• SalesforceChaCha
  • Posts
  • ๐Ÿ’ƒ Close the Loop on that AWS Integration ๐Ÿ•บ

๐Ÿ’ƒ Close the Loop on that AWS Integration ๐Ÿ•บ

ChaCha Style ๐Ÿ˜Š

Good morning, Salesforce Nerds! A few weeks ago we reviewed how to set an OOTB integration with AWS. Super easy โž• powerful stuff that lets you send data asynchronously between Salesforce and AWS. Today, weโ€™re going to look at how to close the loop and configure the bi-directional โ†”๏ธ piece of this design! Put your integration hats ๐ŸŽฉ on today folks!

Agenda for today includes

  • Foundations

  • Some witty title

  • Daily Principle

  • All the Memes

Foundations

โœ‹ Before we move on โœ‹ 

AWS EventBridge enforces a limit on the size of the payload placed onto their event bus. That limit is 256KB. Not much data. This works well for small payloads, but how likely is it that we can guarantee weโ€™ll be returning a small payload every time? What if AWS needs to send Salesforce something larger? ๐Ÿค” 

Enterprise Integration Patterns to the rescue! โ›‘๏ธ Specifically, the Claim Check pattern!

From a high-level: this is an approach used for creating additional capacity on an event bus by persisting the payload into an external datastore and then sending a reference to the datastore on the bus. ๐Ÿ”ฅ Letโ€™s look at what we need to implement!

Some witty title

โœ… AWS Configuration

Letโ€™s start out with making sure weโ€™re set up on the AWS side. We just need to set up a few things.

  • S3 Bucket | This will serve as the external datastore that will hold the large payload. Once the data is saved here, our application will pass the โ€œS3 Object Keyโ€ on the event bus as a reference to the full payload.

  • EventBridge API Destination | This is an OAuth connection + callout details to a Salesforce endpoint. Be sure to use an integration account for the connection details. ๐Ÿ˜Ž AWS will pass the data placed onto the event bus as the payload to the endpoint. So, weโ€™re going to spin up a custom endpoint to consume this payload.

  • EventBridge Rule | This is used by the event bus to route events to the the right place. Set up a rule and set the Target = EventBrigde API Destination you just created.

โœ… Salesforce Configuration

Nice, weโ€™re set up for success on the AWS side. Time for some Salesforce work. Just a few things to do here, too. Be warned - This side will require some Apex so have that code ninja ready! ๐Ÿฅท

  • S3 Named Credential | This makes reaching into the S3 Bucket we created earlier soooooo easy. Hereโ€™s a how-to. Itโ€™s actually very easy to set and use afterward!

  • Custom Endpoint | This is an Apex class exposed as a REST endpoint. Earlier, we had configured the EventBrigde API Destination to hit this endpoint with itโ€™s payload. Weโ€™ll see some Apex here to read the payload into a data structure you can work with. Apex has built-in support for this. ๐Ÿ’™

  • Apex Callout | This is the part that ties it all together! Write a bit of code that will use the S3 Named Credential + S3 Object Key from the payload passed in by AWS to callout to S3 and read the file.

โœ… Recap

So letโ€™s look this over again. Weโ€™re using an S3 bucket to create the additional capacity on the AWS event bus. Our application is set up to persist larger payloads here as the external datastore, and then we send the S3 Object Key to Salesforce as a reference back to the datastore. Salesforce is configured to easily reach back into S3 with a Named Credential and has code written that will read the S3 Object Key, fetch the file and read the data.

Daily Principle

"External things are not the problem. Itโ€™s your assessment of them. Which you can erase right now."

Marcus Aurelius

and now....Your Daily Memes

What did you think about today's newsletter?

Login or Subscribe to participate in polls.