- SalesforceChaCha
- Posts
- ๐ Close the Loop on that AWS Integration ๐บ
๐ Close the Loop on that AWS Integration ๐บ
ChaCha Style ๐
Good morning, Salesforce Nerds! A few weeks ago we reviewed how to set an OOTB integration with AWS. Super easy โ powerful stuff that lets you send data asynchronously between Salesforce and AWS. Today, weโre going to look at how to close the loop and configure the bi-directional โ๏ธ piece of this design! Put your integration hats ๐ฉ on today folks!

Agenda for today includes
Foundations
Some witty title
Daily Principle
All the Memes
Foundations
โ Before we move on โ
AWS EventBridge enforces a limit on the size of the payload placed onto their event bus. That limit is 256KB. Not much data. This works well for small payloads, but how likely is it that we can guarantee weโll be returning a small payload every time? What if AWS needs to send Salesforce something larger? ๐ค
Enterprise Integration Patterns to the rescue! โ๏ธ Specifically, the Claim Check pattern!
From a high-level: this is an approach used for creating additional capacity on an event bus by persisting the payload into an external datastore and then sending a reference to the datastore on the bus. ๐ฅ Letโs look at what we need to implement!
Some witty title
โ AWS Configuration
Letโs start out with making sure weโre set up on the AWS side. We just need to set up a few things.
S3 Bucket | This will serve as the external datastore that will hold the large payload. Once the data is saved here, our application will pass the โS3 Object Keyโ on the event bus as a reference to the full payload.
EventBridge API Destination | This is an OAuth connection + callout details to a Salesforce endpoint. Be sure to use an integration account for the connection details. ๐ AWS will pass the data placed onto the event bus as the payload to the endpoint. So, weโre going to spin up a custom endpoint to consume this payload.
EventBridge Rule | This is used by the event bus to route events to the the right place. Set up a rule and set the Target = EventBrigde API Destination you just created.
โ Salesforce Configuration
Nice, weโre set up for success on the AWS side. Time for some Salesforce work. Just a few things to do here, too. Be warned - This side will require some Apex so have that code ninja ready! ๐ฅท
S3 Named Credential | This makes reaching into the S3 Bucket we created earlier soooooo easy. Hereโs a how-to. Itโs actually very easy to set and use afterward!
Custom Endpoint | This is an Apex class exposed as a REST endpoint. Earlier, we had configured the EventBrigde API Destination to hit this endpoint with itโs payload. Weโll see some Apex here to read the payload into a data structure you can work with. Apex has built-in support for this. ๐
Apex Callout | This is the part that ties it all together! Write a bit of code that will use the S3 Named Credential + S3 Object Key from the payload passed in by AWS to callout to S3 and read the file.
โ Recap
So letโs look this over again. Weโre using an S3 bucket to create the additional capacity on the AWS event bus. Our application is set up to persist larger payloads here as the external datastore, and then we send the S3 Object Key to Salesforce as a reference back to the datastore. Salesforce is configured to easily reach back into S3 with a Named Credential and has code written that will read the S3 Object Key, fetch the file and read the data.
Daily Principle
"External things are not the problem. Itโs your assessment of them. Which you can erase right now."
and now....Your Daily Memes



What did you think about today's newsletter? |