Triggering Multiple Lambda from Single S3 Event

We had a use case that leads me to write this and I am sure many of you would have faced this situation. The Situation was on a put event of S3 we wanted to trigger two lambda function that copies data into different Redshift Cluster in different Region. Now as we know we can not have multiple lambdas triggered directly from single S3 event as unfortunately at the moment, S3 is limited to a single event notification. This means that trying to add more than one Lambda function for the same event will result in an overlap error, thus we have to look into alternative architecture. There is a comprehensive blog post from Amazon that details the solution called: Fanout S3 Event Notifications to Multiple Endpoints Based on that detailed blog here is simplified and short version of it, as mentioned there is this approach and an alternative approach which we implemented : Note: We had a use case of copying data into Redshift but it can easily be replaced or used into your use case. Approach One (mentioned in the blog): We can see that the first solution is to use an SNS topic to forward information from an S3 event to multiple Lambda functions.
So to overcome this, we use a single event notification on our S3 bucket which is sent to an SNS topic. This SNS topic is then configured as the event trigger for both Lambda functions. The only down-side to this implementation would be that your current configuration would need to be changed (if you have one) to include the SNS topic as the S3 event notification. Your existing Lambda function would also need to be amended to include a GetObject function which extracts the S3 object from the SNS message.In NodeJS, this function would look like the following sample:
function getSNSMessageObject(msgString) 
{ 
  var x = msgString.replace(/\\/g,’’);
  var y = x.substring(1,x.length-1);
  var z = JSON.parse(y); return z; }


  return z; 
}
Approach Two (alternative): We could consider keeping the current configuration of S3 -> Lambda -> Redshift. Then replicate these resources as a test environment. Once done, we could amend our current Lambda function to not only send our S3 object to Redshift but also send them to another S3 bucket which acts as the source for your test Lambda and test Redshift.
The benefit of this solution would be that our test environment could be configured as an exact replica of our current environment and the existing Lambda function would only need to include a second operation that after sending our S3 object to Redshift also copies the object and sends it to our test source bucket. Of the two options outlined above, we implemented the second solution based on our current architecture as this had the least impact on current functioning resources. However, I would suggest reviewing both options to determine which one would be more applicable to your desired result. Hope this would help you in some or other way!