Just as we have an event mapping for DynamoDB, AWS also provides event mapping for SNS and Lambda's integration. Lambda functions can get triggered each time a new message is published to an existing SNS topic. When triggered, a Lambda function can be used to perform tasks such as reading contents of the message payload, processing it or even forwarding it to other AWS services that can use the SNS notification to perform some action. An important thing to note here, while using SNS event mappings, is that, SNS will invoke Lambda functions in an asynchronous manner. If Lambda was successfully able to process the SNS event, it will send a successful delivery status. In the case of errors, SNS will try and invoke the particular function up to three times, post which, it will log an unsuccessful/failure message that can be viewed from Amazon CloudWatch.
Now, onward with the use case. This particular use case is a fairly simple representation of a simple user registration where a username is published via a SNS Topic which, in turn, triggers a Lambda function that reads the SNS Topic payload message, generates an MD5 checksum of the supplied username, and writes the first 10 characters of the MD5 checksum to a DynamoDB table.
To get started with the use case, we first create a corresponding directory structure. Type in the following command:
# mkdir ~/workdir/apex/event_driven/functions/mySnsToLambdaFunc
With the directory created, we only need to create a function.dev.json and our index.js files here as well. Remember, the function.dev.json file is unique to each use case so in this case, the file will contain the following set of instructions:
{
"description": "Node.js lambda function using sns as a trigger to
generate an md5 of the message received and store it in the
database",
"role": "arn:aws:iam::<account_id>:role/myLambdaSNSFuncRole",
"handler": "index.handler",
"environment": {}
}
Next, create the corresponding IAM role for providing permissions to the Lambda function to create and publish logs in CloudWatch as well as add items to a particular DynamoDB database:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "myLogsPermissions",
"Effect": "Allow",
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource": [
"*"
]
},
{
"Sid": "myDynamodbPermissions",
"Effect": "Allow",
"Action": [
"dynamodb:PutItem"
],
"Resource": [
"arn:aws:dynamodb:us-east-1:<account_id>:table/LambdaTriggerSNS"
]
}
]
}
Remember, the IAM role will not get pushed to AWS IAM by APEX. You will have to use some other means to achieve this action for now.
Finally, create the index.js file and paste the code as provided here: https://github.com/PacktPublishing/Mastering-AWS-Lambda.
The first section of the code is fairly understandable on its own. We check if the message string is not empty or undefined. If so, we simply return the callback() with a message. Else, we create an MD5 checksum of the supplied message and slice the first 10 characters off from it:
function getMessageHash(message, hashCB){
if(message === ""){
return hashCB("Message is empty");
}
else if((message === null) || (message === undefined)){
return hashCB("Message is null or undefined");
}
else{
var crypto = require('crypto');
var messageHash =
crypto.createHash('md5').update(message).digest("hex");
return hashCB(null, messageHash.slice(0,10));
}
}
The second piece is where we define the insert function that will populate the DynamoDB table.
function insertItem(insertParams, insertCB){
var AWS = require('aws-sdk');
AWS.config.update({
region: "us-east-1",
endpoint: "http://dynamodb.us-east-1.amazonaws.com"
});
var dynamodb = new AWS.DynamoDB({apiVersion: '2012-08-10'});
dynamodb.putItem(insertParams, function(err, data) {
if(err){
insertCB(err);
}
else{
insertCB(null, data);
}
});
}
And finally, we have the handler of our function defined.
exports.handler = (event, context, callback) => {
var tableName = "LambdaTriggerSNS";
var message, recordVal;
With the basic steps done, your work directory should resemble the following screenshot a bit:

With this, we are ready to package and upload the function to Lambda. To do so, simply run the following command from your project directory:
# apex --env dev deploy mySnsToLambdaFunc
Next up, create a simple DynamoDB table and provide it with the same name as done in the function's role, that is, LambdaTriggerSNS. Make sure the Primary key of the table is set as userName. Accept the default settings for the table, and click on Create to complete the process.
Similarly, go ahead and create the corresponding SNS Topic. Login to the AWS Management Console and select the SNS service from the main landing page. Next, create a simple Topic by selecting the Topics option from the navigation pane to the left of the SNS Dashboard. Click on Create topic and fill out the Topic name and Display name for your Topic in the popup dialog box. Click Create topic once done.
With the topic created, the only thing left to do is subscribing the Lambda function to a particular topic. To do so, select the newly created Topic and from the Actions tab, select the option Subscribe to topic. This will bring up the Create subscription dialog, as shown as follows:

- Topic ARN: Provide the SNS Topic ARN here
- Protocol: Select AWS Lambda from the dropdown list
- Endpoint: From the drop down list, you will have to select the ARN of our deployed function
- Version or alias: You can leave this value to default as of now, however, you can always use the $LATEST flag to point to the latest version of your function code.
Verify that event mapping was indeed created successfully by viewing the Triggers tab of your function. You should see the SNS trigger configured there automatically as well. So, that should pretty much do it! You can now go ahead and test the event mapping. To do so, simply publish a username as a message in your Topic using the SNS dashboard itself. Back at Lambda, our deployed function will automatically get triggered once SNS publishes the message. It will read the contents of the message payload, create an MD5 checksum of the same, accept only the first 10 characters of the checksum and store that in the DynamoDB table we created a while back.
You can verify the output by viewing the logs of your functions' execution using Amazon CloudWatch as well:
