Process Inbound Emails | AWS SES

Vipin Kumar
3 min readJan 15, 2021

--

Problem Statement: An enterprise is integrating with third party vendor to provide some data. Third party vendor can only provide the data via weekly emails as attachments. What could enterprise do to create an automated email pipeline to store it in database.

Solution: Following services can be combined to achieve this.

  1. Amazon SES: Amazon’s Simple Email Service can not only send emails but also receive emails and do predefined action based on that.
  2. Amazon S3: Amazon SES can use AWS S3 is one of action type to store inbound emails in a bucket.
  3. AWS Lambda: Since Lambda functions are serverless compute platform, they can used to read inbound emails from S3 to extract attachments and process them.
  4. Amazon DynamoDB: AWS Lambda can store data at any datastore of choice. DynamoDB is serverless database, which integrates well with AWS Lambda.

Step 1: Create a receipt rule in SES to receive emails and store them directly in S3. While creating this rule, make sure to create a new S3 bucket when prompted to provide.

Reference: https://docs.aws.amazon.com/ses/latest/DeveloperGuide/receiving-email-getting-started.html

Step 2: Create a DynamoDB table to store data parsed from email attachments. Sort key won’t be required for this demo, just a primary key “id” is sufficient.

Reference: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/getting-started-step-1.html

Step 3: Create a Lambda function with following source code. This Lambda function will retrieve files from S3 bucket, extract CSV data file attachments, parse them and store them in DynamoDB table created in Step 2.

Make sure, your Lambda function has follwing permissions

  1. s3:getObject
  2. dynamodb:PutItem

Note: you would also need some dependencies for this, csv-parser and mailparser from npm. I would suggest to create a NodeJS package locally and then upload zip file in Lambda console including node_modules folder.

Step 4: Create a S3 notification on bucket created in Step 1, to notify Lambda function whenever a new object is created.

Reference: https://docs.aws.amazon.com/lambda/latest/dg/with-s3-example.html#with-s3-example-configure-event-source

Result: Once you send an email with CSV files, you can see your DynamoDB table is populated with data.

This solution can further be enhanced to integrate with other AWS services to create a rich architecture. For example

  1. Amazon Simple Notification Service (SNS): Once email is received via SES, another action can be added to send notification via multiple channels to inform that new email is received. Same can also be done when processing inside Lambda is completed.
  2. Amazon Simple Queue Service (SQS): If volume/frequency of inbound emails is very high and your Lambda function is not able to support that much volume. Then, instead of directly calling Lambda function from S3, push message to SQS. From SQS, it can be pulled out as a batch using Lambda.
  3. Amazon API Gateway: You can provision combination of API Gateway and Lambda function on top of DynamoDB table to create a REST Api which can be consumed by other clients.

--

--