Using AWS Lambda with S3 and DynamoDB

Any application, storage is the major concern and you can perfectly manage your storage by choosing an outstanding AWS consultant. With AWS we can create any application where user can operate it globally by using any device.

Using AWS lambda with S3 and DynamoDB

What is AWS lambda?

   Simply put, it's just a service which executes a given code based on certain events.

Why lambda?

    Obviously, we can use sqs or sns service for event based computation but lambda makes it easy and further it logs the code stdout to cloud watch logs.

Using lambda with s3 and dynamodb:

     Here we are going to configure lambda function such that whenever an object is created in the s3 bucket we are going to download that file and log that filename into our dynamobd database.

Prerequisites:

     Access to s3 and dynamodb for putting and execute operations, here we assume that you have already created a table with the key being the filename in dynamodb.

1. Goto aws console and click on aws lambda, click over create a lambda function.
2. You can see blue prints(sample code) for different languages. Choose s3-get-object-python.
3. Select event source type as s3, select the desired bucket. 
4. The event type is should be 'created' as we want to capture events only when objects are created and cleck next.
5. Provide your code name. Now you can see the sample code which includes the boto3 library by default. If you need to include other libraries then you should create a zip file with main code file and all required libraries.
       When you upload a zip file with main code filename being main_file and the handler function inside the main_file being lambda_handler then the 'Handler' option should represent: main_file.lambda_handler.

example main.py file

import re
import json
import traceback
import boto3 s3_resource = boto3.resource('s3')
s3_client = boto3.client('s3')
dynamodb_client = boto3.client('dynamodb') table_name = 'dynamodb_table_name' def lambda_handler(event, context):
    bucket_name = event['Records'][0]['s3']['bucket']['name']
    key = event['Records'][0]['s3']['object']['key']
    if not key.endswith('/'):
        try:
            split_key = key.split('/')
            file_name = split_key[-1]
            s3_client.download_file(bucket_name, key, '/tmp/'+file_name)
            item = {'key': {'S': file_name}}
            
dynamodb_client.put_item(TableName=table_name, Item=item)
        except Exception as e:
            print(traceback.format_exc())     return (bucket_name, key)

6. For the role, you can select s3 execution role.
7. Leave all the options as default and click next.
8. You can enable the event source but it's recommended not to, until you test the code is working. So just leave that and create function.
Now that you have configured lambda, to test click on the test which shows the test data, here change the bucket name with the desired one. Click on the test.
This will display the result at the bottom where you can check the output for any syntax errors or bugs.
Once you are confident that the code is fine, you can enable the event source.
Now got to your bucket and create a file. As you create the file, lambda will invoke the function associated with it. You can check the output at cloud watch logs.

Note:

  • The number of executions at any given time for a lambda function depends on the time taken by the function and number of events per second. So if per second 10 events are triggered and it takes 3 seconds for function to complete then the number concurrent executions will be 10*3 i.e. 30 concurrent executions
  • A function can have a maximum of 300seconds as execution time. The default setting is 3sec.

Posted On 28 May 2016 By MicroPyramid


Need any Help in your Project?Let's Talk

Latest Comments
AWS Lambda - Best Practices

After Years of Developing Lambda Scripts from creating Serverless Applications to Pipelining your Tasks, Here are the Best Practices that we follow.

Continue Reading...
Autoscaling Application with AutoScaling Groups and AWS LoadBalancer

Autoscaling Application with AutoScaling Groups and AWS LoadBalancer

Continue Reading...
How to Build and verify an application using aws codepipeline and creating custom events with lambda

How to build and verify an application using aws codepipeline and creating custom events with lambda.

Continue Reading...
AWS tips and tricks to optimize cost and performance for better ROI

Best Practices of AWS cost and Performance Optimization

Continue Reading...
How to process message queuing system by amazon SQS

How to process attributes of message queuing system by amazon SQS using Boto3

Continue Reading...
Easy and Fast way to implement AWS Lambda service

We are going to use a simple application called Gordan to prevent creating a lambda function and triggering actions which involves time taking and repetitive …

Continue Reading...
How To Send And Receive Email With Django And Amazon SES

django-ses-gateway a pluggable Django application is used for sending mails from your verified domains and verified emails. We can also use django-ses-gateway for receiving messages …

Continue Reading...
Deploy Django using CloudFormation Template

CloudFormation helps in Using JSON templates to describe the resources needed from aws. With this approach, we don't have to repeat the same manual configuration …

Continue Reading...
How to Mount S3 Bucket on Local Disk

It all starts with FUSE, FUSE is File System User Space. Operating Systems have Kernel Space and User Space. Kernel Space is where low level …

Continue Reading...
Using AWS Lambda with S3 and DynamoDB

AWS lambda is handy tool for event driven computation, here we will learn how to configure and setup lambda function so to run our function …

Continue Reading...
How to access EC2 instance even if pem file is lost

Accessing the EC2 instance even if you loose the pem file is rather easy.

1. First, create a new instance by creating new access …

Continue Reading...
Deploying Django project on Elastic Beanstalk

Here You can learn about how to setup and deploy a Django application to Amazon Web Services (AWS).

Tools/technologies used:
Python v2.7
Django v1.7
Amazon …

Continue Reading...
how to setup custom domain for amazon cloudfront

We all want our own domain name to be setup for cloud front instead of amazon default cloud front domain name. We need two things …

Continue Reading...
Paginating S3 objects using boto3

When using Boto you can only List 1000 objects per request. So to obtain all the objects in the bucket. You can use s3's paginator.

Continue Reading...
Creating Elastic Search Cluster (EC2, cloud-aws Plugin)

While handling Large amounts of data with elasticsearch, you may run out of server capacity or compute power, Forming a Elasticsearch cluster will reduce the …

Continue Reading...
Configuring and Testing Load Balancer in AWS EC2

When You have an application that is serving Huge Customer Base, so will be your Traffic. Sometimes The Application simply stops responding. We can use …

Continue Reading...
Django Hosting on Amazon EC2 with wordpress on same domain

Configuring the Wordpress as subdirectory can be tricky. In this tutorial we will Setup a Django Website alongside a wordpress blog.

Continue Reading...
Amazon SES - Handling Bounces and Complaints

In general while sending emails, we will prepare some recipient addresses as our mailing list, which are valid and our recipients want and expect our …

Continue Reading...
Amazon AWS IAM Roles and Policies

When You want to Provide access to Amazon Web Services Console or if you're planning to provide REST API Keys to your Developers of a …

Continue Reading...
CORS with Amazon S3 and CloudFront

We struggle to load fonts from CloudFront because of CORS.

CORS - Cross Origin Resource Sharing is a security measure to block macious scripts or …

Continue Reading...

Subscribe To our news letter

Subscribe and Stay Updated about our Webinars, news and articles on Django, Python, Machine Learning, Amazon Web Services, DevOps, Salesforce, ReactJS, AngularJS, React Native.
* We don't provide your email contact details to any third parties