How to process message queuing system by amazon SQS

Reading Time : ~ .

Amazon Simple Queue Service (Amazon SQS) is a distributed messaging queue service.

Queued items in SQS are called messages which are variable in size but can be no larger than 256KB. SQS doesn’t guarantee delivery order or that a message will be delivered only once. Using Visibility Timeout we can ensures once a message has been retrieved it will not be resent for a given period of time.

In this tutorial, we'll see how to manage SQS queues and messages using boto3.

    import boto3

    # boto3 connect

    sqs = boto3.resource(

        'sqs',

        region_name=AWS_REGION_NAME,

        aws_access_key_id=AWS_ACESS_KEY_ID,

        aws_secret_access_key=AWS_SECRET_ACESS_KEY)

In the above code, we are connecting to a sqs resource in a given region, access key id, secret key using boto3.

Creating A Queue:

    queue = sqs.create_queue(QueueName='testqueue', Attributes={'DelaySeconds': '5'})


We should give queue name, can also give other attributes such as delay seconds(number of seconds to wait before an item may be processed), ApproximateNumberOfMessages, MaximumMessageSize.

It returns unique queue url though which we can access queue and its messages

Connecting to an existing Queue:

After connecting to a service, we are connecting a SQS Queue by giving queue name with get_queue_by_name method.

    queue = sqs.get_queue_by_name(QueueName=AWS_QUEUE_NAME)

Sending Messages

In SQS, we can create single, bulk messages in a queue using send_message and send_messages command.

Adding a single message:

            response = queue.send_message(

                QueueUrl=url,

                MessageBody='message1',

                MessageAttributes={

                'Type': {

                    'name': 'String'

                }

            )

            print response

    It returns a message id, message body for generated message. We can alse user defined attrubutes to a individual message.

Adding a bulk messages:

            response = queue.send_messages(Entries=[{

                QueueUrl=url,

                MessageBody='message1',

                MessageAttributes={

                'Type': {

                    'length': '09'

                },

                {QueueUrl=url,

                MessageBody='message2',

                MessageAttributes={

                'Size': {

                    'size': '20'

                }]

            )

            print response

    Response will contain all successful message and failed messages information in a queue.

Processing Messages

    message = queue.receive_messages()[0]

    message = queue.receive_messages(MessageAttributeNames=['Type'])[0]

    sqs message will be processed in batches, we can retrieve all messages or filter particular messages based on attribute types in a queue. We will get the message information in an xml format. We can convert it to json using xmltodict package. Here you can find relevant information to process information.


Deleting Messages

    response = client.delete_message(

        QueueUrl=url,

        ReceiptHandle=MESSAGE_ID

    )
    response = queue.delete_messages(

        Entries=[

            {

                'Id': MESSAGE_ID,

                'ReceiptHandle': MESSAGE_BODY

            },

        ]

    )

    Here we give the queue url, message id for the message to be deleted.

Delete Queue

    response = client.delete_queue(

        QueueUrl=url

    )

    When you delete a queue, you must wait at least 60 seconds to delete the queue and creating a queue with the same name.

    By Posted On
SENIOR DEVELOPER at MICROPYRAMID

Need any Help in your Project?Let's Talk

Latest Comments
Related Articles
Paginating S3 objects using boto3 Jagadeesh V

When using Boto you can only List 1000 objects per request. So to obtain all the objects in the bucket. You can use s3's paginator.

Continue Reading...
Easy and Fast way to implement AWS Lambda service Dinesh Deshmukh

We are going to use a simple application called Gordan to prevent creating a lambda function and triggering actions which involves time taking and repetitive ...

Continue Reading...
Creating Elastic Search Cluster (EC2, cloud-aws Plugin) Jagadeesh V

While handling Large amounts of data with elasticsearch, you may run out of server capacity or compute power, Forming a Elasticsearch cluster will reduce the ...

Continue Reading...

Subscribe To our news letter

Subscribe to our news letter to receive latest blog posts into your inbox. Please fill your email address in the below form.
*We don't provide your email contact details to any third parties