0

I have the following code. This code is supposed to receive an SQS message, read the body, then update a dynamo record with the information contained within that body. The update is not working which is one issue, but even stranger I'm not getting any output from the dynamodb update. The last line of output is the console.log which details the SQS message, then the function ends.

How is this possible? Shouldn't dynamo return some kind of output?

console.log('Loading function');
const util = require('util')
const AWS = require('aws-sdk');
var documentClient = new AWS.DynamoDB.DocumentClient();


exports.handler = async(event) => {
    //console.log('Received event:', JSON.stringify(event, null, 2));
    for (const { messageId, body } of event.Records) {
        //const { body } = event.Records[0];
        //console.log(body)
        console.log('SQS message %s: %j', messageId, body);
        const JSONBody = JSON.parse(body)
        //const message = JSON.parse(test["Message"]);
        const id = JSONBody.id;
        const city = JSONBody.City;
        const address = JSONBody.Address;

        const params = {
            TableName: 'myTable',
            Key: {
                ID: ':id',
            },
            UpdateExpression: 'set address = :address',
            ExpressionAttributeValues: {
                ':id': id,
                ':address': address,
                ':sortKey': "null"
            }
            //ReturnValues: "UPDATED_NEW"
        };

        documentClient.update(params, function(err, data) {
            if (err) console.log(err);
            else console.log(data);
        });
    }
    return `Successfully processed ${event.Records.length} messages.`;
};
2
  • Is the first console.log logging anything 'SQS message %s: %j'? Commented May 2, 2020 at 19:18
  • Yeah it logs everything correctly Commented May 3, 2020 at 3:01

2 Answers 2

2

There're a couple of ways to do this, but I'm not sure about your use cases: Are operations are critical? Do the failed items need to be handled? Are performance need to be boosted as the large dataset? etc...

// I'm not recommend to this implementation
const { DynamoDB } = require('aws-sdk');
const documentClient = new DynamoDB.DocumentClient();


exports.handler = async (event) => {
    for (const { messageId, body } of event.Records) {
        console.log('SQS message %s: %j', messageId, body);
        // Parse json is dangerous without knowing the structure, remember to handle
        // when error occured
        const JSONBody = JSON.parse(body)
        const id = JSONBody.id;
        const address = JSONBody.Address;

        const params = {
            TableName: 'myTable',
            Key: {
                ID: ':id',
            },
            UpdateExpression: 'set address = :address',
            ExpressionAttributeValues: {
                ':id': id,
                ':address': address,
                ':sortKey': "null"
            },
            ReturnValues: "UPDATED_NEW"
        };

        // Wait for each update operation to finished
        // IO time will be extended
        await documentClient.update(params)
            .promise()
            .then(res => {
                console.log(res)
            })
            .catch(err => {
                console.error(err);
            })
    }

    // In case there's a failed update operation, this message still be returned by lambda handler
    return `Successfully processed ${event.Records.length} messages.`;
};
// My recommended way
const AWS = require('aws-sdk');
const documentClient = new AWS.DynamoDB.DocumentClient();


exports.handler = async (event) => {
    // All the update operation is fired nearly concurrently
    // IO will be reduced
    return Promise.all(event.Records.map(({ messageId, body }) => {
        console.log('SQS message %s: %j', messageId, body);
        // Parse json is dangerous without knowing the structure, remember to handle
        // when error occured
        const JSONBody = JSON.parse(body)
        const id = JSONBody.id;
        const address = JSONBody.Address;

        const params = {
            TableName: 'myTable',
            Key: {
                ID: ':id',
            },
            UpdateExpression: 'set address = :address',
            ExpressionAttributeValues: {
                ':id': id,
                ':address': address,
                ':sortKey': "null"
            },
            ReturnValues: "UPDATED_NEW"
        };

        return documentClient.update(params)
            .promise()
            .then(res => {
                console.log(res)
            })
    }))
        // When lambda handler finised all the update, lambda handler return a string
        .then(() => {
            return `Successfully processed ${event.Records.length} messages.`
        })
        // In case any of the update operation failed, the next update operations is cancelled
        // Lambda handler return undefined
        .catch(error => {
            console.error(error);
            // return some error for lambda response.
        })
};

P/s: My two cents, before you do any kind of Lamba development with node.js runtime, you should understand the differences between callbacks, promises, await/async in javascript.

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks trmaphi, in my case it was fine to just make the function synchronous
0

Fixed it by making the method synchronous, i.e removed async from the function def

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.