Moving On-Premise Data Store to AWS DynamoDB: Achieving Triggers and Updates Easily

Migrating On-Premise Data Store to DynamoDB: Simple Triggers and Updates

Prev Question Next Question

Question

Your development team is currently planning on moving an on-premise data store to AWS DynamoDB.

There were triggers defined in the prior database which was used for updates to existing items.

How can you achieve the same when the movement is made to DynamoDB in the easiest way possible?

Answers

Explanations

Click on the arrows to vote for the correct answer

A. B. C. D.

Answer - B.

The AWS Documentation mentions the following.

Amazon DynamoDB is integrated with AWS Lambda so that you can create triggers-pieces of code that automatically respond to events in DynamoDB Streams.

With triggers, you can build applications that react to data modifications in DynamoDB tables.

If you enable DynamoDB Streams on a table, you can associate the stream ARN with a Lambda function that you write.

Immediately after an item in the table is modified, a new record appears in the table's stream.

AWS Lambda polls the stream and invokes your Lambda function synchronously when it detects new stream records.

Option A is invalid since triggers cannot be defined by default in DynamoDB.Options C and D are invalid since the streams need to be integrated with Lambda.

For more information on using streams with AWS Lambda, please refer to the below URL-

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.Lambda.html

When migrating an on-premises data store to Amazon DynamoDB, triggers defined in the prior database can be replicated using DynamoDB Streams. DynamoDB Streams is a feature that enables real-time streaming of data from a table to other AWS services, such as AWS Lambda, Amazon SNS, or Amazon SQS.

Option A is incorrect because DynamoDB does not support triggers natively.

Option B is the correct answer. Developers can define AWS Lambda functions that respond to events from DynamoDB Streams. When a change occurs in a DynamoDB table, such as an item being created, updated, or deleted, a record is written to the table's associated stream. The Lambda function can be configured to trigger when a specific event is written to the stream. The Lambda function can then perform the necessary logic, such as updating other data stores, sending notifications, or executing a business process.

Option C is incorrect because while SNS can be used to notify subscribers about changes in DynamoDB tables, it does not provide the ability to process or manipulate the data in the same way as AWS Lambda.

Option D is also incorrect because SQS is not designed to execute code in response to events like DynamoDB Streams. It's a message queuing service that enables asynchronous communication between decoupled systems. Developers could use SQS to buffer or batch changes to DynamoDB tables for processing later by AWS Lambda or other services.

In summary, when moving an on-premises data store to DynamoDB, developers can use AWS Lambda functions in response to events from DynamoDB Streams to replicate triggers defined in the prior database.