AWS Certified Developer - Associate: DynamoDB Table Update and Secondary Table Insertion

Inserting Records into a Secondary Table on DynamoDB Update

Prev Question Next Question

Question

Your application currently interacts with a DynamoDB table.

Records are inserted into the table via the application.

There is now a requirement to ensure that another record is inserted into a secondary table whenever items are updated in the DynamoDB primary table.

Which of the below feature should be used when developing such a solution?

Answers

Explanations

Click on the arrows to vote for the correct answer

A. B. C. D.

Answer - B.

This is also mentioned as a use case in the AWS Documentation.

DynamoDB Streams Use Cases and Design Patterns.

This post describes some common use cases you might encounter, along with their design options and solutions, when migrating data from relational data stores to Amazon DynamoDB.We will consider how to manage the following scenarios.

How do you set up a relationship across multiple tables in which, based on the value of an item from one table, you update the item in a second table?

How do you trigger an event based on a particular transaction?

How do you audit or archive transactions?

How do you replicate data across multiple tables (similar to that of materialized views/streams/replication in relational data stores)?

Relational databases provide native support for transactions, triggers, auditing, and replication.

Typically, a transaction in a database refers to performing create, read, update, and delete (CRUD) operations against multiple tables in a block.

A transaction can have only two states-success or failure.

In other words, there is no partial completion.

As a NoSQL database, DynamoDB is not designed to support transactions.

Although client-side libraries are available to mimic the transaction capabilities, they are not scalable and cost-effective.

For example, the Java Transaction Library for DynamoDB creates 7N+4 additional writes for every write operation.

This is partly because the library holds metadata to manage the transactions to ensure that it's consistent and can be rolled back before commit.

You can use DynamoDB Streams to address all these use cases.

DynamoDB Streams is a powerful service that you can combine with other AWS services to solve many similar problems.

When enabled, DynamoDB Streams captures a time-ordered sequence of item-level modifications in a DynamoDB table and durably stores the information for up to 24 hours.

Applications can access a series of stream records, which contain an item change, from a DynamoDB stream in near real-time.

AWS maintains separate endpoints for DynamoDB and DynamoDB Streams.

To work with database tables and indexes, your application must access a DynamoDB endpoint.

To read and process DynamoDB Streams records, your application must access a DynamoDB Streams endpoint in the same Region.

Option A is incorrect because DynamoDB Encryption helps you with the security, not adding the data to secondary tables.

Option C is incorrect because DynamoDB Accelerator is a fully managed, highly available, in-memory cache for DynamoDB that delivers up to a 10x performance improvement.

It does not solve the problem.

Option D is incorrect because there is no service named Table Accelerator.

For more information on use cases and design patterns for DynamoDB streams, please refer to the below link-

https://aws.amazon.com/blogs/database/dynamodb-streams-use-cases-and-design-patterns/

The correct answer is B. AWS DynamoDB Streams.

DynamoDB Streams is a feature of DynamoDB that captures a time-ordered sequence of item-level modifications made to a DynamoDB table. With DynamoDB Streams, you can process the data modifications in near-real time, and use the processed data to build applications that can react to changes in a DynamoDB table. In other words, DynamoDB Streams allows you to build applications that can react to changes in a DynamoDB table in real time.

In this scenario, you can use DynamoDB Streams to capture the item-level modifications made to the primary table, and use the processed data to insert a new record into the secondary table whenever an item is updated in the primary table. This can be achieved by creating a DynamoDB Streams-enabled table, and configuring the stream to trigger an AWS Lambda function whenever an item is updated in the primary table. The AWS Lambda function can then insert a new record into the secondary table based on the updated item in the primary table.

A. AWS DynamoDB Encryption is not relevant to this scenario. It is a feature that allows you to encrypt your data at rest in DynamoDB, and does not provide a solution for inserting records into a secondary table when items are updated in the primary table.

C. AWS DynamoDB Accelerator (DAX) is an in-memory cache for DynamoDB that can improve the performance of DynamoDB applications. While DAX can improve the performance of DynamoDB queries, it does not provide a solution for inserting records into a secondary table when items are updated in the primary table.

D. AWS Table Accelerator is not a feature of AWS. It is not relevant to this scenario.