Mitigating Security Concerns with Amazon S3 and Amazon EC2

Maintaining Data Integrity and Secure Code Deployment Measures

Prev Question Next Question

Question

You have a code repository that uses Amazon S3 as a data store.

During a recent audit of your security controls, some concerns were raised about maintaining the integrity of the data in the Amazon S3 bucket.

Another concern was raised around securely deploying code from Amazon S3 to applications running on Amazon EC2 in a virtual private cloud.

What are some measures that you can implement to mitigate these concerns? Choose two answers from the options given below.

Answers

Explanations

Click on the arrows to vote for the correct answer

A. B. C. D. E. F.

Answer - B and D.

You can add another layer of protection by enabling MFA Delete on a versioned bucket.

Once you do so, you must provide your AWS account's access keys and a valid code from the account's MFA device to permanently delete an object version or suspend or reactivate versioning on the bucket.

For more information on MFA, please refer to the below link:

https://aws.amazon.com/blogs/security/securing-access-to-aws-using-mfa-part-3/

IAM roles are designed so that your applications can securely make API requests from your instances without requiring you to manage the security credentials that the applications use.

Instead of creating and distributing your AWS credentials, you can delegate permission to make API requests using IAM roles

For more information on Roles for EC2, please refer to the below link:

http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html

Option A is invalid because this will not address either the integrity or security concern completely.

Option C is invalid because user credentials should never be used in EC2 instances to access AWS resources.

Option E and F are invalid because AWS Pipeline is an unnecessary overhead when you already have inbuilt controls to manage security for S3.

The two measures that can be implemented to mitigate the concerns raised are:

A. Add an Amazon S3 bucket policy with a condition statement to allow access only from Amazon EC2 instances with RFC 1918 IP addresses and enable bucket versioning. This measure addresses the concern of maintaining the integrity of the data in the Amazon S3 bucket. By adding a bucket policy with a condition statement to allow access only from Amazon EC2 instances with RFC 1918 IP addresses, access to the S3 bucket is restricted to only the specified IP addresses. This reduces the likelihood of unauthorized access to the S3 bucket. Enabling bucket versioning ensures that previous versions of the data are preserved, allowing for easy rollback in case of accidental deletion or modification of data.

B. Add an Amazon S3 bucket policy with a condition statement that requires multi-factor authentication to delete objects and enable bucket versioning. This measure addresses the concern of securely deploying code from Amazon S3 to applications running on Amazon EC2 in a virtual private cloud. By adding a bucket policy with a condition statement that requires multi-factor authentication to delete objects, the risk of accidental deletion of data is reduced. Enabling bucket versioning ensures that previous versions of the data are preserved, allowing for easy rollback in case of accidental deletion or modification of data.

C, D, E, and F are not relevant to the concerns raised in the question.

C. Using a configuration management service to deploy AWS Identity and Access Management user credentials to the Amazon EC2 instances is not directly relevant to maintaining the integrity of the data in the Amazon S3 bucket or securely deploying code from Amazon S3 to applications running on Amazon EC2.

D. Creating an Amazon Identity and Access Management role with authorization to access the Amazon S3 bucket and launching all of your application's Amazon EC2 instances with this role is not directly relevant to maintaining the integrity of the data in the Amazon S3 bucket or securely deploying code from Amazon S3 to applications running on Amazon EC2.

E. Using AWS Data Pipeline to lifecycle the data in your Amazon S3 bucket to Amazon Glacier on a weekly basis is not directly relevant to maintaining the integrity of the data in the Amazon S3 bucket or securely deploying code from Amazon S3 to applications running on Amazon EC2.

F. Using AWS Data Pipeline with multi-factor authentication to securely deploy code from the Amazon S3 bucket to your Amazon EC2 instances is not relevant to maintaining the integrity of the data in the Amazon S3 bucket.