Implementing Azure Data Lake Gen 1 Storage Logging

Implementing Azure Data Lake Gen 1 Storage Logging

Question

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

A company uses Azure Data Lake Gen 1 Storage to store big data related to consumer behavior.

You need to implement logging.

Solution: Create an Azure Automation runbook to copy events.

Does the solution meet the goal?

Answers

Explanations

Click on the arrows to vote for the correct answer

A. B.

B

Instead configure Azure Data Lake Storage diagnostics to store logs and metrics in a storage account.

Note:

You can enable diagnostic logging for your Azure Data Lake Storage Gen1 accounts, blobs, files, queues and tables.

Diagnostic logs aren't available for Data Lake Storage Gen2 accounts [as of August 2019].

https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-diagnostic-logs https://github.com/MicrosoftDocs/azure-docs/issues/34286

The proposed solution to implement logging in Azure Data Lake Gen 1 Storage is to create an Azure Automation runbook to copy events. However, this solution is not sufficient to meet the stated goal of implementing logging in Azure Data Lake Gen 1 Storage.

An Azure Automation runbook can automate tasks in Azure by using runbooks, which are a collection of procedures, and it can be used to copy events from one location to another. However, this solution does not provide a comprehensive logging capability that is required to monitor the big data related to consumer behavior stored in Azure Data Lake Gen 1 Storage.

To implement logging in Azure Data Lake Gen 1 Storage, a more comprehensive solution is required that can capture events and provide a centralized location for monitoring and analysis. This can be achieved by using Azure Data Factory, which is a cloud-based data integration service that can orchestrate and automate the movement and transformation of data. Azure Data Factory can be used to create pipelines that can capture events and logs from various sources, including Azure Data Lake Gen 1 Storage, and store them in a centralized location such as Azure Log Analytics.

Therefore, the proposed solution to create an Azure Automation runbook to copy events is not sufficient to meet the goal of implementing logging in Azure Data Lake Gen 1 Storage. The correct answer is B, No.