You are developing a data engineering solution for a company. The solution will store a large set of key-value pair data by using Microsoft Azure Cosmos DB

Posted by: Pdfprep Category: DP-200 Tags: , ,

You are developing a data engineering solution for a company. The solution will store a large set of key-value pair data by using Microsoft Azure Cosmos DB

The solution has the following requirements:

• Data must be partitioned into multiple containers.

• Data containers must be configured separately.

• Data must be accessible from applications hosted around the world.

• The solution must minimize latency.

You need to provision Azure Cosmos DB
A . Configure account-level throughput.
B . Provision an Azure Cosmos DB account with the Azure Table API Enable geo-redundancy.
C . Configure table-level throughput
D . Replicate the data globally by manually adding regions lo the Azure Cosmos DB account.
E . Provision an Azure Cosmos DB account with the Azure Table AP
F . Enable multi-region writes.

Answer: E

Explanation:

Scale read and write throughput globally. You can enable every region to be writable and elastically scale reads and writes all around the world. The throughput that your application configures on an Azure Cosmos database or a container is guaranteed to be delivered across all regions associated with your Azure Cosmos account. The provisioned throughput is guaranteed up by financially backed SLAs.

References: https://docs.microsoft.com/en-us/azure/cosmos-db/distribute-data-globally

Leave a Reply

Your email address will not be published.