Topic 1, Proseware Inc
Background
Proseware, Inc, develops and manages a product named Poll Taker. The product is used for delivering public opinion polling and analysis.
Polling data comes from a variety of sources, including online surveys, house-to-house interviews, and booths at public events.
Polling data
Polling data is stored in one of the two locations:
– An on-premises Microsoft SQL Server 2019 database named PollingData
– Azure Data Lake Gen 2
Data in Data Lake is queried by using PolyBase
Poll metadata
Each poll has associated metadata with information about the poll including the date and number of respondents. The data is stored as JSON.
Phone-based polling
Security
– Phone-based poll data must only be uploaded by authorized users from authorized devices
– Contractors must not have access to any polling data other than their own
– Access to polling data must set on a per-active directory user basis
Data migration and loading
– All data migration processes must use Azure Data Factory
– All data migrations must run automatically during non-business hours
– Data migrations must be reliable and retry when needed
Performance
After six months, raw polling data should be moved to a storage account. The storage must be available in the event of a regional disaster. The solution must minimize costs.
Deployments
– All deployments must be performed by using Azure DevOps. Deployments must use templates used in multiple environments
– No credentials or secrets should be used during deployments
Reliability
All services and processes must be resilient to a regional Azure outage.
Monitoring
All Azure services must be monitored by using Azure Monitor. On-premises SQL Server performance must be monitored.
You need to ensure that phone-based poling data can be analyzed in the PollingData database.
How should you configure Azure Data Factory?
A . Use a tumbling schedule trigger
B . Use an event-based trigger
C . Use a schedule trigger
D . Use manual execution
Answer: C
Explanation:
When creating a schedule trigger, you specify a schedule (start date, recurrence, end date etc.) for the trigger, and associate with a Data Factory pipeline.
Scenario:
All data migration processes must use Azure Data Factory
All data migrations must run automatically during non-business hours
References: https://docs.microsoft.com/en-us/azure/data-factory/how-to-create-schedule-trigger
Leave a Reply