0 of 55 Questions completed
Questions:
You have already completed the quiz before. Hence you can not start it again.
You must sign in or sign up to start the quiz.
You must first complete the following:
Quiz complete. Results are being recorded.
0 of 55 Questions answered correctly
Your time:
Time has elapsed
You have reached 0 of 0 point(s), (0 )
Earned Point(s): 0 of 0 , (0 )
0 Essay(s) Pending (Possible Point(s): 0 )
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
Current
Review
Answered
Correct
Incorrect
Question 1 of 55
1 point(s)
You need to import data into a Microsoft SQL Data Warehouse. The data that needs to be ingested resides in parquet files. These files are stored in an Azure Data Lake Gen 2 storage account. You need to load the data from the storage account into the data warehouse.
You decide to implement the following steps.
• Create an external data source pointing to the Azure storage account.
• Create a workload group using the Azure storage account name as the pool name.
• Load the data using the CREATE TABLE AS SELECT statement.
Would these steps fulfill the requirement?
Question 2 of 55
1 point(s)
You need to import data into a Microsoft SQL Data Warehouse. The data that needs to be ingested resides in parquet files. These files are stored in an Azure Data Lake Gen 2 storage account. You need to load the data from the storage account into the data warehouse.
You decide to implement the following steps.
• Create a remote service binding pointing to the Azure Data Lake Gen 2 storage account.
• Create an external file format and external table using the external data source.
• Load the data using the CREATE TABLE AS SELECT statement.
Would these steps fulfill the requirement?
Question 3 of 55
1 point(s)
You need to import data into a Microsoft SOL Data Warehouse. The data that needs to be ingested resides in parquet files. These files are stored in an Azure Data Lake Gen 2 storage account. You need to load the data from the storage account into the data warehouse.
You decide to implement the following steps.
• Use Azure Data Factory to convert the parquet files to CSV files.
• Create an external data source pointing to the Azure storage account.
• Create an external file format and external table using the external data source.
• Load the data using the INSERT.SELECT statement.
Would these steps fulfill the requirement?
Question 4 of 55
1 point(s)
You need to import data into a Microsoft SQL Data Warehouse. The data that needs to be ingested resides in parquet files. These files are stored in an Azure Data Lake Gen 2 storage account. You need to load the data from the storage account into the data warehouse.
You decide to implement the following steps.
• Create an external data source that points to the Azure Data Lake Gen 2 storage account.
• Then create an external file format and an external table by making use of an external data source.
• Then load the data using the CREATE TABLE AS SELECT statement.
Would these steps fulfill the requirement?
Question 5 of 55
1 point(s)
A company wants to use Azure Storage accounts for file storage purposes. A single storage account would be required to perform all read, write and delete operations. The company also needs to keep a copy of all historical operations in an on-premises server.
Which of the following actions should be performed to accomplish this requirement? Choose 2 answers from the options given below.
Question 6 of 55
1 point(s)
Your company has an enterprise data warehouse hosted in Azure SOL Synapse. The name of the data warehouse is ipslab-data and the name of the server is ipslabserver3000. You have to verify whether the size of the transaction log file for each distribution of the data warehouse is smaller than 160 GB.
Which of the following would you implement for this requirement?
Question 7 of 55
1 point(s)
A company is planning to create an Azure SQL database to support a mission-critical application. The application needs to be highly available and not have any performance degradation during maintenance windows. Which of the following technologies can be used to implement this solution? Choose 3 answers from the options given below.
Question 8 of 55
1 point(s)
Case Study
A company is hosting a computational data analysis processing application on a set of virtual machines. The application runs daily jobs and then stores the results in virtual hard disk drives. The application uses data from the previous data and stores the results in a snapshot of the VHD. When a new month starts, the application creates a new VHD. You need to implement a data lifecycle policy that would meet the following requirements:
• Ensure that results are kept for 60 days
• The data for the current year must be available for weekly reports
• Data from the prior 10 years must be available for auditing purposes
• Data that is required for auditing must be made available within 10 days of the request being made
You have to enforce the data lifecycle policy and also ensure costs are kept to a minimum
You need to complete the below lifecycle policy JSON segment.
Which of the following would go into Slot 1?
Question 9 of 55
1 point(s)
Case Study
A company is hosting a computational data analysis processing application on a set of virtual machines. The application runs daily jobs and then stores the results in virtual hard disk drives. The application uses data from the previous data and stores the results in a snapshot of the VHD. When a new month starts, the application creates a new VHD. You need to implement a data lifecycle policy that would meet the following requirements:
• Ensure that results are kept for 60 days
• The data for the current year must be available for weekly reports
• Data from the prior 10 years must be available for auditing purposes
• Data that is required for auditing must be made available within 10 days of the request being made
You have to enforce the data lifecycle policy and also ensure costs are kept to a minimum
You need to complete the below lifecycle policy JSON segment
Which of the following would go into Slot 2?
Question 10 of 55
1 point(s)
Case Study
A company is hosting a computational data analysis processing application on a set of virtual machines. The application runs daily jobs and then stores the results in virtual hard disk drives. The application uses data from the previous data and stores the results in a snapshot of the VHD. When a new month starts, the application creates a new VHD. You need to implement a data lifecycle policy that would meet the following requirements:
• Ensure that results are kept for 60 days
• The data for the current year must be available for weekly reports
• Data from the prior 10 years must be available for auditing purposes
• Data that is required for auditing must be made available within 10 days of the request being made
You have to enforce the data lifecycle policy and also ensure costs are kept to a minimum
You need to complete the below lifecycle policy JSON segment
Which of the following would go into Slot 3?
Question 11 of 55
1 point(s)
Case Study
A company is hosting a computational data analysis processing application on a set of virtual machines. The application runs daily jobs and then stores the results in virtual hard disk drives. The application uses data from the previous data and stores the results in a snapshot of the VHD. When a new month starts, the applications creates a new VHD. You need to implement a data lifecycle policy that would meet the following requirements:
• Ensure that results are kept for 60 days
• The data for the current year must be available for weekly reports
• Data from the prior 10 years must be available for auditing purposes
• Data that is required for auditing must be made available within 10 days of the request being made
You have to enforce the data lifecycle policy and also ensure costs are kept to a minimum
You need to complete the below lifecycle policy JSON segment
Which of the following would go into Slot 4?
Question 12 of 55
1 point(s)
Case Study
A company is hosting a computational data analysis processing application on a set of virtual machines. The application runs daily jobs and then stores the results in virtual hard disk drives. The application uses data from the previous data and stores the results in a snapshot of the VHD. When a new month starts, the applications creates a new VHD. You need to implement a data lifecycle policy that would meet the following requirements:
• Ensure that results are kept for 60 days
• The data for the current year must be available for weekly reports
• Data from the prior 10 years must be available for auditing purposes
• Data that is required for auditing must be made available within 10 days of the request being made
You have to enforce the data lifecycle policy and also ensure costs are kept to a minimum
You need to complete the below lifecycle policy JSON segment
Which of the following would go into Slot 5?
Question 13 of 55
1 point(s)
A company has an Azure SQL data warehouse. They want to use PolyBase to retrieve data from an Azure Blob storage account and ingest it into the Azure SQL data warehouse. The files are stored in parquet format. The data needs to be loaded into a table called ips_sales. Which of the following actions need to be performed to implement this requirement? Choose 4 answers from the options given below.
Question 14 of 55
1 point(s)
Your company has to develop a solution using Azure Stream Analytics. The solution will consist of the following components.
• The solution would accept a file named orders that are stored in Azure Blob storage. The order information stored contains the address of the items in the order.
• The storage account also contains another file that contains the estimated delivery time for each Location. The file does not change.
• You have to configure Azure Stream Analytics to process the orders in the files based on the estimated delivery time for each location.
• The data must be sent to an Azure SQL Database for immediate use.
• The data must be sent to an Azure Data Lake Storage Gen2 storage account for long-term retention.
You decide to implement a Stream Analytics job that has two streaming inputs, one query, and two outputs.
Would this fulfill the requirement?
Question 15 of 55
1 point(s)
Your company has to develop a solution using Azure Stream Analytics. The solution will consist of the following components.
• The solution would accept a file named orders that is stored in Azure Blob storage. The order information stored contains the address of the items in the order.
• The storage account also contains another file which contains the estimated delivery time for each location. The file does not change.
• You have to configure Azure Stream Analytics to process the orders in the files based on the estimated delivery time for each location.
• The data must be sent to an Azure SQL Database for immediate use.
• The data must be sent to an Azure Data Lake Storage Gen2 storage account for long-term retention.
You decide to implement a Stream Analytics job that has one streaming input. one reference input. Two queries and four outputs.
Would this fulfil the requirement?
Question 16 of 55
1 point(s)
Your company has to develop a solution using Azure Stream Analytics. The solution will consist of the following components.
• The solution would accept a file named orders that is stored in Azure Blob storage. The order information stored contains the address of the items in the order.
• The storage account also contains another file that contains the estimated delivery time for each location. The file does not change.
• You have to configure Azure Stream Analytics to process the orders in the files based on the estimated delivery time for each location.
• The data must be sent to an Azure SQL Database for immediate use.
• The data must be sent to an Azure Data Lake Storage Gen2 storage account for long-term retention.
You decide to implement a Stream Analytics job that has one streaming input, one reference input. One queries and two outputs.
Would this fulfill the requirement?
Question 17 of 55
1 point(s)
A company is designing a new lambda architecture on Microsoft Azure, below are the requirements for each architecture layer.
Ingestion
• Have the ability to receive millions of events per second.
• Be fully managed platform-as-a-service solution.
• Integrate with Azure Functions.
Stream Processing
• Have the ability to process data on a per-job basis.
• Have the ability to connect seamlessly to Azure services.
• Use a SQL based query language.
Analytical datastore
• Perform as a managed service.
• Ability to behave as a document store.
• Provide data encryption at rest.
Which of the following would you consider using at the Ingestion layer?
Question 18 of 55
1 point(s)
A company is designing a new lambda architecture on Microsoft Azure, below are the requirements for each architecture layer.
Ingestion
• Have the ability to receive millions of events per second.
• Be fully managed platform-as-a-service solution.
• Integrate with Azure Functions.
Stream Processing
• Have the ability to process data on a per-job basis.
• Have the ability to connect seamlessly to Azure services.
• Use a SQL based query language.
Analytical datastore
• Perform as a managed service.
• Ability to behave as a document store.
• Provide data encryption at rest.
Which of the following would you consider using at the Stream Processing layer?
Question 19 of 55
1 point(s)
A company is designing a new lambda architecture on Microsoft Azure. Below are the requirements for each architecture layer.
Ingestion
• Have the ability to receive millions of events per second.
• Be fully managed platform-as-a-service solution.
• Integrate with Azure Functions.
Stream Processing
• Have the ability to process data on a per-job basis.
• Have the ability to connect seamlessly to Azure services.
• Use a SQL based query language.
Analytical datastore
• Perform as a managed service.
• Ability to behave as a document store.
• Provide data encryption at rest.
Which of the following would you consider using at the Analytical data store layer?
Question 20 of 55
1 point(s)
Your company has an Azure storage account as part of its Azure subscription. The company wants to ensure that blobs are identified and deleted that was NOT modified during the last 50 days. You decide to apply an Azure policy that makes use of tags for the storage account.
Would this fulfill the requirement?
Question 21 of 55
1 point(s)
Your company has an Azure storage account as part of its Azure subscription. The company wants to ensure that blobs are identified and deleted that was NOT modified during the last 50 days. You decide to apply an expired tag to the blobs in the storage account.
Would this fulfill the requirement?
Question 22 of 55
1 point(s)
Your company has an Azure storage account as part of its Azure subscription. The company wants to ensure that blobs are identified and deleted that was NOT modified during the last 50 days. You decide to apply an Azure Blob storage lifecycle policy.
Would this fulfill the requirement?
Question 23 of 55
1 point(s)
Your company has an Azure Databrick cluster. They want to collect and send application metrics. Logs and streaming query events for the cluster to Azure Monitor.
Which of the following library would you implement for Azure Databricks for this requirement?
Question 24 of 55
1 point(s)
Your company has an Azure Databrick cluster. They want to collect and send application metrics. Logs and streaming query events for the cluster to Azure Monitor.
Which of the following would you implement for the storage of data?
Question 25 of 55
1 point(s)
Your company has an Azure virtual machine that has Microsoft SQL Server installed. The instance has a table named ipslaborders. You have to copy the data from the table to an Azure Data Lake Gen2 storage account with the help of Azure Data Factory.
Which of the following would you use as the type of integration runtime for the copy activity?
Question 26 of 55
1 point(s)
A company wants to develop a solution in Azure. The solution would handle streaming data from Twitter. Azure Event Hubs would be used to ingest the streaming data. Azure Data bricks would then be used to receive the data from Event Hubs. Which of the following actions would you implement for this requirement? Choose 3 answers from the options given below.
Question 27 of 55
1 point(s)
A company has an on-premises Microsoft SQL Server database. They want to copy the data from the instance to Azure Blob storage. They want to configure Azure Data Factory to connect to the on-premises SQL Server instance. Which of the following steps must be carried out for this requirement? Choose 3 answers from the options given below.
Question 28 of 55
1 point(s)
A company wants to make use of Azure Stream Analytics. The incoming data is formatted as one record per row. You need to complete the following REST API segment which would be used to create the input stream.
“inputs”:[
{
“name”:”ipslabsource”,
“properties” :{
“type”:”stream”,
“serialization”:{
Area 1
“properties” :{
“fieldDelimiter” :”,”,
“encoding” :“UTF8″
}
}
“datasource”:{
Area 2
“properties” :{
“serviceBusNamespace”:”ipsbus2020″,
“sharedAccessPolicyName”:”ipspolicy”,
“sharedAccessPolicyKey”:”***/#**#99ss0s ssasscccaanansnsesesccccascosce”
“eventHubName”:”ipshub”
}
}
}
}
Which of the following would go into Area 1?
Question 29 of 55
1 point(s)
A company wants to make use of Azure Stream Analytics. The incoming data is formatted as one record per row. You need to complete the following REST API segment which would be used to create the input stream.
“inputs”:[
{
“name”:”ipslabsource”,
“properties” :{
“type”:”stream”,
“serialization”:{
Area 1
“properties” :{
“fieldDelimiter” :”,”,
“encoding” :“UTF8″
}
}
“datasource”:{
Area 2
“properties” :{
“serviceBusNamespace”:”ipsbus2020″,
“sharedAccessPolicyName”:”ipspolicy”,
“sharedAccessPolicyKey”:”***/#**#99ss0s ssasscccaanansnsesesccccascosce”
“eventHubName”:”ipshub”
}
}
}
}
Which of the following would go into Area 2?
Question 30 of 55
1 point(s)
A company wants to implement a solution using Azure Stream Analytics. Below are the key requirements of the solution.
• Ingest data from Blob storage.
• Be able to analyze data in real-time.
• Be able to store the processed data in Azure Cosmos DB.
Which of the following actions would you implement for this requirement? Choose 3 answers from the options given below.
Question 31 of 55
1 point(s)
You are going to create an Azure Databricks environment. You will be accessing data in an Azure Blob storage account. The data must be available to all Azure Databricks workspaces. Which of the following actions would you perform for this requirement? Choose 3 answers from the options given below.
Question 32 of 55
1 point(s)
Your company is making use of the Azure Stream Analytics service. You have to implement complex stateful business logic within the Azure Stream Analytics service.
Which of the following would you implement for this requirement?
Question 33 of 55
1 point(s)
A company is planning to use Azure SQL Database along with Elastic Database jobs. You have to analyze. troubleshoot and report the various components that are responsible for running Elastic database jobs.
Which of the following would be used for the task of storing “Execution results and diagnostics”?
Question 34 of 55
1 point(s)
A company is planning to use Azure SQL Database along with Elastic Database jobs. You have to analyze. troubleshoot and report on the various components that are responsible for running Elastic database jobs.
Which of the following would be used for the task of handling the ‘Job launcher and tracker”?
Question 35 of 55
1 point(s)
A company is planning to use Azure SQL Database along with Elastic Database jobs. You have to analyze, troubleshoot, and report on the various components that are responsible for running Elastic database jobs.
Which of the following would be used for the task of storing the Job metadata and state’?
Question 36 of 55
1 point(s)
A company wants to set up a NoSQL database in Azure to store data. They want to have a database that can be used to store key-value pairs. They also want to have a database that can store wide-column data values. Which of the following API types would you choose for Cosmos DB for these requirements?
Choose 2 answers from the options given below.
Question 37 of 55
1 point(s)
A company wants to deploy a sales application as part of their application portfolio. The application needs to store its data in an Azure SQL Database. The data for sales will be stored in Azure SQL database from multiple regions. After a week. the sales data needs to be stored in another Azure SQL database to perform analytics. Analytics is a very resource-intensive process and can generate up to 40 TB of data. You have to provision the right type of database for each operation. The database provisioning must ensure that performance is maximized and the cost is minimized.
Which of the following would you choose as the database instance type for uploading daily sales data?
Question 38 of 55
1 point(s)
A company wants to deploy a sales application as part of their application portfolio. The application needs to store its data in an Azure SQL Database. The data for sales will be stored in Azure SQL database from multiple regions. After a week. the sales data needs to be stored in another Azure SQL database to perform analytics. The Analytics is a very resource-intensive process and can generate up to 40 TB of data. You have to provide the right type of database for each operation. The database must ensure that performance is maximized and the cost is minimized.
Which of the following would you choose as the database instance type for the weekly sales data on which analytics needs to be performed?
Question 39 of 55
1 point(s)
A company wants to synchronize data from an on-premises Microsoft SOL Server database to an Azure SQL Database. You have to perform an assessment to understand whether the data can actually be moved without any compatibility issues. Which of the following would you use to perform the assessment?
Question 40 of 55
1 point(s)
A company wants to start using Azure Cosmos DB. They would be using the Cassandra API for the database.
Which of the following would they need to choose as the container type?
Question 41 of 55
1 point(s)
A company wants to start using Azure Cosmos DB. They would be using the Cassandra API for the database.
Which of the following would they need to choose as the item type?
Question 42 of 55
1 point(s)
A company is planning to set up an Azure SQL data warehouse. They want to set up different tables. The different tables have different requirements, as stated below.
• ipslab_sales – Here the rows should be distributed in such a way that it offers high performance.
• ipslab_offers – Here data should be available on all nodes to achieve better performance on table joins.
• ipslab_orders – Here data should be loaded faster on the underlying table.
Which of the following table type would you use for the table ipslab_sales?
Question 43 of 55
1 point(s)
A company is planning to set up an Azure SQL data warehouse. They want to set up different tables. The different tables have different requirements as stated below.
• ipslab_sales – Here the rows should be distributed in such a way that it offers high performance.
• ipslab_offers – Here data should be available on all nodes to achieve better performance on table joins.
• ipslab_orders – Here data should be loaded faster on the underlying table.
Which of the following table type would you use for the table ipslab_offers?
Question 44 of 55
1 point(s)
A company is planning to set up an Azure SQL data warehouse. They want to set up different tables. The different tables have different requirements as stated below.
• ipslab_sales – Here the rows should be distributed in such a way that it offers high performance.
• ipslab_offers – Here data should be available on all nodes to achieve better performance on table joins.
• ipslab_orders – Here data should be loaded faster on the underlying table.
Which of the following table type would you use for the table ipslab_orders?
Question 45 of 55
1 point(s)
A company wants to migrate several on-premises Microsoft SQL Server databases to Azure. They want to migrate to Azure using the backup process available for Microsoft SQL servers. Which of the following is the data technology they should use on Azure?
Question 46 of 55
1 point(s)
A company wants to create an Azure Cosmos DB account. They need to ensure that the database uses the SQL API and the latency is minimized.
You have to complete the following Azure CLI command for this requirement
az cosmosdb create —-name “ipslabaccount2021” —resource-group “ipslab-rg”
–kind Area 1 –default-consistency-level Area 2
Which of the following would go into Area 1?
Question 47 of 55
1 point(s)
A company wants to create an Azure Cosmos DB account. They need to ensure that the database uses the SQL API and the latency is minimized.
You have to complete the following Azure CLI command for this requirement
az cosmosdb create –name “ipslabaccount2021” –resource-group “ipslab-rg”
–kind Area 1–default-consistency-level Area 2
Which of the following would go into Area 2?
Question 48 of 55
1 point(s)
Case Study
Overview
Ipslabs is an online training provider. They regularly conduct polls on the effectiveness of their training material. The polling data comes from various sources, such as online surveys and public events.
The polling data would be stored in 2 locations
• An on-premises Microsoft SQL Server database
• Azure Data Lake Gen 2 storage
The data in the data lake would be queried using PolyBase
Each poll data also has associated metadata. The metadata is stored as JSON. The metadata has the date and number of people who have taken the poll.
Polling data is also taken via phone calls. Below are the security requirements for polling data taken via phone calls
• The poll data must be uploaded by authorized users from authorized devices
• External contractors can only access their own polling data
• The access to the polling data would be given to users on a per-active directory user basis
Other requirements
• All data migration processes must be carried out using Azure Data Factory
• All of the data migrations must run automatically and be carried out during non-business hours
• All services and processes must be resilient to regional Azure outages.
• All services must be monitored using Azure Monitor
• The performance of the on-premises SQL server must also be monitored
• After 3 months all polling data must be moved to low cost storage
• All deployments must be performed using Azure DevOps
• Deployments must make use of templates
• No credentials or secrets of any kind must be exposed during deployments
You have to implement the deployment of Azure Data Factory pipelines. Which of the following would you use for authorization of the deployments?
Question 49 of 55
1 point(s)
Case Study
Overview
Ipslabs is an online training provider. They regularly conduct polls on the effectiveness of their training material. The polling data comes from various sources, such as online surveys and public events.
The polling data would be stored in 2 locations
• An on-premises Microsoft SQL Server database
• Azure Data Lake Gen 2 storage
The data in the data lake would be queried using PolyBase
Each poll data also has associated metadata. The metadata is stored as JSON. The metadata has the date and number of people who have taken the poll.
Polling data is also taken via phone calls. Below are the security requirements for polling data taken via phone calls
• The poll data must be uploaded by authorized users from authorized devices
• External contractors can only access their own polling data
• The access to the polling data would be given to users on a per-active directory user basis
Other requirements
• All data migration processes must be carried out using Azure Data Factory
• All of the data migrations must run automatically and be carried out during non-business hours
• All services and processes must be resilient to regional Azure outages.
• All services must be monitored using Azure Monitor
• The performance of the on-premises SQL server must also be monitored
• After 3 months all polling data must be moved to low-cost storage
• All deployments must be performed using Azure DevOps
• Deployments must make use of templates
• No credentials or secrets of any kind must be exposed during deployments
You have to implement the deployment of Azure Data Factory pipelines. Which of the following would you use for authentication of the deployments?
Question 50 of 55
1 point(s)
Case Study
Overview
Ipslabs is an online training provider. They regularly conduct polls on the effectiveness of their training material. The polling data comes from various sources, such as online surveys and public events.
The polling data would be stored in 2 locations
• An on-premises Microsoft SQL Server database
• Azure Data Lake Gen 2 storage
The data in the data lake would be queried using PolyBase
Each poll data also has associated metadata. The metadata is stored as JSON. The metadata has the date and number of people who have taken the poll.
Polling data is also taken via phone calls. Below are the security requirements for polling data taken via phone calls
• The poll data must be uploaded by authorized users from authorized devices
• External contractors can only access their own polling data
• The access to the polling data would be given to users on a per-active directory user basis
Other requirements
• All data migration processes must be carried out using Azure Data Factory
• All of the data migrations must run automatically and be carried out during non-business hours
• All services and processes must be resilient to regional Azure outages.
• All services must be monitored using Azure Monitor
• The performance of the on-premises SQL server must also be monitored
• After 3 months all polling data must be moved to low cost storage
• All deployments must be performed using Azure DevOps
• Deployments must make use of templates
• No credentials or secrets of any kind must be exposed during deployments
You have to create the storage account that would be used to store the polling data. Which of the following would you use as the Account type?
Question 51 of 55
1 point(s)
Case Study
Overview
Ipslabs is an online training provider. They regularly conduct polls on the effectiveness of their training material. The polling data comes from various sources, such as online surveys and public events.
The polling data would be stored into 2 locations
• An on-premises Microsoft SQL Server database
• Azure Data Lake Gen 2 storage
The data in the data lake would be queried using PolyBase
Each poll data also has associated metadata. The metadata is stored as JSON. The metadata has the date and number of people who have taken the poll.
Polling data is also taken via phone calls. Below are the security requirements for polling data taken via phone calls
• The poll data must be uploaded by authorized users from authorized devices
• External contractors can only access their own polling data
• The access to the polling data would be given to users on a per-active directory user basis
Other requirements
• All data migration processes must be carried out using Azure Data Factory
• All of the data migrations must run automatically and be carried out during non-business hours
• All services and processes must be resilient to regional Azure outages.
• All services must be monitored using Azure Monitor
• The performance of the on-premises SQL server must also be monitored
• After 3 months all polling data must be moved to low cost storage
• All deployments must be performed using Azure DevOps
• Deployments must make use of templates
• No credentials or secrets of any kind must be exposed during deployments
You have to create the storage account that would be used to store the polling data. Which of the following would you use as the replication type?
Question 52 of 55
1 point(s)
Case Study
Overview
Ipslabs is an online training provider. They regularly conduct polls on the effectiveness of their training material. The polling data comes from various sources, such as online surveys and public events.
The polling data would be stored into 2 locations
• An on-premises Microsoft SQL Server database
• Azure Data Lake Gen 2 storage
The data in the data lake would be queried using PolyBase
Each poll data also has associated metadata. The metadata is stored as JSON. The metadata has the date and number of people who have taken the poll.
Polling data is also taken via phone calls. Below are the security requirements for polling data taken via phone calls
• The poll data must be uploaded by authorized users from authorized devices
• External contractors can only access their own polling data
• The access to the polling data would be given to users on a per-active directory user basis
Other requirements
• All data migration processes must be carried out using Azure Data Factory
• All of the data migrations must run automatically and be carried out during non-business hours
• All services and processes must be resilient to regional Azure outages.
• All services must be monitored using Azure Monitor
• The performance of the on-premises SQL server must also be monitored
• After 3 months all polling data must be moved to low cost storage
• All deployments must be performed using Azure DevOps
• Deployments must make use of templates
• No credentials or secrets of any kind must be exposed during deployments
You have to ensure that Azure Data Factory would run to make the polling data available in the polling data database. Which of the following would you configure in Azure Data Factory?
Question 53 of 55
1 point(s)
Case Study
Overview
Ipslabs is an online training provider. They regularly conduct polls on the effectiveness of their training material. The polling data comes from various sources, such as online surveys and public events.
The polling data would be stored into 2 locations
• An on-premises Microsoft SQL Server database
• Azure Data Lake Gen 2 storage
The data in the data lake would be queried using PolyBase
Each poll data also has associated metadata. The metadata is stored as JSON. The metadata has the date and number of people who have taken the poll.
Polling data is also taken via phone calls. Below are the security requirements for polling data taken via phone calls
• The poll data must be uploaded by authorized users from authorized devices
• External contractors can only access their own polling data
• The access to the polling data would be given to users on a per-active directory user basis
Other requirements
• All data migration processes must be carried out using Azure Data Factory
• All of the data migrations must run automatically and be carried out during non-business hours
• All services and processes must be resilient to regional Azure outages.
• All services must be monitored using Azure Monitor
• The performance of the on-premises SQL server must also be monitored
• After 3 months all polling data must be moved to low cost storage
• All deployments must be performed using Azure DevOps
• Deployments must make use of templates
• No credentials or secrets of any kind must be exposed during deployments
You have to ensure that the polling data security requirements are met. Which of the following would you set for Polybase?
Question 54 of 55
1 point(s)
You need to grant access to a storage account from a virtual network. Which of the following would you need to enable first for this requirement?
Question 55 of 55
1 point(s)
A company has a storage account named ipsstore2020. They want to ensure that they can recover a blob object if it was deleted in the last 10 days.
Which of the following would they implement for this requirement?