[2020.4] Free Microsoft DP-201 Dumps and pdf online practice questions and answers

How do I get the latest free Microsoft DP-201 exam practice questions?
Latest DP-201 exam dump, DP-201 pdf, online Microsoft DP-201 practice test questions free to improve skills and experience,
98.5% of exam pass rate selection lead4Pass DP-201 dumps: https://www.leads4pass.com/dp-201.html (latest update)

Microsoft DP-201 exam pdf free download

[PDF Q1-Q13] Free Microsoft DP-201 pdf dumps download from Google Drive: https://drive.google.com/open?id=1FBB57K45lNPc0xvTkMqwGiykdhBr4ebr

Exam DP-201: Designing an Azure Data Solution: https://docs.microsoft.com/en-us/learn/certifications/exams/dp-201

Candidates for this exam are Microsoft Azure data engineers who collaborate with business stakeholders to identify and meet the data
requirements to design data solutions that use Azure data services.

Azure data engineers are responsible for data-related design tasks that include designing Azure data storage solutions that use relational
and non-relational data stores, batch and real-time data processing solutions, and data security and compliance solutions.

Candidates for this exam must design data solutions that use the following Azure services: Azure Cosmos DB, Azure SQL Database,
Azure Synapse Analytics, Azure Data Lake Storage, Azure Data Factory, Azure Stream Analytics, Azure Databricks, and Azure Blob storage.

Skills measured

  • The content of this exam was updated on March 26, 2020. Please download the skills measured document below to see what changed.
  • NOTE: The bullets that appear below each of the skills measured in the document below are intended to illustrate how we are assessing that skill. This list is not definitive or exhaustive.
  • Design Azure data storage solutions (40-45%)
  • Design data processing solutions (25-30%)
  • Design for data security and compliance (25-30%)

Latest effective Microsoft DP-201 exam practice questions

QUESTION 1
You are designing an Azure Databricks cluster that runs user-defined local processes. You need to recommend a
cluster configuration that meets the following requirements:
1.
Minimize query latency
2.
Reduce overall costs
3.
Maximize the number of users that can run queries on the cluster at the same time. Which cluster type should you
recommend?
A. Standard with Autoscaling
B. High Concurrency with Auto Termination
C. High Concurrency with Autoscaling
D. Standard with Auto Termination
Correct Answer: C
High Concurrency clusters allow multiple users to run queries on the cluster at the same time while minimizing query
latency. Autoscaling clusters can reduce overall costs compared to a statically-sized cluster.
Incorrect Answers:
A, D: Standard clusters are recommended for a single user.
References:
https://docs.azuredatabricks.net/user-guide/clusters/create.html https://docs.azuredatabricks.net/userguide/clusters/high-concurrency.html#high-concurrency https://docs.azuredatabricks.net/userguide/clusters/terminate.html https://docs.azuredatabricks.net/user-guide/clusters/sizing.html#enable-and-configureautoscaling

QUESTION 2
You are designing a solution for a company. The solution will use model training for objective classification.
You need to design the solution.
What should you recommend?
A. an Azure Cognitive Services application
B. a Spark Streaming job
C. interactive Spark queries
D. Power BI models
E. a Spark application that uses Spark MLib.
Correct Answer: E
Spark in SQL Server big data cluster enables AI and machine learning.
You can use Apache Spark MLlib to create a machine learning application to do simple predictive analysis on an open
dataset.
MLlib is a core Spark library that provides many utilities useful for machine learning tasks, including utilities that are
suitable for:
1.
Classification
2.
Regression
3.
Clustering
4.
Topic modeling
5.
Singular value decomposition (SVD) and principal component analysis (PCA)
6.
Hypothesis testing and calculating sample statistics
References: https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-machine-learning-mllib-ipython

QUESTION 3
A company stores sensitive information about customers and employees in Azure SQL Database.
You need to ensure that the sensitive data remains encrypted in transit and at rest.
What should you recommend?
A. Transparent Data Encryption
B. Always Encrypted with secure enclaves
C. Azure Disk Encryption
D. SQL Server AlwaysOn
Correct Answer: B
Incorrect Answers:
A: Transparent Data Encryption (TDE) encrypts SQL Server, Azure SQL Database, and Azure SQL Data Warehouse
data files, known as encrypting data at rest. TDE does not provide encryption across communication channels.
References: https://cloudblogs.microsoft.com/sqlserver/2018/12/17/confidential-computing-using-always-encrypted-withsecure-enclaves-in-sql-server-2019-preview/

QUESTION 4
HOTSPOT
You plan to create a real-time monitoring app that alerts users when a device travels more than 200 meters away from a
designated location.
You need to design an Azure Stream Analytics job to process the data for the planned app. The solution must minimize
the amount of code developed and the number of technologies used.
What should you include in the Stream Analytics job? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:pursue4pass dp-201 exam questions q4

Correct Answer:

pursue4pass dp-201 exam questions q4-1

Input type: Stream
You can process real-time IoT data streams with Azure Stream Analytics.
Input source: Azure IoT Hub
In a real-world scenario, you could have hundreds of these sensors generating events as a stream. Ideally, a gateway
device would run code to push these events to Azure Event Hubs or Azure IoT Hubs.
Function: Geospatial
With built-in geospatial functions, you can use Azure Stream Analytics to build applications for scenarios such as fleet
management, ride sharing, connected cars, and asset tracking.
Reference:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-get-started-with-azure-stream-analytics-toprocess-data-from-iot-devices
https://docs.microsoft.com/en-us/azure/stream-analytics/geospatial-scenarios

QUESTION 5
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains
a unique solution that might meet the stated goals. Some question sets might have more than one correct solution,
while
others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not
appear in the review screen.
A company is developing a solution to manage inventory data for a group of automotive repair shops. The solution will
use Azure SQL Data Warehouse as the data store.
Shops will upload data every 10 days.
Data corruption checks must run each time data is uploaded. If corruption is detected, the corrupted data must be
removed.
You need to ensure that upload processes and data corruption checks do not impact reporting and analytics processes
that use the data warehouse.
Proposed solution: Create a user-defined restore point before the data is uploaded. Delete the restore point after data
corruption checks complete.
Does the solution meet the goal?
A. Yes
B. No
Correct Answer: A
User-Defined Restore Points This feature enables you to manually trigger snapshots to create restore points of your
data warehouse before and after large modifications. This capability ensures that restore points are logically consistent,
which provides additional data protection in case of any workload interruptions or user errors for quick recovery time.
Note: A data warehouse restore is a new data warehouse that is created from a restore point of an existing or deleted
data warehouse. Restoring your data warehouse is an essential part of any business continuity and disaster recovery
strategy because it re-creates your data after accidental corruption or deletion.
References: https://docs.microsoft.com/en-us/azure/sql-data-warehouse/backup-and-restore

QUESTION 6
HOTSPOT You are designing the security for a mission-critical Azure SQL database named DB1. DB1 contains several
columns that store Personally Identifiable Information (PII) data You need to recommend a security solution that meets
the following requirements:
1.
Ensures that DB1 is encrypted at rest
2.

Ensures that data from the columns containing PII data is encrypted in transit
Which security solution should you recommend for DB1 and the columns? To answer, select the appropriate options in
the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

pursue4pass dp-201 exam questions q6

Correct Answer:

pursue4pass dp-201 exam questions q6-1

DB1: Transparent Data Encryption
Azure SQL Database currently supports encryption at rest for Microsoft-managed service side and client-side encryption
scenarios.
Support for server encryption is currently provided through the SQL feature called Transparent Data Encryption.
Columns: Always encrypted
Always Encrypted is a feature designed to protect sensitive data stored in Azure SQL Database or SQL Server
databases. Always Encrypted allows clients to encrypt sensitive data inside client applications and never reveal the
encryption
keys to the database engine (SQL Database or SQL Server).
Note: Most data breaches involve the theft of critical data such as credit card numbers or personally identifiable
information. Databases can be treasure-troves of sensitive information. They can contain customers\\’ personal data
(like national
identification numbers), confidential competitive information, and intellectual property. Lost or stolen data, especially
customer data, can result in brand damage, competitive disadvantage, and serious fines–even lawsuits.
References:
https://docs.microsoft.com/en-us/azure/security/fundamentals/encryption-atrest
https://docs.microsoft.com/en-us/azure/security/fundamentals/database-security-overview

QUESTION 7
You need to design the authentication and authorization methods for sensors.
What should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:pursue4pass dp-201 exam questions q7

Correct Answer:

pursue4pass dp-201 exam questions q7-1

Sensor data must be stored in a Cosmos DB named try data in a collection named SensorData
Sensors must have permission only to add items to the SensorData collection
Box 1: Resource Token
Resource tokens provide access to the application resources within a Cosmos DB database.
Enable clients to read, write, and delete resources in the Cosmos DB account according to the permissions they\\’ve
been granted.
Box 2: Cosmos DB user
You can use a resource token (by creating Cosmos DB users and permissions) when you want to provide access to
resources in your Cosmos DB account to a client that cannot be trusted with the master key.
References:
https://docs.microsoft.com/en-us/azure/cosmos-db/secure-access-to-data

QUESTION 8
You need to recommend an Azure SQL Database service tier.
What should you recommend?
A. Business Critical
B. General Purpose
C. Premium
D. Standard
E. Basic
Correct Answer: C
The data engineers must set the SQL Data Warehouse to compute resources to consume 300 DWUs.
Note: There are three architectural models that are used in Azure SQL Database:
1.
General Purpose/Standard
2.
Business Critical/Premium
3.
Hyperscale
Incorrect Answers:
A: Business Critical service tier is designed for the applications that require low-latency responses from the underlying
SSD storage (1-2 ms on average), fast recovery if the underlying infrastructure fails, or need to off-load reports,
analytics, and read-only queries to the free of charge readable secondary replica of the primary database.
References: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tier-business-critical

QUESTION 9
You need to design image processing and storage solutions.
What should you recommend? To answer, select the appropriate configuration in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:pursue4pass dp-201 exam questions q9

Correct Answer:

pursue4pass dp-201 exam questions q9-1

From the scenario:
The company identifies the following business requirements:
1.
You must transfer all images and customer data to cloud storage and remove on-premises servers.
2.
You must develop an image object and color tagging solution.
The solution has the following technical requirements:
1.
Image data must be stored in a single data store at a minimum cost.
2.
All data must be backed up in case disaster recovery is required.
All cloud data must be encrypted at rest and in transit. The solution must support: hyper-scale storage of images
References: https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/batch-processing
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tier-hyperscale

QUESTION 10
You need to design the runtime environment for the Real-Time Response system. What should you recommend?
A. General Purpose nodes without the Enterprise Security package
B. Memory-Optimized Nodes without the Enterprise Security package
C. Memory-Optimized nodes with the Enterprise Security package
D. General Purpose nodes with the Enterprise Security package
Correct Answer: B
Scenario: You must maximize the performance of the Real-Time Response system.

QUESTION 11
You need to recommend a storage solution for a sales system that will receive thousands of small files per minute. The
files will be in JSON, text, and CSV formats. The files will be processed and transformed before they are loaded into an
Azure data warehouse. The files must be stored and secured in folders.
Which storage solution should you recommend?
A. Azure Data Lake Storage Gen2
B. Azure Cosmos DB
C. Azure SQL Database
D. Azure Blob storage
Correct Answer: A
Azure provides several solutions for working with CSV and JSON files, depending on your needs. The primary landing
place for these files is either Azure Storage or Azure Data Lake Store.1
Azure Data Lake Storage is optimized storage for big data analytics workloads.
Incorrect Answers:
D: Azure Blob Storage containers are a general-purpose object store for a wide variety of storage scenarios. Blobs are
stored in containers, which are similar to folders.
References: https://docs.microsoft.com/en-us/azure/architecture/data-guide/scenarios/csv-and-json

QUESTION 12
You are designing an Azure Databricks interactive cluster.
You need to ensure that the cluster meets the following requirements:
1.
Enable auto-termination
2.
Retain cluster configuration indefinitely after cluster termination. What should you recommend?
A. Start the cluster after it is terminated.
B. Pin the cluster
C. Clone the cluster after it is terminated.
D. Terminate the cluster manually at process completion.
Correct Answer: B
To keep an interactive cluster configuration even after it has been terminated for more than 30 days, an administrator
can pin a cluster to the cluster list.
References: https://docs.azuredatabricks.net/user-guide/clusters/terminate.html

QUESTION 13
You have an on-premises data warehouse that includes the following fact tables. Both tables have the following
columns: DataKey, ProductKey, RegionKey. There are 120 unique product keys and 65 unique region keys.pursue4pass dp-201 exam questions q13

Queries that use the data warehouse take a long time to complete.
You plan to migrate the solution to use Azure SQL Data Warehouse. You need to ensure that the Azure-based solution
optimizes query performance and minimizes processing skew.
What should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

pursue4pass dp-201 exam questions q13-1

Correct Answer:

pursue4pass dp-201 exam questions q13-2

Box 1: Hash-distributed
Box 2: ProductKey
ProductKey is used extensively in joins.
Hash-distributed tables improve query performance on large fact tables.
Box 3: Round-robin
Box 4: RegionKey
Round-robin tables are useful for improving loading speed.
Consider using the round-robin distribution for your table in the following scenarios:
1.
When getting started as a simple starting point since it is the default
2.
If there is no obvious joining key
3.
If there is not good candidate column for hash distributing the table
4.
If the table does not share a common join key with other tables
5.
If the join is less significant than other joins in the query
6.
When the table is a temporary staging table
Note: A distributed table appears as a single table, but the rows are actually stored across 60 distributions. The rows are
distributed with a hash or round-robin algorithm.
References:
https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-tables-distribute

Share lead4Pass Microsoft DP-201 Discount codes for free 2020

lead4pass coupon code:lead4pass2020

Lead4Pass Reviews

Lead4pass offers the latest exam exercise questions for free! Microsoft exam questions are updated throughout the year.
Lead4Pass has many professional exam experts! Guaranteed valid passing of the exam! The highest pass rate, the highest cost-effective!
Help you pass the exam easily on your first attempt.

about lead4pass

What you need to know:

Pursue4pass shares the latest Microsoft DP-201 exam dumps,DP-201 pdf,DP-201 exam exercise questions for free.
You can Improve your skills and exam experience online to get complete exam questions and answers guaranteed to pass the
exam we recommend Lead4Pass DP-201 exam dumps

Latest update Lead4pass DP-201 exam dumps: https://www.leads4pass.com/dp-201.html (145 Q&As)

[Q1-Q13 PDF] Free Microsoft DP-201 pdf dumps download from Google Drive: https://drive.google.com/open?id=1FBB57K45lNPc0xvTkMqwGiykdhBr4ebr