0% found this document useful (0 votes)
272 views46 pages

DP-300 Updated Dumps - Administering Microsoft Azure SQL Solutions

Share Administering Microsoft Azure SQL Solutions DP-300 updated dumps with you.

Uploaded by

timblin843
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
272 views46 pages

DP-300 Updated Dumps - Administering Microsoft Azure SQL Solutions

Share Administering Microsoft Azure SQL Solutions DP-300 updated dumps with you.

Uploaded by

timblin843
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 46

DP-300

Exam Name: Administering Microsoft Azure SQL


Solutions

Full version: 341 Q&As

Full version of DP-300 Dumps

Share some DP-300 exam dumps below.

1. You have a Microsoft SQL Server 2019 instance in an on-premises datacenter. The instance
contains a 4-TB database named DB1.
You plan to migrate DB1 to an Azure SQL Database managed instance.
What should you use to minimize downtime and data loss during the migration?
A. database mirroring
B. distributed availability groups
C. Always On Availability Group
D. Azure Database Migration Service
Answer: D

2. What should you do after a failover of SalesSQLDb1 to ensure that the database remains
accessible to SalesSQLDb1App1?
A. Configure SalesSQLDb1 as writable.
B. Update the connection strings of SalesSQLDb1App1.
C. Update the firewall rules of SalesSQLDb1.
D. Update the users in SalesSQLDb1.
Answer: B
Explanation:
Scenario: SalesSQLDb1 uses database firewall rules and contained database users.

3. DRAG DROP
Your company analyzes images from security cameras and sends alerts to security teams that
respond to unusual activity. The solution uses Azure Databricks.
You need to send Apache Spark level events, Spark Structured Streaming metrics, and
application metrics to Azure Monitor.
Which three actions should you perform in sequence? To answer, move the appropriate actions
from the list of actions in the answer area and arrange them in the correct order.
Answer:

Explanation:
Send application metrics using Dropwizard.
Spark uses a configurable metrics system based on the Dropwizard Metrics Library.
To send application metrics from Azure Databricks application code to Azure Monitor, follow
these steps:
Step 1: Configure your Azure Databricks cluster to use the Databricksmonitoring library.
Prerequisite: Configure your Azure Databricks cluster to use the monitoring library.
Step 2: Build the spark-listeners-loganalytics-1.0-SNAPSHOT.jar JAR file
Step 3: Create Dropwizard counters in your application code Create Dropwizard gauges or
counters in your application code

4. You create five Azure SQL Database instances on the same logical server.
In each database, you create a user for an Azure Active Directory (Azure AD) user named
User1.
User1 attempts to connect to the logical server by using Azure Data Studio and receives a login
error.
You need to ensure that when User1 connects to the logical server by using Azure Data Studio,
User1 can see all the databases.
What should you do?
A. Create User1 in the master database.
B. Assign User1 the db_datareader role for the master database.
C. Assign User1 the db_datareader role for the databases that Userl creates.
D. Grant select on sys.databases to public in the master database.
Answer: A
Explanation:
Reference: https://docs.microsoft.com/en-us/azure/azure-sql/database/logins-create-manage

5. HOTSPOT
You configure backups for an Azure SQL database as shown in the following exhibit.
Use the drop-down menus to select the answer choice that completes each statement based on
the information presented in the graphic. NOTE: Each correct selection is worth one point.

Answer:

6. HOTSPOT
You have a SQL Server on Azure Virtual Machines instance that hosts a 10-TB SQL database
named DB1.
You need to identify and repair any physical or logical corruption in DB1.
The solution must meet the following requirements:
• Minimize how long it takes to complete the procedure.
• Minimize data loss.
How should you complete the command? To answer, select the appropriate options in the
answer area NOTE: Each correct selection is worth one point.
Answer:

7. HOTSPOT
You have an Azure SQL managed instance named Server1 and an Azure Blob Storage account
named storage1 that contains Microsoft SQL Server database backup files.
You plan to use Log Replay Service to migrate the backup files from storage1 to Server1. The
solution must use the highest level of security when connecting to storage1.
Which PowerShell cmdlet should you run, and which parameter should you specify to secure
the connection? To answer, select the appropriate options in the answer area. NOTE: Each
correct selection is worth one point.
Answer:

8. You have a Microsoft SQL Server 2019 database named DB1 that uses the following
database-level and instance-level features.
? Clustered columnstore indexes
? Automatic tuning
? Change tracking
? PolyBase
You plan to migrate DB1 to an Azure SQL database.
What feature should be removed or replaced before DB1 can be migrated?
A. Clustered columnstore indexes
B. PolyBase
C. Change tracking
D. Automatic tuning
Answer: B
Explanation:
This table lists the key features for PolyBase and the products in which they're available.
Reference: https://docs.microsoft.com/en-us/sql/relational-databases/polybase/polybase-
versioned-feature-summary

9. You plan to build a structured streaming solution in Azure Databricks. The solution will count
new events in fiveminute intervals and report only events that arrive during the interval.
The output will be sent to a Delta Lake table.
Which output mode should you use?
A. complete
B. append
C. update
Answer: A
Explanation:
Complete mode: You can use Structured Streaming to replace the entire table with every batch.
Incorrect Answers:
B: By default, streams run in append mode, which adds new records to the table.
Reference: https://docs.databricks.com/delta/delta-streaming.html

10. Schedule task (Step 4)


Steps:
11. HOTSPOT
You plan to deploy an Always On failover cluster instance (FCI) on Azure virtual machines.
You need to provision an Azure Storage account to host a cloud witness for the deployment.
How should you configure the storage account? To answer, select the appropriate options in the
answer area. NOTE: Each correct selection is worth one point.

Answer:
12. Note: This question is part of a series of questions that present the same scenario. Each
question in the series contains a unique solution that might meet the stated goals. Some
question sets might have more than one correct solution, while others might not have a correct
solution.
After you answer a question in this section, you will NOT be able to return to it. As a result,
these questions will not appear in the review screen.
You have two Azure SQL Database servers named Server1 and Server2. Each server contains
an Azure SQL database named Database1.
You need to restore Database1 from Server1 to Server2. The solution must replace the existing
Database1 on Server2.
Solution: You restore Database1 from Server1 to the Server2 by using the RESTORE Transact-
SQL command and the REPLACE option.
Does this meet the goal?
A. Yes
B. No
Answer: A
Explanation:
The REPLACE option overrides several important safety checks that restore normally performs.
The overridden checks are as follows:
? Restoring over an existing database with a backup taken of another database.
With the REPLACE option, restore allows you to overwrite an existing database with whatever
database is in the backup set, even if the specified database name differs from the database
name recorded in the backup set. This can result in accidentally overwriting a database by a
different database.
Reference: https://docs.microsoft.com/en-us/sql/t-sql/statements/restore-statements-transact-
sql

13. You have a SQL pool in Azure Synapse that contains a table named dbo.Customers. The
table contains a column name Email.
You need to prevent nonadministrative users from seeing the full email addresses in the Email
column. The users must see values in a format of [email protected] instead.
What should you do?
A. From the Azure portal, set a mask on the Email column.
B. From the Azure portal, set a sensitivity classification of Confidential for the Email column.
C. From Microsoft SQL Server Management Studio, set an email mask on the Email column.
D. From Microsoft SQL Server Management Studio, grant the SELECT permission to the users
for all the columns in the dbo.Customers table except Email.
Answer: B
Explanation:
The Email masking method, which exposes the first letter and replaces the domain with
XXX.com using a constant string prefix in the form of an email address.
Example: [email protected]

14. You have an on-premises Microsoft SQL Server 2019 database named SQL1 that uses
merge replication. You need to migrate SQL1 to Azure.
Which service should you use?
A. Azure SQL Edge
B. Azure SQL Database
C. SQL Server on Azure Virtual Machines
D. Azure SQL Managed instance
Answer: C

15. HOTSPOT
You have an Azure SQL database named db1.
You need to retrieve the resource usage of db1 from the last week.
How should you complete the statement? To answer, select the appropriate options in the
answer area. NOTE: Each correct selection is worth one point.
Answer:
Explanation:
Box 1: sys.resource_stats
sys.resource_stats returns CPU usage and storage data for an Azure SQL Database. It has
database_name and start_time columns.
Box 2: DateAdd
The following example returns all databases that are averaging at least 80% of compute
utilization over the last one week.
DECLARE @s datetime;
DECLARE @e datetime;
SET @s= DateAdd(d,-7,GetUTCDate());
SET @e= GETUTCDATE();
SELECT database_name, AVG(avg_cpu_percent) AS Average_Compute_Utilization FROM
sys.resource_stats
WHERE start_time BETWEEN @s AND @e
GROUP BY database_name
HAVING AVG(avg_cpu_percent) >= 80
Incorrect Answers:
sys.dm_exec_requests:
sys.dm_exec_requests returns information about each request that is executing in SQL Server.
It does not have a column named database_name.
sys.dm_db_resource_stats:
sys.dm_db_resource_stats does not have any start_time column.
Note: sys.dm_db_resource_stats returns CPU, I/O, and memory consumption for an Azure SQL
Database database. One row exists for every 15 seconds, even if there is no activity in the
database. Historical data is maintained for approximately one hour.
Sys.dm_user_db_resource_governance returns actual configuration and capacity settings used
by resource governance mechanisms in the current database or elastic pool. It does not have
any start_time column.
Reference: https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-
resource-stats-azure-sql-database

16. You have an Azure Data Factory pipeline that is triggered hourly.
The pipeline has had 100% success for the past seven days.
The pipeline execution fails, and two retries that occur 15 minutes apart also fail.
The third failure returns the following error.
What is a possible cause of the error?
A. From 06:00 to 07:00 on January 10, 2021, there was no data in wwi/BIKES/CARBON.
B. The parameter used to generate year=2021/month=01/day=10/hour=06 was incorrect.
C. From 06:00 to 07:00 on January 10, 2021, the file format of data in wwi/BIKES/CARBON was
incorrect.
D. The pipeline was triggered too early.
Answer: B

17. You have an on-premises multi-tier application named App1 that includes a web tier, an
application tier, and a Microsoft SQL Server tier. All the tiers run on Hyper-V virtual machines.
Your new disaster recovery plan requires that all business-critical applications can be recovered
to Azure.
You need to recommend a solution to fail over the database tier of App1 to Azure. The solution
must provide the ability to test failover to Azure without affecting the current environment.
What should you include in the recommendation?
A. Azure Backup
B. Azure Information Protection
C. Windows Server Failover Cluster
D. Azure Site Recovery
Answer: D
Explanation:
Reference: https://docs.microsoft.com/en-us/azure/site-recovery/site-recovery-test-failover-to-
azure

18. You have an Azure Data Factory that contains 10 pipelines.


You need to label each pipeline with its main purpose of either ingest, transform, or load. The
labels must be
available for grouping and filtering when using the monitoring experience in Data Factory.
What should you add to each pipeline?
A. an annotation
B. a resource tag
C. a run group ID
D. a user property
E. a correlation ID
Answer: A
Explanation:
Azure Data Factory annotations help you easily filter different Azure Data Factory objects based
on a tag. You
can define tags so you can see their performance or find errors faster.
Reference: https://www.techtalkcorner.com/monitor-azure-data-factory-annotations/

19. HOTSPOT
You need to recommend a configuration for ManufacturingSQLDb1 after the migration to Azure.
The solution must meet the business requirements.
What should you include in the recommendation? To answer, select the appropriate options in
the answer area. NOTE: Each correct selection is worth one point.

Answer:
Explanation:
Scenario: Business Requirements
Litware identifies business requirements include: meet an SLA of 99.99% availability for all
Azure deployments.
Box 1: Cloud witness
If you have a Failover Cluster deployment, where all nodes can reach the internet (by extension
of Azure), it is recommended that you configure a Cloud Witness as your quorum witness
resource.
Box 2: Azure Basic Load Balancer
Microsoft guarantees that a Load Balanced Endpoint using Azure Standard Load Balancer,
serving two or more Healthy Virtual Machine Instances, will be available 99.99% of the time.
Note: There are two main options for setting up your listener: external (public) or internal. The
external (public) listener uses an internet facing load balancer and is associated with a public
Virtual IP (VIP) that is accessible over the internet. An internal listener uses an internal load
balancer and only supports clients within the same Virtual Network.
Reference: https://technet.microsoft.com/windows-server-docs/failover-clustering/deploy-cloud-
witness
https://azure.microsoft.com/en-us/support/legal/sla/load-balancer/v1_0/

20. You have an on-premises datacenter that contains a 2-TB Microsoft SQL Server 2019
database named DB1.
You need to recommend a solution to migrate DB1 to an Azure SQL managed instance. The
solution must minimize downtime and administrative effort.
What should you include in the recommendation?
A. Log Replay Service (LRS)
B. log shipping
C. transactional replication
D. SQL Data Sync
Answer: B

21. You need to recommend a solution to ensure that the performance of DB3 is optimized after
the migration to Azure SQL Database. The solution must meet availability requirements.
What should you include in the recommendation?
A. Resource Governor
B. a custom resource pool
C. vertical scaling
D. horizontal scaling
Answer: C

22. You have an Azure SQL database named DB1.


You have a table name Table1 that has 20 columns of type CHAR (400). Row compression for
Table1 is enabled.
During a database audit, you discover that none of the fields contain more than 150 characters.
You need to ensure that you can apply page compression to Table1.
What should you do?
A. Configure the columns as sparse.
B. Change the column type to nvarchar (MAX).
C. Change the column type to varchar (MAX).
D. Change the column type to varchar (200).
Answer: D
Explanation:
Reference:
https://www.sqlshack.com/sql-varchar-data-type-deep-dive/
https://36chambers.wordpress.com/2020/06/18/nvarchar-everywhere-a-thought-experiment/

23. You have an Azure virtual machine named VM1 that runs Windows Server 2022 and hosts
a Microsoft SQL Server 2019 instance named SQL1. You need to configure SQL! to use mixed
mode authentication.
Which procedure should you run?
A. sp_addremoteIogin
B. xp_instance_regwrite
C. sp_cnarge_users_login
D. xp_grant_login
Answer: B

24. HOTSPOT
You have a SQL Server on Azure Virtual Machines instance named SQLVM1 that contains two
databases named DB1 and DB2. The database and log files for DB1 and DB2 are hosted on
managed disks.
You need to perform a snapshot backup of DB1 and DB2
How should you complete the I SQL statements? To answer, select the appropriate options in
the answer area. NOTE: Each correct selection is worth one point.

Answer:
25. You have SQL Server 2019 on an Azure virtual machine that runs Windows Server 2019.
The virtual machine has 4 vCPUs and 28 GB of memory.
You scale up the virtual machine to 8 vCPUSs and 64 GB of memory.
You need to provide the lowest latency for tempdb.
What is the total number of data files that tempdb should contain?
A. 2
B. 4
C. 8
D. 64
Answer: C
Explanation:
The number of files depends on the number of (logical) processors on the machine. As a
general rule, if the number of logical processors is less than or equal to eight, use the same
number of data files as logical processors. If the number of logical processors is greater than
eight, use eight data files and then if contention continues, increase the number of data files by
multiples of 4 until the contention is reduced to acceptable levels or make changes to the
workload/code.
Reference: https://docs.microsoft.com/en-us/sql/relational-databases/databases/tempdb-
database

26. You need to recommend an availability strategy for an Azure SQL database.
The strategy must meet the following requirements:
? Support failovers that do not require client applications to change their connection strings.
? Replicate the database to a secondary Azure region.
? Support failover to the secondary region.
What should you include in the recommendation?
A. failover groups
B. transactional replication
C. Availability Zones
D. geo-replication
Answer: A
Explanation:
Active geo-replication is an Azure SQL Database feature that allows you to create readable
secondary databases of individual databases on a server in the same or different data center
(region).
Incorrect Answers:
C: Availability Zones are unique physical locations within a region. Each zone is made up of one
or more datacenters equipped with independent power, cooling, and networking.
Reference: https://docs.microsoft.com/en-us/azure/azure-sql/database/active-geo-replication-
overview

27. DRAG DROP


You have SQL Server on an Azure virtual machine that contains a database named DB1. DB1
is 30 TB and has a 1-GB daily rate of change.
You back up the database by using a Microsoft SQL Server Agent job that runs Transact-SQL
commands. You perform a weekly full backup on Sunday, daily differential backups at 01:00,
and transaction log backups every five minutes.
The database fails on Wednesday at 10:00.
Which three backups should you restore in sequence? To answer, move the appropriate
backups from the list of backups to the answer area and arrange them in the correct order.
Answer:
28. You need to trigger an Azure Data Factory pipeline when a file arrives in an Azure Data
Lake Storage Gen2 container.
Which resource provider should you enable?
A. Microsoft.EventHub
B. Microsoft.EventGrid
C. Microsoft.Sql
D. Microsoft.Automation
Answer: B
Explanation:
Event-driven architecture (EDA) is a common data integration pattern that involves production,
detection, consumption, and reaction to events. Data integration scenarios often require Data
Factory customers to trigger pipelines based on events happening in storage account, such as
the arrival or deletion of a file in Azure Blob Storage account. Data Factory natively integrates
with Azure Event Grid, which lets you trigger pipelines on such events.
Reference: https://docs.microsoft.com/en-us/azure/data-factory/how-to-create-event-trigger

29. Topic 6, Misc. Questions


You have an Azure SQL database that contains a table named factSales.
FactSales contains the columns shown in the following table.

FactSales has 6 billion rows and is loaded nightly by using a batch process.
Which type of compression provides the greatest space reduction for the database?
A. page compression
B. row compression
C. columnstore compression
D. columnstore archival compression
Answer: D
Explanation:
Columnstore tables and indexes are always stored with columnstore compression. You can
further reduce the size of columnstore data by configuring an additional compression called
archival compression.
Note: Columnstore ? The columnstore index is also logically organized as a table with rows and
columns, but the data is physically stored in a column-wise data format.
Incorrect Answers:
B: Rowstore ? The rowstore index is the traditional style that has been around since the initial
release of SQL Server.
For rowstore tables and indexes, use the data compression feature to help reduce the size of
the database.
Reference: https://docs.microsoft.com/en-us/sql/relational-databases/data-compression/data-
compression
30. HOTSPOT
You have an Azure SQL database.
You run the following PowerShell script.

For each of the following statements, select Yes if the statement is true. Otherwise, select No.
NOTE: Each correct selection is worth one point.

Answer:
Explanation:
Reference:
https://docs.microsoft.com/en-us/powershell/module/az.sql/set-
azsqldatabasebackupshorttermretentionpolicy?view=azps-7.2.0
https://docs.microsoft.com/en-us/powershell/module/az.sql/set-
azsqldatabasebackuplongtermretentionpolicy?view=azps-7.2.0

31. HOTSPOT
You have an Azure SQL database named db1 on a server named server1.
You use Query Performance Insight to monitor db1.
You need to modify the Query Store configuration to ensure that performance monitoring data is
available as soon as possible.
Which configuration setting should you modify and which value should you configure? To
answer, select the appropriate options in the answer area. NOTE: Each correct selection is
worth one point.

Answer:
32. HOTSPOT
You have an Azure subscription.
You need to deploy a logical SQL server by using an Azure Resource Manager (ARM) template.
The solution must ensure that the server will allow inbound connectivity from any Azure
resource.
How should you complete the template? To answer, select the appropriate options in the
answer area. NOTE: Each correct selection is worth one point.
Answer:
33. HOTSPOT
You have an Azure subscription.
You need to deploy an Azure SQL resource that will support cross database queries by using
an Azure Resource Manager (ARM) template.
How should you complete the ARM template? To answer, select the appropriate options in the
answer area. NOTE: Each correct selection is worth one point.
Answer:
Explanation:
Reference: https://docs.microsoft.com/en-us/azure/azure-sql/managed-instance/create-template-
quickstart?tabs=azure-powershell

34. You have an Azure SQL database named SOL1.


You need to implement a disaster recovery solution for SQL1.
The solution must minimize the following:
• The recovery point objective JRPO)
• The recovery time objective (RTO)
• Administrative effort
What should you include in the solution?
A. auto failover groups
B. Azure Site Recovery
C. availability groups
D. active geo - replication
Answer: A

35. DRAG DROP


You need to implement statistics maintenance for SalesSQLDb1. The solution must meet the
technical requirements.
Which four actions should you perform in sequence? To answer, move the appropriate actions
from the list of actions to the answer area and arrange them in the correct order.
Answer:
Explanation:
Automating Azure SQL DB index and statistics maintenance using Azure Automation:
36. SIMULATION
Task 8
You plan to perform performance testing of db1.
You need prevent db1 from reverting to the last known good query plan.
Answer:
To prevent db1 from reverting to the last known good query plan, you need to disable the
automatic plan correction feature for the database. This feature is enabled by default and allows
the Query Store to detect and fix plan performance regressions by forcing the last good plan1.
However, if you want to test the performance of different plans without interference from the
Query Store, you can turn off this feature by using the ALTER DATABASE SCOPED
CONFIGURATION statement2. Here are the steps to disable the automatic plan correction
feature for db1:
Connect to db1 using SQL Server Management Studio, Azure Data Studio, or any other tool
that supports Transact-SQL statements.
Open a new query window and run the following command: ALTER DATABASE SCOPED
CONFIGURATION SET AUTOMATIC_TUNING (FORCE_LAST_GOOD_PLAN = OFF); GO
This command will disable the automatic plan correction feature for db1 and allow the Query
Optimizer to choose the best plan based on the current statistics and parameters3.
To verify that the automatic plan correction feature is disabled for db1, you can query the
sys.database_scoped_configurations catalog view. The value of the force_last_good_plan
column should be 0 for db1.
These are the steps to disable the automatic plan correction feature for db1.

37. You have an Azure subscription that contains a SQL Server on Azure Virtual Machines
instance named SQLVM1.
SQLVM1 has the following configurations:
• Automated patching is enabled.
• The SQL Server laaS Agent extension is installed.
• The Microsoft SQL Server instance on SQLVM1 is managed by using the Azuie portal.
You need to automate the deployment of cumulative updates to SQLVM1 by using Azure
Update Manager. The solution must ensure that the SQL Server instance on SQLVM1 can be
managed by using the Azure portal.
What should you do first on SQLVM1?
A. Install the Azure Monitor Agent.
B. Uninstall the SQL Server laaS Agent extension.
C. Install the Log Analytics agent.
D. Set Automated patching to Disable
Answer: B
38. You have an Azure SQL database named DB1.
You need to display the estimated execution plan of a query by using the query editor in the
Azure portal.
What should you do first?
A. Run the set showplan_all Transact-SQL statement.
B. For DB1, set QUERY_CAPTURE_MODE of Query Store to All.
C. Run the set forceplan Transact-SQL statement.
D. Enable Query Store for DB1.
Answer: A
Explanation:
Reference: https://docs.microsoft.com/en-us/sql/t-sql/statements/set-showplan-all-transact-
sql?view=sql-server-ver15

39. You have an Azure subscription.


You need to deploy an Azure SQL database.
The solution must meet the following requirements:
• Dynamically scale CPU resources.
• Ensure that the database can be paused to reduce costs.
What should you use?
A. the Business Critical service tier
B. the serverless compute tier
C. an elastic pool
D. the General Purpose service tier
Answer: B

40. You have an Azure virtual machine named VM1 on a virtual network named VNet1.
Outbound traffic from VM1 to the internet is blocked.
You have an Azure SQL database named SqlDb1 on a logical server named SqlSrv1.
You need to implement connectivity between VM1 and SqlDb1 to meet the following
requirements:
? Ensure that all traffic to the public endpoint of SqlSrv1 is blocked.
? Minimize the possibility of VM1 exfiltrating data stored in SqlDb1.
What should you create on VNet1?
A. a VPN gateway
B. a service endpoint
C. a private link
D. an ExpressRoute gateway
Answer: C
Explanation:
Azure Private Link enables you to access Azure PaaS Services (for example, Azure Storage
and SQL Database) and Azure hosted customer-owned/partner services over a private endpoint
in your virtual network.
Traffic between your virtual network and the service travels the Microsoft backbone network.
Exposing your service to the public internet is no longer necessary.
Reference: https://docs.microsoft.com/en-us/azure/private-link/private-link-overview

41. Note: This question is part of a series of questions that present the same scenario. Each
question in the series contains a unique solution that might meet the stated goals. Some
question sets might have more than one correct solution, while others might not have a correct
solution.
After you answer a question in this section, you will NOT be able to return to it. As a result,
these questions will not appear in the review screen.
You have an Azure Data Lake Storage account that contains a staging zone.
You need to design a daily process to ingest incremental data from the staging zone, transform
the data by executing an R script, and then insert the transformed data into a data warehouse in
Azure Synapse Analytics.
Solution: You schedule an Azure Databricks job that executes an R notebook, and then inserts
the data into the data warehouse.
Does this meet the goal?
A. Yes
B. No
Answer: B
Explanation:
Must use an Azure Data Factory, not an Azure Databricks job.
Reference: https://docs.microsoft.com/en-US/azure/data-factory/transform-data

42. You have an Azure SQL database named DB1. DB1 has a table named Table1 that
contains the following columns.
You plan to enable Always Encrypted for Table1.
Which two columns support encryption? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point
A. Column1
B. Column2
C. Column3
D. Column4
E. Column5
Answer: A, D

43. You are designing a dimension table in an Azure Synapse Analytics dedicated SQL pool.
You need to create a surrogate key for the table. The solution must provide the fastest query
performance.
What should you use for the surrogate key?
A. an IDENTITY column
B. a GUID column
C. a sequence object
Answer: A
Explanation:
Dedicated SQL pool supports many, but not all, of the table features offered by other
databases.
Surrogate keys are not supported. Implement it with an Identity column.
Reference: https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-
data-warehouse-tablesoverview

44. You need to identify the event_flle target for monitonng DB3 after the migration to Azure
SQL Database. The solution must meet the management requirements,
What should you use as the event_file target?
A. an Azure SQL database
B. an Azure Blob Storage container
C. a SQL Server filegroup
D. an Azure Files share
Answer: B

45. Topic 3, ADatum Corporation

Overview
This is a case study. Case studies are not timed separately. You can use as much exam time as
you would like to complete each case. However, there may be additional case studies and
sections on this exam. You must manage your time to ensure that you are able to complete all
questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is
provided in the case study. Case studies might contain exhibits and other resources that
provide more information about the scenario that is described in the case study. Each question
is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your
answers and to make changes before you move to the next section of the exam. After you begin
a new section, you cannot return to this section.

To start the case study


To display the first question in this case study, click the Next button. Use the buttons in the left
pane to explore the content of the case study before you answer the questions. Clicking these
buttons displays information such as business requirements, existing environment, and problem
statements. If the case study has an All Information tab, note that the information displayed is
identical to the information displayed on the subsequent tabs. When you are ready to answer a
question, click the Question button to return to the question.

Overview
ADatum Corporation is a retailer that sells products through two sales channels: retail stores
and a website.
Existing Environment
ADatum has one database server that has Microsoft SQL Server 2016 installed. The server
hosts three mission-critical databases named SALESDB, DOCDB, and REPORTINGDB.
SALESDB collects data from the stores and the website.
DOCDB stores documents that connect to the sales data in SALESDB. The documents are
stored in two different JSON formats based on the sales channel.
REPORTINGDB stores reporting data and contains several columnstore indexes. A daily
process creates reporting data in REPORTINGDB from the data in SALESDB. The process is
implemented as a SQL Server Integration Services (SSIS) package that runs a stored
procedure from SALESDB.

Requirements
Planned Changes
ADatum plans to move the current data infrastructure to Azure.
The new infrastructure has the following requirements:
? Migrate SALESDB and REPORTINGDB to an Azure SQL database.
? Migrate DOCDB to Azure Cosmos DB.
? The sales data, including the documents in JSON format, must be gathered as it arrives and
analyzed online by using Azure Stream Analytics. The analytics process will perform
aggregations that must be done continuously, without gaps, and without overlapping.
? As they arrive, all the sales documents in JSON format must be transformed into one
consistent format.
? Azure Data Factory will replace the SSIS process of copying the data from SALESDB to
REPORTINGDB.

Technical Requirements
The new Azure data infrastructure must meet the following technical requirements:
? Data in SALESDB must encrypted by using Transparent Data Encryption (TDE). The
encryption must use your own key.
? SALESDB must be restorable to any given minute within the past three weeks.
? Real-time processing must be monitored to ensure that workloads are sized properly based
on actual usage patterns.
? Missing indexes must be created automatically for REPORTINGDB.
? Disk IO, CPU, and memory usage must be monitored for SALESDB.

46. Based on the PaaS prototype, which Azure SQL Database compute tier should you use?
A. Business Critical 4-vCore
B. Hyperscale
C. General Purpose v-vCore
D. Serverless
Answer: A
Explanation:
There are CPU and Data I/O spikes for the PaaS prototype. Business Critical 4-vCore is
needed.
Reference: https://docs.microsoft.com/en-us/azure/azure-sql/database/reserved-capacity-
overview

47. DRAG DROP


You create a new Azure SQL managed instance named SQL1 and enable Database Mail
extended stored procedures.
You need to ensure that SOL Server Agent jobs running on SQL 1 can notify administrators
when a failure occurs.
Which three actions should you perform in sequence? To answer, move the appropriate actions
from the list of actions to the answer area and arrange them in the correct order.

Answer:
48. DRAG DROP
You have two on-premises Microsoft SQL Server instances named SQL1 and SQL2.
You have an Azure subscription
You need to sync a subset of tables between the databases hosted on SQL1 and SQL2 by
using SQL Data Sync.
Which five actions should you perform in sequence' To answer, move the appropriate actions
from the list of actions to the answer area and arrange them in the correct order.

Answer:

49. You need to implement authentication for ResearchDB1. The solution must meet the
security and compliance requirements.
What should you run as part of the implementation?
A. CREATE LOGIN and the FROM WINDOWS clause
B. CREATE USER and the FROM CERTIFICATE clause
C. CREATE USER and the FROM LOGIN clause
D. CREATE USER and the ASYMMETRIC KEY clause
E. CREATE USER and the FROM EXTERNAL PROVIDER clause
Answer: E
Explanation:
Scenario: Authenticate database users by using Active Directory credentials.
(Create a new Azure SQL database named ResearchDB1 on a logical server named
ResearchSrv01.)
Authenticate the user in SQL Database or SQL Data Warehouse based on an Azure Active
Directory user:
CREATE USER [[email protected]] FROM EXTERNAL PROVIDER;
Reference: https://docs.microsoft.com/en-us/sql/t-sql/statements/create-user-transact-sql

50. DRAG DROP


You have an instance of SQL Server on Azure Virtual Machines named SQL1. SQL1 contains a
database named DB1.
You need to enable Transparent Data Encryption (TDE) for DB1.
Which three objects should you create in sequence? To answer, move the appropriate objects
from the list of objects to the answer area and arrange them in the correct order.

Answer:
51. HOTSPOT
You have an Azure Data Lake Storage Gen2 account named account1 that stores logs as
shown in the following table.

You do not expect that the logs will be accessed during the retention periods.
You need to recommend a solution for account1 that meets the following requirements:
? Automatically deletes the logs at the end of each retention period
? Minimizes storage costs
What should you include in the recommendation? To answer, select the appropriate options in
the answer area. NOTE: Each correct selection is worth one point.

Answer:
Explanation:
Box 1: Store the infrastructure logs in the Cool access tier the application logs in the Archive
access tier
Hot - Optimized for storing data that is accessed frequently.
Cool - Optimized for storing data that is infrequently accessed and stored for at least 30 days.
Archive - Optimized for storing data that is rarely accessed and stored for at least 180 days with
flexible latency requirements, on the order of hours.
Box 2: Azure Blob storage lifecycle management rules
Blob storage lifecycle management offers a rich, rule-based policy that you can use to transition
your data to the best access tier and to expire data at the end of its lifecycle.
Reference: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers

52. You have an Azure SQL database.


You need to implement a disaster recovery solution that meets the following requirements:
• Minimizes how long it takes to recover the database if a datacenter fails
• Minimizes administrative effort
What should you include in the solution?
A. Azure Backup
B. active geo-replication
C. Azure Site Recovery
D. auto-failover groups
Answer: D

53. DRAG DROP


You have an Azure subscription that contains an Azure SQL managed instance, a database
named db1, and an Azure web app named Appl. Appl uses db1.
You need to enable Resource Governor for a App1. The solution must meet the following
requirements:
App1 must be able to consume all available CPU resources.
App1 must have at least half of the available CPU resources always available.
Which three actions should you perform in sequence? To answer. move the appropriate actions
from the list of actions to the answer area and arrange them in the correct order. NOTE: More
than one order of answer choices is correct. You will receive credit for any of the correct orders
you select.

Answer:

54. You plan to deploy two instances of SQL Server on Azure virtual machines in a highly
available configuration that will use an Always On availability group.
You need to recommend a deployment solution that meets the following requirements:
• Provides a Service Level Agreement (SLA) of at least 99.95%
• Replicates databases in the same group synchronously
• Minimizes the latency of database writes
What should you recommend?
A. Create a proximity group and an availability set. Deploy each virtual machine to the
availability set Add both virtual machines to the proximity group.
B. Create two proximity groups and a single availability set. Deploy both virtual machines to the
availability set. Add one virtual machine to each proximity group.
C. Create two proximity groups and two availability sets. Deploy each virtual machine to a
unique availability set. Add one virtual machine to each proximity group.
D. Create a proximity group and two availability sets. Deploy each virtual machine to a unique
availability set. Add both virtual machines to the proximity group.
Answer: A
Explanation:
To get VMs as close as possible, achieving the lowest possible latency, you should deploy them
within a proximity placement group. https://learn.microsoft.com/en-us/azure/virtual-machines/co-
location

More Hot Exams are available.

350-401 ENCOR Exam Dumps

350-801 CLCOR Exam Dumps


200-301 CCNA Exam Dumps

Powered by TCPDF (www.tcpdf.org)

You might also like