Big Black Friday Sale 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: exams65

ExamsBrite Dumps

Implementing Data Engineering Solutions Using Microsoft Fabric Question and Answers

Implementing Data Engineering Solutions Using Microsoft Fabric

Last Update Nov 30, 2025
Total Questions : 109

We are offering FREE DP-700 Microsoft exam questions. All you do is to just go and sign up. Give your details, prepare DP-700 free exam questions and then go for complete pool of Implementing Data Engineering Solutions Using Microsoft Fabric test questions that will help you more.

DP-700 pdf

DP-700 PDF

$40.25  $114.99
DP-700 Engine

DP-700 Testing Engine

$47.25  $134.99
DP-700 PDF + Engine

DP-700 PDF + Testing Engine

$61.25  $174.99
Questions 1

You have a Fabric workspace that contains an eventhouse and a KQL database named Database1. Database1 has the following:

A table named Table1

A table named Table2

An update policy named Policy1

Policy1 sends data from Table1 to Table2.

The following is a sample of the data in Table2.

Recently, the following actions were performed on Table1:

An additional element named temperature was added to the StreamData column.

The data type of the Timestamp column was changed to date.

The data type of the DeviceId column was changed to string.

You plan to load additional records to Table2.

Which two records will load from Table1 to Table2? Each correct answer presents a complete solution.

NOTE: Each correct selection is worth one point.

A)

B)

C)

D)

Options:

A.  

Option A

B.  

Option B

C.  

Option c

D.  

Option D

Discussion 0
Questions 2

HOTSPOT

You have a Fabric workspace that contains an eventstream named EventStream1.

You discover that an EventStream1 transformation fails.

You need to find the following error information:

The error details, including the occurrence time

The total number of errors

What should you use? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Discussion 0
Questions 3

You have a Fabric warehouse named DW1 that loads data by using a data pipeline named Pipeline1. Pipeline1 uses a Copy data activity with a dynamic SQL source. Pipeline1 is scheduled to run every 15 minutes.

You discover that Pipeline1 keeps failing.

You need to identify which SQL query was executed when the pipeline failed.

What should you do?

Options:

A.  

From Monitoring hub, select the latest failed run of Pipeline1, and then view the output JSON.

B.  

From Monitoring hub, select the latest failed run of Pipeline1, and then view the input JSON.

C.  

From Real-time hub, select Fabric events, and then review the details of Microsoft.Fabric.ItemReadFailed.

D.  

From Real-time hub, select Fabric events, and then review the details of Microsoft. Fabric.ItemUpdateFailed.

Discussion 0
Questions 4

You have a Fabric workspace named Workspace1 that contains a lakehouse named Lakehouse1. Lakehouse1 contains the following tables:

Orders

Customer

Employee

The Employee table contains Personally Identifiable Information (PII).

A data engineer is building a workflow that requires writing data to the Customer table, however, the user does NOT have the elevated permissions required to view the contents of the Employee table.

You need to ensure that the data engineer can write data to the Customer table without reading data from the Employee table.

Which three actions should you perform? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Options:

A.  

Share Lakehouse1 with the data engineer.

B.  

Assign the data engineer the Contributor role for Workspace2.

C.  

Assign the data engineer the Viewer role for Workspace2.

D.  

Assign the data engineer the Contributor role for Workspace1.

E.  

Migrate the Employee table from Lakehouse1 to Lakehouse2.

F.  

Create a new workspace named Workspace2 that contains a new lakehouse named Lakehouse2.

G.  

Assign the data engineer the Viewer role for Workspace1.

Discussion 0
Questions 5

HOTSPOT

You are processing streaming data from an external data provider.

You have the following code segment.

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

NOTE: Each correct selection is worth one point.

Options:

Discussion 0
Questions 6

You need to develop an orchestration solution in fabric that will load each item one after the other. The solution must be scheduled to run every 15 minutes. Which type of item should you use?

Options:

A.  

warehouse

B.  

data pipeline

C.  

Dataflow Gen2 dataflow

D.  

notebook

Discussion 0
Questions 7

HOTSPOT

You need to troubleshoot the ad-hoc query issue.

How should you complete the statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Discussion 0
Questions 8

You need to ensure that WorkspaceA can be configured for source control. Which two actions should you perform?

Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

Options:

A.  

Assign WorkspaceA to Capl.

B.  

From Tenant setting, set Users can synchronize workspace items with their Git repositories to Enabled

C.  

Configure WorkspaceA to use a Premium Per User (PPU) license

D.  

From Tenant setting, set Users can sync workspace items with GitHub repositories to Enabled

Discussion 0
Questions 9

You need to recommend a method to populate the POS1 data to the lakehouse medallion layers.

What should you recommend for each layer? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Discussion 0
Questions 10

You need to ensure that usage of the data in the Amazon S3 bucket meets the technical requirements.

What should you do?

Options:

A.  

Create a workspace identity and enable high concurrency for the notebooks.

B.  

Create a shortcut and ensure that caching is disabled for the workspace.

C.  

Create a workspace identity and use the identity in a data pipeline.

D.  

Create a shortcut and ensure that caching is enabled for the workspace.

Discussion 0
Questions 11

You need to create the product dimension.

How should you complete the Apache Spark SQL code? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Discussion 0
Questions 12

You need to recommend a solution for handling old files. The solution must meet the technical requirements. What should you include in the recommendation?

Options:

A.  

a data pipeline that includes a Copy data activity

B.  

a notebook that runs the VACUUM command

C.  

a notebook that runs the OPTIMIZE command

D.  

a data pipeline that includes a Delete data activity

Discussion 0
Questions 13

You need to schedule the population of the medallion layers to meet the technical requirements.

What should you do?

Options:

A.  

Schedule a data pipeline that calls other data pipelines.

B.  

Schedule a notebook.

C.  

Schedule an Apache Spark job.

D.  

Schedule multiple data pipelines.

Discussion 0
Questions 14

You need to populate the MAR1 data in the bronze layer.

Which two types of activities should you include in the pipeline? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Options:

A.  

ForEach

B.  

Copy data

C.  

WebHook

D.  

Stored procedure

Discussion 0
Questions 15

You need to ensure that the data engineers are notified if any step in populating the lakehouses fails. The solution must meet the technical requirements and minimize development effort.

What should you use? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Discussion 0
Questions 16

You need to ensure that the data analysts can access the gold layer lakehouse.

What should you do?

Options:

A.  

Add the DataAnalyst group to the Viewer role for Workspace

A.  

B.  

Share the lakehouse with the DataAnalysts group and grant the Build reports on the default semantic model permission.

C.  

Share the lakehouse with the DataAnalysts group and grant the Read all SQL Endpoint data permission.

D.  

Share the lakehouse with the DataAnalysts group and grant the Read all Apache Spark permission.

Discussion 0
Questions 17

You need to recommend a solution to resolve the MAR1 connectivity issues. The solution must minimize development effort. What should you recommend?

Options:

A.  

Add a ForEach activity to the data pipeline.

B.  

Configure retries for the Copy data activity.

C.  

Configure Fault tolerance for the Copy data activity.

D.  

Call a notebook from the data pipeline.

Discussion 0
Questions 18

You have two Fabric workspaces named Workspace1 and Workspace2.

You have a Fabric deployment pipeline named deployPipeline1 that deploys items from Workspace1 to Workspace2. DeployPipeline1 contains all the items in Workspace1.

You recently modified the items in Workspaces1.

The workspaces currently contain the items shown in the following table.

Items in Workspace1 that have the same name as items in Workspace2 are currently paired.

You need to ensure that the items in Workspace1 overwrite the corresponding items in Workspace2. The solution must minimize effort.

What should you do?

Options:

A.  

Delete all the items in Workspace2, and then run deployPipeline1.

B.  

Rename each item in Workspace2 to have the same name as the items in Workspace1.

C.  

Back up the items in Workspace2, and then run deployPipeline1.

D.  

Run deployPipeline1 without modifying the items in Workspace2.

Discussion 0
Questions 19

HOTSPOT

You have a Fabric workspace.

You are debugging a statement and discover the following issues:

Sometimes, the statement fails to return all the expected rows.

The PurchaseDate output column is NOT in the expected format of mmm dd, yy.

You need to resolve the issues. The solution must ensure that the data types of the results are retained. The results can contain blank cells.

How should you complete the statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Discussion 0
Questions 20

You have a Fabric workspace that contains an eventstream named EventStream1. EventStream1 outputs events to a table in a lakehouse.

You need to remove files that are older than seven days and are no longer in use.

Which command should you run?

Options:

A.  

VACUUM

B.  

COMPUTE

C.  

OPTIMIZE

D.  

CLONE

Discussion 0
Questions 21

You are building a Fabric notebook named MasterNotebookl in a workspace. MasterNotebookl contains the following code.

You need to ensure that the notebooks are executed in the following sequence:

1. Notebook_03

2. Notebook.Ol

3. Notebook_02

Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

Options:

A.  

Split the Directed Acyclic Graph (DAG) definition into three separate definitions.

B.  

Change the concurrency to 3.

C.  

Move the declaration of Notebook_03 to the top of the Directed Acyclic Graph (DAG) definition.

D.  

Move the declaration of Notebook_02 to the bottom of the Directed Acyclic Graph (DAG) definition.

E.  

Add dependencies to the execution of Note boo k_02.

F.  

Add dependencies to the execution of Notebook_03.

Discussion 0
Questions 22

You have a Fabric workspace that contains a warehouse named Warehouse1. Warehouse! contains a table named Customer. Customer contains the following data.

You have an internal Microsoft Entra user named User1 that has an email address of user1@contoso.com.

You need to provide User1 with access to the Customer table. The solution must prevent User1 from accessing the CreditCard column.

How should you complete the statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Discussion 0