Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft Exam DP-203 Topic 1 Question 103 Discussion

Actual exam question for Microsoft's DP-203 exam
Question #: 103
Topic #: 1
[All DP-203 Questions]

You have an Azure Databricks workspace that contains a Delta Lake dimension table named Tablet. Table1 is a Type 2 slowly changing dimension (SCD) table. You need to apply updates from a source table to Table1. Which Apache Spark SQL operation should you use?

Show Suggested Answer Hide Answer
Suggested Answer: A, B

Contribute your Thoughts:

Izetta
1 days ago
Hmm, I was also considering UPDATE, but MERGE is definitely the more robust and comprehensive solution for this problem.
upvoted 0 times
...
Thaddeus
5 days ago
I agree, MERGE is the way to go. It's designed specifically for this kind of data integration use case.
upvoted 0 times
...
Sabrina
6 days ago
The MERGE operation seems like the obvious choice here to update the Type 2 SCD table. It can handle both insert and update scenarios seamlessly.
upvoted 0 times
...
Felix
24 days ago
I'm not sure, but I think MERGE makes sense because it allows you to insert, update, or delete records based on a condition.
upvoted 0 times
...
Lakeesha
26 days ago
I agree with Rikki, MERGE is the correct operation for updating Type 2 SCD tables.
upvoted 0 times
...
Rikki
30 days ago
I think the answer is C) MERGE.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77