Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Exam Data Architect Topic 1 Question 22 Discussion

Actual exam question for Salesforce's Data Architect exam
Question #: 22
Topic #: 1
[All Data Architect Questions]

Universal Containers (UC) is in the process of selling half of its company. As part of this split, UC's main Salesforce org will be divided into two org:org A and org B, UC has delivered these requirements to its data architect

1. The data model for Org B will drastically change with different objects, fields, and picklist values.

2. Three million records will need to be migrated from org A to org B for compliance reasons.

3. The migrate will need occur within the next two month, prior to be split.

Which migrate strategy should a data architect use to successfully migrate the date?

Show Suggested Answer Hide Answer
Suggested Answer: A

Contribute your Thoughts:

Rosio
8 days ago
Hmm, I'm not sure about that. Option D with the Salesforce CLI might be a bit more complex, but it could provide better performance and reliability for a migration of this scale.
upvoted 0 times
...
Dewitt
15 days ago
Option C seems like the best choice to me. Using the Bulk API with a custom script will allow for more control and flexibility during the migration process.
upvoted 0 times
Ahmed
5 days ago
I think option C is the best choice for this migration.
upvoted 0 times
...
...
Matt
25 days ago
I'm not sure, but I think option B could be a good choice for smaller data sets.
upvoted 0 times
...
Lenny
25 days ago
I think option B would be the easiest and most straightforward approach. The Data Loader and Data Import Wizard are user-friendly tools that can handle large data volumes efficiently.
upvoted 0 times
Noel
11 days ago
User 1: I agree, option B seems like the best choice for this situation.
upvoted 0 times
...
...
Meghan
29 days ago
I disagree, I believe option C is more efficient as the Bulk API can handle large volumes of data.
upvoted 0 times
...
Felicitas
1 months ago
I think option A is the best choice because an ETL tool can handle complex data transformations.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77