Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Exam Data Architect Topic 5 Question 35 Discussion

Actual exam question for Salesforce's Data Architect exam
Question #: 35
Topic #: 5
[All Data Architect Questions]

UC has migrated its Back-office data into an on-premise database with REST API access. UC recently implemented Sales cloud for its sales organization. But users are complaining about a lack of order data inside SF.

UC is concerned about SF storage limits but would still like Sales cloud to have access to the data.

Which design patterns should a data architect select to satisfy the requirement?

Show Suggested Answer Hide Answer
Suggested Answer: B

Creating validation rules to check if the required attributes are entered is the best option to mandate this when customers are created in Salesforce. Validation rules allow you to specify criteria that must be met before a record can be saved. You can use validation rules to ensure that customers have a first name, last name, and email when they are created in Salesforce. This way, you can prevent incomplete or invalid data from being sent to your MDM solution.


Contribute your Thoughts:

Yolande
8 days ago
I would go with Option C. A bidirectional integration between the systems ensures that the data is always in sync, and it gives the sales team the access they need without compromising the on-premise system.
upvoted 0 times
...
Marica
9 days ago
Option B looks like the way to go. Virtualizing the data in Salesforce seems like the most efficient solution to avoid storage limits. Plus, it keeps the data secure in the on-premise system.
upvoted 0 times
...
Eve
14 days ago
That's a good point, but I still think option B is more efficient in this scenario.
upvoted 0 times
...
Jaime
15 days ago
I disagree, I believe option C is the way to go as it ensures a seamless integration between the systems.
upvoted 0 times
...
Eve
17 days ago
I think option B is the best choice because it allows us to access the data without worrying about storage limits.
upvoted 0 times
...
Shizue
18 days ago
I'm not sure about option B). I think option C) might be better as it allows for bidirectional integration between systems.
upvoted 0 times
...
Ethan
19 days ago
I agree with Deeanna. Using SF Connect to virtualize the data seems like the most efficient solution.
upvoted 0 times
...
Deeanna
20 days ago
I think option B) sounds like a good idea. It would allow Sales cloud to access the data without worrying about storage limits.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77