Independence Day Deal! Unlock 25% OFF Today – Limited-Time Offer - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Exam Data Architect Topic 5 Question 35 Discussion

Actual exam question for Salesforce's Data Architect exam
Question #: 35
Topic #: 5
[All Data Architect Questions]

UC has migrated its Back-office data into an on-premise database with REST API access. UC recently implemented Sales cloud for its sales organization. But users are complaining about a lack of order data inside SF.

UC is concerned about SF storage limits but would still like Sales cloud to have access to the data.

Which design patterns should a data architect select to satisfy the requirement?

Show Suggested Answer Hide Answer
Suggested Answer: B

Creating validation rules to check if the required attributes are entered is the best option to mandate this when customers are created in Salesforce. Validation rules allow you to specify criteria that must be met before a record can be saved. You can use validation rules to ensure that customers have a first name, last name, and email when they are created in Salesforce. This way, you can prevent incomplete or invalid data from being sent to your MDM solution.


Contribute your Thoughts:

Sylvie
1 months ago
Option F: Develop a secret handshake protocol to transfer the data between the systems. It'll be like a spy thriller, but with more spreadsheets.
upvoted 0 times
...
Maynard
1 months ago
Option E: Teach the sales team to use carrier pigeons to fetch the data from the on-premise system. No storage limits, and it's a great team-building exercise!
upvoted 0 times
Dulce
6 days ago
C) Develop a bidirectional integration between the on-premise system and Salesforce.
upvoted 0 times
...
Vanna
11 days ago
B) Use SF Connect to virtualize the data in SF and avoid storage limits.
upvoted 0 times
...
...
Kenneth
2 months ago
Option A might work, but I'm not sure I'd want to migrate all that data into Salesforce just to take advantage of the native functionality. Seems like a lot of overhead for the storage limits.
upvoted 0 times
...
Eric
2 months ago
Hmm, Option D seems interesting. Iframing the on-premise system in Salesforce could be a creative way to give the sales team the data they need without too much integration complexity.
upvoted 0 times
Alease
7 days ago
It's a creative approach to address the data access issue.
upvoted 0 times
...
Eileen
11 days ago
I agree, it could simplify things for the sales team.
upvoted 0 times
...
Daniela
1 months ago
Option D seems like a good solution.
upvoted 0 times
...
...
Yolande
2 months ago
I would go with Option C. A bidirectional integration between the systems ensures that the data is always in sync, and it gives the sales team the access they need without compromising the on-premise system.
upvoted 0 times
Lisha
3 days ago
True, a bidirectional integration would provide the best of both worlds for UC.
upvoted 0 times
...
Blair
17 days ago
I see your point, but with Option C, we can avoid storage limits and keep both systems updated.
upvoted 0 times
...
Doretha
2 months ago
But wouldn't Option A be easier to implement? Just migrate the data into SF.
upvoted 0 times
...
Sheridan
2 months ago
I think Option C is the best choice. It keeps the data in sync between systems.
upvoted 0 times
...
...
Marica
2 months ago
Option B looks like the way to go. Virtualizing the data in Salesforce seems like the most efficient solution to avoid storage limits. Plus, it keeps the data secure in the on-premise system.
upvoted 0 times
...
Eve
2 months ago
That's a good point, but I still think option B is more efficient in this scenario.
upvoted 0 times
...
Jaime
2 months ago
I disagree, I believe option C is the way to go as it ensures a seamless integration between the systems.
upvoted 0 times
...
Eve
2 months ago
I think option B is the best choice because it allows us to access the data without worrying about storage limits.
upvoted 0 times
...
Shizue
2 months ago
I'm not sure about option B). I think option C) might be better as it allows for bidirectional integration between systems.
upvoted 0 times
...
Ethan
2 months ago
I agree with Deeanna. Using SF Connect to virtualize the data seems like the most efficient solution.
upvoted 0 times
...
Deeanna
2 months ago
I think option B) sounds like a good idea. It would allow Sales cloud to access the data without worrying about storage limits.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77