Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Exam Data-Architect Topic 1 Question 33 Discussion

Actual exam question for Salesforce's Salesforce Certified Data Architect exam
Question #: 33
Topic #: 1
[All Salesforce Certified Data Architect Questions]

Northern Trail Outfitters needs to implement an archive solution for Salesforce dat

a. This archive solution needs to help NTO do the following:

1. Remove outdated Information not required on a day-to-day basis.

2. Improve Salesforce performance.

Which solution should be used to meet these requirements?

Show Suggested Answer Hide Answer
Suggested Answer: A

Identifying a location to store archived data and using scheduled batch jobs to migrate and purge the aged data on a nightly basis can be a way to meet the requirements for an archive solution. The article provides a use case of how to use Heroku Connect, Postgres, and Salesforce Connect to archive old data, free up space in the org, and still retain the option to unarchive the data if needed. The article also explains how this solution can improve Salesforce performance and meet data retention policies.


Contribute your Thoughts:

Lauran
8 days ago
Agreed, I think option A is the way to go here. It's a straightforward, well-established approach to archiving data, and it should help us meet the key requirements.
upvoted 0 times
...
Buddy
10 days ago
Option D is the one that's got me scratching my head. Using a full copy sandbox as a source for archived data? That doesn't sound like the most efficient or practical solution to me.
upvoted 0 times
...
Marguerita
11 days ago
Option C is an interesting idea, but I'm not sure a formula field and a report export is the best way to go about this. It seems a bit of a workaround, and we'd have to maintain that formula field and report on an ongoing basis.
upvoted 0 times
...
Gertude
12 days ago
I'm not too sure about option B. Moving data to a location using a time-based workflow seems a bit more complex than the batch job approach. Plus, we'd still need to figure out where to store the archived data.
upvoted 0 times
...
Danica
14 days ago
Okay, let's go through the options. Option A seems like a good fit - we can identify a location to store the archived data and use scheduled batch jobs to migrate and purge the aged data. That way, we're keeping the data but removing it from the main Salesforce instance.
upvoted 0 times
...
Nelida
16 days ago
Hmm, this question seems pretty straightforward. We need to implement an archive solution for Salesforce data, and the key requirements are to remove outdated information and improve Salesforce performance. Let's see what the options are.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77