Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Certified Data Architect All Questions

View all questions & answers for the Certified Data Architect exam

Exam Certified Data Architect topic 1 question 37 discussion

Actual exam question from Salesforce's Certified Data Architect
Question #: 37
Topic #: 1
[All Certified Data Architect Questions]

Universal Containers (UC) has a very large and complex Salesforce org with hundreds of validation rules and triggers. The triggers are responsible for system updates and data manipulation as records are created or updated by users. A majority of the automation tools within UC's org were not designed to run during a data load. UC is importing 100,000 records into Salesforce across several objects over the weekend.
What should a data architect do to mitigate any unwanted results during the import?

  • A. Ensure validation rules, triggers, and other automation tools are disabled
  • B. Ensure duplication and matching rules are defined
  • C. Bulkify the triggers to handle import loads
  • D. Import the data is smaller batches over a 24-hour period
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
lizbette
6 months, 4 weeks ago
Selected Answer: A
A is correct and is general best practice.
upvoted 1 times
...
ETH777
10 months, 3 weeks ago
Selected Answer: A
Not D - it help manage governor limits, but doesn't prevent issues caused by automation tools running during each batch. It also extend the duration of the overall import time. A - Automation tools are not designed for large-scale data loads. Disabling avoids trigger recursion and speeds up the import process.
upvoted 3 times
...
tobicky
12 months ago
Selected Answer: A
The most accurate answer is A. Ensure validation rules, triggers, and other automation tools are disabled. During a large data import, these features can cause delays and conflicts. Disabling them can help prevent errors and improve the performance of the data import. D. Import the data in smaller batches over a 24-hour period: This could help avoid overloading the system, but it might not be enough to prevent all unwanted results, especially if the issues are caused by validation rules, triggers, or other automation tools
upvoted 2 times
...
Amine98ma
1 year, 1 month ago
Selected Answer: A
Because we want to save all data and prevent any problems so VR should be disabled and also salesforce recommend to by VR and triggers during import and instead preprocess the data
upvoted 2 times
...
ksho
1 year, 2 months ago
Selected Answer: D
A is only plausible if there is a trigger framework in place that allows code to be disabled via custom settings. Otherwise, the unit tests would fail while trying to disable them and you'd end up working more on commenting code out/deployment and dealing with that fall out rather than your import. But even disabling validation rules and flows can cause well written unit tests to fail if there are dependencies. For this reason, I think it's the answer is D. It's slower, but you'll get the data in without tampering with code. The data can be preprocessed so that validation rules aren't triggered. Realistically, the answer is a combination of A & D - disable what you can and lower the batch size to accomodate for what you cannot.
upvoted 2 times
...
Oleg_M
1 year, 2 months ago
Selected Answer: A
The answer is A. Even though you can mitigate any lack of bulkyfication by importing records in small batches, that won't bypass validation rules, So in the end you'll still have a lot of data not imported because of validation rules. So I'd say A since it'll allow you to import all set of data.
upvoted 2 times
...
thneeb
1 year, 4 months ago
Selected Answer: D
I would tend more to D. Sure in a well designed Salesforce org, it would not have a big impact, if the triggers are disabled during the load the the processing of the triggers can be executed afterwards. But here I would say, import the 100K records in smaller batches. (D)
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...