Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Certified Data Architect All Questions

View all questions & answers for the Certified Data Architect exam

Exam Certified Data Architect topic 1 question 90 discussion

Actual exam question from Salesforce's Certified Data Architect
Question #: 90
Topic #: 1
[All Certified Data Architect Questions]

Northern Trail Outfitters (NTO) needs to extract 50 million records from a custom object every day from its Salesforce org. NTO is facing query time-out issues while extracting these records.
What should a data architect recommend in order to get around the time-out issue?

  • A. Use a custom auto number and formula field to chunk records while extracting data,
  • B. Use the Rest API to extract data as it automatically chunks records by 200.
  • C. Use extract, transform, load (ETL) tool for extraction of records.
  • D. Ask Salesforce support to increase the query timeout value.
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
lizbette
6 months, 4 weeks ago
Selected Answer: C
C is better answer. A is still too manual, PK chunking would we better.
upvoted 1 times
...
Ullr
9 months, 1 week ago
A is correct: Chunking the Data into Smaller Sets ... 2. Create a formula field that converts the auto-number field text value into a numeric value—you cannot use an index with comparison operators such as “<=” (less than or equal to) or “>” (greater than) for the text-based auto-number field. In this example, we’ll name this field “ExportID.” https://developer.salesforce.com/blogs/engineering/2013/06/extracting-large-data-volume-ldv-in-force-com
upvoted 2 times
...
gokuSuperSayan4
9 months, 3 weeks ago
So the correct answers are A ore C? I vote C
upvoted 2 times
...
tobicky
12 months ago
Selected Answer: C
The most accurate answer would be Option C: Use extract, transform, load (ETL) tool for extraction of records. ETL tools are designed to handle large volumes of data and can efficiently extract, transform, and load data without running into time-out issues. They can also chunk data into manageable sizes for extraction, which can help avoid time-out issues.
upvoted 3 times
...
vip_10
1 year, 3 months ago
Selected Answer: B
B is correct as it also depends on the ETL tool on which api is used by it.
upvoted 1 times
tobicky
12 months ago
Not Option B: Use the Rest API to extract data as it automatically chunks records by 200. While the Rest API does chunk records, the chunk size of 200 might be too small for extracting 50 million records efficiently. This could lead to a large number of API calls and potentially hit the API limits
upvoted 2 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...