Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Certified Data Architect All Questions

View all questions & answers for the Certified Data Architect exam

Exam Certified Data Architect topic 1 question 23 discussion

Actual exam question from Salesforce's Certified Data Architect
Question #: 23
Topic #: 1
[All Certified Data Architect Questions]

Universal Containers (UC) is in the process of selling half of its company. As part of this split, UC’s main Salesforce org will be divided into two orgs: Org A and Org B. UC has delivered these requirements to its data architect:
1. The data model for Org B will drastically change with different objects, fields, and picklist values.
2. Three million records will need to be migrated from Org A to Org B for compliance reasons.
3. The migration will need occur within the next two months, prior to the spilt.
Which migration strategy should a data architect use to successfully migrate the data?

  • A. Use an ETL tool to orchestrate the migration.
  • B. Write a script to use the Bulk API.
  • C. Use the Salesforce CLI to query, export, and import.
  • D. Use Data Loader for export and Data Import Wizard for import.
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
Rangya
5 months, 2 weeks ago
Selected Answer: A
The script will facilitate extract and load. But transformation is also required here.
upvoted 2 times
...
Nilesh_Nanda
6 months, 1 week ago
A is correct
upvoted 1 times
...
lizbette
6 months, 4 weeks ago
Selected Answer: A
a correct
upvoted 2 times
...
6967185
7 months, 4 weeks ago
Options provided for Data Migration study guide are: serial load, parallel mode, defer sharing calculation, record locks, hierarchical relationship, and bulk API limits. Given this is a test on data migration, would opt for solution that is mentioned.
upvoted 1 times
6967185
7 months, 4 weeks ago
Scratch that, a single batch of records can contain a maximum of 10,000 records. The requirement states 3,000,000 records. :)
upvoted 1 times
...
...
ETH777
10 months, 2 weeks ago
Selected Answer: A
Not B - Bulk API is efficient for bulk data transfer, but it requires significant scripting effort, especially for data mapping and transformation in this complex scenario. A - ETL tool handles complexity, mapping, and orchestration.
upvoted 3 times
...
DavidHolland
11 months, 4 weeks ago
Selected Answer: A
Big changes to the data model means I would select A
upvoted 2 times
...
tobicky
12 months ago
Selected Answer: A
The most accurate answer is A. Use an ETL tool to orchestrate the migration. Given the complexity of the migration (drastic changes in the data model, large volume of records, and tight timeline), an ETL (Extract, Transform, Load) tool would be the most suitable option. ETL tools are designed to handle complex data migrations, including changes in data models and large volumes of data. They also provide robust error handling and logging capabilities, which are crucial for a successful migration. B. Write a script to use the Bulk API: Writing a script to use the Bulk API could be a viable option, but it would require significant development effort and may not be feasible given the two-month timeline. Additionally, this approach would require extensive testing to ensure the accuracy of the data migration.
upvoted 4 times
...
ksho
1 year, 2 months ago
Selected Answer: A
ETL or Batch both would require investment in development resources. However, batch would be a 'from scratch' development effort and there's only two months to complete. Using an ETL tool would greatly shorten the development time to transform and migrate the data and can be easily updated during testing.
upvoted 3 times
...
thneeb
1 year, 4 months ago
Selected Answer: A
Drastical changes in the data model let me choose A.
upvoted 3 times
...
BorisBoris
1 year, 4 months ago
On reflection, B is probably more appropriate since it will requirer a one-off operation (maybe in batches), therefore, the investment in an ETL tool seems illogical unless one is already in use and, in this scenario, we cannot assume that. Therefore Batch Scripting is appropriate and will yield the required results with accuracy and reliability.
upvoted 1 times
...
BorisBoris
1 year, 4 months ago
Answer A. An ETL tool provides a robust and scalable solution for data migration between Salesforce orgs, especially when dealing with large volumes of data and complex transformations. Here's why it is the recommended approach: Scalability: An ETL tool can handle large data volumes efficiently by leveraging parallel processing capabilities. With three million records to migrate, using an ETL tool can ensure optimal performance and faster data transfer.
upvoted 2 times
...
Alokv
1 year, 5 months ago
In this scenario, considering the complexity of the data model changes, the volume of records, and the timeframe for migration, the most suitable migration strategy would be: A. Use an ETL tool to orchestrate the migration.
upvoted 4 times
...
Alokv
1 year, 5 months ago
I think A is correct option. A Batch Script is also some kind of ETL tool. But it is developed by user/developer and therefore prone to have errors.
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...