exam questions

Exam AWS Certified Database - Specialty All Questions

View all questions & answers for the AWS Certified Database - Specialty exam

Exam AWS Certified Database - Specialty topic 1 question 29 discussion

Exam question from Amazon's AWS Certified Database - Specialty
Question #: 29
Topic #: 1
[All AWS Certified Database - Specialty Questions]

A company wants to migrate its existing on-premises Oracle database to Amazon Aurora PostgreSQL. The migration must be completed with minimal downtime using AWS DMS. A Database Specialist must validate that the data was migrated accurately from the source to the target before the cutover. The migration must have minimal impact on the performance of the source database.
Which approach will MOST effectively meet these requirements?

  • A. Use the AWS Schema Conversion Tool (AWS SCT) to convert source Oracle database schemas to the target Aurora DB cluster. Verify the datatype of the columns.
  • B. Use the table metrics of the AWS DMS task created for migrating the data to verify the statistics for the tables being migrated and to verify that the data definition language (DDL) statements are completed.
  • C. Enable the AWS Schema Conversion Tool (AWS SCT) premigration validation and review the premigration checklist to make sure there are no issues with the conversion.
  • D. Enable AWS DMS data validation on the task so the AWS DMS task compares the source and target records, and reports any mismatches.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
grekh001
Highly Voted 3 years, 4 months ago
"To ensure that your data was migrated accurately from the source to the target, we highly recommend that you use data validation." https://docs.aws.amazon.com/dms/latest/userguide/CHAP_BestPractices.html Answer is D.
upvoted 9 times
...
Hisayuki
Most Recent 1 year, 3 months ago
Selected Answer: D
DMS supports the validation between source and target data. But SCT does not support it.
upvoted 1 times
...
sachin
2 years, 10 months ago
table metrics of the AWS DMS can be verified manully for DMS task. There is no mechanism which provided an automated for reading these metric and providing confirmation if every thing went well. If you are working 100+ tables then DMS Data Validation is the only option. D is best fit
upvoted 2 times
...
novice_expert
2 years, 12 months ago
Selected Answer: D
D would run select queries on source & target to compare rows, so some load B is also good candidate, rather better but it needed "The data validation option in the DMS task has to be activated before DMS " You can find individual table metrics on the Table statistics tab for each individual task. These metrics include these numbers: Rows loaded during the full load. Inserts, updates, and deletes since the task started. DDL operations since the task started. B. Use the table metrics of the AWS DMS task created for migrating the data to verify the statistics for the tables being migrated and to verify that the data definition language (DDL) statements are completed. D. Enable AWS DMS data validation on the task so the AWS DMS task compares the source and target records, and reports any mismatches.
upvoted 2 times
...
tugboat
3 years, 1 month ago
Selected Answer: D
D for data validation
upvoted 3 times
...
SMAZ
3 years, 4 months ago
https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Validating.html During data validation, AWS DMS compares each row in the source with its corresponding row at the target, verifies the rows contain the same data, and reports any mismatches. To accomplish this AWS DMS issues appropriate queries to retrieve the data. Note that these queries will consume additional resources at the source and target as well as additional network resources. So Answer should be B
upvoted 1 times
awsmonster
3 years, 3 months ago
Answer should be D: The data validation option in the DMS task has to be activated before DMS performs what SMAZ has written. B does not mention anything about enabling it.
upvoted 3 times
...
...
SMAZ
3 years, 4 months ago
'The migration should have a negligible effect on the source database's performance.' I believe answer should be 'B' Table metrics You can find individual table metrics on the Table statistics tab for each individual task. These metrics include these numbers: Rows loaded during the full load. Inserts, updates, and deletes since the task started. DDL operations since the task started.
upvoted 2 times
RotterDam
3 years, 1 month ago
No datavalidation can ONLY be done using Task Validation and it has to be enabled before DMS tasks start and after migration is finished. Its a very common question. D is the correct choice
upvoted 1 times
...
...
2025flakyt
3 years, 4 months ago
D is the correct answer
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago