Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam MCIA - Level 1 All Questions

View all questions & answers for the MCIA - Level 1 exam

Exam MCIA - Level 1 topic 1 question 75 discussion

Actual exam question from Mulesoft's MCIA - Level 1
Question #: 75
Topic #: 1
[All MCIA - Level 1 Questions]

A Mule application is being designed to receive nightly a CSV file containing millions of records from an external vendor over SFTP. The records from the file need to be validated, transformed, and then written to a database. Records can be inserted into the database in any order.
In this use case, what combination of Mule components provides the most effective and performant way to write these records to the database?

  • A. Use a Batch Job scope to bulk insert records into the database
  • B. Use a Scatter-Gather to bulk insert records into the database
  • C. Use a Parallel For Each scope to insert records one by one into the database
  • D. Use a DataWeave map operation and an Async scope to insert records one by one into the database
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
Alandt
4 months, 2 weeks ago
Selected Answer: A
A is correct according to official practice exam
upvoted 1 times
...
madgeezer
2 years, 2 months ago
Selected Answer: A
A. Use a Batch Job scope to bulk insert records into the database Reliability: If you want reliability while processing the records, i.e should the processing survive a runtime crash or other unhappy scenarios, and when restarted process all the remaining records, if yes then go for batch as it uses persistent queues. Memory footprint: Since question said that there are millions of records to process, parallel for each will aggregate all the processed records at the end and can possibly cause Out Of Memory. Batch job instead provides a BatchResult in the on complete phase where you can get the count of failures and success. For huge file processing if order is not a concern definitely go ahead with Batch Job
upvoted 2 times
...
Outdoor25
2 years, 10 months ago
Selected Answer: A
A is correct answer. Bulk insert in database is more efficient than multiple database insert.
upvoted 1 times
...
Outdoor25
2 years, 10 months ago
Should be A.
upvoted 1 times
...
KrishnVams
3 years ago
Correct Answer: A. Use a Batch Job scope to bulk insert records into the database
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...