Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Certified Associate Developer for Apache Spark All Questions

View all questions & answers for the Certified Associate Developer for Apache Spark exam

Exam Certified Associate Developer for Apache Spark topic 1 question 184 discussion

Which of the following cluster configurations will fail to ensure completion of a Spark application in light of a worker node failure?



Note: each configuration has roughly the same compute power using 100GB of RAM and 200 cores.

  • A. Scenario #5
  • B. Scenario #4
  • C. Scenario #6
  • D. Scenario #1
  • E. They should all ensure completion because worker nodes are fault tolerant
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
Oks_An
2 months ago
Selected Answer: D
scenario #1 uses a configuration with only 1 executor. If the worker node running that executor fails, the entire application will fail because there are no other executors available to continue processing the tasks. Spark needs redundancy across multiple worker nodes or executors to tolerate node failures. Since this configuration has only one executor, it lacks fault tolerance.
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...