Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Certified Associate Developer for Apache Spark All Questions

View all questions & answers for the Certified Associate Developer for Apache Spark exam

Exam Certified Associate Developer for Apache Spark topic 1 question 88 discussion

The code block shown below contains an error. The code block is intended to adjust the number of partitions used in wide transformations like join() to 32. Identify the error.

Code block:

spark.conf.set("spark.default.parallelism", "32")

  • A. spark.default.parallelism is not the right Spark configuration parameter – spark.sql.shuffle.partitions should be used instead.
  • B. There is no way to adjust the number of partitions used in wide transformations – it defaults to the number of total CPUs in the cluster.
  • C. Spark configuration parameters cannot be set in runtime.
  • D. Spark configuration parameters are not set with spark.conf.set().
  • E. The second argument should not be the string version of "32" – it should be the integer 32.
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Anweee
9 months, 3 weeks ago
A ist richtig
upvoted 1 times
...
newusername
1 year ago
Selected Answer: A
spark.conf.set("spark.sql.shuffle.partitions", "32")
upvoted 3 times
...
Ram459
1 year, 3 months ago
Selected Answer: A
should be spark.sql.shuffle.partitions for joins
upvoted 4 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...