The code block shown below contains an error. The code block is intended to adjust the number of partitions used in wide transformations like join() to 32. Identify the error.
Code block:
spark.conf.set("spark.default.parallelism", "32")
A.
spark.default.parallelism is not the right Spark configuration parameter – spark.sql.shuffle.partitions should be used instead.
B.
There is no way to adjust the number of partitions used in wide transformations – it defaults to the number of total CPUs in the cluster.
C.
Spark configuration parameters cannot be set in runtime.
D.
Spark configuration parameters are not set with spark.conf.set().
E.
The second argument should not be the string version of "32" – it should be the integer 32.
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
Anweee
9 months, 3 weeks agonewusername
1 year agoRam459
1 year, 3 months ago