In the Spark UI, the Stage’s detail screen provides key metrics about each stage of a job, including the amount of data that has been spilled to disk. If you see a high number in the “Spill (Memory)” or “Spill (Disk)” columns, it’s an indication that a partition is spilling to disk.
The Executor’s log files can also provide valuable information about spill. If a task is spilling a lot of data, you’ll see messages in the logs like “Spilling UnsafeExternalSorter to disk” or “Task memory spill”. These messages indicate that the task ran out of memory and had to spill data to disk.
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
60ties
Highly Voted 1 year agodd1192d
Most Recent 1 month, 2 weeks agovctrhugo
9 months, 3 weeks agojin1991
11 months, 1 week agojin1991
11 months, 1 week ago