Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Certified Associate Developer for Apache Spark All Questions

View all questions & answers for the Certified Associate Developer for Apache Spark exam

Exam Certified Associate Developer for Apache Spark topic 1 question 74 discussion

Which of the following operations can be used to return a DataFrame with no duplicate rows? Please select the most complete answer.

  • A. DataFrame.distinct()
  • B. DataFrame.dropDuplicates() and DataFrame.distinct()
  • C. DataFrame.dropDuplicates()
  • D. DataFrame.drop_duplicates()
  • E. DataFrame.dropDuplicates(), DataFrame.distinct() and DataFrame.drop_duplicates()
Show Suggested Answer Hide Answer
Suggested Answer: E 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Ahlo
9 months ago
Answer E drop_duplicates() is an alias for dropDuplicates() it also work in pyspark
upvoted 1 times
...
azure_bimonster
9 months, 3 weeks ago
Selected Answer: E
it asks "most complete" one, so E would be correct as all these three options would work in pyspark
upvoted 1 times
...
thanab
1 year, 2 months ago
B The most complete answer is B. DataFrame.dropDuplicates() and DataFrame.distinct(). Both DataFrame.distinct() and DataFrame.dropDuplicates() methods in PySpark can be used to return a new DataFrame with duplicate rows removed. The DataFrame.drop_duplicates() method is used in pandas, not in PySpark.
upvoted 1 times
juadaves
1 year, 1 month ago
It should be E, drop_duplicates() works in pyspark too.
upvoted 1 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...