Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Certified Machine Learning Professional All Questions

View all questions & answers for the Certified Machine Learning Professional exam

Exam Certified Machine Learning Professional topic 1 question 31 discussion

Actual exam question from Databricks's Certified Machine Learning Professional
Question #: 31
Topic #: 1
[All Certified Machine Learning Professional Questions]

Which of the following statements describes streaming with Spark as a model deployment strategy?

  • A. The inference of batch processed records as soon as a trigger is hit
  • B. The inference of all types of records in real-time
  • C. The inference of batch processed records as soon as a Spark job is run
  • D. The inference of incrementally processed records as soon as trigger is hit
  • E. The inference of incrementally processed records as soon as a Spark job is run
Show Suggested Answer Hide Answer
Suggested Answer: E 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
BokNinja
Highly Voted 11 months, 1 week ago
The correct answer is D. The inference of incrementally processed records as soon as a trigger is hit. In this context, a “trigger” refers to the condition that initiates the processing of the next set of data. This could be a time interval (e.g., process new data every second), a data size (e.g., process every 1000 records), or other custom conditions
upvoted 5 times
...
64934ca
Most Recent 4 months, 3 weeks ago
Selected Answer: D
Streaming with Spark as a model deployment strategy involves processing data in small, incremental batches (micro-batches) as it arrives. Spark Structured Streaming allows for continuous processing of streaming data, where the data is processed incrementally and results are updated in real-time. The processing is typically triggered at regular intervals, known as trigger
upvoted 1 times
...
JaydeepT
9 months, 4 weeks ago
Selected Answer: D
Incrementally processed records with a Spark job: Spark jobs are typically used for initiating processing, but triggers are more common for continuous inference in streaming scenarios.
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...