exam questions

Exam Professional Machine Learning Engineer All Questions

View all questions & answers for the Professional Machine Learning Engineer exam

Exam Professional Machine Learning Engineer topic 1 question 134 discussion

Actual exam question from Google's Professional Machine Learning Engineer
Question #: 134
Topic #: 1
[All Professional Machine Learning Engineer Questions]

You are the Director of Data Science at a large company, and your Data Science team has recently begun using the Kubeflow Pipelines SDK to orchestrate their training pipelines. Your team is struggling to integrate their custom Python code into the Kubeflow Pipelines SDK. How should you instruct them to proceed in order to quickly integrate their code with the Kubeflow Pipelines SDK?

  • A. Use the func_to_container_op function to create custom components from the Python code.
  • B. Use the predefined components available in the Kubeflow Pipelines SDK to access Dataproc, and run the custom code there.
  • C. Package the custom Python code into Docker containers, and use the load_component_from_file function to import the containers into the pipeline.
  • D. Deploy the custom Python code to Cloud Functions, and use Kubeflow Pipelines to trigger the Cloud Function.
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
hiromi
Highly Voted 10 months, 1 week ago
Selected Answer: A
A -https://kubeflow-pipelines.readthedocs.io/en/stable/source/kfp.components.html?highlight=func_to_container_op%20#kfp.components.func_to_container_op
upvoted 5 times
...
M25
Most Recent 5 months, 2 weeks ago
Selected Answer: A
Went with A
upvoted 2 times
...
Antmal
6 months, 1 week ago
Selected Answer: A
The answer is A. because the Kubeflow Pipelines SDK provides a convenient way to create custom components from existing Python code using the func_to_container_op function. This allows data science team to encapsulate the custom code as containerised components that can be easily integrated into the kubeflow pipeline. This approach allows for seamless integration of custom Python code into the Kubeflow Pipelines SDK without requiring additional dependencies or infrastructure setup.
upvoted 2 times
...
TNT87
7 months, 3 weeks ago
Selected Answer: A
A. Use the func_to_container_op function to create custom components from the Python code. The func_to_container_op function in the Kubeflow Pipelines SDK is specifically designed to convert Python functions into containerized components that can be executed in a Kubernetes cluster. By using this function, the Data Science team can easily integrate their custom Python code into the Kubeflow Pipelines SDK without having to learn the details of containerization or Kubernetes.
upvoted 3 times
...
mil_spyro
10 months, 2 weeks ago
Selected Answer: A
Use the func_to_container_op function to create custom components from their code. This function allows you to define a Python function that can be used as a pipeline component, and it automatically creates a Docker container with the necessary dependencies
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago