exam questions

Exam DP-100 All Questions

View all questions & answers for the DP-100 exam

Exam DP-100 topic 4 question 36 discussion

Actual exam question from Microsoft's DP-100
Question #: 36
Topic #: 4
[All DP-100 Questions]

HOTSPOT
-

You create an Azure Machine Learning model to include model files and a scoring script.

You must deploy the model. The deployment solution must meet the following requirements:

• Provide near real-time inferencing.
• Enable endpoint and deployment level cost estimates.
• Support logging to Azure Log Analytics.

You need to configure the deployment solution.

What should you configure? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Show Suggested Answer Hide Answer
Suggested Answer:

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
nposteraro
5 months, 1 week ago
For Endpoint type the Managed Online Endpoint is the right answer. I was not convinced but then I read this: https://learn.microsoft.com/en-us/azure/machine-learning/concept-endpoints-online?view=azureml-api-2#managed-online-endpoints-vs-kubernetes-online-endpoints
upvoted 1 times
...
evangelist
10 months, 3 weeks ago
Endpoint type: Kubernetes online Deployment component: Azure Kubernetes Service (AKS) cluster
upvoted 3 times
...
deyoz
1 year, 2 months ago
answers are correct: A Docker image can be used for real-time inferencing of ML models, depending on the deployment solution and the requirements. For example, you can use a Docker image to create a web service that exposes your model via a REST API and responds to requests with predictions in near real-time1. Alternatively, you can use a Docker image to deploy your model to a cloud platform such as Amazon SageMaker2 or Azure Machine Learning3 that provides managed online endpoints for real-time inferencing. However, using a Docker image for real-time inferencing may also introduce some challenges, such as ensuring the compatibility and security of the image, synchronizing the image with the online feature store, and scaling the image to handle the traffic and latency demands4. Therefore, you should carefully evaluate the trade-offs and best practices of using a Docker image for real-time inferencing of ML models.
upvoted 1 times
...
vprowerty
1 year, 2 months ago
Deployment component asnwer might be docker image! Because: https://learn.microsoft.com/en-us/azure/machine-learning/how-to-deploy-online-endpoints?view=azureml-api-2&tabs=azure-cli#define-the-deployment Define the deployment: A deployment is a set of resources required for hosting the model that does the actual inferencing. To deploy a model, you must have: (among others) - An environment in which your model runs. The environment can be a Docker image with Conda dependencies or a Dockerfile.
upvoted 1 times
...
PI_Team
1 year, 8 months ago
both managed online endpoints and AKS-based online endpoints can be used to deploy machine learning models for real-time inferencing in Azure Machine Learning. Managed online endpoints are fully managed by Azure Machine Learning and provide a simple and cost-effective way to deploy models for real-time inferencing. They also provide out-of-the-box monitoring and logging powered by Azure Monitor and Log Analytics, which includes key metrics and log tables for endpoints and deployments. On the other hand, AKS-based online endpoints provide more flexibility and control, but require more user responsibility to set up and manage. So the only reason I would go for managed online is the mentioning of Azure Log Analytics directly. Second one should be Azure Kubernetes Service (AKS) for deployment. SaM
upvoted 1 times
...
snegnik
1 year, 10 months ago
To enable endpoint and deployment level cost estimates, you can use managed online endpoints. Managed online endpoints work with powerful CPU and GPU machines in Azure in a scalable, fully managed way. They take care of serving, scaling, securing, and monitoring your models, freeing you from the underlying infrastructure management. https://learn.microsoft.com/en-us/azure/machine-learning/concept-endpoints-online?view=azureml-api-2#managed-online-endpoints-vs-kubernetes-online-endpoints
upvoted 2 times
...
ZoeJ
2 years ago
https://learn.microsoft.com/en-us/azure/machine-learning/concept-endpoints?view=azureml-api-2 managed online can provide deployment level cost estimates
upvoted 2 times
ZoeJ
2 years ago
https://learn.microsoft.com/en-us/azure/aks/monitor-aks#analyze-log-data-with-log-analytics There is a 'Analyze log data with Log Analytics' part, maybe this is the evidence that we can choose AKS for the second question. If anyone find better evidence please let me know
upvoted 1 times
...
...
sap_dg
2 years, 1 month ago
I would go for EndpointType: managed online, Deployment component: AKS cluster
upvoted 2 times
...
esimsek
2 years, 1 month ago
Was on exam 2023-03-27
upvoted 4 times
...
esimsek
2 years, 1 month ago
Is it correct?
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago