exam questions

Exam AI-102 All Questions

View all questions & answers for the AI-102 exam

Exam AI-102 topic 1 question 59 discussion

Actual exam question from Microsoft's AI-102
Question #: 59
Topic #: 1
[All AI-102 Questions]

DRAG DROP
-

You have an app that manages feedback.

You need to ensure that the app can detect negative comments by using the Sentiment Analysis API in Azure AI Language. The solution must ensure that the managed feedback remains on your company’s internal network.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select.

Show Suggested Answer Hide Answer
Suggested Answer:

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
NullVoider_0
Highly Voted 8 months ago
1. Provision the Language service resource in Azure. This step involves creating the Azure Language service resource, which will provide you with the necessary credentials and endpoint URL to use the Sentiment Analysis API. 2. Deploy a Docker container to an on-premises server. By deploying a Docker container on-premises, you can run the Sentiment Analysis API locally, ensuring that the feedback data does not leave your internal network. 3. Run the container and query the prediction endpoint. Once the container is running on your on-premises server, you can start sending feedback data to the Sentiment Analysis API by querying the prediction endpoint provided by the Language service.
upvoted 18 times
...
Jimmy1017
Highly Voted 7 months, 3 weeks ago
Provision the Language service resource in Azure. Deploy a Docker container to an on-premises server. Identify the Language service endpoint URL and query the prediction endpoint.
upvoted 8 times
...
mrg998
Most Recent 2 months, 2 weeks ago
1) Provision the language service in Azure 2) Deploy a docker container to container instance - this is because you can inject a container instance into a VNET (remember there is no requirement for anything to be on-prem, it just says internal which could be a internal VNET) 3) Run the container and query the prediction endpoint
upvoted 1 times
...
moonlightc
3 months, 3 weeks ago
This was in the exam on 15/08/2024
upvoted 3 times
...
krzkrzkra
4 months, 3 weeks ago
Provision the Language service resource in Azure Deploy a Docker container to an on-premises server Run the container and query the prediction endpoint
upvoted 1 times
...
nanaw770
6 months, 2 weeks ago
Deploy on-premises Provision Run
upvoted 2 times
...
Murtuza
8 months ago
Provision the Language service resource in Azure: This is the first step where you set up the Language service resource in Azure. This service will provide you with the Sentiment Analysis API. Deploy a Docker container to an on-premises server: After provisioning the Language service, you should deploy a Docker container on an on-premises server. This container will host the Azure AI Language service and ensure that the managed feedback remains on your company’s internal network. Identify the Language service endpoint URL and query the prediction endpoint: Once the Docker container is running on your on-premises server, you can identify the Language service endpoint URL. You can then query the prediction endpoint to analyze the sentiment of the comments.
upvoted 5 times
...
Murtuza
8 months, 1 week ago
Remember that the order matters: provision the language service first, then Identify the Language Service Endpoint URL and Query the Prediction Endpoint and finally deploy the container based on your chosen deployment target which is on-premises 🚀
upvoted 7 times
vovap0vovap
6 months, 1 week ago
Question stated that more ten 1 correct order possible. You can provision recourses first or deploy container first. Now Identify the Language Service Endpoint URL and Query the Prediction Endpoint should not be correct as indirectly assumed Azure Endpoint rather then local from Docker
upvoted 1 times
...
...
Mehe323
9 months ago
That is correct, see prerequisites where you need to have 1) Docker installed and 2) provisioned a Language resource: https://learn.microsoft.com/en-us/azure/ai-services/language-service/sentiment-opinion-mining/how-to/use-containers
upvoted 5 times
Ody
8 months, 2 weeks ago
Right, but the question says Deploy a Docker Container. It is not saying setup a Docker host. You can't run the container without having the API Key and Endpoint URI. Provision the Language service (get the API Key and Endpoint URI) Deploy a Docker container to an on-premise host Run the container and query the prediction endpoint.
upvoted 7 times
Mehe323
8 months, 1 week ago
It says 'deploy a docker container to an on-premises server', you forgot the last part. A synonym for deploy is install.
upvoted 3 times
...
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...