Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam AZ-204 All Questions

View all questions & answers for the AZ-204 exam

Exam AZ-204 topic 20 question 2 discussion

Actual exam question from Microsoft's AZ-204
Question #: 2
Topic #: 20
[All AZ-204 Questions]

You need to resolve the capacity issue.
What should you do?

  • A. Convert the trigger on the Azure Function to an Azure Blob storage trigger
  • B. Ensure that the consumption plan is configured correctly to allow scaling
  • C. Move the Azure Function to a dedicated App Service Plan
  • D. Update the loop starting on line PC09 to process items in parallel
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
trance13
Highly Voted 3 years, 7 months ago
Receipts are uploaded to the File Storage (not Blob Storage) which does not support triggers. Concurrent processing of a (SINGLE!) receipt must be prevented - so parallel processing is OK. So answer D.
upvoted 31 times
ZodiaC
3 years, 4 months ago
1000% D !!!!!!!! CORRECT!
upvoted 3 times
...
...
PaulMD
Highly Voted 3 years, 7 months ago
Cleared AZ-204 today, the question appeared, the option "D" was not there, but a "replace the solution with durable functions". I went for that.
upvoted 23 times
ferut
3 years, 6 months ago
Durable functions will let the consumer get an immediate (async) response, but the processing remains. The duration till the file appears on the website doesn't change. Doing the processing in parallel will make a change.
upvoted 3 times
...
ning
3 years, 3 months ago
Correct, if one instance of time trigger function is running, then there will not be a second instance starts, even when 5 minutes pass ... For a durable function, it can make sure immediate returns to allow second instance to start ...
upvoted 3 times
...
leonidn
2 years, 10 months ago
That makes sense. Running parallel tasks is not good practice for functions. Here we cannot predict the degree of parallelizm. But using durable function is the best choice.
upvoted 2 times
...
edengoforit
2 years, 10 months ago
If that is the case, the answer should be C?
upvoted 1 times
...
...
overhill
Most Recent 6 days, 8 hours ago
Why not B????
upvoted 1 times
...
AndySmith
1 year, 1 month ago
I believe it's "B" - since issue happens in busy period when CPU is over-utilized. Then only reasonable action will be to scale. And for that we should properly configure Consumption Plan. "D" - could be an answer if Q was about slow speed of processing in normal situation, when CPU resources in enough. In this case, I/O operations are the bottleneck. But, if we try to spawn more thread when CPU is already super-busy, it would even worsen user experience. And it's not "C" since Dedicated Plan is used in very specific situation. Exerpt: "Consider a dedicated App Service plan in the following situations: - You have existing, underutilized VMs that are already running other App Service instances. - You want to provide a custom image on which to run your functions."
upvoted 4 times
overhill
6 days, 8 hours ago
D can cause troubles with Bandwidth
upvoted 1 times
...
overhill
6 days, 8 hours ago
I'm going with B buddy, the other options don't seem to make sense
upvoted 1 times
...
...
OPT_001122
2 years ago
Selected Answer: D
D. Update the loop starting on line PC09 to process items in parallel
upvoted 1 times
...
gmishra88
2 years, 1 month ago
"Concurrent processing of a receipt must be prevented." Microsoft has added this line as a red-herring to make the question taker not think parallelism as an option? What does "processing" mean here? What is "a receipt"? That in combination with the listFiles() method. Does "a receipt" contain multiple files? Does "processing of a receipt" in Microsoft dictionary mean uploading (processing, microsoft?) of the multiple files in "a receipt" If the answer has durable functions then go for it without thinking deep. The requirements looks like a requirement for asynchronous processing because employees get an email (asynchronous) later. But any other answer is just not right and the question could send an intelligent developer (Microsoft excluded) into a loop of thoughts.
upvoted 3 times
...
gmishra88
2 years, 1 month ago
Appreciate all the Microsoft-Technology-developers finding innovative reasons for the answers. But what is not clear is what that listFiles() method do. Which files are returned. That's a lot of assumptions to say you can do upload in parallel without knowing what files and their sizes. No wonder Microsoft-technologies are so buggy
upvoted 1 times
...
ReniRechner
2 years, 8 months ago
Selected Answer: D
A. Convert the trigger on the Azure Function to an Azure Blob storage trigger => won't help because we have Azure Fileshare B. Ensure that the consumption plan is configured correctly to allow scaling => Trigger is time based. Multiple instances scanning the same folder => bad idea; also clearly stated in the requirements that parallel processing is not allowed C. Move the Azure Function to a dedicated App Service Plan => the Trigger every 5 seconds should keep the function "alive". The function work is also not CPU bound so I cannot see a real benefit for ASP in this scenario D. Update the loop starting on line PC09 to process items in parallel => might help. D2 (alternative to D as by PaulMD) replace the solution with durable functions => looks even better than D If D2 is an option I'd go for that. Maybe they realized that the current "D" is not a really good solution and D2 is also way more "azure" Otherwise D.
upvoted 13 times
0cc50bf
3 months, 1 week ago
The trigger is every 5 hours, not seconds.
upvoted 1 times
...
...
kozchris
2 years, 9 months ago
The answer is C since this is a cold start problem. "When using Azure Functions in the dedicated plan, the Functions host is always running, which means that cold start isn’t really an issue." https://azure.microsoft.com/en-us/blog/understanding-serverless-cold-start/
upvoted 1 times
coffecold
2 years, 1 month ago
No, trigger is timed [Timertrigger....], so function execution never sleeps..
upvoted 1 times
...
...
eMax
2 years, 10 months ago
The answer reference is about JavaScript, not C# :))))))
upvoted 1 times
...
asdasdasg2
2 years, 10 months ago
D is not correct - while this would speed up performance, the prompt states that users report high delay during BUSY PERIODS. Clearly, the fact that it does not upload files in parallel would not solve that. The problem must be that the consumption plan is not scaling the function app correctly to handle the load. C could theoretically help, but B is better. Correct answer: B
upvoted 2 times
...
ning
3 years, 3 months ago
Only thing possible is D ... File mount, is not blob storage, so cannot be trigger ... This is a time trigger, so scale up will not help, only one instance will run ... Only leave us with D
upvoted 4 times
...
Onuoa92
3 years, 6 months ago
Nobody is given us a correct answer
upvoted 2 times
ZodiaC
3 years, 4 months ago
D is 1000% correct
upvoted 1 times
Molte
2 years, 10 months ago
your 1000% comments under every single question does not help at all!
upvoted 16 times
...
...
...
[Removed]
3 years, 6 months ago
I vote for B. Reasoning: A. Convert the trigger on the Azure Function to an Azure Blob storage trigger > We are not dealing with a defect, but a performance degradation, so this would not help. B. Ensure that the consumption plan is configured correctly to allow scaling > It seems that "Maximum Scale Out Limit" is set to a value not appropriate for the usage pattern C. Move the Azure Function to a dedicated App Service Plan > Wont help. D. Update the loop starting on line PC09 to process items in parallel > I don't think it is a good idea to call an async method from within a foreach loop, also not from within Parallel.ForEach. https://stackoverflow.com/questions/23137393/parallel-foreach-and-async-await
upvoted 5 times
anirbanzeus
3 years, 6 months ago
well the function is started by a timer, meaning that the "event" that should trigger the scaling won't increase. Hence I do not think B is the correct choice (Ref: https://docs.microsoft.com/en-us/azure/azure-functions/event-driven-scaling). Considering that we are uploading receipts to a Azure file storage A is also incorrect. In the given scenario D is the one that makes the most sense.
upvoted 2 times
...
warchoon
1 year, 7 months ago
Don't use parallel extensions in Azure. There are special Azure constructions for it.
upvoted 1 times
...
...
VR
3 years, 7 months ago
So what is the answer?
upvoted 4 times
...
kwaazaar
3 years, 7 months ago
D is the right answer, since the loop picks up all files in the container and scaling would make the files being processed more than once, potentially. Change feed is not supported for file shares, so D is the only remaining option (though ugly as hell).
upvoted 2 times
...
jokergester
3 years, 7 months ago
A and C - converting to blob trigger with dedicated plan not consumption to avoid cold start and high availability of the function D - is not enough since the the trigger is scheduled to every 5 mins - so users will still need to wait even it is already have been processed.
upvoted 1 times
nicolaus
3 years, 6 months ago
Answer is C. A is not possible as reports can also be uploaded using Azure Files. Consumption plan has a cold start (up to 10 minutes), so moving to dedicated plan will help
upvoted 6 times
PhilLI
2 years, 10 months ago
2 questions about C: Will a cold start be an issue at all when it is triggered by a time trigger? Could it be a dedicated App Service plan has stronger CPU allowing to process the files faster? Besides that: if parallel processing is an option, I would go for that specially with the autoscaling options of a consumption plan (but where time trigger doesn't help?)
upvoted 2 times
...
kabbas
1 year, 4 months ago
I agree, parallel processing is not going to help much here since the listfile() is already doing that. A dedicated plan will provide more resources
upvoted 1 times
...
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...