Receipts are uploaded to the File Storage (not Blob Storage) which does not support triggers.
Concurrent processing of a (SINGLE!) receipt must be prevented - so parallel processing is OK.
So answer D.
Durable functions will let the consumer get an immediate (async) response, but the processing remains. The duration till the file appears on the website doesn't change.
Doing the processing in parallel will make a change.
Correct, if one instance of time trigger function is running, then there will not be a second instance starts, even when 5 minutes pass ... For a durable function, it can make sure immediate returns to allow second instance to start ...
That makes sense. Running parallel tasks is not good practice for functions. Here we cannot predict the degree of parallelizm. But using durable function is the best choice.
I believe it's "B" - since issue happens in busy period when CPU is over-utilized. Then only reasonable action will be to scale. And for that we should properly configure Consumption Plan.
"D" - could be an answer if Q was about slow speed of processing in normal situation, when CPU resources in enough. In this case, I/O operations are the bottleneck. But, if we try to spawn more thread when CPU is already super-busy, it would even worsen user experience.
And it's not "C" since Dedicated Plan is used in very specific situation. Exerpt:
"Consider a dedicated App Service plan in the following situations:
- You have existing, underutilized VMs that are already running other App Service instances.
- You want to provide a custom image on which to run your functions."
"Concurrent processing of a receipt must be prevented."
Microsoft has added this line as a red-herring to make the question taker not think parallelism as an option? What does "processing" mean here? What is "a receipt"? That in combination with the listFiles() method. Does "a receipt" contain multiple files? Does "processing of a receipt" in Microsoft dictionary mean uploading (processing, microsoft?) of the multiple files in "a receipt"
If the answer has durable functions then go for it without thinking deep. The requirements looks like a requirement for asynchronous processing because employees get an email (asynchronous) later. But any other answer is just not right and the question could send an intelligent developer (Microsoft excluded) into a loop of thoughts.
Appreciate all the Microsoft-Technology-developers finding innovative reasons for the answers. But what is not clear is what that listFiles() method do. Which files are returned. That's a lot of assumptions to say you can do upload in parallel without knowing what files and their sizes. No wonder Microsoft-technologies are so buggy
A. Convert the trigger on the Azure Function to an Azure Blob storage trigger
=> won't help because we have Azure Fileshare
B. Ensure that the consumption plan is configured correctly to allow scaling
=> Trigger is time based. Multiple instances scanning the same folder => bad idea; also clearly stated in the requirements that parallel processing is not allowed
C. Move the Azure Function to a dedicated App Service Plan
=> the Trigger every 5 seconds should keep the function "alive". The function work is also not CPU bound so I cannot see a real benefit for ASP in this scenario
D. Update the loop starting on line PC09 to process items in parallel
=> might help.
D2 (alternative to D as by PaulMD) replace the solution with durable functions
=> looks even better than D
If D2 is an option I'd go for that.
Maybe they realized that the current "D" is not a really good solution and D2 is also way more "azure"
Otherwise D.
The answer is C since this is a cold start problem.
"When using Azure Functions in the dedicated plan, the Functions host is always running, which means that cold start isn’t really an issue."
https://azure.microsoft.com/en-us/blog/understanding-serverless-cold-start/
D is not correct - while this would speed up performance, the prompt states that users report high delay during BUSY PERIODS. Clearly, the fact that it does not upload files in parallel would not solve that.
The problem must be that the consumption plan is not scaling the function app correctly to handle the load. C could theoretically help, but B is better.
Correct answer: B
Only thing possible is D ...
File mount, is not blob storage, so cannot be trigger ...
This is a time trigger, so scale up will not help, only one instance will run ...
Only leave us with D
I vote for B. Reasoning:
A. Convert the trigger on the Azure Function to an Azure Blob storage trigger
> We are not dealing with a defect, but a performance degradation, so this would not help.
B. Ensure that the consumption plan is configured correctly to allow scaling
> It seems that "Maximum Scale Out Limit" is set to a value not appropriate for the usage pattern
C. Move the Azure Function to a dedicated App Service Plan
> Wont help.
D. Update the loop starting on line PC09 to process items in parallel
> I don't think it is a good idea to call an async method from within a foreach loop, also not from within Parallel.ForEach.
https://stackoverflow.com/questions/23137393/parallel-foreach-and-async-await
well the function is started by a timer, meaning that the "event" that should trigger the scaling won't increase. Hence I do not think B is the correct choice (Ref: https://docs.microsoft.com/en-us/azure/azure-functions/event-driven-scaling).
Considering that we are uploading receipts to a Azure file storage A is also incorrect.
In the given scenario D is the one that makes the most sense.
D is the right answer, since the loop picks up all files in the container and scaling would make the files being processed more than once, potentially.
Change feed is not supported for file shares, so D is the only remaining option (though ugly as hell).
A and C - converting to blob trigger with dedicated plan not consumption to avoid cold start and high availability of the function
D - is not enough since the the trigger is scheduled to every 5 mins - so users will still need to wait even it is already have been processed.
Answer is C. A is not possible as reports can also be uploaded using Azure Files. Consumption plan has a cold start (up to 10 minutes), so moving to dedicated plan will help
2 questions about C:
Will a cold start be an issue at all when it is triggered by a time trigger?
Could it be a dedicated App Service plan has stronger CPU allowing to process the files faster?
Besides that: if parallel processing is an option, I would go for that specially with the autoscaling options of a consumption plan (but where time trigger doesn't help?)
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
trance13
Highly Voted 3 years, 7 months agoZodiaC
3 years, 4 months agoPaulMD
Highly Voted 3 years, 7 months agoferut
3 years, 6 months agoning
3 years, 3 months agoleonidn
2 years, 10 months agoedengoforit
2 years, 10 months agooverhill
Most Recent 6 days, 8 hours agoAndySmith
1 year, 1 month agooverhill
6 days, 8 hours agooverhill
6 days, 8 hours agoOPT_001122
2 years agogmishra88
2 years, 1 month agogmishra88
2 years, 1 month agoReniRechner
2 years, 8 months ago0cc50bf
3 months, 1 week agokozchris
2 years, 9 months agocoffecold
2 years, 1 month agoeMax
2 years, 10 months agoasdasdasg2
2 years, 10 months agoning
3 years, 3 months agoOnuoa92
3 years, 6 months agoZodiaC
3 years, 4 months agoMolte
2 years, 10 months ago[Removed]
3 years, 6 months agoanirbanzeus
3 years, 6 months agowarchoon
1 year, 7 months agoVR
3 years, 7 months agokwaazaar
3 years, 7 months agojokergester
3 years, 7 months agonicolaus
3 years, 6 months agoPhilLI
2 years, 10 months agokabbas
1 year, 4 months ago