You are helping the QA team to roll out a new load-testing tool to test the scalability of your primary cloud services that run on Google Compute Engine with Cloud Bigtable. Which three requirements should they include? (Choose three.)
A.
Ensure that the load tests validate the performance of Cloud Bigtable
B.
Create a separate Google Cloud project to use for the load-testing environment
C.
Schedule the load-testing tool to regularly run against the production environment
D.
Ensure all third-party systems your services use is capable of handling high load
E.
Instrument the production services to record every transaction for replay by the load-testing tool
F.
Instrument the load-testing tool and the target services with detailed logging and metrics collection
after reading link: https://cloud.google.com/bigtable/docs/performance
A:Run your typical workloads against Bigtable :Always run your own typical workloads against a Bigtable cluster when doing capacity planning, so you can figure out the best resource allocation for your applications.
B. Create a separate Google Cloud project to use for the load-testing environment
F : The most important/standard factor of testing, you gather logs and metrics in TEST environment for further scaling.
I agree. It is important to verity that current BitTable cluster can deal with incoming traffic:
A cluster must have enough nodes to support its current workload and the amount of data it stores. Otherwise, the cluster might not be able to handle incoming requests, and latency could go up.
So although it is a managed service, it does not auto-scale.
AB&F
Creating a separate project is highly recommended. It gives you total isolation from your product environment, and make sure it will not share the resources with your product env such as service quota
A: No. Not needed since it's a managed GCP product. It'll scale to satisfy demand.
B: Yes. You could leave it in the same project as the app, but it'll eventually be deployed to production and be a risk if anyone accidentally runs it against prod.
C: No. You musn't run load testing against prod.
D: Yes. The capability of the third party systems should be tested. They are another link in the chain and if they are not up to the task, they may be replaced.
E: No. There is no need to use real data in the requests, this is a load test, not a behavior one.
F: Yes. Having detailed logs and metrics helps diagnosing problems during the tests.
Don't think it would be this option: Ensure all third-party systems your services use is capable of handling high load. This is some extra information and we are not sure if the application in this question is even using any third party tools.
Here is my take, I respectfully disagree with ya all :)
A. Ensure that the load tests validate the performance of Cloud Bigtable Most Voted
=> not the requirement
B. Create a separate Google Cloud project to use for the load-testing environment Most Voted
=> yes, you don't want to use production quota.
C. Schedule the load-testing tool to regularly run against the production environment
=> yes please kill the prod !
D. Ensure all third-party systems your services use is capable of handling high load
=> well, that is what we shall test, so, it was more the task of the development team, not the QA team.
E. Instrument the production services to record every transaction for replay by the load-testing tool
=> yes, this way you can build your test dataset with realistic behavior.
F. Instrument the load-testing tool and the target services with detailed logging and metrics collection Most Vo
=> yes, otherwise you test for nothing, you have no data at the end to evaluate the system's performance.
E. Instrument the production services to record every transaction for replay by the load-testing tool
=> yes, this way you can build your test dataset with realistic behavior.
lol you dont test on prd but still ned prd's records .... how can bro
Answer ADF
A: It is important to ensure that the load-testing tool is able to accurately test the performance of Cloud Bigtable in order to ensure that it can handle the expected load.
D: It is important to ensure that all third-party systems that your primary cloud services rely on are able to handle the expected load in order to avoid any potential bottlenecks or failures.
F: Instrumenting the load-testing tool and the target services with detailed logging and metrics collection can provide valuable insights into the performance and behavior of the system under test, allowing the QA team to identify any potential issues or bottlenecks.
Why not B, C and E:
B: creating a separate Google Cloud project to use for the load-testing environment, could also be a good idea by not necessary in order to ensure that the load tests do not impact the performance of the production environment.
C: scheduling the load-testing tool to regularly run against the production environment, is not recommended, as this could potentially impact the performance of the production environment and could lead to unexpected behavior or issues.
E: instrumenting the production services to record every transaction for replay by the load-testing tool, could also be a useful requirement, as it would allow the QA team to accurately replay real-world workloads during the load tests in order to more accurately simulate the expected production environment.
There is no necessary reason for running it in a separate project.
A we have to test Bigtable.
F Important to record all the outputs and be able to review it.
D Important to stress test third party solutions or change it.
hi, in the sentence it's underline that you what to test the scalability of your primary CLOUD SERVICS, so I think that D is not required. For me it's ABF
BCF
1) B - you need to have a separate project for Load-Testing tool. That would at least separate role based access - dev and test. Also, test will have their own code/project/config for testing, so no any chance of collision.
2) C - testing on production? because per Google recommendation, BigTable should be tested on production instances (not on development) and for at least 10 min / 300 GB of data. Check "Testing Performance with Cloud Bigtable" here.
I understand this as a requirement for integration test for projects using BigTable. Testing of BigTable on Dev instances won't give proper results.
3) F - collecting of metrics would be useful anyway....
Why not A, D, E?
1) A - isolation testing of BigTable likely doesn't make sense, if anyway integration test will need to run. That's covered in C.
2) D - testing of 3rd party tools in isolated mode likely is a one time effort (only useful when upgrading these tools). No point to run them regularly.
3) E - collecting metrics on production env just to replay them on load-testing tool? What's a point? We need test max load anyway...
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
rishab86
Highly Voted 3 years agomikesp
2 years, 8 months agoAK2020
3 years agoPeppaPig
Highly Voted 2 years, 10 months agoPeppaPig
2 years, 10 months agoNad1122
Most Recent 4 months, 1 week agoe5019c6
6 months agoguzmanelmalo
6 months, 4 weeks agosampon279
1 year agojlambdan
1 year, 2 months agoductrinh
9 months, 1 week agoomermahgoub
1 year, 6 months agoomermahgoub
1 year, 6 months agosurajkrishnamurthy
1 year, 6 months agoMahmoud_E
1 year, 8 months agoandras
1 year, 8 months agoAzureDP900
1 year, 8 months agoalexandercamachop
1 year, 9 months agoAMEJack
1 year, 8 months agokiappy81
1 year, 9 months agovincy2202
2 years, 6 months agojoe2211
2 years, 7 months agoMaxNRG
2 years, 8 months agokopper2019
2 years, 11 months agofirecloud
2 years, 11 months agogionny
2 years, 11 months ago