exam questions

Exam CAS-004 All Questions

View all questions & answers for the CAS-004 exam

Exam CAS-004 topic 1 question 4 discussion

Actual exam question from CompTIA's CAS-004
Question #: 4
Topic #: 1
[All CAS-004 Questions]

In preparation for the holiday season, a company redesigned the system that manages retail sales and moved it to a cloud service provider. The new infrastructure did not meet the company's availability requirements. During a postmortem analysis, the following issues were highlighted:
1. International users reported latency when images on the web page were initially loading.
2. During times of report processing, users reported issues with inventory when attempting to place orders.
3. Despite the fact that ten new API servers were added, the load across servers was heavy at peak times.
Which of the following infrastructure design changes would be BEST for the organization to implement to avoid these issues in the future?

  • A. Serve static content via distributed CDNs, create a read replica of the central database and pull reports from there, and auto-scale API servers based on performance.
  • B. Increase the bandwidth for the server that delivers images, use a CDN, change the database to a non-relational database, and split the ten API servers across two load balancers.
  • C. Serve images from an object storage bucket with infrequent read times, replicate the database across different regions, and dynamically create API servers based on load.
  • D. Serve static-content object storage across different regions, increase the instance size on the managed relational database, and distribute the ten API servers across multiple regions.
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
BiteSize
Highly Voted 1 year, 9 months ago
Selected Answer: A
A distributed Content delivery network's purpose is to reduce latency for users utilizing the application and reduce the load on the central server by splitting the resources into different areas, regardless of the user's geographical location. Source: Verifying each answer against Chat GPT, my experience, other test banks, a written book, and weighing in the discussion from all users to create a 100% accurate guide for myself before I take the exam. (It isn't easy because of the time needed, but it is doing my diligence)
upvoted 8 times
...
blacksheep6r
Most Recent 2 months, 3 weeks ago
Selected Answer: A
Distributed CDNs for Static Content: Serving static content (like images) via a distributed Content Delivery Network means that users—especially international ones—get the images from servers that are geographically closer to them. This greatly reduces latency and speeds up the initial loading of images.
upvoted 1 times
...
fb2fcb1
7 months ago
Selected Answer: A
A. Serve static content via distributed CDNs, create a read replica of the central database and pull reports from there, and auto-scale API servers based on performance. This option provides solutions for each of the identified issues: Serving static content via distributed Content Delivery Networks (CDNs) can reduce latency for international users because it serves content from the closest geographic location to the user. Creating a read replica of the central database and pulling reports from there can reduce the load on the central database, minimizing disruptions to inventory lookups during report processing. Auto-scaling API servers based on performance can ensure that as load increases, new servers are spun up to handle it, distributing the load more evenly and preventing any single server from becoming overloaded. The other options offer some potential improvements but don't address all of the identified issues as effectively.
upvoted 3 times
...
Delab202
1 year, 3 months ago
Selected Answer: C
The infrastructure design changes that would be BEST for the organization to implement to address the highlighted issues are: C. Serve images from an object storage bucket with infrequent read times, replicate the database across different regions, and dynamically create API servers based on load. Explanation: Latency for loading images Issues with inventory during report processing Heavy load on API servers at peak times
upvoted 1 times
Trap_D0_r
1 year, 3 months ago
Serving all the images from a single object storage bucket won't help with latency at all. It needs to be distributed via edge networks or regionally distributed buckets--AKA a CDN. "A" is the correct answer here.
upvoted 1 times
...
...
CASP_Master
1 year, 11 months ago
Option A is the BEST choice.
upvoted 2 times
...
user009
2 years ago
Now, let's examine why the other options are not ideal: B. This option partially addresses some issues but falls short in other areas. Increasing the bandwidth and using a CDN would improve image delivery, but changing the database to a non-relational database might not solve the inventory issues during report processing. Additionally, simply splitting the API servers across two load balancers does not provide an auto-scaling solution, which is crucial for handling peak times.
upvoted 3 times
...
user009
2 years ago
The correct answer is A. Serve static content via distributed CDNs, create a read replica of the central database and pull reports from there, and auto-scale API servers based on performance. This option addresses all three issues as follows: Serving static content via distributed CDNs reduces latency for international users because content is served from locations closer to them. Creating a read replica of the central database and pulling reports from there offloads the reporting workload from the main database, reducing the impact on inventory operations during order placement. Auto-scaling API servers based on performance ensures that resources adjust dynamically to handle varying workloads, preventing heavy load on servers during peak times.
upvoted 2 times
...
PeteUtah
2 years, 2 months ago
A is the correct answer. B is not correct; changing the architecture of the database would help in some situations, but you would still have a single database in play. Read replicas mean that you can have several databases handling read requests, with only one master handling writes.
upvoted 2 times
...
lordguck
2 years, 5 months ago
B: is good, but A: is better as it scales and has advantages regarding number of databases
upvoted 1 times
...
BHWAZN
2 years, 6 months ago
Selected Answer: A
When thinking cloud, it opens up the option to autoscale servers depending on resource usage. Therefore A is best since it can account for sudden spikes in load.
upvoted 2 times
...
ryanzou
2 years, 6 months ago
Selected Answer: A
A makes more sense.
upvoted 2 times
...
ccryptix
2 years, 6 months ago
Selected Answer: A
I believe B isn't right because it states that despite the fact 10 were added, the load was still heavy at times. So you would want and auto-scale API to keep up with performance. Something that will add more API's as needed.
upvoted 3 times
...
adamwella
2 years, 7 months ago
Why wouldn't B. be the best answer?
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago