Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Professional Cloud Developer All Questions

View all questions & answers for the Professional Cloud Developer exam

Exam Professional Cloud Developer topic 1 question 45 discussion

Actual exam question from Google's Professional Cloud Developer
Question #: 45
Topic #: 1
[All Professional Cloud Developer Questions]

Case study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an
All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Company Overview -
HipLocal is a community application designed to facilitate communication between people in close proximity. It is used for event planning and organizing sporting events, and for businesses to connect with their local communities. HipLocal launched recently in a few neighborhoods in Dallas and is rapidly growing into a global phenomenon. Its unique style of hyper-local community communication and business outreach is in demand around the world.

Executive Statement -
We are the number one local community app; it's time to take our local community services global. Our venture capital investors want to see rapid growth and the same great experience for new local and virtual communities that come online, whether their members are 10 or 10000 miles away from each other.

Solution Concept -
HipLocal wants to expand their existing service, with updated functionality, in new regions to better serve their global customers. They want to hire and train a new team to support these regions in their time zones. They will need to ensure that the application scales smoothly and provides clear uptime data.

Existing Technical Environment -
HipLocal's environment is a mix of on-premises hardware and infrastructure running in Google Cloud Platform. The HipLocal team understands their application well, but has limited experience in global scale applications. Their existing technical environment is as follows:
* Existing APIs run on Compute Engine virtual machine instances hosted in GCP.
* State is stored in a single instance MySQL database in GCP.
* Data is exported to an on-premises Teradata/Vertica data warehouse.
* Data analytics is performed in an on-premises Hadoop environment.
* The application has no logging.
* There are basic indicators of uptime; alerts are frequently fired when the APIs are unresponsive.

Business Requirements -
HipLocal's investors want to expand their footprint and support the increase in demand they are seeing. Their requirements are:
* Expand availability of the application to new regions.
* Increase the number of concurrent users that can be supported.
* Ensure a consistent experience for users when they travel to different regions.
* Obtain user activity metrics to better understand how to monetize their product.
* Ensure compliance with regulations in the new regions (for example, GDPR).
* Reduce infrastructure management time and cost.
* Adopt the Google-recommended practices for cloud computing.

Technical Requirements -
* The application and backend must provide usage metrics and monitoring.
* APIs require strong authentication and authorization.
* Logging must be increased, and data should be stored in a cloud analytics platform.
* Move to serverless architecture to facilitate elastic scaling.
* Provide authorized access to internal apps in a secure manner.
HipLocal has connected their Hadoop infrastructure to GCP using Cloud Interconnect in order to query data stored on persistent disks.
Which IP strategy should they use?

  • A. Create manual subnets.
  • B. Create an auto mode subnet.
  • C. Create multiple peered VPCs.
  • D. Provision a single instance for NAT.
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
anshad666
1 month, 1 week ago
Selected Answer: A
ince Cloud Interconnect is used, HipLocal is likely setting up a hybrid cloud environment. This requires careful planning of IP ranges to avoid conflicts and ensure smooth communication between on-premises infrastructure and the cloud, which manual subnets provide.
upvoted 1 times
...
thewalker
4 months, 1 week ago
Selected Answer: B
The best answer here is B. Create an auto mode subnet. Here's why: Auto Mode Subnets: Auto mode subnets automatically assign internal IP addresses to instances within the subnet. This simplifies IP address management and eliminates the need for manual configuration. Cloud Interconnect: Cloud Interconnect provides a dedicated, high-bandwidth connection between your on-premises network and Google Cloud. This allows for efficient data transfer between your Hadoop infrastructure and the persistent disks in GCP. Simplified Management: Auto mode subnets make it easier to manage IP addresses, especially when dealing with a hybrid environment like HipLocal's.
upvoted 1 times
thewalker
4 months, 1 week ago
Why other options are less suitable: A. Create manual subnets: Manual subnets require you to manually assign IP addresses to instances, which can be time-consuming and error-prone, especially in a dynamic environment. C. Create multiple peered VPCs: Peering VPCs is useful for connecting different VPCs, but it's not necessary for connecting your Hadoop infrastructure to GCP using Cloud Interconnect. D. Provision a single instance for NAT: While NAT can be used for outbound connectivity, it's not the most efficient or secure approach for connecting your Hadoop infrastructure to GCP. In summary: Auto mode subnets provide the most efficient and manageable IP address strategy for HipLocal's hybrid environment, simplifying IP address management and ensuring seamless connectivity between their Hadoop infrastructure and GCP.
upvoted 1 times
...
...
d_ella2001
4 months, 2 weeks ago
Selected Answer: A
Manual Subnets: Control Over IP Addressing: Creating manual subnets (also known as custom mode VPCs) provides precise control over IP addressing and subnet creation. This ensures that the IP ranges do not overlap and can be managed to meet specific requirements of the Hadoop infrastructure and Cloud Interconnect setup. Subnet Management: With manual subnets, HipLocal can create subnets that are optimised for their data traffic patterns and usage requirements, which is crucial for performance and efficient utilisation of network resources. Integration: Custom mode VPCs allow for better integration with on-premises networks through Cloud Interconnect, ensuring a seamless and efficient network setup. Why Not the B? Auto mode subnets automatically create subnets in each region with pre-defined IP ranges. This lack of control over IP address assignment and network segmentation is not suitable for complex and specific networking requirements like those of HipLocal’s Hadoop infrastructure.
upvoted 1 times
...
santoshchauhan
8 months, 3 weeks ago
Selected Answer: B
B. Create an auto mode subnet. When integrating Hadoop infrastructure with Google Cloud Platform (GCP) via Cloud Interconnect and querying data stored on persistent disks, the IP strategy should simplify network management while ensuring efficient and secure data access. Creating an auto mode subnet in their VPC is a suitable approach for this scenario.
upvoted 1 times
...
Bessa24
9 months ago
Selected Answer: B
B is correct
upvoted 1 times
...
__rajan__
1 year, 2 months ago
Selected Answer: B
For simplicity and ease of management, an auto mode subnet (option B) could be a good choice.
upvoted 1 times
...
tomato123
2 years, 3 months ago
Selected Answer: A
A is correct
upvoted 1 times
...
bk7
2 years, 3 months ago
Selected Answer: A
A - Need to take control of the IP assignment thru manual subnet especially when establishing the connectivity between on-prem/cloud
upvoted 4 times
...
akshaychavan7
2 years, 3 months ago
Selected Answer: B
I will go with auto mode subnet creation as it will automatically create a subnet inside each region. Moreover, one of the business requirements states that 'Reduce infrastructure management time and cost.'. Thus, with auto mode subnet we avoid infrastructure management.
upvoted 2 times
...
syu31svc
3 years, 4 months ago
https://cloud.google.com/architecture/hadoop/hadoop-gcp-migration-data I would take A based on the 2nd figure given in the link
upvoted 1 times
...
saurabh1805
4 years ago
A is correct answer here.
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...