Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Professional Machine Learning Engineer All Questions

View all questions & answers for the Professional Machine Learning Engineer exam

Exam Professional Machine Learning Engineer topic 1 question 189 discussion

Actual exam question from Google's Professional Machine Learning Engineer
Question #: 189
Topic #: 1
[All Professional Machine Learning Engineer Questions]

You are implementing a batch inference ML pipeline in Google Cloud. The model was developed using TensorFlow and is stored in SavedModel format in Cloud Storage. You need to apply the model to a historical dataset containing 10 TB of data that is stored in a BigQuery table. How should you perform the inference?

  • A. Export the historical data to Cloud Storage in Avro format. Configure a Vertex AI batch prediction job to generate predictions for the exported data
  • B. Import the TensorFlow model by using the CREATE MODEL statement in BigQuery ML. Apply the historical data to the TensorFlow model
  • C. Export the historical data to Cloud Storage in CSV format. Configure a Vertex AI batch prediction job to generate predictions for the exported data
  • D. Configure a Vertex AI batch prediction job to apply the model to the historical data in BigQuery
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
Foxy2021
1 month, 2 weeks ago
My answer is D.
upvoted 1 times
...
pinimichele01
7 months, 3 weeks ago
Selected Answer: B
https://cloud.google.com/vertex-ai/docs/tabular-data/classification-regression/get-batch-predictions#input_data_requirements
upvoted 1 times
...
edoo
8 months, 3 weeks ago
Selected Answer: B
The choice is between B and D, both good BUT: Importing and making batch predictions is quite straightforward in BQ ML https://cloud.google.com/bigquery/docs/making-predictions-with-imported-tensorflow-models if not pre-processing needed on the data. If we need a more complete pipeline I'd chose D, but the tables need partitioning (100GB is the limit in Vertex AI): https://cloud.google.com/vertex-ai/docs/tabular-data/classification-regression/get-batch-predictions#input_data_requirements
upvoted 3 times
...
guilhermebutzke
9 months, 1 week ago
Selected Answer: D
My Answer: D The historical dataset is stored in BigQuery, which can be directly accessed by Vertex AI. Vertex AI offers batch prediction capabilities, allowing you to apply the model to the data stored in BigQuery without the need to export it. So, This approach leverages the scalability of Google Cloud infrastructure and avoids unnecessary data movement, being not necessary to export data to Cloud Store (options A and C), nor Import the TensorFlow model to BQ (option B).
upvoted 1 times
...
ddogg
9 months, 3 weeks ago
Selected Answer: B
https://cloud.google.com/bigquery/docs/making-predictions-with-imported-tensorflow-models#:~:text=Import%20TensorFlow%20models,-To%20import%20TensorFlow&text=In%20the%20Google%20Cloud%20console%2C%20go%20to%20the%20BigQuery%20page.&text=In%20the%20query%20editor%2C%20enter,MODEL%20statement%20like%20the%20following.&text=The%20preceding%20query%20imports%20a,BigQuery%20ML%20model%20named%20imported_tf_model%20.
upvoted 2 times
...
sonicclasps
9 months, 4 weeks ago
Selected Answer: B
https://cloud.google.com/bigquery/docs/reference/standard-sql/bigqueryml-syntax-create-tensorflow#limitations
upvoted 1 times
...
Zwi3b3l
10 months ago
Selected Answer: B
Has to be B, because D has limitations: BigQuery data source tables must be no larger than 100 GB. https://cloud.google.com/vertex-ai/docs/tabular-data/classification-regression/get-batch-predictions#input_data_requirements
upvoted 2 times
...
BlehMaks
10 months, 1 week ago
Selected Answer: A
Same platform as data == less computation required to load and pass it to model
upvoted 1 times
BlehMaks
10 months, 1 week ago
i mean B
upvoted 1 times
...
...
b1a8fae
10 months, 2 weeks ago
Selected Answer: D
It could either be B or D. It seems like most of the limitations of B are mentioned in the problem (https://cloud.google.com/bigquery/docs/reference/standard-sql/bigqueryml-syntax-create-tensorflow#limitations) but some of them are not and we are left questioning if the model will match the remaining requirements. Therefore, I would go for D, which can import data from BigQuery. https://cloud.google.com/vertex-ai/docs/predictions/get-batch-predictions#bigquery
upvoted 2 times
...
pikachu007
10 months, 2 weeks ago
Selected Answer: D
Limitations of other options: A and C. Exporting data: Exporting 10 TB of data to Cloud Storage incurs additional storage costs, transfer time, and potential data management complexities. B. BigQuery ML: While BigQuery ML supports some TensorFlow models, it might have limitations with certain model architectures or features. Additionally, it might not be as optimized for large-scale batch inference as Vertex AI.
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...