Google Professional Machine Learning Engineer Exam Practice Questions (P. 5)
- Full Access (304 questions)
- Six months of Premium Access
- Access to one million comments
- Seamless ChatGPT Integration
- Ability to download PDF files
- Anki Flashcard files for revision
- No Captcha & No AdSense
- Advanced Exam Configuration
Question #41
Your team is building an application for a global bank that will be used by millions of customers. You built a forecasting model that predicts customers' account balances 3 days in the future. Your team will use the results in a new feature that will notify users when their account balance is likely to drop below $25. How should you serve your predictions?
- A1. Create a Pub/Sub topic for each user. 2. Deploy a Cloud Function that sends a notification when your model predicts that a user's account balance will drop below the $25 threshold.
- B1. Create a Pub/Sub topic for each user. 2. Deploy an application on the App Engine standard environment that sends a notification when your model predicts that a user's account balance will drop below the $25 threshold.
- C1. Build a notification system on Firebase. 2. Register each user with a user ID on the Firebase Cloud Messaging server, which sends a notification when the average of all account balance predictions drops below the $25 threshold.
- D1. Build a notification system on Firebase. 2. Register each user with a user ID on the Firebase Cloud Messaging server, which sends a notification when your model predicts that a user's account balance will drop below the $25 threshold.Most Voted
Correct Answer:
A
A
send
light_mode
delete
Question #42
You work for an advertising company and want to understand the effectiveness of your company's latest advertising campaign. You have streamed 500 MB of campaign data into BigQuery. You want to query the table, and then manipulate the results of that query with a pandas dataframe in an AI Platform notebook.
What should you do?
What should you do?
- AUse AI Platform Notebooks' BigQuery cell magic to query the data, and ingest the results as a pandas dataframe.Most Voted
- BExport your table as a CSV file from BigQuery to Google Drive, and use the Google Drive API to ingest the file into your notebook instance.
- CDownload your table from BigQuery as a local CSV file, and upload it to your AI Platform notebook instance. Use pandas.read_csv to ingest he file as a pandas dataframe.
- DFrom a bash cell in your AI Platform notebook, use the bq extract command to export the table as a CSV file to Cloud Storage, and then use gsutil cp to copy the data into the notebook. Use pandas.read_csv to ingest the file as a pandas dataframe.
Correct Answer:
C
Reference:
https://cloud.google.com/bigquery/docs/bigquery-storage-python-pandas
C
Reference:
https://cloud.google.com/bigquery/docs/bigquery-storage-python-pandas
send
light_mode
delete
Question #43
You are an ML engineer at a global car manufacture. You need to build an ML model to predict car sales in different cities around the world. Which features or feature crosses should you use to train city-specific relationships between car type and number of sales?
- AThee individual features: binned latitude, binned longitude, and one-hot encoded car type.
- BOne feature obtained as an element-wise product between latitude, longitude, and car type.
- COne feature obtained as an element-wise product between binned latitude, binned longitude, and one-hot encoded car type.Most Voted
- DTwo feature crosses as an element-wise product: the first between binned latitude and one-hot encoded car type, and the second between binned longitude and one-hot encoded car type.
Correct Answer:
C
C
send
light_mode
delete
Question #44
You work for a large technology company that wants to modernize their contact center. You have been asked to develop a solution to classify incoming calls by product so that requests can be more quickly routed to the correct support team. You have already transcribed the calls using the Speech-to-Text API. You want to minimize data preprocessing and development time. How should you build the model?
- AUse the AI Platform Training built-in algorithms to create a custom model.
- BUse AutoMlL Natural Language to extract custom entities for classification.Most Voted
- CUse the Cloud Natural Language API to extract custom entities for classification.
- DBuild a custom model to identify the product keywords from the transcribed calls, and then run the keywords through a classification algorithm.
Correct Answer:
A
A
send
light_mode
delete
Question #45
You are training a TensorFlow model on a structured dataset with 100 billion records stored in several CSV files. You need to improve the input/output execution performance. What should you do?
- ALoad the data into BigQuery, and read the data from BigQuery.
- BLoad the data into Cloud Bigtable, and read the data from Bigtable.
- CConvert the CSV files into shards of TFRecords, and store the data in Cloud Storage.Most Voted
- DConvert the CSV files into shards of TFRecords, and store the data in the Hadoop Distributed File System (HDFS).
Correct Answer:
B
Reference:
https://cloud.google.com/dataflow/docs/guides/templates/provided-batch
B
Reference:
https://cloud.google.com/dataflow/docs/guides/templates/provided-batch
send
light_mode
delete
Question #46
As the lead ML Engineer for your company, you are responsible for building ML models to digitize scanned customer forms. You have developed a TensorFlow model that converts the scanned images into text and stores them in Cloud Storage. You need to use your ML model on the aggregated data collected at the end of each day with minimal manual intervention. What should you do?
- AUse the batch prediction functionality of AI Platform.Most Voted
- BCreate a serving pipeline in Compute Engine for prediction.
- CUse Cloud Functions for prediction each time a new data point is ingested.
- DDeploy the model on AI Platform and create a version of it for online inference.
Correct Answer:
D
D
send
light_mode
delete
Question #47
You recently joined an enterprise-scale company that has thousands of datasets. You know that there are accurate descriptions for each table in BigQuery, and you are searching for the proper BigQuery table to use for a model you are building on AI Platform. How should you find the data that you need?
- AUse Data Catalog to search the BigQuery datasets by using keywords in the table description.Most Voted
- BTag each of your model and version resources on AI Platform with the name of the BigQuery table that was used for training.
- CMaintain a lookup table in BigQuery that maps the table descriptions to the table ID. Query the lookup table to find the correct table ID for the data that you need.
- DExecute a query in BigQuery to retrieve all the existing table names in your project using the INFORMATION_SCHEMA metadata tables that are native to BigQuery. Use the result o find the table that you need.
Correct Answer:
B
B
send
light_mode
delete
Question #48
You started working on a classification problem with time series data and achieved an area under the receiver operating characteristic curve (AUC ROC) value of
99% for training data after just a few experiments. You haven't explored using any sophisticated algorithms or spent any time on hyperparameter tuning. What should your next step be to identify and fix the problem?
99% for training data after just a few experiments. You haven't explored using any sophisticated algorithms or spent any time on hyperparameter tuning. What should your next step be to identify and fix the problem?
- AAddress the model overfitting by using a less complex algorithm.
- BAddress data leakage by applying nested cross-validation during model training.Most Voted
- CAddress data leakage by removing features highly correlated with the target value.
- DAddress the model overfitting by tuning the hyperparameters to reduce the AUC ROC value.
Correct Answer:
B
B
send
light_mode
delete
Question #49
You work for an online travel agency that also sells advertising placements on its website to other companies. You have been asked to predict the most relevant web banner that a user should see next. Security is important to your company. The model latency requirements are 300ms@p99, the inventory is thousands of web banners, and your exploratory analysis has shown that navigation context is a good predictor. You want to Implement the simplest solution. How should you configure the prediction pipeline?
- AEmbed the client on the website, and then deploy the model on AI Platform Prediction.
- BEmbed the client on the website, deploy the gateway on App Engine, and then deploy the model on AI Platform Prediction.Most Voted
- CEmbed the client on the website, deploy the gateway on App Engine, deploy the database on Cloud Bigtable for writing and for reading the user's navigation context, and then deploy the model on AI Platform Prediction.
- DEmbed the client on the website, deploy the gateway on App Engine, deploy the database on Memorystore for writing and for reading the user's navigation context, and then deploy the model on Google Kubernetes Engine.
Correct Answer:
B
B
send
light_mode
delete
Question #50
Your team is building a convolutional neural network (CNN)-based architecture from scratch. The preliminary experiments running on your on-premises CPU-only infrastructure were encouraging, but have slow convergence. You have been asked to speed up model training to reduce time-to-market. You want to experiment with virtual machines (VMs) on Google Cloud to leverage more powerful hardware. Your code does not include any manual device placement and has not been wrapped in Estimator model-level abstraction. Which environment should you train your model on?
- AAVM on Compute Engine and 1 TPU with all dependencies installed manually.
- BAVM on Compute Engine and 8 GPUs with all dependencies installed manually.
- CA Deep Learning VM with an n1-standard-2 machine and 1 GPU with all libraries pre-installed.Most Voted
- DA Deep Learning VM with more powerful CPU e2-highcpu-16 machines with all libraries pre-installed.
Correct Answer:
A
A
send
light_mode
delete
All Pages