Google Professional Cloud Database Engineer Exam Practice Questions (P. 4)
- Full Access (132 questions)
- Six months of Premium Access
- Access to one million comments
- Seamless ChatGPT Integration
- Ability to download PDF files
- Anki Flashcard files for revision
- No Captcha & No AdSense
- Advanced Exam Configuration
Question #16
You are managing multiple applications connecting to a database on Cloud SQL for PostgreSQL. You need to be able to monitor database performance to easily identify applications with long-running and resource-intensive queries. What should you do?
- AUse log messages produced by Cloud SQL.
- BUse Query Insights for Cloud SQL.Most Voted
- CUse the Cloud Monitoring dashboard with available metrics from Cloud SQL.
- DUse Cloud SQL instance monitoring in the Google Cloud Console.
Correct Answer:
C
C
send
light_mode
delete
Question #17
You are building an application that allows users to customize their website and mobile experiences. The application will capture user information and preferences. User profiles have a dynamic schema, and users can add or delete information from their profile. You need to ensure that user changes automatically trigger updates to your downstream BigQuery data warehouse. What should you do?
- AStore your data in Bigtable, and use the user identifier as the key. Use one column family to store user profile data, and use another column family to store user preferences.
- BUse Cloud SQL, and create different tables for user profile data and user preferences from your recommendations model. Use SQL to join the user profile data and preferences
- CUse Firestore in Native mode, and store user profile data as a document. Update the user profile with preferences specific to that user and use the user identifier to query.Most Voted
- DUse Firestore in Datastore mode, and store user profile data as a document. Update the user profile with preferences specific to that user and use the user identifier to query.
Correct Answer:
A
A
send
light_mode
delete
Question #18
Your application uses Cloud SQL for MySQL. Your users run reports on data that relies on near-real time; however, the additional analytics caused excessive load on the primary database. You created a read replica for the analytics workloads, but now your users are complaining about the lag in data changes and that their reports are still slow. You need to improve the report performance and shorten the lag in data replication without making changes to the current reports. Which two approaches should you implement? (Choose two.)
- ACreate secondary indexes on the replica.
- BCreate additional read replicas, and partition your analytics users to use different read replicas.Most Voted
- CDisable replication on the read replica, and set the flag for parallel replication on the read replica. Re-enable replication and optimize performance by setting flags on the primary instance.Most Voted
- DDisable replication on the primary instance, and set the flag for parallel replication on the primary instance. Re-enable replication and optimize performance by setting flags on the read replica.
- EMove your analytics workloads to BigQuery, and set up a streaming pipeline to move data and update BigQuery.
Correct Answer:
BE
BE
send
light_mode
delete
Question #19
You are evaluating Cloud SQL for PostgreSQL as a possible destination for your on-premises PostgreSQL instances. Geography is becoming increasingly relevant to customer privacy worldwide. Your solution must support data residency requirements and include a strategy to: configure where data is stored control where the encryption keys are stored govern the access to data
What should you do?
What should you do?
- AReplicate Cloud SQL databases across different zones.
- BCreate a Cloud SQL for PostgreSQL instance on Google Cloud for the data that does not need to adhere to data residency requirements. Keep the data that must adhere to data residency requirements on-premises. Make application changes to support both databases.
- CAllow application access to data only if the users are in the same region as the Google Cloud region for the Cloud SQL for PostgreSQL database.
- DUse features like customer-managed encryption keys (CMEK), VPC Service Controls, and Identity and Access Management (IAM) policies.Most Voted
Correct Answer:
C
C
send
light_mode
delete
Question #20
Your customer is running a MySQL database on-premises with read replicas. The nightly incremental backups are expensive and add maintenance overhead. You want to follow Google-recommended practices to migrate the database to Google Cloud, and you need to ensure minimal downtime. What should you do?
- ACreate a Google Kubernetes Engine (GKE) cluster, install MySQL on the cluster, and then import the dump file.
- BUse the mysqldump utility to take a backup of the existing on-premises database, and then import it into Cloud SQL.
- CCreate a Compute Engine VM, install MySQL on the VM, and then import the dump file.
- DCreate an external replica, and use Cloud SQL to synchronize the data to the replica.Most Voted
Correct Answer:
B
B
send
light_mode
delete
All Pages