Google Professional Cloud Database Engineer Exam Practice Questions (P. 5)
- Full Access (132 questions)
- Six months of Premium Access
- Access to one million comments
- Seamless ChatGPT Integration
- Ability to download PDF files
- Anki Flashcard files for revision
- No Captcha & No AdSense
- Advanced Exam Configuration
Question #21
Your team uses thousands of connected IoT devices to collect device maintenance data for your oil and gas customers in real time. You want to design inspection routines, device repair, and replacement schedules based on insights gathered from the data produced by these devices. You need a managed solution that is highly scalable, supports a multi-cloud strategy, and offers low latency for these IoT devices. What should you do?
- AUse Firestore with Looker.
- BUse Cloud Spanner with Data Studio.
- CUse MongoD8 Atlas with Charts.Most Voted
- DUse Bigtable with Looker.
Correct Answer:
C
C
send
light_mode
delete
Question #22
Your application follows a microservices architecture and uses a single large Cloud SQL instance, which is starting to have performance issues as your application grows. in the Cloud Monitoring dashboard, the CPU utilization looks normal You want to follow Google-recommended practices to resolve and prevent these performance issues while avoiding any major refactoring. What should you do?
- AUse Cloud Spanner instead of Cloud SQL.
- BIncrease the number of CPUs for your instance.
- CIncrease the storage size for the instance.
- DUse many smaller Cloud SQL instances.Most Voted
Correct Answer:
A
A
send
light_mode
delete
Question #23
You need to perform a one-time migration of data from a running Cloud SQL for MySQL instance in the us-central1 region to a new Cloud SQL for MySQL instance in the us-east1 region. You want to follow Google-recommended practices to minimize performance impact on the currently running instance. What should you do?
- ACreate and run a Dataflow job that uses JdbcIO to copy data from one Cloud SQL instance to another.
- BCreate two Datastream connection profiles, and use them to create a stream from one Cloud SQL instance to another.
- CCreate a SQL dump file in Cloud Storage using a temporary instance, and then use that file to import into a new instance.Most Voted
- DCreate a CSV file by running the SQL statement SELECT...INTO OUTFILE, copy the file to a Cloud Storage bucket, and import it into a new instance.
Correct Answer:
C
C
send
light_mode
delete
Question #24
You are running a mission-critical application on a Cloud SQL for PostgreSQL database with a multi-zonal setup. The primary and read replica instances are in the same region but in different zones. You need to ensure that you split the application load between both instances. What should you do?
- AUse Cloud Load Balancing for load balancing between the Cloud SQL primary and read replica instances.
- BUse PgBouncer to set up database connection pooling between the Cloud SQL primary and read replica instances.Most Voted
- CUse HTTP(S) Load Balancing for database connection pooling between the Cloud SQL primary and read replica instances.
- DUse the Cloud SQL Auth proxy for database connection pooling between the Cloud SQL primary and read replica instances.
Correct Answer:
B
B
send
light_mode
delete
Question #25
Your organization deployed a new version of a critical application that uses Cloud SQL for MySQL with high availability (HA) and binary logging enabled to store transactional information. The latest release of the application had an error that caused massive data corruption in your Cloud SQL for MySQL database. You need to minimize data loss. What should you do?
- AOpen the Google Cloud Console, navigate to SQL > Backups, and select the last version of the automated backup before the corruption.
- BReload the Cloud SQL for MySQL database using the LOAD DATA command to load data from CSV files that were used to initialize the instance.
- CPerform a point-in-time recovery of your Cloud SQL for MySQL database, selecting a date and time before the data was corrupted.Most Voted
- DFail over to the Cloud SQL for MySQL HA instance. Use that instance to recover the transactions that occurred before the corruption.
Correct Answer:
B
B
send
light_mode
delete
All Pages