Snowflake SnowPro Advanced Architect Exam Practice Questions (P. 5)
- Full Access (109 questions)
- Six months of Premium Access
- Access to one million comments
- Seamless ChatGPT Integration
- Ability to download PDF files
- Anki Flashcard files for revision
- No Captcha & No AdSense
- Advanced Exam Configuration
Question #21
A healthcare company wants to share data with a medical institute. The institute is running a Standard edition of Snowflake; the healthcare company is running a Business Critical edition.
How can this data be shared?
How can this data be shared?
- AThe healthcare company will need to change the institute’s Snowflake edition in the accounts panel.
- BBy default, sharing is supported from a Business Critical Snowflake edition to a Standard edition.
- CContact Snowflake and they will execute the share request for the healthcare company.
- DSet the share_restriction parameter on the shared object to false.Most Voted
Correct Answer:
C
C
send
light_mode
delete
Question #22
An Architect is designing a pipeline to stream event data into Snowflake using the Snowflake Kafka connector. The Architect’s highest priority is to configure the connector to stream data in the MOST cost-effective manner.
Which of the following is recommended for optimizing the cost associated with the Snowflake Kafka connector?
Which of the following is recommended for optimizing the cost associated with the Snowflake Kafka connector?
- AUtilize a higher Buffer.flush.time in the connector configuration.Most Voted
- BUtilize a higher Buffer.size.bytes in the connector configuration.
- CUtilize a lower Buffer.size.bytes in the connector configuration.
- DUtilize a lower Buffer.count.records in the connector configuration.
Correct Answer:
D
D
send
light_mode
delete
Question #23
An Architect has chosen to separate their Snowflake Production and QA environments using two separate Snowflake accounts.
The QA account is intended to run and test changes on data and database objects before pushing those changes to the Production account. It is a requirement that all database objects and data in the QA account need to be an exact copy of the database objects, including privileges and data in the Production account on at least a nightly basis.
Which is the LEAST complex approach to use to populate the QA account with the Production account’s data and database objects on a nightly basis?
The QA account is intended to run and test changes on data and database objects before pushing those changes to the Production account. It is a requirement that all database objects and data in the QA account need to be an exact copy of the database objects, including privileges and data in the Production account on at least a nightly basis.
Which is the LEAST complex approach to use to populate the QA account with the Production account’s data and database objects on a nightly basis?
- A1. Create a share in the Production account for each database
2. Share access to the QA account as a Consumer
3. The QA account creates a database directly from each share
4. Create clones of those databases on a nightly basis
5. Run tests directly on those cloned databases - B1. Create a stage in the Production account
2. Create a stage in the QA account that points to the same external object-storage location
3. Create a task that runs nightly to unload each table in the Production account into the stage
4. Use Snowpipe to populate the QA account - C1. Enable replication for each database in the Production account
2. Create replica databases in the QA account
3. Create clones of the replica databases on a nightly basis
4. Run tests directly on those cloned databasesMost Voted - D1. In the Production account, create an external function that connects into the QA account and returns all the data for one specific table
2. Run the external function as part of a stored procedure that loops through each table in the Production account and populates each table in the QA account
Correct Answer:
A
A
send
light_mode
delete
Question #24
A user can change object parameters using which of the following roles?
- AACCOUNTADMIN, SECURITYADMIN
- BSYSADMIN, SECURITYADMIN
- CACCOUNTADMIN, USER with PRIVILEGEMost Voted
- DSECURITYADMIN, USER with PRIVILEGE
Correct Answer:
A
A
send
light_mode
delete
Question #25
A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.
The data pipeline needs to run continuously ang efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.
Which design will meet these requirements?
The data pipeline needs to run continuously ang efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.
Which design will meet these requirements?
- AIngest the data using COPY INTO and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.
- BIngest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.Most Voted
- CIngest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.
- DIngest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.
Correct Answer:
B
B
send
light_mode
delete
All Pages