Databricks Certified Data Engineer Professional Exam Practice Questions (P. 2)
- Full Access (227 questions)
- Six months of Premium Access
- Access to one million comments
- Seamless ChatGPT Integration
- Ability to download PDF files
- Anki Flashcard files for revision
- No Captcha & No AdSense
- Advanced Exam Configuration
Question #6
The security team is exploring whether or not the Databricks secrets module can be leveraged for connecting to an external database.
After testing the code with all Python variables being defined with strings, they upload the password to the secrets module and configure the correct permissions for the currently active user. They then modify their code to the following (leaving all other variables unchanged).

Which statement describes what will happen when the above code is executed?
After testing the code with all Python variables being defined with strings, they upload the password to the secrets module and configure the correct permissions for the currently active user. They then modify their code to the following (leaving all other variables unchanged).

Which statement describes what will happen when the above code is executed?
- AThe connection to the external table will fail; the string "REDACTED" will be printed.
- BAn interactive input box will appear in the notebook; if the right password is provided, the connection will succeed and the encoded password will be saved to DBFS.
- CAn interactive input box will appear in the notebook; if the right password is provided, the connection will succeed and the password will be printed in plain text.
- DThe connection to the external table will succeed; the string value of password will be printed in plain text.
- EThe connection to the external table will succeed; the string "REDACTED" will be printed.Most Voted
Correct Answer:
E
E
send
light_mode
delete
Question #7
The data science team has created and logged a production model using MLflow. The following code correctly imports and applies the production model to output the predictions as a new DataFrame named preds with the schema "customer_id LONG, predictions DOUBLE, date DATE".

The data science team would like predictions saved to a Delta Lake table with the ability to compare all predictions across time. Churn predictions will be made at most once per day.
Which code block accomplishes this task while minimizing potential compute costs?

The data science team would like predictions saved to a Delta Lake table with the ability to compare all predictions across time. Churn predictions will be made at most once per day.
Which code block accomplishes this task while minimizing potential compute costs?
- Apreds.write.mode("append").saveAsTable("churn_preds")Most Voted
- Bpreds.write.format("delta").save("/preds/churn_preds")
- C
- D
- E
Correct Answer:
A
A
send
light_mode
delete
Question #8
An upstream source writes Parquet data as hourly batches to directories named with the current date. A nightly batch job runs the following code to ingest all data from the previous day as indicated by the date variable:

Assume that the fields customer_id and order_id serve as a composite key to uniquely identify each order.
If the upstream system is known to occasionally produce duplicate entries for a single order hours apart, which statement is correct?

Assume that the fields customer_id and order_id serve as a composite key to uniquely identify each order.
If the upstream system is known to occasionally produce duplicate entries for a single order hours apart, which statement is correct?
- AEach write to the orders table will only contain unique records, and only those records without duplicates in the target table will be written.
- BEach write to the orders table will only contain unique records, but newly written records may have duplicates already present in the target table.Most Voted
- CEach write to the orders table will only contain unique records; if existing records with the same key are present in the target table, these records will be overwritten.
- DEach write to the orders table will only contain unique records; if existing records with the same key are present in the target table, the operation will fail.
- EEach write to the orders table will run deduplication over the union of new and existing records, ensuring no duplicate records are present.
Correct Answer:
B
B
send
light_mode
delete
Question #9
A junior member of the data engineering team is exploring the language interoperability of Databricks notebooks. The intended outcome of the below code is to register a view of all sales that occurred in countries on the continent of Africa that appear in the geo_lookup table.
Before executing the code, running SHOW TABLES on the current database indicates the database contains only two tables: geo_lookup and sales.

Which statement correctly describes the outcome of executing these command cells in order in an interactive notebook?
Before executing the code, running SHOW TABLES on the current database indicates the database contains only two tables: geo_lookup and sales.

Which statement correctly describes the outcome of executing these command cells in order in an interactive notebook?
- ABoth commands will succeed. Executing show tables will show that countries_af and sales_af have been registered as views.
- BCmd 1 will succeed. Cmd 2 will search all accessible databases for a table or view named countries_af: if this entity exists, Cmd 2 will succeed.
- CCmd 1 will succeed and Cmd 2 will fail. countries_af will be a Python variable representing a PySpark DataFrame.
- DBoth commands will fail. No new variables, tables, or views will be created.
- ECmd 1 will succeed and Cmd 2 will fail. countries_af will be a Python variable containing a list of strings.Most Voted
Correct Answer:
E
E
send
light_mode
delete
Question #10
A Delta table of weather records is partitioned by date and has the below schema: date DATE, device_id INT, temp FLOAT, latitude FLOAT, longitude FLOAT
To find all the records from within the Arctic Circle, you execute a query with the below filter: latitude > 66.3
Which statement describes how the Delta engine identifies which files to load?
To find all the records from within the Arctic Circle, you execute a query with the below filter: latitude > 66.3
Which statement describes how the Delta engine identifies which files to load?
- AAll records are cached to an operational database and then the filter is applied
- BThe Parquet file footers are scanned for min and max statistics for the latitude column
- CAll records are cached to attached storage and then the filter is applied
- DThe Delta log is scanned for min and max statistics for the latitude columnMost Voted
- EThe Hive metastore is scanned for min and max statistics for the latitude column
Correct Answer:
B
B

The correct procedure employed by the Delta engine in identifying the necessary files for processing queries involves analyzing the Parquet file footers. These footers contain metadata including min and max statistics of each column, which are instrumental in efficient data retrieval. Specifically, when a query with a latitude-based filter is executed, these metadata allow the engine to swiftly ascertain which files contain relevant records above the specified latitude threshold, thereby skipping irrelevant files and enhancing query performance. This design is pivotal for optimizing data processing in large datasets, ensuring speed and efficiency in data handling.
send
light_mode
delete
All Pages