Databricks Data Engineer Professional Testing & Deployment

The testing and deployment section of the certified Databricks data engineer professional exam carries 10% of the total available marks. Expect around 6 questions on this topic. This is the last post in the series on preparing for the Databricks Data Engineer Professional Exam.

Find the overview and links to all sections here.

Section 6: Testing & Deployment

  • Adapt a notebook dependency pattern to use Python file dependencies

Run a Databricks notebook from another notebook – Azure Databricks | Microsoft Learn

  • Adapt Python code maintained as Wheels to direct imports using relative paths

Use a Python wheel file in an Azure Databricks job – Azure Databricks | Microsoft Learn

  • Repair and rerun failed jobs

Troubleshoot and repair job failures – Azure Databricks | Microsoft Learn

  • Create Jobs based on common use cases and patterns
  • Create a multi-task job with multiple dependencies
  • Design systems that control for cost and latency SLAs for production streaming jobs.
  • Configure the Databricks CLI and execute basic commands to interact with the workspace and clusters.

https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/install

https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/commands#workspace-commands

https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/commands#compute-commands

  • Execute commands from the CLI to deploy and monitor Databricks jobs.

Related Posts

One thought on “Databricks Data Engineer Professional Testing & Deployment

Leave a Reply

Your email address will not be published. Required fields are marked *