Databricks Asset Bundles

Automate ETL, analytics, and ML notebook workflows through native Databricks pipeline steps with AI-powered post-run insights via Hummingbird AI.

Prerequisites

  • Databricks account with admin or contributor access

  • Databricks tool registered in Opsera Tool Registry (OAuth or access token)

  • Permissions to configure tools and pipelines in OpseraSCM repository with Databricks project registered (for Asset Bundle step)

Step types

Tool type
Best for

Databricks Asset Bundles

General workflows — jobs, repo sync, code deploy via YAML

Databricks Notebook Execution

Direct notebook runs with custom JSON parameters

Pipeline setup

1

Create pipeline and add stepPipelines → New Pipeline → Blank Pipeline Template → Workflow → Edit Workflow. Click +, enter a descriptive step name, choose Tool Type.

Configure Asset Bundle step

Field
Description

Tool Name

Registered Databricks tool from Tool Registry

SCM Type / Tool

Source control platform (must be pre-registered)

Repo Name

Repository containing the Databricks project

Branch

Target branch (e.g. main)

YAML Path

Path to pipeline definition (e.g. /jobs/deploy.yaml)

Target

Deployment environment (e.g. prod)

2

Configure Notebook Execution step (alternative)

Field
Description

Endpoint URL

Databricks API endpoint (e.g. https://<workspace>.cloud.databricks.com/api/2.0)

Auth Token

Access token for authentication

Data Package JSON

Notebook parameters, e.g. {"notebook_path": "/Users/team/ETL", "timeout": 3600}

3

Run and get AI insights

Click Run Pipeline. After execution, click the Hummingbird AI button to generate a technical summary. Use AI chat to debug errors, optimize performance, or request log analysis.

circle-info

Points to remember

  • Validate the JSON data package syntax before running Notebook Execution steps — malformed JSON will cause the step to fail immediately with a parse error.\

  • Ensure SCM tools are registered in Tool Registry before configuring Asset Bundle steps — the pipeline will not resolve the repository without a valid registered SCM connection.

  • If authentication fails, verify token expiration or OAuth permissions in your Databricks workspace before re-authenticating the tool in Tool Registry.Use tags on pipeline steps for easy filtering in Opsera's dashboard and governance reporting.

Last updated