asset-bundles — asset-bundles install asset-bundles, dbrx-multi-agent-retail-intelligence, community, asset-bundles install, ide skills, Databricks multi-environment deployment, Spark Declarative Pipeline configurations, Claude Code, Cursor, Windsurf

v1.0.0
GitHub

About this Skill

Ideal for Databricks-focused AI Agents requiring seamless multi-environment deployment capabilities for Spark Declarative Pipelines asset-bundles is a Databricks Asset Bundle (DAB) writer that facilitates multi-environment deployment through a structured project layout and configuration files like databricks.yml and resource definitions.

Features

Creates Databricks Asset Bundles (DABs) for multi-environment deployment
Supports Spark Declarative Pipeline configurations via SDP_guidance.md
Includes SQL Alert schemas for critical alerts with API differences
Utilizes a structured project layout with databricks.yml and resource definitions
Enables deployment across dev, staging, and prod environments
References alerts_guidance.md for SQL Alert configurations

# Core Topics

juanlamadrid20 juanlamadrid20
[0]
[0]
Updated: 3/8/2026

Agent Capability Analysis

The asset-bundles skill by juanlamadrid20 is an open-source community AI agent skill for Claude Code and other IDE workflows, helping agents execute tasks with better context, repeatability, and domain-specific guidance. Optimized for asset-bundles install, Databricks multi-environment deployment, Spark Declarative Pipeline configurations.

Ideal Agent Persona

Ideal for Databricks-focused AI Agents requiring seamless multi-environment deployment capabilities for Spark Declarative Pipelines

Core Value

Empowers agents to create and manage Databricks Asset Bundles (DABs) for streamlined deployment across dev, staging, and prod environments, leveraging Spark Declarative Pipeline configurations and SQL Alert schemas

Capabilities Granted for asset-bundles

Deploying Databricks applications across multiple environments
Managing Spark Declarative Pipelines for consistent workflow execution
Creating and configuring SQL Alerts for critical notifications

! Prerequisites & Limits

  • Requires Databricks environment setup
  • Specific to Databricks and Spark Declarative Pipelines
  • Needs careful configuration of databricks.yml and resource definitions
Labs Demo

Browser Sandbox Environment

⚡️ Ready to unleash?

Experience this Agent in a zero-setup browser environment powered by WebContainers. No installation required.

Boot Container Sandbox

asset-bundles

Install asset-bundles, an AI agent skill for AI agent workflows and automation. Works with Claude Code, Cursor, and Windsurf with one-command setup.

SKILL.md
Readonly

Databricks Asset Bundle (DABs) Writer

Overview

Create DABs for multi-environment deployment (dev/staging/prod).

Reference Files

Bundle Structure

project/
├── databricks.yml           # Main config + targets
├── resources/*.yml          # Resource definitions
└── src/                     # Code/dashboard files

Main Configuration (databricks.yml)

yaml
1bundle: 2 name: project-name 3 4include: 5 - resources/*.yml 6 7variables: 8 catalog: 9 default: "default_catalog" 10 schema: 11 default: "default_schema" 12 warehouse_id: 13 lookup: 14 warehouse: "Shared SQL Warehouse" 15 16targets: 17 dev: 18 default: true 19 mode: development 20 workspace: 21 profile: dev-profile 22 variables: 23 catalog: "dev_catalog" 24 schema: "dev_schema" 25 26 prod: 27 mode: production 28 workspace: 29 profile: prod-profile 30 variables: 31 catalog: "prod_catalog" 32 schema: "prod_schema"

Dashboard Resources

yaml
1resources: 2 dashboards: 3 dashboard_name: 4 display_name: "[${bundle.target}] Dashboard Title" 5 file_path: ../src/dashboards/dashboard.lvdash.json # Relative to resources/ 6 warehouse_id: ${var.warehouse_id} 7 permissions: 8 - level: CAN_RUN 9 group_name: "users"

Permission levels: CAN_READ, CAN_RUN, CAN_EDIT, CAN_MANAGE

Pipelines

See SDP_guidance.md for pipeline configuration

SQL Alerts

See alerts_guidance.md - Alert schema differs significantly from other resources

Jobs Resources

yaml
1resources: 2 jobs: 3 job_name: 4 name: "[${bundle.target}] Job Name" 5 tasks: 6 - task_key: "main_task" 7 notebook_task: 8 notebook_path: ../src/notebooks/main.py # Relative to resources/ 9 new_cluster: 10 spark_version: "13.3.x-scala2.12" 11 node_type_id: "i3.xlarge" 12 num_workers: 2 13 schedule: 14 quartz_cron_expression: "0 0 9 * * ?" 15 timezone_id: "America/Los_Angeles" 16 permissions: 17 - level: CAN_VIEW 18 group_name: "users"

Permission levels: CAN_VIEW, CAN_MANAGE_RUN, CAN_MANAGE

⚠️ Cannot modify "admins" group permissions on jobs - verify custom groups exist before use

Path Resolution

⚠️ Critical: Paths depend on file location:

File LocationPath FormatExample
resources/*.yml../src/...../src/dashboards/file.json
databricks.yml targets./src/..../src/dashboards/file.json

Why: resources/ files are one level deep, so use ../ to reach bundle root. databricks.yml is at root, so use ./

Volume Resources

yaml
1resources: 2 volumes: 3 my_volume: 4 catalog_name: ${var.catalog} 5 schema_name: ${var.schema} 6 name: "volume_name" 7 volume_type: "MANAGED"

⚠️ Volumes use grants not permissions - different format from other resources

Apps Resources

Apps resource support added in Databricks CLI 0.239.0 (January 2025)

Apps in DABs have a minimal configuration - environment variables are defined in app.yaml in the source directory, NOT in databricks.yml.

bash
1# Generate bundle config from existing CLI-deployed app 2databricks bundle generate app --existing-app-name my-app --key my_app --profile DEFAULT 3 4# This creates: 5# - resources/my_app.app.yml (minimal resource definition) 6# - src/app/ (downloaded source files including app.yaml)

Manual Configuration

resources/my_app.app.yml:

yaml
1resources: 2 apps: 3 my_app: 4 name: my-app-${bundle.target} # Environment-specific naming 5 description: "My application" 6 source_code_path: ../src/app # Relative to resources/ dir

src/app/app.yaml: (Environment variables go here)

yaml
1command: 2 - "python" 3 - "dash_app.py" 4 5env: 6 - name: USE_MOCK_BACKEND 7 value: "false" 8 - name: DATABRICKS_WAREHOUSE_ID 9 value: "your-warehouse-id" 10 - name: DATABRICKS_CATALOG 11 value: "main" 12 - name: DATABRICKS_SCHEMA 13 value: "my_schema"

databricks.yml:

yaml
1bundle: 2 name: my-bundle 3 4include: 5 - resources/*.yml 6 7variables: 8 warehouse_id: 9 default: "default-warehouse-id" 10 11targets: 12 dev: 13 default: true 14 mode: development 15 workspace: 16 profile: dev-profile 17 variables: 18 warehouse_id: "dev-warehouse-id"

Key Differences from Other Resources

AspectAppsOther Resources
Environment varsIn app.yaml (source dir)In databricks.yml or resource file
ConfigurationMinimal (name, description, path)Extensive (tasks, clusters, etc.)
Source pathPoints to app directoryPoints to specific files

⚠️ Important: When source code is in project root (not src/app), use source_code_path: .. in the resource file

Other Resources

DABs supports schemas, models, experiments, clusters, warehouses, etc. Use databricks bundle schema to inspect schemas.

Reference: DABs Resource Types

Common Commands

Validation

bash
1databricks bundle validate # Validate default target 2databricks bundle validate -t prod # Validate specific target

Deployment

bash
1databricks bundle deploy # Deploy to default target 2databricks bundle deploy -t prod # Deploy to specific target 3databricks bundle deploy --auto-approve # Skip confirmation prompts 4databricks bundle deploy --force # Force overwrite remote changes

Running Resources

bash
1databricks bundle run resource_name # Run a pipeline or job 2databricks bundle run pipeline_name -t prod # Run in specific environment 3 4# Apps require bundle run to start after deployment 5databricks bundle run app_resource_key -t dev # Start/deploy the app

Monitoring & Logs

View application logs (for Apps resources):

bash
1# View logs for deployed apps 2databricks apps logs <app-name> --profile <profile-name> 3 4# Examples: 5databricks apps logs my-dash-app-dev -p DEFAULT 6databricks apps logs my-streamlit-app-prod -p DEFAULT

What logs show:

  • [SYSTEM] - Deployment progress, file updates, dependency installation
  • [APP] - Application output (print statements, errors)
  • Backend connection status
  • Deployment IDs and timestamps
  • Stack traces for errors

Key log patterns to look for:

  • Deployment successful - Confirms deployment completed
  • App started successfully - App is running
  • Initialized real backend - Backend connected to Unity Catalog
  • Error: - Look for error messages and stack traces
  • 📝 Requirements installed - Dependencies loaded correctly

Cleanup

bash
1databricks bundle destroy -t dev 2databricks bundle destroy -t prod --auto-approve

Common Issues

IssueSolution
App deployment failsCheck logs: databricks apps logs <app-name> for error details
App not connecting to Unity CatalogCheck logs for backend connection errors; verify warehouse ID and permissions
Wrong permission levelDashboards: CAN_READ/RUN/EDIT/MANAGE; Jobs: CAN_VIEW/MANAGE_RUN/MANAGE
Path resolution failsUse ../src/ in resources/*.yml, ./src/ in databricks.yml
Catalog doesn't existCreate catalog first or update variable
"admins" group error on jobsCannot modify admins permissions on jobs
Volume permissionsUse grants not permissions for volumes
Hardcoded catalog in dashboardCreate environment-specific files or parameterize JSON
App not starting after deployApps require databricks bundle run <resource_key> to start
App env vars not workingEnvironment variables go in app.yaml (source dir), not databricks.yml
Wrong app source pathUse ../ from resources/ dir if source is in project root
Debugging any app issueFirst step: databricks apps logs <app-name> to see what went wrong

Key Principles

  1. Path resolution: ../src/ in resources/*.yml, ./src/ in databricks.yml
  2. Variables: Parameterize catalog, schema, warehouse
  3. Mode: development for dev/staging, production for prod
  4. Groups: Use "users" for all workspace users
  5. Job permissions: Verify custom groups exist; can't modify "admins"

Resources

FAQ & Installation Steps

These questions and steps mirror the structured data on this page for better search understanding.

? Frequently Asked Questions

What is asset-bundles?

Ideal for Databricks-focused AI Agents requiring seamless multi-environment deployment capabilities for Spark Declarative Pipelines asset-bundles is a Databricks Asset Bundle (DAB) writer that facilitates multi-environment deployment through a structured project layout and configuration files like databricks.yml and resource definitions.

How do I install asset-bundles?

Run the command: npx killer-skills add juanlamadrid20/dbrx-multi-agent-retail-intelligence. It works with Cursor, Windsurf, VS Code, Claude Code, and 19+ other IDEs.

What are the use cases for asset-bundles?

Key use cases include: Deploying Databricks applications across multiple environments, Managing Spark Declarative Pipelines for consistent workflow execution, Creating and configuring SQL Alerts for critical notifications.

Which IDEs are compatible with asset-bundles?

This skill is compatible with Cursor, Windsurf, VS Code, Trae, Claude Code, OpenClaw, Aider, Codex, OpenCode, Goose, Cline, Roo Code, Kiro, Augment Code, Continue, GitHub Copilot, Sourcegraph Cody, and Amazon Q Developer. Use the Killer-Skills CLI for universal one-command installation.

Are there any limitations for asset-bundles?

Requires Databricks environment setup. Specific to Databricks and Spark Declarative Pipelines. Needs careful configuration of databricks.yml and resource definitions.

How To Install

  1. 1. Open your terminal

    Open the terminal or command line in your project directory.

  2. 2. Run the install command

    Run: npx killer-skills add juanlamadrid20/dbrx-multi-agent-retail-intelligence. The CLI will automatically detect your IDE or AI agent and configure the skill.

  3. 3. Start using the skill

    The skill is now active. Your AI agent can use asset-bundles immediately in the current project.

Related Skills

Looking for an alternative to asset-bundles or another community skill for your workflow? Explore these related open-source skills.

View All

widget-generator

Logo of f
f

f.k.a. Awesome ChatGPT Prompts. Share, discover, and collect prompts from the community. Free and open source — self-host for your organization with complete privacy.

149.6k
0
AI

flags

Logo of vercel
vercel

flags is a Next.js feature management skill that enables developers to efficiently add or modify framework feature flags, streamlining React application development.

138.4k
0
Browser

zustand

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
AI

data-fetching

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
AI