Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,6 @@
### CLI

### Bundles
* engine/direct: Add declarative `bind` blocks under a target to bring existing workspace resources under bundle management at deploy time, with `bind` and `bind_and_update` actions surfaced in `bundle plan` output ([#4630](https://github.com/databricks/cli/pull/4630)).

### Dependency updates
23 changes: 23 additions & 0 deletions acceptance/bundle/deploy/bind/basic/databricks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
bundle:
name: test-bind-basic

resources:
jobs:
foo:
name: test-bind-job
environments:
- environment_key: default
spec:
client: "1"
tasks:
- task_key: my_task
environment_key: default
spark_python_task:
python_file: ./hello.py

targets:
default:
bind:
jobs:
foo:
id: "PLACEHOLDER_JOB_ID"
1 change: 1 addition & 0 deletions acceptance/bundle/deploy/bind/basic/hello.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
print("hello")
3 changes: 3 additions & 0 deletions acceptance/bundle/deploy/bind/basic/out.test.toml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

66 changes: 66 additions & 0 deletions acceptance/bundle/deploy/bind/basic/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@

>>> [CLI] bundle plan
bind jobs.foo (id: [NEW_JOB_ID])

Plan: 0 to add, 1 to change, 0 to delete, 0 unchanged, 1 to bind

>>> [CLI] bundle deploy --auto-approve
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bind-basic/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!

>>> [CLI] bundle plan
Plan: 0 to add, 0 to change, 0 to delete, 1 unchanged

>>> print_state.py
{
"state_version": 2,
"cli_version": "[DEV_VERSION]",
"lineage": "[UUID]",
"serial": 1,
"state": {
"resources.jobs.foo": {
"__id__": "[NEW_JOB_ID]",
"state": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/test-bind-basic/default/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"environments": [
{
"environment_key": "default",
"spec": {
"client": "1"
}
}
],
"format": "MULTI_TASK",
"max_concurrent_runs": 1,
"name": "test-bind-job",
"queue": {
"enabled": true
},
"tasks": [
{
"environment_key": "default",
"spark_python_task": {
"python_file": "/Workspace/Users/[USERNAME]/.bundle/test-bind-basic/default/files/hello.py"
},
"task_key": "my_task"
}
]
}
}
}
}

>>> [CLI] bundle destroy --auto-approve
The following resources will be deleted:
delete resources.jobs.foo

All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/test-bind-basic/default

Deleting files...
Destroy complete!
37 changes: 37 additions & 0 deletions acceptance/bundle/deploy/bind/basic/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Create a job in the workspace
NEW_JOB_ID=$($CLI jobs create --json '{"name": "test-import-job", "environments": [{"environment_key": "default", "spec": {"client": "1"}}], "tasks": [{"task_key": "my_task", "environment_key": "default", "spark_python_task": {"python_file": "/Workspace/test.py"}}]}' | jq -r .job_id)
add_repl.py $NEW_JOB_ID NEW_JOB_ID

# Update the databricks.yml with the actual job ID
update_file.py databricks.yml 'PLACEHOLDER_JOB_ID' "$NEW_JOB_ID"

# Run plan - should show import action
trace $CLI bundle plan

# Deploy with auto-approve
trace $CLI bundle deploy --auto-approve

# Plan again - should show no changes (skip)
trace $CLI bundle plan

# Verify state file contains the imported ID
trace print_state.py | contains.py "$NEW_JOB_ID"

# Remove bind block before destroy (destroy blocks on active bind blocks)
python3 << 'PYSCRIPT'
import re
with open('databricks.yml', 'r') as f:
content = f.read()

# Remove the bind block from the target (everything from "bind:" until the next top-level key or EOF)
content = re.sub(r'\n bind:.*', '', content, flags=re.DOTALL)

with open('databricks.yml', 'w') as f:
f.write(content)
PYSCRIPT

# Remove .databricks directory that might cache old config
rm -rf .databricks

# Cleanup
trace $CLI bundle destroy --auto-approve
23 changes: 23 additions & 0 deletions acceptance/bundle/deploy/bind/bind-and-update/databricks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
bundle:
name: test-bind-update

resources:
jobs:
foo:
name: updated-job-name
environments:
- environment_key: default
spec:
client: "1"
tasks:
- task_key: my_task
environment_key: default
spark_python_task:
python_file: ./hello.py

targets:
default:
bind:
jobs:
foo:
id: "PLACEHOLDER_JOB_ID"
1 change: 1 addition & 0 deletions acceptance/bundle/deploy/bind/bind-and-update/hello.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
print("hello")
3 changes: 3 additions & 0 deletions acceptance/bundle/deploy/bind/bind-and-update/out.test.toml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

37 changes: 37 additions & 0 deletions acceptance/bundle/deploy/bind/bind-and-update/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@

>>> [CLI] bundle plan
bind jobs.foo (id: [NEW_JOB_ID])

Plan: 0 to add, 1 to change, 0 to delete, 0 unchanged, 1 to bind

>>> [CLI] bundle deploy --auto-approve
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bind-update/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!

>>> [CLI] jobs get [NEW_JOB_ID]
updated-job-name

>>> [CLI] bundle plan
update jobs.foo

Plan: 0 to add, 1 to change, 0 to delete, 0 unchanged

>>> [CLI] bundle deploy --auto-approve
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bind-update/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!

>>> [CLI] jobs get [NEW_JOB_ID]
second-update-name

>>> [CLI] bundle destroy --auto-approve
The following resources will be deleted:
delete resources.jobs.foo

All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/test-bind-update/default

Deleting files...
Destroy complete!
44 changes: 44 additions & 0 deletions acceptance/bundle/deploy/bind/bind-and-update/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# Create a job in the workspace with a different name
NEW_JOB_ID=$($CLI jobs create --json '{"name": "original-job-name", "environments": [{"environment_key": "default", "spec": {"client": "1"}}], "tasks": [{"task_key": "my_task", "environment_key": "default", "spark_python_task": {"python_file": "/Workspace/test.py"}}]}' | jq -r .job_id)
add_repl.py $NEW_JOB_ID NEW_JOB_ID

# Update the databricks.yml with the actual job ID
update_file.py databricks.yml 'PLACEHOLDER_JOB_ID' "$NEW_JOB_ID"

# Run plan - should show bind action (name differs from config)
trace $CLI bundle plan

# Deploy with auto-approve
trace $CLI bundle deploy --auto-approve

# Verify the job was updated
trace $CLI jobs get $NEW_JOB_ID | jq -r .settings.name

# Now update the job name again in the config and deploy again.
# This time the action should be "update", not "bind", since the resource
# is already bound in state.
update_file.py databricks.yml 'updated-job-name' 'second-update-name'
trace $CLI bundle plan
trace $CLI bundle deploy --auto-approve

# Verify the job was updated with the second name
trace $CLI jobs get $NEW_JOB_ID | jq -r .settings.name

# Remove bind block before destroy (destroy blocks on active bind blocks)
python3 << 'PYSCRIPT'
import re
with open('databricks.yml', 'r') as f:
content = f.read()

# Remove the bind block from the target (everything from "bind:" until the next top-level key or EOF)
content = re.sub(r'\n bind:.*', '', content, flags=re.DOTALL)

with open('databricks.yml', 'w') as f:
f.write(content)
PYSCRIPT

# Remove .databricks directory that might cache old config
rm -rf .databricks

# Cleanup
trace $CLI bundle destroy --auto-approve
21 changes: 21 additions & 0 deletions acceptance/bundle/deploy/bind/bind-permissions/databricks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
bundle:
name: test-bind-permissions

resources:
jobs:
foo:
name: test-job
tasks:
- task_key: test
notebook_task:
notebook_path: ./nb.py
permissions:
- group_name: users
level: CAN_MANAGE

targets:
default:
bind:
jobs.permissions:
foo:
id: "12345"
2 changes: 2 additions & 0 deletions acceptance/bundle/deploy/bind/bind-permissions/nb.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Databricks notebook source
print("Hello, World!")
3 changes: 3 additions & 0 deletions acceptance/bundle/deploy/bind/bind-permissions/out.test.toml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

25 changes: 25 additions & 0 deletions acceptance/bundle/deploy/bind/bind-permissions/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@

>>> musterr [CLI] bundle validate
Error: binding jobs.permissions is not allowed

bind can only be used for resources directly under the resources block, not for child resources like permissions or grants.

To manage permissions or grants:
1. First bind the parent resource (without .permissions or .grants)
2. Then define permissions or grants in your bundle configuration

Invalid bind configuration:
bind:
jobs.permissions:
foo:
id: ...

Instead, remove this bind entry and ensure the parent resource is bound.

Name: test-bind-permissions
Target: default
Workspace:
User: [USERNAME]
Path: /Workspace/Users/[USERNAME]/.bundle/test-bind-permissions/default

Found 1 error
3 changes: 3 additions & 0 deletions acceptance/bundle/deploy/bind/bind-permissions/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
#!/bin/bash

trace musterr $CLI bundle validate
23 changes: 23 additions & 0 deletions acceptance/bundle/deploy/bind/block-migrate/databricks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
bundle:
name: test-bind-block-migrate

resources:
jobs:
foo:
name: test-job
environments:
- environment_key: default
spec:
client: "1"
tasks:
- task_key: my_task
environment_key: default
spark_python_task:
python_file: ./hello.py

targets:
default:
bind:
jobs:
foo:
id: "12345"
1 change: 1 addition & 0 deletions acceptance/bundle/deploy/bind/block-migrate/hello.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
print("hello")
3 changes: 3 additions & 0 deletions acceptance/bundle/deploy/bind/block-migrate/out.test.toml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 3 additions & 0 deletions acceptance/bundle/deploy/bind/block-migrate/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@

>>> musterr [CLI] bundle deployment migrate
Error: cannot run 'bundle deployment migrate' when bind blocks are defined in the target configuration; bind blocks are only supported with the direct deployment engine
2 changes: 2 additions & 0 deletions acceptance/bundle/deploy/bind/block-migrate/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Try to run migration with bind blocks - should fail
trace musterr $CLI bundle deployment migrate
2 changes: 2 additions & 0 deletions acceptance/bundle/deploy/bind/block-migrate/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Migration test does not need engine matrix
[EnvMatrix]
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
bundle:
name: test-bind-delete-conflict

resources:
jobs:
foo:
name: test-bind-delete-job
environments:
- environment_key: default
spec:
client: "1"
tasks:
- task_key: my_task
environment_key: default
spark_python_task:
python_file: ./hello.py

targets:
default:
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
bundle:
name: test-bind-delete-conflict

resources:
jobs:
bar:
name: test-bind-delete-job
environments:
- environment_key: default
spec:
client: "1"
tasks:
- task_key: my_task
environment_key: default
spark_python_task:
python_file: ./hello.py

targets:
default:
bind:
jobs:
bar:
id: "PLACEHOLDER_JOB_ID"
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
print("hello")
Loading
Loading