-
Notifications
You must be signed in to change notification settings - Fork 159
Update product names: Workflows→Jobs, Delta Live Tables→Spark Declarative Pipelines #4967
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
6a11f6d
2f72a10
d4c3378
ecf3c20
766df90
37c7dd4
4c772d8
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
|
|
@@ -21,7 +21,7 @@ The 'my_default_scala' project was generated by using the default-scala template | |||||
| This deploys everything that's defined for this project. | ||||||
| For example, the default template would deploy a job called | ||||||
| `[dev yourname] my_default_scala_job` to your workspace. | ||||||
| You can find that job by opening your workspace and clicking on **Workflows**. | ||||||
| You can find that job by opening your workspace and clicking on **Jobs**. | ||||||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
|
||||||
| 4. Similarly, to deploy a production copy, type: | ||||||
| ``` | ||||||
|
|
||||||
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
|
|
@@ -21,7 +21,7 @@ The 'my_default_sql' project was generated by using the default-sql template. | |||||
| This deploys everything that's defined for this project. | ||||||
| For example, the default template would deploy a job called | ||||||
| `[dev yourname] my_default_sql_job` to your workspace. | ||||||
| You can find that job by opening your workpace and clicking on **Workflows**. | ||||||
| You can find that job by opening your workpace and clicking on **Jobs**. | ||||||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
|
||||||
| 4. Similarly, to deploy a production copy, type: | ||||||
| ``` | ||||||
|
|
||||||
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
|
|
@@ -40,7 +40,7 @@ The 'my_jobs_as_code' project was generated by using the "Jobs as code" template | |||||
| This deploys everything that's defined for this project. | ||||||
| For example, the default template would deploy a job called | ||||||
| `[dev yourname] my_jobs_as_code_job` to your workspace. | ||||||
| You can find that job by opening your workspace and clicking on **Workflows**. | ||||||
| You can find that job by opening your workspace and clicking on **Jobs**. | ||||||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
|
||||||
| 3. Similarly, to deploy a production copy, type: | ||||||
| ``` | ||||||
|
|
||||||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -328,7 +328,7 @@ github.com/databricks/cli/bundle/config/resources.ModelServingEndpoint: | |
| github.com/databricks/cli/bundle/config/resources.Pipeline: | ||
| "_": | ||
| "markdown_description": |- | ||
| The pipeline resource allows you to create Delta Live Tables [pipelines](/api/workspace/pipelines/create). For information about pipelines, see [_](/dlt/index.md). For a tutorial that uses the Declarative Automation Bundles template to create a pipeline, see [_](/dev-tools/bundles/pipelines-tutorial.md). | ||
| The pipeline resource allows you to create Spark Declarative [Pipelines](/api/workspace/pipelines/create). For information about pipelines, see [_](/dlt/index.md). For a tutorial that uses the Declarative Automation Bundles template to create a pipeline, see [_](/dev-tools/bundles/pipelines-tutorial.md). | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. [Nit] The new text reads Consider |
||
| "markdown_examples": |- | ||
| The following example defines a pipeline with the resource key `hello-pipeline`: | ||
|
|
||
|
|
@@ -454,7 +454,7 @@ github.com/databricks/cli/bundle/config/resources.RegisteredModel: | |
| github.com/databricks/cli/bundle/config/resources.Schema: | ||
| "_": | ||
| "markdown_description": |- | ||
| The schema resource type allows you to define Unity Catalog [schemas](/api/workspace/schemas/create) for tables and other assets in your workflows and pipelines created as part of a bundle. A schema, different from other resource types, has the following limitations: | ||
| The schema resource type allows you to define Unity Catalog [schemas](/api/workspace/schemas/create) for tables and other assets in your jobs and pipelines created as part of a bundle. A schema, different from other resource types, has the following limitations: | ||
|
|
||
| - The owner of a schema resource is always the deployment user, and cannot be changed. If `run_as` is specified in the bundle, it will be ignored by operations on the schema. | ||
| - Only fields supported by the corresponding [Schemas object create API](/api/workspace/schemas/create) are available for the schema resource. For example, `enable_predictive_optimization` is not supported as it is only available on the [update API](/api/workspace/schemas/update). | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -36,12 +36,12 @@ func approvalForDeploy(ctx context.Context, b *bundle.Bundle, plan *deployplan.P | |
|
|
||
| types := []deployplan.ActionType{deployplan.Recreate, deployplan.Delete} | ||
| schemaActions := filterGroup(actions, "schemas", types...) | ||
| dltActions := filterGroup(actions, "pipelines", types...) | ||
| pipelineActions := filterGroup(actions, "pipelines", types...) | ||
| volumeActions := filterGroup(actions, "volumes", types...) | ||
| dashboardActions := filterGroup(actions, "dashboards", types...) | ||
|
|
||
| // We don't need to display any prompts in this case. | ||
| if len(schemaActions) == 0 && len(dltActions) == 0 && len(volumeActions) == 0 && len(dashboardActions) == 0 { | ||
| if len(schemaActions) == 0 && len(pipelineActions) == 0 && len(volumeActions) == 0 && len(dashboardActions) == 0 { | ||
| return true, nil | ||
| } | ||
|
|
||
|
|
@@ -56,10 +56,10 @@ func approvalForDeploy(ctx context.Context, b *bundle.Bundle, plan *deployplan.P | |
| } | ||
| } | ||
|
|
||
| // One or more DLT pipelines is being recreated. | ||
| if len(dltActions) != 0 { | ||
| // One or more SDP pipelines is being recreated. | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. [Suggestion] Nice rename from |
||
| if len(pipelineActions) != 0 { | ||
| cmdio.LogString(ctx, deleteOrRecreatePipelineMessage) | ||
| for _, action := range dltActions { | ||
| for _, action := range pipelineActions { | ||
| cmdio.Log(ctx, action) | ||
| } | ||
| } | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.