Describe the bug
dbt docs generate fails with a RuntimeError when any schema in scope returns zero rows from SHOW TABLE EXTENDED. The empty result causes agate to infer text columns (e.g. column_name) as Number. When merged with a non-empty schema result where column_name is Text, agate.Table.merge() throws:
RuntimeError: Tables contain columns with the same names (column_name), but different types
(<agate.data_types.text.Text object at 0x7fc703f6e590> vs
<dbt_common.clients.agate_helper.Number object at 0x7fc70443c1d0>)
The root cause is that _get_schema_for_catalog in dbt-databricks builds the agate table via Table.from_object(columns, DEFAULT_TYPE_TESTER) without using text_only_columns to force metadata columns to Text. When columns is empty (no tables in the schema), agate's type inference defaults everything to Number.
The base adapter's _catalog_filter_table already handles this via text_only_columns (added in dbt-labs/dbt-adapters#1057), but dbt-databricks's catalog pipeline (_get_schema_for_catalog) never calls _catalog_filter_table, so it doesn't benefit from that fix.
Steps To Reproduce
-
Create a dbt project targeting a Databricks Unity Catalog workspace with two schemas in scope:
- Schema A: contains at least one materialized table
- Schema B: exists but contains no tables matching the dbt project scope (i.e.,
SHOW TABLE EXTENDED returns zero rows)
-
Run only the populated model:
dbt run --select schema_a.model_a
-
Trigger catalog generation:
Expected behavior
dbt docs generate should complete successfully. Empty catalog results should not cause type conflicts during merge.
Screenshots and log output
14:46:05 On ('your_catalog', 'repro_empty'): Close
14:46:10 SQL status: OK in 4.830 seconds
14:46:10 On ('your_catalog', 'repro_populated'): Close
14:46:10 Encountered an error:
Runtime Error
Tables contain columns with the same names (column_name), but different types
(<agate.data_types.text.Text object at 0x7fc703f6e590> vs
<dbt_common.clients.agate_helper.Number object at 0x7fc70443c1d0>)
System information
- dbt version: 2026.3.11 (also reproducible on other versions)
- Adapter: dbt-databricks 1.11.x
- OS: Linux (dbt Cloud), also reproducible locally
Additional context
Describe the bug
dbt docs generatefails with aRuntimeErrorwhen any schema in scope returns zero rows fromSHOW TABLE EXTENDED. The empty result causes agate to infer text columns (e.g.column_name) asNumber. When merged with a non-empty schema result wherecolumn_nameisText,agate.Table.merge()throws:The root cause is that
_get_schema_for_catalogindbt-databricksbuilds the agate table viaTable.from_object(columns, DEFAULT_TYPE_TESTER)without usingtext_only_columnsto force metadata columns toText. Whencolumnsis empty (no tables in the schema), agate's type inference defaults everything toNumber.The base adapter's
_catalog_filter_tablealready handles this viatext_only_columns(added in dbt-labs/dbt-adapters#1057), but dbt-databricks's catalog pipeline (_get_schema_for_catalog) never calls_catalog_filter_table, so it doesn't benefit from that fix.Steps To Reproduce
Create a dbt project targeting a Databricks Unity Catalog workspace with two schemas in scope:
SHOW TABLE EXTENDEDreturns zero rows)Run only the populated model:
Trigger catalog generation:
Expected behavior
dbt docs generateshould complete successfully. Empty catalog results should not cause type conflicts during merge.Screenshots and log output
System information
Additional context
catch_as_completed. This prevents the crash for all adapters.text_only_columns(or call_catalog_filter_table) in_get_schema_for_catalogso column types are correct regardless of whether the result is empty. This aligns with how the base adapter handles it after Fix #1056: Keep table and column metadata as strings dbt-labs/dbt-adapters#1057.