ServicesAboutNotesContact Get in touch →
EN FR
Note

dbt Cross-Database Macros

Hub for writing dbt macros that work across BigQuery, Snowflake, and Databricks — dialect differences, dispatch configuration, built-in macros, and array operations.

Planted
dbtbigquerysnowflakedatabricksdata engineeringdata modeling

This hub covers writing macros that adapt automatically to different warehouses. It applies when working across multiple SQL dialects or publishing macros as a package. The notes are ordered from understanding the problem to applying the solutions.

Reading Order

  1. SQL Dialect Divergences Across Warehouses — maps where BigQuery, Snowflake, and Databricks disagree: date functions, type casting, and array operations.
  2. dbt Built-In Cross-Database Macros — the dbt.* namespace functions that handle the most common divergences (dates, strings, types, casting) without custom code.
  3. dbt Cross-Database Array Operations — array operations are not covered by built-ins; UNNEST vs LATERAL FLATTEN vs EXPLODE requires custom dispatch macros. This note shows the patterns.
  4. dbt Dispatch Configuration — how to control where dbt looks for adapter-specific implementations: overriding package macros, adding Databricks support via spark_utils, and namespace resolution in dbt_project.yml.

Connections

The dispatch pattern itself is documented in dbt Macros, which covers macro fundamentals, Jinja templating, and when macros help vs. hurt.

For packaging cross-database macros for reuse, see dbt Package Anatomy for structure and dbt Package CI/CD for matrix testing across warehouses and dbt versions.