Before writing custom dispatch macros for cross-database SQL differences, check whether dbt already provides what you need. Since version 1.8, the most common cross-database functions live in the dbt namespace as part of dbt-core itself. No extra packages required.
The Reference Table
| Macro | Purpose | Example |
|---|---|---|
dbt.dateadd() | Add intervals to dates | {{ dbt.dateadd("day", 7, "order_date") }} |
dbt.datediff() | Difference between dates | {{ dbt.datediff("start_date", "end_date", "day") }} |
dbt.date_trunc() | Truncate to precision | {{ dbt.date_trunc("month", "created_at") }} |
dbt.concat() | Concatenate strings | {{ dbt.concat(["first_name", "' '", "last_name"]) }} |
dbt.safe_cast() | Cast with NULL on failure | {{ dbt.safe_cast("col", api.Column.translate_type("string")) }} |
dbt.type_string() | Adapter’s string type | CAST(x AS {{ dbt.type_string() }}) |
dbt.type_timestamp() | Adapter’s timestamp type | CAST(x AS {{ dbt.type_timestamp() }}) |
dbt.type_int() | Adapter’s integer type | CAST(x AS {{ dbt.type_int() }}) |
These macros handle argument reordering, function name differences, and type translation automatically. On BigQuery, dbt.dateadd("day", 7, "order_date") compiles to DATE_ADD(order_date, INTERVAL 7 DAY). On Snowflake, the same call produces DATEADD('day', 7, order_date).
Date Functions
Date functions cover the most common cross-database pain point. The dbt.dateadd() macro takes three arguments in a consistent order regardless of the target database:
{{ dbt.dateadd(datepart="day", interval=7, from_date_or_timestamp="order_date") }}For date differences:
{{ dbt.datediff("start_date", "end_date", "day") }}Date truncation:
{{ dbt.date_trunc("month", "created_at") }}These handle the argument ordering differences between BigQuery, Snowflake, and Databricks automatically. For custom date logic beyond what the built-ins provide (fiscal quarters, business day calculations), use the dispatch pattern with adapter-specific implementations:
{% macro get_fiscal_quarter_start(date_column) %} {{ return(adapter.dispatch('get_fiscal_quarter_start')(date_column)) }}{% endmacro %}
{% macro default__get_fiscal_quarter_start(date_column) %} date_trunc('quarter', {{ date_column }}){% endmacro %}
{% macro bigquery__get_fiscal_quarter_start(date_column) %} date_trunc({{ date_column }}, quarter){% endmacro %}String Concatenation
dbt.concat() handles the differences between databases that use the CONCAT() function, those that prefer the || operator, and those that have limits on argument count:
{{ dbt.concat(["first_name", "' '", "last_name"]) }} AS full_nameThis is particularly useful because some databases limit CONCAT() to two arguments while others accept many. The macro handles it consistently.
Safe Casting and Type Helpers
Safe casting returns NULL instead of an error when a conversion fails. The function name differs across databases (SAFE_CAST on BigQuery, TRY_CAST elsewhere), but dbt.safe_cast() abstracts this:
{{ dbt.safe_cast("user_input", api.Column.translate_type("string")) }}The api.Column.translate_type() function converts generic type names to adapter-specific equivalents. "string" becomes STRING on BigQuery and VARCHAR on Snowflake. Combine it with safe_cast for fully portable type conversions.
For quick access to the most common types without translate_type():
-- These return the correct type name for your adapterCAST({{ column }} AS {{ dbt.type_string() }}) -- STRING or VARCHARCAST({{ column }} AS {{ dbt.type_int() }}) -- INT64 or INTEGERCAST({{ column }} AS {{ dbt.type_timestamp() }}) -- TIMESTAMPUse type helpers whenever you’re generating DDL or writing type-aware SQL in macros. Hardcoding STRING works on BigQuery but fails on Snowflake.
Migration from dbt_utils
Many of these macros used to live in the dbt_utils package. Since dbt-utils v1.0 (aligned with dbt-core 1.8), they moved to the dbt namespace in core. The migration is straightforward:
| Old (deprecated) | New |
|---|---|
dbt_utils.dateadd() | dbt.dateadd() |
dbt_utils.datediff() | dbt.datediff() |
dbt_utils.date_trunc() | dbt.date_trunc() |
dbt_utils.concat() | dbt.concat() |
dbt_utils.safe_cast() | dbt.safe_cast() |
dbt_utils.type_string() | dbt.type_string() |
The dbt_utils versions still work but are deprecated and will eventually be removed. If you’re upgrading an older project, a global find-and-replace from dbt_utils.dateadd to dbt.dateadd (and similar for each function) handles the migration. No argument changes are needed — the signatures are identical.
The move to core means these macros are now available without adding dbt_utils to your packages.yml. For new projects, you may not need dbt_utils at all if you were only using it for cross-database functions. Check what else you’re importing from it — generate_surrogate_key, star, get_column_values, and date_spine still live in dbt_utils.
What the Built-Ins Don’t Cover
The dbt namespace handles dates, strings, types, and casting. It does not cover:
- Array operations —
UNNESTvsLATERAL FLATTENvsEXPLODErequires custom dispatch macros. - JSON parsing — Different functions and syntax across databases.
- Database-specific optimizations — BigQuery’s clustering hints, Snowflake’s
RESULT_SCAN, Databricks’OPTIMIZE. - Approximate aggregations —
APPROX_COUNT_DISTINCTand similar functions vary in availability and behavior.
For these, write your own dispatch macros following the patterns in the macros note, and configure the dispatch search order if you’re building for multi-warehouse use.