ServicesAboutNotesContact Get in touch →
EN FR
Note

When to Write Custom dbt Materializations

Decision framework for when custom dbt materializations are worth the maintenance burden versus post-hooks, macros, or built-in incremental strategies.

Planted
dbtdata engineeringdata modeling

Custom materializations add real maintenance burden. They’re Jinja code that sits between dbt and your warehouse, invisible to most team members, and they break in ways that are harder to debug than regular models. Before writing one, exhaust the built-in options. Most “I need a custom materialization” situations are actually “I haven’t fully explored what dbt already does.”

What dbt Already Handles

The incremental materialization is more flexible than it looks, and it covers the majority of cases where people first consider going custom.

Append-only behavior is built in. Set incremental_strategy='append' to insert rows without merging or updating. Add full_refresh: false to prevent --full-refresh from dropping the table. This handles audit logs, event streams, and any table where historical rows should never be touched.

{{ config(
materialized='incremental',
incremental_strategy='append',
full_refresh=false
) }}

Partition-level overwrites are built in too. The insert_overwrite strategy in dbt-bigquery replaces entire partitions instead of merging individual rows. It builds a temp table, identifies affected partitions, and swaps them atomically.

{{ config(
materialized='incremental',
incremental_strategy='insert_overwrite',
partition_by={'field': 'event_date', 'data_type': 'date'}
) }}

The delete+insert strategy handles row-level replacement by unique key. Between these three strategies and the full_refresh config, most incremental patterns are covered without custom code.

The Alternatives to Exhaust First

Before writing a custom materialization, work through these options in order:

Post-hooks handle many post-build operations. If you need a single GRANT after build, one ALTER TABLE, or a security policy on a specific model, a post-hook is simpler:

{{ config(
materialized='table',
post_hook="GRANT SELECT ON {{ this }} TO 'group:analysts@company.com'"
) }}

Post-hooks break down when the pattern is complex (multiple statements with nested quoting), when it repeats across many models (copy-pasting walls of SQL), or when the ordering of operations matters in ways hooks can’t control.

Macros in the model SQL can generate complex statements without touching materializations. A macro that builds a MERGE statement with custom conflict resolution logic keeps the complexity visible in the model file rather than hidden in a materialization.

on-run-end hooks handle project-wide operations that should happen after every dbt run, like revoking temporary permissions or cleaning up staging schemas.

When Custom Materializations Are Worth It

Write a custom materialization when:

  • The pattern is complex enough that copy-pasting would be error-prone. A secured table with three row access policies, eight column descriptions, and data masking tags means a dozen SQL statements in post-hooks. That’s not maintainable. A materialization that reads a config dict and generates the right DDL is.

  • You need the behavior across many models. If thirty models need the same zero-downtime swap pattern, a materialization gives you one place to maintain the logic instead of thirty sets of post-hooks.

  • Built-in materializations actively work against what you need. The standard table materialization drops and recreates. If you need the old table to remain queryable until the new one is validated and ready, you need to control the swap process yourself.

  • You need validation before replacement. Checking row counts, schema compatibility, or business rules before committing a table swap isn’t something post-hooks can do — by the time a post-hook fires, the old table is already gone.

When to Skip the Custom Materialization

Skip it when:

  • A post-hook solves the problem. One or two additional DDL statements after a build don’t justify a full materialization.

  • Only one or two models need the behavior. The overhead of understanding and maintaining a custom materialization isn’t worth it for isolated cases. Put the logic in a macro or a post-hook and move on.

  • You’re the only one who’ll maintain it. Custom materializations are among the hardest dbt code to debug. If nobody else on the team understands Jinja well enough to troubleshoot materialization failures, you’ve created a single point of failure in your project.

  • The behavior is better expressed as a macro. If what you really need is a custom MERGE statement or a specific CREATE TABLE syntax, write it as a macro that the model calls directly. You get the reuse without the materialization overhead.

The Complexity Spectrum

Think of it as a spectrum from simplest to most complex:

  1. Inline SQL — one-off behavior in a single model
  2. Post-hooks — simple post-build operations, one to two statements
  3. Macros — reusable SQL generation, called explicitly from models
  4. Custom materializations — full lifecycle control, many models, complex multi-step operations

Each level adds power and maintenance burden. Jump to a higher level only when the lower levels genuinely can’t solve your problem. The six-step structure of a materialization isn’t hard to understand, but every abstraction you add is an abstraction someone has to debug at 2 AM when the pipeline fails.

The zero-downtime table pattern is the simpler of the common custom materializations and a good place to learn the structure. It solves a clear, concrete problem (build-time gaps and unvalidated swaps) and follows the standard anatomy without too many moving parts. Once you’re comfortable with that, the secured table pattern adds more post-build operations while following the same principles.