dbt model contracts, introduced in Core v1.5 (April 2023), enforce schema guarantees at build time. When you set contract: {enforced: true} on a model, dbt adds a preflight check that prevents the model from materializing if its output doesn’t match the YAML declaration. Three years of production use have made the feature’s strengths and limitations clear.
The Two-Step Enforcement
When a contract is enabled, dbt does two things during the build.
First, a compile-time preflight check. dbt compares the columns your SQL query returns against what you’ve declared in YAML. It checks column names and data types. If there’s a mismatch, dbt throws a compilation error before any table is created. The model never materializes.
Second, DDL constraint inclusion. dbt includes your declared constraints in the DDL statements sent to the warehouse. Instead of a bare CREATE TABLE, the DDL specifies column types and constraints like NOT NULL or PRIMARY KEY. Whether the warehouse actually enforces those constraints is a separate question entirely.
The error output is specific enough to act on immediately:
Compilation Error in model mrt__analytics__customers (models/marts/analytics/mrt__analytics__customers.sql)This model has an enforced contract that failed.
| column_name | definition_type | contract_type | mismatch_reason || ------------- | --------------- | ------------- | ------------------ || customer__id | TEXT | INT | data type mismatch |Three mismatch reasons can appear: data type mismatch, missing in contract (column exists in SQL but not YAML), and missing in definition (column declared in YAML but absent from SQL). The diagnostic table tells you exactly what to fix.
Fail-Fast vs Test-After
This fail-fast behavior is what distinguishes contracts from dbt tests. Tests run after a model builds, which means the table already exists and downstream models may have consumed it. Contracts prevent the model from materializing in the first place. The bad state never reaches the warehouse.
But contracts only validate shape. They check that columns exist and have the right types. They don’t verify whether the data in those columns makes sense. A status column could contain values you’ve never seen before, and contracts won’t catch it. Data quality tests remain essential for content validation. Contracts handle structure, tests handle content, and you need both.
Configuration
The configuration lives in your YAML properties file:
models: - name: mrt__analytics__customers access: public config: materialized: table contract: enforced: true columns: - name: customer__id data_type: integer constraints: - type: not_null - name: customer__name data_type: text - name: customer__lifetime_value data_type: numeric(38,2) - name: customer__is_active data_type: booleanEvery column in your model must be listed with a name and data_type. Partial contracts aren’t supported. If your SQL returns a column that isn’t in the YAML, or the YAML lists a column the SQL doesn’t return, the build fails. This is intentional: a contract that doesn’t cover everything isn’t much of a contract.
Column ordering in your SQL doesn’t matter — the preflight check is order-agnostic. But dbt reorders the output columns to match the YAML’s column order in the DDL, which matters if anything downstream depends on column position.
You can also enable contracts at the directory level in dbt_project.yml, which is the most common approach for teams that want contracts on all mart models:
models: my_project: marts: +contract: enforced: trueType Handling
dbt handles type aliasing across platforms (string maps to text on Postgres, for example), but it doesn’t compare granular sizing. varchar(256) and varchar(257) are treated as equivalent.
Precision matters in one specific case: bare numeric types can default to scale=0, which stores only whole numbers. If your model calculates decimals, always specify precision and scale explicitly (numeric(38,2)). dbt 1.7+ warns when numeric types lack explicit precision, which helps catch this before it quietly rounds your revenue numbers to integers.
Materialization Support
Contracts work with table, incremental, and view materializations (views have limited constraint support). They don’t work on ephemeral models, materialized views, Python models, sources, seeds, or snapshots.
For incremental models, set on_schema_change to append_new_columns or fail. Avoid sync_all_columns, which removes columns not present in the latest run and creates exactly the kind of breaking change contracts are meant to prevent.
Where Contracts Fit
Contracts are one layer of a quality strategy, not the whole thing. They protect model shape within your dbt project. They don’t validate sources (though you can place a contracted base model directly on top of a source). They don’t check data content. And they don’t prevent bad data from entering your warehouse.
The best candidates for contracts are mart models that serve downstream consumers, especially those marked access: public. In a dbt Mesh setup, contracts combine with access controls and model versions to form the governance foundation for cross-project references. When another team uses ref('your_project', 'mrt__analytics__customers'), the contract guarantees they’ll get the columns and types they expect.
For source-level protection, you’re looking at tools outside dbt: schema registries for event streams, EL tool contracts (dlt’s native schema contracts are particularly capable), or runtime validation. The broader tooling ecosystem covers these enforcement points.