Not every Dataform project should migrate to dbt. The decision depends on three migration signals and a break-even cost calculation.
The Three Migration Signals
Are you going multi-warehouse? Dataform only works with BigQuery. If your organization is adopting Snowflake, Databricks, or Redshift alongside BigQuery, dbt’s adapter architecture becomes essential. This is the clearest, most unambiguous migration signal. No amount of Dataform customization can make it talk to Snowflake.
Do you need the ecosystem? dbt’s package hub offers 200+ packages covering data quality testing (dbt_expectations, Elementary), marketing attribution, utility functions (dbt_utils), and source-specific transformations (Fivetran packages). Manually implementing functionality that exists as a dbt package adds ongoing maintenance cost.
Does career portability matter? dbt proficiency appears in most analytics engineering job postings. Dataform expertise remains valuable but concentrated in GCP-heavy organizations. dbt experience has broader market applicability, which affects hiring and retention.
When to Stay Put
Migration carries real costs. Stay with Dataform if any of these apply:
Your JavaScript includes are complex. Dataform’s ability to generate models programmatically with standard JavaScript has no direct dbt equivalent. Converting sophisticated .js files that dynamically create dozens of models requires substantial rewriting. See JavaScript vs Jinja in Analytics Engineering for the full scope of this gap.
ML pipelines depend on templating behavior. One team’s migration took two months, plus three additional weeks fixing issues where JavaScript and Jinja template behavior differed in ways that broke model retraining. The fraud that slipped through during that period cost more than years of licensing fees. Numerical precision, null handling, and timestamp formatting can all diverge in subtle ways.
Your use case is simple. If you’re running basic transformation models without complex incremental logic, Dataform’s free tier on BigQuery makes economic sense. Why add licensing costs and migration risk for the same outcome?
The Break-Even Calculation
The math is straightforward. If migration takes three months of engineering time and you’re comparing against dbt Cloud licensing at $100/user/month, you need a 10-person team running for over two years before licensing savings alone justify the switch.
But licensing is rarely the actual driver. The calculation changes immediately when you factor in:
- Multi-warehouse needs — there’s no Dataform alternative; the cost of not migrating is architectural lock-in
- Ecosystem requirements — every custom implementation you avoid is weeks of engineering time saved
- Hiring efficiency — finding Dataform-experienced engineers is harder than finding dbt-experienced engineers
- Feature gaps — snapshots, source freshness, comprehensive testing, and the semantic layer have no Dataform equivalents
The break-even calculation on pure licensing is a distraction. Focus on what capabilities you need and what it costs to build them yourself in Dataform versus getting them out of the box in dbt.
Realistic Timelines
Based on project complexity:
| Project size | Timeline | Primary effort |
|---|---|---|
| Small (~20 models) | 1-2 weeks | Mostly mechanical conversion |
| Medium (50-100 models) | 2-4 weeks | [[dbt Macros |
| Large (100+ models, complex macros) | 2-3 months | JavaScript rewrite, [[dbt Migration Validation Patterns |
| Enterprise with ML dependencies | 3-6 months | Parallel running, stakeholder sign-off |
What extends timelines beyond the estimate:
- Complex JavaScript includes requiring manual rewriting
- Custom incremental strategies beyond simple merge
- ML pipelines requiring regression validation
- Stakeholder approval processes
- Parallel running requirements for compliance
Available Tooling
Two open-source tools accelerate the mechanical parts:
dataform-to-dbt — Node.js tool that handles refs, assertions, and view materializations. Run with npx dataform-to-dbt. Limitations: doesn’t handle JavaScript includes, incremental models, or complex pre-operations.
ra_dbt_to_dataform — Despite the name suggesting the opposite direction, the same team provides patterns for conversion. Uses GPT-4 for complex macro conversion.
These tools handle perhaps 60-70% of a typical project. Plan for the remainder to be manual work, concentrated in the areas that matter most: JavaScript-to-Jinja conversion, incremental model tuning, and validation.
The Decision Framework
Migrate when you’re going multi-warehouse, need the package ecosystem, or want enterprise features like the semantic layer and Mesh. The 2-3 month investment pays dividends in reduced maintenance and expanded capabilities.
Stay put when your Dataform project is stable, you’re BigQuery-only with no plans to change, and you’re not missing functionality. The “if it ain’t broke” principle applies.
Reconsider the timeline when you have complex JavaScript generation, ML pipeline dependencies, or stakeholders requiring extended parallel running. The two-month estimate becomes six months in these scenarios.
Both tools transform SQL identically at the warehouse level. What differs is the ecosystem, the commercial model, and the career implications. The choice should be based on organizational trajectory, not syntax preference.