The dbt MCP server’s local mode gives an AI assistant full dbt CLI access. That’s the value proposition: “run my models through conversation.” It’s also the risk. Commands like run and build materialize models in your warehouse. If the server is configured with production credentials, the AI can modify production data.
This is not a theoretical concern. It’s the default behavior. The dbt MCP server executes commands with whatever profile and target your dbt installation defaults to. If your profiles.yml defaults to production — and many do, especially in environments where dbt Cloud handles development — every dbt run through MCP hits production.
CLI Commands Modify Data
The distinction is worth stating clearly:
- Metadata tools (
get_model_details,get_lineage,get_all_sources) are read-only. They read the manifest and catalog. No warehouse writes. - CLI tools (
run,build,test,show,compile) execute against the warehouse.compileonly generates SQL without running it, butrunandbuildmaterialize tables and views.testruns queries.showruns arbitrary SQL.
When an AI assistant calls build, it executes exactly the same operation as typing dbt build in your terminal. There is no sandbox, no dry-run mode, no safety layer between the MCP tool call and the dbt command.
Mitigations
Use a Development Profile
The most direct mitigation: configure the MCP server to use a development target.
If your profiles.yml has separate targets for dev and prod, set the appropriate target in your dbt project’s dbt_project.yml or pass it through the MCP server’s environment:
my_project: target: dev # Default to dev, not prod outputs: dev: type: bigquery project: my-project-dev dataset: dbt_dev prod: type: bigquery project: my-project-prod dataset: analyticsWhen the MCP server launches dbt, it picks up the default target. If that’s dev, all CLI commands execute against your development environment. Production stays untouched.
Disable CLI for Production Access
For configurations where the server needs production credentials — say, for metadata discovery and Semantic Layer queries — but shouldn’t be able to run CLI commands:
{ "env": { "DISABLE_DBT_CLI": "true", "DBT_HOST": "cloud.getdbt.com", "DBT_TOKEN": "your-production-token", "DBT_PROD_ENV_ID": "12345" }}The DISABLE_DBT_CLI=true feature toggle removes all CLI tools from the server. The AI can still browse metadata and query the Semantic Layer, but run, build, and test don’t exist as options. This is the recommended configuration for any production-connected server.
Add Safety Hooks
The dbt Production Safety Hooks pattern uses Claude Code’s PreToolUse hooks to block specific dangerous commands before they execute. Even if the MCP server has CLI access, a hook can intercept dbt run --target prod and block it:
# In .claude/hooks/dbt-safety.shif echo "$command" | grep -q "dbt" && echo "$command" | grep -q "\-\-target.*prod"; then if echo "$command" | grep -qE "dbt (run|build)" && ! echo "$command" | grep -qE "\-\-(select|models)"; then echo "Blocked: dbt run on production needs explicit --select" >&2 exit 2 fifiHooks are a defense-in-depth layer. The primary defense is using a development profile or disabling CLI. Hooks catch what slips through.
Scope Service Token Permissions
If you’re using dbt Cloud credentials, follow the principle of least privilege:
| Need | Grant |
|---|---|
| Metadata browsing only | Metadata Only |
| Metadata + metrics | Metadata Only + Semantic Layer Only |
| Full access including job management | Metadata Only + Semantic Layer Only + Job Admin |
Start with Metadata Only. Add permissions when the workflow requires them, not preemptively.
Copilot Credit Consumption
The text_to_sql tool, if enabled in your dbt Cloud configuration, consumes dbt Copilot credits. These credits are plan-specific and finite. An AI assistant that liberally converts natural language to SQL through this tool can burn through credits faster than expected, especially in exploratory sessions where the AI tries multiple query variations.
Monitor your Copilot credit usage in the dbt Cloud dashboard. If credits are a concern, DISABLE_SQL=true prevents the AI from using the text-to-SQL feature. The AI can still use the regular show tool to run explicit SQL, which doesn’t consume Copilot credits.
Enterprise OAuth Complications
Organizations using OAuth authentication (rather than service tokens) may encounter issues with dbt MCP. The server requires static subdomains for authentication — dynamic OAuth redirect URIs that change per session aren’t supported. Check with your dbt Cloud admin for the correct hostname format.
This is primarily an issue for large enterprises with custom identity providers. If you’re using service tokens, it doesn’t apply.
The Experimental Status
dbt Labs marks the MCP server as experimental. In practice, this means:
- The tool API may change between releases. Tool names, parameters, and response formats are not guaranteed stable.
- New tools may appear and existing ones may be deprecated without long deprecation cycles.
- The
#tools-dbt-mcpchannel in dbt Community Slack is the canonical source for updates and breaking changes.
The experimental label doesn’t mean the server is unreliable — it works well for its intended purpose. It means you shouldn’t build deeply coupled automation on top of specific tool names or response formats without accepting the maintenance burden when things change.
A Sensible Default
For most individual data engineers, the safe starting configuration is:
- Local server with development target as default
- dbt Cloud credentials for metadata and Semantic Layer (if available)
- Safety hooks blocking production-targeted run/build commands
- CLI enabled (because that’s where the value is)
For team deployments with production access:
- Local server with
DISABLE_DBT_CLI=true - dbt Cloud credentials for read-only metadata and Semantic Layer
- Service token scoped to Metadata Only + Semantic Layer Only
- Separate development-targeted server for engineers who need CLI access
This mirrors the cost and safety patterns used for BigQuery MCP access — read-only by default, write access deliberately scoped.