ServicesAboutNotesContact Get in touch →
EN FR
Note

Elementary CLI profile configuration

How to configure the Elementary CLI (edr) profile for BigQuery, Snowflake, and Databricks -- including the gotchas that differ from your dbt profile.

Planted
dbtelementarybigquerysnowflakedatabricksdata qualitydata engineering

The Elementary CLI (edr) has its own connection profile, separate from your dbt profile. This is intentional: the CLI runs independently from dbt, on whatever schedule you choose, and may need different credentials or permissions than your dbt service account. The profiles live at ~/.edr/profiles.yml rather than ~/.dbt/profiles.yml.

The fastest way to get a correct starting template is to run the generator macro from within your dbt project:

Terminal window
dbt run-operation elementary.generate_elementary_cli_profile

This outputs YAML pre-filled for your adapter type. Copy it into ~/.edr/profiles.yml and fill in the actual values.

BigQuery

A complete BigQuery CLI profile using OAuth (suitable for local use):

elementary:
outputs:
default:
type: bigquery
method: oauth
project: your-project-id
dataset: your_schema_elementary
location: US
threads: 4

For production deployments where you’re running edr from a CI/CD system or scheduler, use a service account instead:

elementary:
outputs:
default:
type: bigquery
method: service-account
project: your-project-id
dataset: your_schema_elementary
keyfile: /path/to/service-account.json
location: US
threads: 4

The location field is the most common source of errors on BigQuery setups. dbt infers the dataset location automatically, so many teams omit it from their dbt profiles and never notice. The Elementary CLI does not infer it — you must provide it explicitly. Use US, EU, or your specific region (e.g., europe-west1). If you’re seeing location or region errors from edr, this is almost always why.

Required permissions

The service account (or user) running edr needs read access to the Elementary tables and metadata access to your dbt datasets. It does not need write access to your models. The minimal set:

  • BigQuery Data Viewer on the Elementary dataset
  • BigQuery Metadata Viewer on your dbt datasets
  • BigQuery Resource Viewer on your dbt datasets
  • BigQuery Job User on the project (for running read queries)

These are read-oriented roles. The CLI doesn’t modify your production data.

Snowflake

Snowflake supports password or keypair authentication. Keypair is preferred for service-level access:

elementary:
outputs:
default:
type: snowflake
account: your_account_id
user: elementary_user
role: elementary_role
private_key_path: /path/to/private.key
database: analytics
warehouse: transforming
schema: elementary
threads: 4

The role needs access to the Elementary schema and nothing else. Create a minimal role rather than reusing an existing one with broader permissions:

CREATE ROLE elementary_role;
GRANT USAGE ON WAREHOUSE transforming TO ROLE elementary_role;
GRANT USAGE ON DATABASE analytics TO ROLE elementary_role;
GRANT USAGE ON SCHEMA analytics.elementary TO ROLE elementary_role;
GRANT SELECT ON ALL TABLES IN SCHEMA analytics.elementary TO ROLE elementary_role;
GRANT SELECT ON FUTURE TABLES IN SCHEMA analytics.elementary TO ROLE elementary_role;

The FUTURE TABLES grant ensures that as Elementary creates new metadata tables over time, the role can read them without requiring permission updates.

Databricks

For Unity Catalog, the profile requires a catalog parameter to handle the three-level namespace (catalog.schema.table):

elementary:
outputs:
default:
type: databricks
host: your-workspace.cloud.databricks.com
http_path: /sql/1.0/warehouses/your-warehouse-id
token: your-personal-access-token
catalog: your_catalog
schema: elementary
threads: 4

For production deployments, use service principals rather than personal access tokens.

One Databricks-specific issue: running edr against a shared cluster can produce permission errors. The CLI attempts to write to package directories during initialization, and shared clusters restrict this. The fix is to use a single-user cluster, or pass --update-dbt-package false to any edr commands, which skips the package write:

Terminal window
edr report --update-dbt-package false
edr monitor --update-dbt-package false --slack-token $SLACK_TOKEN --slack-channel-name data-alerts

Installing the CLI

Install edr with the adapter-specific extras that match your warehouse:

Terminal window
pip install 'elementary-data[bigquery]'
# or
pip install 'elementary-data[snowflake]'
# or
pip install 'elementary-data[databricks]'

The adapter extras pull in the right driver dependencies. Installing without them results in missing imports when edr tries to connect.

If edr isn’t found after installation, you’re likely running it outside an activated virtual environment. Either activate the environment where you installed it, or run it through Python directly:

Terminal window
python -m edr report

Once the profile is configured and the CLI is installed, edr report generates an HTML file in ./edr_target/. The Elementary setup troubleshooting note covers what to check if the report shows no data.