KUNDUL Let's talk

Getting Started with dbt on Snowflake

Run your first dbt project against Snowflake in minutes.

Why dbt and Snowflake together?

dbt (data build tool) turns your warehouse into a transformation engine using SQL and Jinja. You write models (SELECT statements), add tests and documentation, and dbt runs them in the right order. Snowflake is one of the most popular dbt adapters: it’s fast, scales well, and supports features like incremental models and snapshots that dbt uses to keep pipelines efficient.

Together, Snowflake holds the data and compute, and dbt defines how raw data becomes analytics-ready tables and marts. That split keeps transformation logic in version control and makes it easy for analysts and engineers to collaborate.

Set up a dbt project for Snowflake

Install dbt Core (e.g. pip install dbt-snowflake) and run dbt init to create a project. In profiles.yml, point to your Snowflake account: account, user, password (or keypair), database, schema, and warehouse. Run dbt debug to confirm the connection.

Add your first model under models/, then run dbt run to build it in Snowflake. Use dbt test to run tests and dbt docs generate && dbt docs serve to view lineage and docs.

Next steps

Organize models into staging, intermediate, and mart layers. Use sources and refs so dbt can build the DAG and run models in order. For large tables, use incremental models so only new or changed data is processed. When you’re ready for scheduling and collaboration, consider dbt Cloud, which integrates natively with Snowflake and adds CI/CD and job orchestration.

Need help with dbt on Snowflake?

Kundul helps teams set up dbt development workflows, Snowflake data models, Fivetran-fed pipelines, and production-ready analytics engineering practices.

Book a call

Learn more