Announcing the General Availability of dbt Projects on Snowflake

Data transformations are the core building blocks of any effective data strategy, crucial for constructing robust data pipelines. For years, data teams have relied on dbt (data build tool) to bring software engineering best practices — such as modularity, version control and testing — to SQL and Snowpark transformation workflows.

But the process hasn’t always been seamless. Data teams and platform owners often run into some common challenges:

  • Infrastructure overhead: Managing compute for an external orchestrator (such as Airflow), in addition to Snowflake, adds maintenance complexity and can potentially reduce reliability across disparate systems.
  • Debugging challenges: Logs and performance data are spread across the orchestrator, query logs, making it hard to find root causes and bottlenecks.
  • Governance gaps: It’s hard to let new teams build and deploy pipelines, especially when there’s a steep learning curve and uniform security is a challenge.
  • CI/CD setup: Setting up robust, automated continuous integration and continuous delivery (CI/CD) for data transformation code often requires significant custom engineering effort to ensure quality and rapid deployment. 

Now, the power of dbt is available natively on Snowflake. dbt projects on Snowflake enable your data team to build, run and monitor dbt projects directly in Snowflake. With the new Workspaces editor, the next generation of SQL authoring in Snowflake, teams can edit and debug projects. dbt projects on Snowflake offer full parity with Snowflake CLI to manage deployments and testing of dbt projects via CI/CD tools, such as GitHub Actions. These native options reduce context switching, simplify setup and accelerate the entire data pipeline development lifecycle.

With dbt projects on Snowflake, teams collaborate to build modular and scalable data products to deliver downstream analytics, AI and applications. Customers previewing this feature reported increased confidence in their ability to build (+34%) and troubleshoot (+11%) their transformation pipelines within a single day.1

1 Between April and June 2025, we surveyed 17 first-time users before and after using dbt projects on Snowflake to measure improvements in build and troubleshooting speed.

Learn about dbt projects on Snowflake in a demo from Charlie Hammond.

 

Accelerate development with dbt projects on Snowflake

dbt projects on Snowflake streamline workflows for data engineers to standardize and automate transformation pipelines, by allowing for:

  • Development and testing: Create, upload and edit dbt projects in Workspaces, using a file-based IDE, which integrates with Git. Perform test runs for data quality and validate models.
  • Visualization and debugging: Compile and visualize directed acyclic graphs (DAGs) to inspect lineage and dependencies directly in the UI.
  • Deployment and orchestration: Deploy and schedule data pipelines using native Snowflake tasks, simplifying orchestration. Select from various dbt commands such as COMPILE, TEST, RUN and more, right from the native Workspaces IDE.
  • Monitoring and tracing: Monitor execution history with fine-grained logging and tracing. 

Get started today

Whether you want to import an existing dbt project from Git or start from scratch, it’s easy to get started with dbt projects on Snowflake:

  1. Navigate to Snowsight Workspaces.
  2. Choose to create or import a dbt project from a Git repository.
  3. Run your project using an existing Snowflake virtual warehouse.

Try our getting started tutorial or grab code from Snowflake Labs. The operational efficiency, standardization and simplified developer experience this feature delivers will enable more teams to build and deploy modern data products. Learn more in Snowflake documentation or visit the Developer Guides page to get hands on.

 

LATEST ARTICLE

See Our Latest

Blog Posts

admin November 26th, 2025

Real business intelligence is more than seeing a number — it’s about understanding the story within it. In the UK, […]

admin November 26th, 2025

Data engineering is having a moment. Everyone suddenly cares about pipelines, lineage and “AI foundations.” It still surprises me, mostly […]

admin November 26th, 2025

We are thrilled to announce the availability of Claude Opus 4.5, Anthropic’s most capable model available to customers on Snowflake […]