Building A DBT Model In Snowflake

Building A DBT Model In Snowflake


We have written about DBT extensively on the site, explaining how it helps to implement a repeatable developer lifecycle around database transformations executed within a data warehouse.

DBT and Snowflake are very complementary, and are frequently deployed together as part of modern cloud based data stacks.

Instead of doing traditional ETL before data goes into Snowflake, the pattern is typically to move to an ELT model, where relatively unprocessed data is loaded into Snowflake, and then transform it within the warehouse using DBT.

Of course we can also take advantage of Snowflake features such as virtual warehouses, multi-cluster warehouses and the ability to query external stages to make very high performing and clean transformation pipelines.

In the video below, we demonstrate the process for executing your first DBT transformation. It involves creating a new dbt model, configuring the DBT profile to point to the Snowflake instance, then executing a new transformation to aggregate some data into a reporting view.

As you progress with DBT on Snowflake, there are a few useful tips on this page, such as the use of transient tables, query tags and table clustering to help make your DBT transformations easier to administer and more performant.

Hands-On Training For The Modern Data Stack

Timeflow Academy is an online, hands-on platform for learning about Data Engineering and Modern Cloud-Native Database management using tools such as DBT, Snowflake, Kafka, Spark and Airflow.

Join our mailing list for our latest insights on Data Engineering:

Timeflow Academy is the leading online, hands-on platform for learning about Data Engineering using the Modern Data Stack. Bought to you by Timeflow CI

© 2023 Timeflow Academy. All rights reserved