We have written about DBT extensively on the site, explaining how it helps to implement a repeatable developer lifecycle around database transformations executed within a data warehouse.

DBT and Snowflake are very complementary, and are frequently deployed together as part of modern cloud based data stacks.

Instead of doing traditional ETL before data goes into Snowflake, the pattern is typically to move to an ELT model, where relatively unprocessed data is loaded into Snowflake, and then transform it within the warehouse using DBT.

Of course we can also take advantage of Snowflake features such as virtual warehouses, multi-cluster warehouses and the ability to query external stages to make very high performing and clean transformation pipelines.

In the video below, we demonstrate the process for executing your first DBT transformation.  It involves creating a new dbt model, configuring the DBT profile to point to the Snowflake instance, then executing a new transformation to aggregate some data into a reporting view.

As you progress with DBT on Snowflake, there are a few useful tips on this page, such as the use of transient tables, query tags and table clustering to help make your DBT transformations easier to administer and more performant.

This Post Requires A Membership

Sign Up

Already A Member? Log In

© 2022 Timeflow Academy.