Nice explanation! For a specific dag, how do I run a specific dbt command? e.g. How would you execute 'dbt run --select +third_day_avg_cost_run' for this project in the video ?
@thedataguygeorge8 ай бұрын
You could use the cosmos filtering mechanism to filter for just that specific step, but by default cosmos will automatically render each individual dbt model trigger as a task in the DAG
@WiseSteps.D8 ай бұрын
All good, But where we able to see snowflake connection details?
@thedataguygeorge8 ай бұрын
Go to the connection management UI and select the snowflake connection there!
@Rajdeep64526 ай бұрын
in airflow go to admin>connections and type in your { "account": "-", "warehouse": "", "database": "", "role": "", "insecure_mode": false }
@maxpatrickoliviermorin2489 Жыл бұрын
Thank you! Would you mind making a much more elaborate version please?
@thedataguygeorge Жыл бұрын
Sure! What would you like to see?
@ameyajoshi65885 ай бұрын
can we have cyclic pre hook applied for a model? If so how to achieve it using dbt airflow
@thedataguygeorge5 ай бұрын
Yes definitely, if you have the pre-hook applied as part of your dbt model build process it should still work!
@vijayjoshi-mw8cr5 ай бұрын
Hello, I have build a ETL pipeline using python,pandas,airflow,snowflake but problem is when i run the task that time it will not load the data into snowflake.. so please can you help with us!!
@thedataguygeorge5 ай бұрын
What errors are you getting?
@VijayJoshi-eg2zq5 ай бұрын
@@thedataguygeorge when i will run the task that time it will get success message but when check snowflake warehouse i am not able to see the table
@Rajdeep64526 ай бұрын
So we are supposed to make the dbt init project first and then create the databases in snowflake? And what schema are you using?
@thedataguygeorge5 ай бұрын
The dbt init that's part of the project should create the databases for you as long as you have the proper permissions/setup