I don't understand where you define the target schema and the database :S, since we changed from previous videos
@VishnuramNadarajan Жыл бұрын
Hi Adam, based on incremental query, update was not happening, for dates lesser than max date
@yc8597 ай бұрын
Hi Adam, How do you handle schema change in the source with the incremental loading? Thanks in advance. Your videos are awesome. Kr, Yahya
@mastering_snowflake7 ай бұрын
By default, dbt WILL NOT populate new columns without a --full-refresh. The new column will be included in the first query against the source model, but not in the second query updating the destination model. dbt provides an on_schema_change config. append_new_columns: Add any new columns; populate with most recent subset. sync_all_columns: Adds new columns, drops removed columns. Populate with most recent subset. To populate new columns with historical data, a -full-refresh is required.
@mastering_snowflake7 ай бұрын
Check this great article out for more details… medium.com/@aaronbannin/dbt-incremental-cookbook-d70331ef879a
@bhratkapoor52302 жыл бұрын
Hello Adam, Can we run 2 model script separately and insert the data into one single table? Thanks, Bhrat Kapoor
@mastering_snowflake2 жыл бұрын
Hello Bhrat, good question. In this case you would have 3 dbt models. The first 2 populate your intermediate table and the 3rd model takes the intermediate table data to create you target table. Alternatively you could have one model joining tables 1 & 2 together which output your target result set. As ever, it depends on your specific requirements, such as how many data transformations you need to apply and if other downstream objects could also leverage the intermediate table using the 3 model approach described above.
@avisrivastava8341 Жыл бұрын
Hello adam, i need to learn the dbt transformation between mongodb and redshift.. Can you explain it on text?
@isaacodeh Жыл бұрын
Hi Adam, is there a thing called stream materialization in dbt?
@mastering_snowflake Жыл бұрын
Hi, the only ones I’m aware of are table, view, incremental and ephemeral docs.getdbt.com/docs/build/materializations
@isaacodeh Жыл бұрын
@@mastering_snowflake Thanks
@ucmanhao3285 Жыл бұрын
hello Adam, I have a question. Is there any difference between materialize 'commit_table' and 'table'?
@mastering_snowflake Жыл бұрын
Hello, can you provide more context as I don’t understand your question?
@niki20982 жыл бұрын
How many records can we load data into csv file using seeds
@mastering_snowflake2 жыл бұрын
Hi great question, check out this article discourse.getdbt.com/t/dbt-seed-for-large-csv-files/328/3