Get my Modern Data Essentials training (for free) & start building more reliable data architectures www.ModernDataCommunity.com
@randolphdeline3092 жыл бұрын
I think you just saved me weeks worth of work. I was just put in charge of flattening dozens of tables with multiple levels totaling 100s of columns and that macro is perfect.
@KahanDataSolutions2 жыл бұрын
Love to hear that!
@peytonbadura6808 Жыл бұрын
I had to change a couple of things to get this to work with SQL Server but this was a lifesaver! Thank you!
@KahanDataSolutions Жыл бұрын
Nice! you bet
@brodericksmith8617 Жыл бұрын
hey there! this video is awesome! do you have any tips on how to turn a variable into a string? I'm trying to take this concept and apply it to Postgres, but i need the column names to be strings to make it work. not sure how to get quotes around the variable name!
@jasonyeung24982 жыл бұрын
Might I ask if there was any package that could recursively do what you did in dbt to unwrap nested JSON and / or array? Thanks!
@KahanDataSolutions2 жыл бұрын
Hey Jason - There very well could be but I'm not familiar off the top of my head. Seems like a common scenario so I wouldn't be surprised if it exists somewhere.
@jasonyeung24982 жыл бұрын
@@KahanDataSolutions Thanks for the kind response. Might I ask, for example, let's say birth_name is another JSON. And, I simply want to reuse the marco again. intermediate_model as ( {{ flatten_json( model_name = 'source_model', json_column = 'birth_name' )}} ), how can I do it here by passing the source_model into the macro again?
@kirillmelnikov17009 ай бұрын
Hi 👋 How can I write if statement on dbt macros which will be go out loop, for example “if table exists - continue, else: break” ? Thx to advanced 🙏
@vlogwithkaran9078 Жыл бұрын
how do we perform lateral flatten in redshift?
@linaelyakhloufi46772 жыл бұрын
Hello, Is it the same if our data is in amazon redshift please?
@henniedenooijer434810 ай бұрын
Great vid!
@ihafidh2 жыл бұрын
Great video as always!
@KahanDataSolutions2 жыл бұрын
Thank you!
@kanthipavuluri38742 жыл бұрын
Nice video! Can you also share how we can determine the data types of the json fields, in this video everything is considered to be a varchar
@KahanDataSolutions2 жыл бұрын
Thanks Kanthi! You can use the TYPEOF function in Snowflake to determine the data type - docs.snowflake.com/en/sql-reference/functions/typeof.html
@summer_xo2 жыл бұрын
I started off by asking myself, "this is cool but how could I convince my team to use this vs flattening via ADF". By the end seeing how reusable it was totally sold me, so much quicker than spinning up a new ADF pipeline, creating source etc (assuming the extract isn't taking place in ADF). Im curious if you think there are any more benefits using DBT/Snowflake to flatten vs ADF?
@wallyflops Жыл бұрын
That lateral keyword is really weird, the documentation doesn't mention it's use anywhere. Only when listing multiple it seems.
@domfp1332 ай бұрын
TOP
@yslx7402 жыл бұрын
Could you talk about getting proper data types, rather than just using varchar for everything? Would be super useful!
@KahanDataSolutions2 жыл бұрын
That is a bit more complex but it is doable. Here is an example function in Snowflake that you can use with JSON data to get you there - docs.snowflake.com/en/sql-reference/functions/typeof.html
@yslx7402 жыл бұрын
@@KahanDataSolutions thanks, didn’t know about typeof
@abdullahsiddique77872 жыл бұрын
How long does it take to learn dbt for person knowing sql
@KahanDataSolutions2 жыл бұрын
That really depends on the individual. Like anything else, the advanced components will take time/experience to fully learn. But if you know SQL you should be able to start contributing to dbt projects pretty quickly once you learn the basic concepts.
@abdullahsiddique77872 жыл бұрын
@@KahanDataSolutions thanks bro yes I am good in sql