Cannot thank you enough for your incredibly well laid out, thorough explanations. The world needs more folks like you :)
@satishutnal3 жыл бұрын
You are the example for how teaching should be. Just awesome 👍
@AdamMarczakYT3 жыл бұрын
Wow, thank you!
@genniferlyon85772 жыл бұрын
Thank you Adam! I had been trying to follow some other written content to do exactly what you showed with no success. Your precise steps and explanation of the process were so helpful. I am successful now.
@priyankapatel94613 жыл бұрын
You have depth knowledge in every service. I learn from scratch using your channel. Keep posting Thanks you and God bless you.
@AdamMarczakYT3 жыл бұрын
Awesome, thanks!
@quyenpn3183 жыл бұрын
I really really like how you guide step by step like this, it is quite easy to understand. You are the best “trainner” I’ve seen, really appreciated for your time on creating those useful videos.
@albertoarellano14944 жыл бұрын
You're the best Adam! Thanks for all the help, been watching your tutorials on ADF and they're very helpful. Keep them coming!
@AdamMarczakYT4 жыл бұрын
My pleasure! Thanks Alberto!
@apogeeaor55316 ай бұрын
Thank you, Adam. I rewatch this video at least twice a year, Thank you for all you do.
@sreejeshsreenivasan22572 ай бұрын
Super helpful . We are breaking our head on how to migrate 32,000 Oracle tables into ADL. this was so simple and helpful
@shaileshsondawale28112 жыл бұрын
What a wonderful content you have place in social media.. What a world class personality you... People certainly fall in love with your teaching..
@pratibhaverma78573 жыл бұрын
Your videos are great. This is the best channel on KZbin platform to learn about ADF. THANKS 🙏😊
@amoldesai46054 жыл бұрын
I am a beginner in Azure Data Engineering and you made it simple to learn all the tactics.. thanks
@AdamMarczakYT4 жыл бұрын
Glad to hear that!
@Vick-vf8ug3 жыл бұрын
It is extremely hard to find information online about this topic. Thank you for making it easy!
@AdamMarczakYT3 жыл бұрын
Glad it was helpful! Thanks!
@paullevingstone48343 жыл бұрын
Very professionally demonstrated and very clear to understand. Thank you very much
@AdamMarczakYT3 жыл бұрын
It's my pleasure Paul! :)
@Ro5ho193 жыл бұрын
Thank you! It's under appreciated how important it is to name things something other than "demoDataset", but it makes a big difference both for understanding concepts, and maintainability.
@AdamMarczakYT3 жыл бұрын
Glad it was helpful! You are of course correct, if it's not demo then take care of your naming conventions.
@gunturulaxmi80372 жыл бұрын
Videos are very much clear to the people who would like to learn and practice.Thanks alot.your hard work is appreciated.
@anderschristoffersen25133 жыл бұрын
Great and simple walk through, good job Adam
@AdamMarczakYT3 жыл бұрын
Thank you, I appreciate it! :)
@maimemphahlele11023 жыл бұрын
Hi Adam Ur videos are just too brilliant. This is subscription I wouldn’t mind paying to support. Ur lessons are invaluable to learning.
@AdamMarczakYT3 жыл бұрын
Awesome, thank you!
@jakirajam2 жыл бұрын
The way you explain is super Adam. Really nice
@wouldyoudomeakindnes3 жыл бұрын
your skills are in the tops thanks, love to see your channel grow
@AdamMarczakYT3 жыл бұрын
I appreciate that!
@waseemmohammed10884 жыл бұрын
Thank you so much for the clear and nice explanation, I am new to ADF and learning a lot from your channel
@AdamMarczakYT4 жыл бұрын
Great to hear!
@naseemmca4 жыл бұрын
Adam you are just awesome man! The way you are teaching is excellent. Keep it up.. you are the best...
@AdamMarczakYT4 жыл бұрын
Thanks! 😃
@vicvic5533 жыл бұрын
Hey, one thing about English - please guys correct me if I am wrong, but I am pretty sure what I am talking about - you shouldn't say inside a sentence "how does it work", but "what it works". Despite that, the content is awesome!
@AdamMarczakYT3 жыл бұрын
You can if you ask a question. "How does it work" is a question structure, not a statement. it should be "how it works" if I'm stating a fact. You wrote "What it works" but I assume that's a typo. It's one of my common mistakes, my English teacher tries to fix it but it is still a common issue for me ;) Thanks for watching!
@sumanthdixit12034 жыл бұрын
Fantastic clear-cut explanation. Nice job!
@AdamMarczakYT4 жыл бұрын
Glad it was helpful!
@xiaobo11343 жыл бұрын
Thanks Adam, your tutorials are very useful, hope to see more in the future
@AdamMarczakYT3 жыл бұрын
Glad you like them! Will do more!
@ahmedroberts48832 жыл бұрын
Excellent, Excellent video. This has truly cemented the concepts and processes you are explaining in my brain. You are awesome, Adam!
@markdransfield9520 Жыл бұрын
Brilliant teaching style Adam. Very watchable. I particularly like how you explain the background. I've subscribed and will watch more of your videos.
@garciaoscar7611 Жыл бұрын
This video was really helpful! you have leveled up my Azure skills, Thank you sir, you have gained another subscriber
@CoolGuy2 жыл бұрын
You are a legend. Next level editing and explanation
@amtwork54173 жыл бұрын
Great video, easy to follow and to the point, really helped me to quickly get up a running with data factory.
@AdamMarczakYT3 жыл бұрын
Glad it helped!
@avicool082 жыл бұрын
very simple yet powerful explanation
@AdamMarczakYT2 жыл бұрын
Glad you think so!
@geoj97163 жыл бұрын
You are a very good teacher.
@AdamMarczakYT3 жыл бұрын
Thank you! 😃
@RavinderApril7 ай бұрын
Incredibly simplified to learn. .. Great!!
@rajanarora66553 жыл бұрын
Awesome explanation, the way you teach assuming in layman terms is pretty great, thanks!!
@veerboot814 жыл бұрын
Hi Adam, very nice work this, I made this for a client of mine and found out one important thing: within the For each not all blocks are executed as if they are working together atomically. What I mean is that if you start two thinks in parallel using the For Each block and within the for each block you have two blocks - say block A and B - connected using parameters (item()) within these blocks say X and Y. Block A starting with item X will not necessarily be using the item X in block B although connected! So I want to suggest one extra advice to use only one block in a For Each block at max if using parameterazed block within or if you need more than one block start a separate pipeline within the For Each block which will have the multiple blocks. These pipelines will be started as separate childs and to the work in the correct order. With kind regards, Jeroen
@AdamMarczakYT4 жыл бұрын
Hey, not sure I understood what you meant here. Using parameters is not making any connection between the actions.
@veerboot814 жыл бұрын
@@AdamMarczakYT I'm using a for each loop to load tables with dynamic statements, if I need more than one block (like a logging call to sl server, a copy block to load the data and a logging block after being done with loading these blocks can be in the for each loop itself, but if you start in parallel multiple times different load of tables the blocks will not follow each other sequencially, but will be running through each other, so the logging will not belong to the copy block for example. I will see if I can make an example if I find the time. To solve this I start always another pipeline within the for each and put the blocks in this pipeline. This will create child pipelines in the for each loop ensuring the right order of execution of the blocks (logging start, copy and logging end)
@ericsalesdeandrade94204 ай бұрын
Amazing video. Complex topic perfectly explained. Thank you Adam
@frenamakenson98443 ай бұрын
Hello Adam, thanks for this demo, Your Channel is A bless for new learner
@amarnadhgunakala29014 жыл бұрын
Thanks Adam, I'm waiting like this video on ADF, Please do regularly...
@AdamMarczakYT4 жыл бұрын
You got it!
@anacarrizo2209 Жыл бұрын
THANK YOU SO MUCH for this! The step-by-step really helped with what I needed to do.
@radiomanzel85702 жыл бұрын
it was so perfect , I was able to follow and copy data in first attempt .thanks
@anilchenchu1017 Жыл бұрын
Awsome adam there cant be a way to explain better than this
@deoroopnarine62324 жыл бұрын
Your videos are awesome man. Gave me a firm grasp and encouraged me to get an azure subscription and play around some more.
@AdamMarczakYT4 жыл бұрын
That is amazing to hear! Thank you!
@rubensanchez63663 жыл бұрын
Very interesting video Adam. I found quite enlightening your idea of storing metadata. probably it could be maintained separately tracking last record loaded so we could use it as an input for delta loads through queries instead of reloading the full table on each run.
@AdamMarczakYT3 жыл бұрын
You can either use watermark or change tracking patterns check this out docs.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-overview?WT.mc_id=AZ-MVP-5003556
@eatingnetwork64743 жыл бұрын
Thanks Adam, amazing workshop, very clear and easy to follow, thanks for helping, i am wiser now :)
@AdamMarczakYT3 жыл бұрын
Perfect! Thank you!
@hollmanalu4 жыл бұрын
Adam, thanks for all your great video's! I appreciate your work very much! Keep up your great work!
@AdamMarczakYT4 жыл бұрын
My pleasure! Thanks!
@wouldyoudomeakindnes4 жыл бұрын
thanks a lot for the videos. Is really grateful to see all the dedication and attention to detail from each video; explanation, supporting slides, code and demo really covers the material well.
@AdamMarczakYT4 жыл бұрын
Glad you like them! Thanks!
@AVADHKISHORE4 жыл бұрын
Thank you Adam!! These videos are really very helpful and builds the foundation to understand ADF.
@AdamMarczakYT4 жыл бұрын
My pleasure!
@santanughosal97852 жыл бұрын
I was looking for this video. Thanks for making this. It helps a lot. Thanks again.
@dev.gaunau3 жыл бұрын
Thank you so much for sharing these valued knowledge. It's very helpful for me.
@AdamMarczakYT3 жыл бұрын
Glad it was helpful!
@pdsqsql14932 жыл бұрын
Wow! What Great video, very easy way step by step tutorials and explanations. Well done!
@NewMayapur4 жыл бұрын
fantastic video Adam!! Really helpful to understand the parametrisation in ADF.
@AdamMarczakYT4 жыл бұрын
Great to hear that!
@shivangrana024 жыл бұрын
You are really best Adam! Your tutorial helped me a lot. Thanks
@AdamMarczakYT4 жыл бұрын
Happy to hear that!
@shivangrana024 жыл бұрын
@@AdamMarczakYT You are welcome. Please keep up the good work.
@eversilver993 жыл бұрын
Excellent video and knowledge sharing. Great Job!
@AdamMarczakYT3 жыл бұрын
Glad you enjoyed it!
@elisehunter34243 жыл бұрын
Brilliant tutorial. Easy to follow and it all works like a charm. Thank you!!
@TheSQLPro3 жыл бұрын
Great content, easy to follow!!
@AdamMarczakYT3 жыл бұрын
Glad you think so!
@ElProgramadorOficial3 жыл бұрын
Adam, You are the best!. Thanks man!
@AdamMarczakYT3 жыл бұрын
Thank you :)
@SealionPrime3 жыл бұрын
These tutorials are so useful!
@AdamMarczakYT3 жыл бұрын
Glad you like them!
@jacobklinck80114 жыл бұрын
Great session!! Thanks Adam.
@AdamMarczakYT4 жыл бұрын
My pleasure!
@leonkriner3744 Жыл бұрын
Amazingly simple and informative!
@szymonzabiello26224 жыл бұрын
Hey Adam. Great video! Two questions regarding the pipeline itself. 1. How do we approach Source Version Control of the pipeline? In SSIS we could export a package and commit to Git or use TFS. How do we approach versioning in Azure? 2. What is the approach to deploy this pipeline in upper environment? Assuming that this pipeline was created in dev, how do I approach deployment in i.e. UAT?
@AdamMarczakYT4 жыл бұрын
I think this page describes and answers both of your questions. docs.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment?WT.mc_id=AZ-MVP-5003556 thanks for watching :)
@mateen1613 жыл бұрын
Very well explained. Thank you!
@AdamMarczakYT3 жыл бұрын
Glad it was helpful!
@RajivGuptaEverydayLearning4 жыл бұрын
Very nice video with good explanation.
@AdamMarczakYT4 жыл бұрын
Glad you liked it
@aks5414 жыл бұрын
Very well explained & succinct. One request - if possible create a video for loading ADW (Synapse) data-warehouse by ADF
@AdamMarczakYT4 жыл бұрын
Thanks! I'm waiting for synapse new workspace experience to be released to make video about it ;)
@nathalielink38693 жыл бұрын
Awesome. Thank you so much Adam!
@AdamMarczakYT3 жыл бұрын
My pleasure!
@agnorpettersen Жыл бұрын
Very good explanation! I will try to read a list of tables but not export but do mask certain columns. I guess i have to use a derved column inside the for each loop maybe. Three parameters: schema_name, table_name and column_name. But how to make: update . set = sh2() where Key in (select Key from othertable) in a derived column Name context?
@agnorpettersen Жыл бұрын
I reseached and tried one thing. Seems to work. Have the lookup and foreach like this video and inside the loop i put the function. There I make an dynamic sql statement picking up the variables from the table i have in lookup.
@e2ndcomingsoon6552 жыл бұрын
Thank you! I really appreciate all you share, it truly helps me
@ashokveluguri19104 жыл бұрын
You are awesome Adam. Thank you so much for detailed explanation.
@AdamMarczakYT4 жыл бұрын
My pleasure!
@Cheyenne96632 жыл бұрын
Wow this was explained so well. Thank you!!!
@prasadsv34094 жыл бұрын
Realy great stuff sir.this what am looking in youtube
@AdamMarczakYT4 жыл бұрын
Thanks a ton!
@verakso27153 жыл бұрын
Thanks for your awesome video, it helped me out a great deal
@AdamMarczakYT3 жыл бұрын
Glad I could help
@feliperegis99894 жыл бұрын
Hey Adam, awesome work and explanation! Do you have another video explaining how to deal with massive data copies from tables in bulk using ADF and that may resolve issues with maximum data or rows of data? Can you make a video with a demo explaining how to deal with this kind of scenarios that you mentioned that's the story for another day? Thanks a lot in advance!! =D
@AdamMarczakYT4 жыл бұрын
Thanks. Well, Lookup shouldn't be used for data, but for metadata driven approach, so 5000 rows limit is very good here. It is rare when you will copy over 5000 tables/files with different structure)/etc. If you do you can do different techniques but in those cases I probably would shift approach entirely. Will think about this.
@YenBui-dn2ek7 ай бұрын
Thanks for your great video. Is there any limitation in terms of a number of tables, or table size when we use copy multiple in bulk according to your experience?
@southernfans14992 жыл бұрын
👍👍👍 very good explanation.. 👍👍.
@gouravjoshi30503 жыл бұрын
Good one adam sir .
@AdamMarczakYT3 жыл бұрын
Thank you :)
@GaneshNaik-lv6jh8 ай бұрын
Thank You so much.... Very good explanation, Just Awesome
@jozefmelichar59603 жыл бұрын
Fine tutorial. Thanks.
@AdamMarczakYT3 жыл бұрын
Glad it was helpful!
@maranp66184 жыл бұрын
Please do a video on Parameterized pipeline in detail with different parameters
@AdamMarczakYT4 жыл бұрын
Did you had a chance to check my video on ADF parametrization? kzbin.info/www/bejne/pnq2c5qtp8mrhq8
@krzysztofrychlik99134 жыл бұрын
dzieki! bardzo pomocne filmy!
@AdamMarczakYT4 жыл бұрын
Dziekoweczka!
@leelaagrawal63753 жыл бұрын
great teacher
@AdamMarczakYT3 жыл бұрын
Thank you! 😃
@ivanovdenys4 жыл бұрын
It is really cool that you make it so simple :)
@AdamMarczakYT4 жыл бұрын
Thank you! 😊
@pavankumars93132 жыл бұрын
You are very good 👍 explained well thanks 😊
@kamalnathvaithinathan57374 жыл бұрын
Awesome Adam!! you are the best. Thank you so much
@AdamMarczakYT4 жыл бұрын
My pleasure! :)
@sameeranavalkar9352 Жыл бұрын
Thank you so much for this. It helped a lot
@sotos47Ай бұрын
Thank you for the content, is there a way to identify which copy activity run for which table, e.g through pipeline output perhaps, I dont see that info
@satyabratabarik492 жыл бұрын
Great Explanation !!!!
@JD-xd3xp3 жыл бұрын
Very nice video, I appreciate that. do you have follow-up video for incremental copy?
@AdamMarczakYT3 жыл бұрын
Not yet but check this tutorial from MS docs.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-overview?WT.mc_id=AZ-MVP-5003556 thanks for stopping by :)
@valentinloghin40043 жыл бұрын
Very nice Adam !!! I didn't checked yet but do you have the process reading from blob storage the csv files and transfered the data to the table ? If you have it can you pls provide the link ?
@AdamMarczakYT3 жыл бұрын
Hey, thank you. I didn't make video on reversed scenario as I thought it might be too similar to this one. Just use Get Metadata instead of lookup to get the list of files and upload them to a table. Note that for this to work files or paths should contain the name of the table so you know which file goes to which table. :)
@valentinloghin40043 жыл бұрын
@@AdamMarczakYT I was able to create the pipeline how read the files from the blob and transfer the data to a sql database , I would like to trigger that pipeline when a file arrive on blob , the event grid is activated i created the trigger but didn't fire the pipeline , any guidance ? Thank you !
@marvinvicente31382 жыл бұрын
Hi Adam, I really appreciate you video. Thanks for your videos! I hope you can also create a video for ODBC as data source.
@surafeltilahun74043 жыл бұрын
Excellent!!!!!
@AdamMarczakYT3 жыл бұрын
Many thanks! Cheers!
@mosestes94174 жыл бұрын
Really helpful! you made it very easy!
@AdamMarczakYT4 жыл бұрын
Glad you think so!
@jac944 жыл бұрын
Thanks Adam! very clear!
@AdamMarczakYT4 жыл бұрын
Glad it was helpful!
@sunkara20093 жыл бұрын
Best Video watched today, Crystal clear !!. Thanks Adam. I am trying to copy multiple Azure Table storages. But struggling to get the query to list table names for azure table storage in the lookup activity. Any solutions please? Thanks in Advance!!
@AdamMarczakYT3 жыл бұрын
Excellent. Well, in this case replace lookup with Web Activity and query table storage rest api docs.microsoft.com/en-us/rest/api/storageservices/query-tables?WT.mc_id=AZ-MVP-5003556 and loop over the response.
@Dadwalfamilyadventures4 жыл бұрын
Hi Adam, this is awesome session. One question: Where in Azure documentation I can find the information regarding all possible output variable of a given activity e.g. when you were explaining about lookup activity, you talked about the property "firstrow". Where can I find such properties supported by all activities in Azure documentation ?
@AdamMarczakYT4 жыл бұрын
Thanks! For details always check the docs by googling "ADF action". For lookup this would pop up which explains everything you asked docs.microsoft.com/en-us/azure/data-factory/control-flow-lookup-activity
@shruthil79133 жыл бұрын
Hi Adam, there is a way to copy in different tables in SQL. (Example : car csv to cars SQL table and Plane csv to plane csv table automatically, without manual intervention.
@AdamMarczakYT3 жыл бұрын
Hi there. Well this video is all about it, just reverse the order and use GetMetadata for blob storage action instead of the Lookup :)
@simonscott11213 жыл бұрын
Reminds me so much of SSIS.
@akshaybhardwaj76262 жыл бұрын
Thanks for your all helpful videos, thank you so much. I have one Query How can we run a pipeline in parallel to copy data from 5 different Sources to respectively 5 different Targets ? 1 is it possible by passing 5-5 different connection strings (source & target string) ? 2nd Can we have a master pipeline wherein a Foreach activity we can call this One pipeline and run it in parallel for 5 different source and targets movement?
@joncrosby13 жыл бұрын
Thank you for your great videos, they have been super helpful. I'm working on a proof of concept similar to this video, however it's SQL to Azure SQL. Any links or references you can offer to help me with parameterizing the Azure SQL sink side?
@shankysingh11412 жыл бұрын
First of all thank u so much for very useful video.can you please create some video where we can connect ADF output file to powerBI for reporting and how we can auto refresh the same.
@mpramods3 жыл бұрын
Awesome video Adam. I would like to understand the next step on how to loop through the files and load into tables. Do you have a video on that or could you point me to a link with that info?
@AdamMarczakYT3 жыл бұрын
No video on this, but it's very similar, just use GetMetadata activity instead of the lookup :)
@arjunsaagi5904 жыл бұрын
Thank you Adam, Very much informatics video
@AdamMarczakYT4 жыл бұрын
My pleasure!
@abhishektrivedi34064 жыл бұрын
You're awesome Adam, thanks for such a great tutorial. I also tweeted this video. Thanks.!!