Hi Adam: I am 76 years old and also very new in this technology of ETL which I got now clear. Keep on developing this kind of professional projects to be used in the resume preparations.
@hovardlee3 жыл бұрын
-1979 and ,12 This is why complex logic is needed. Nice tutorial :)
@bifurcate-ai4 жыл бұрын
Adam, I have been watching many of your videos. As someone new to Azure, i find your videos immensely valuable. Keep up your great work, really appreciate!
@AdamMarczakYT4 жыл бұрын
Awesome, thank you!
@waklop43845 жыл бұрын
Just discovered the channel. Your material is hight quality. It's excellent work. I will go watch more. Thank you Adam !
@AdamMarczakYT5 жыл бұрын
Thank you. This means much :)
@pradeeps36713 жыл бұрын
Hello Adam, pls let me know how to connect to dynamic crm .. Pls send detail to pradysg@gmail.com
@notonprem4 ай бұрын
This is quality stuff. Good for a quick upskill especially when prepping for an interview.
@dimitarkrastev60852 жыл бұрын
Great video! Most videos seem to focus mostly on the evertisement material straight from Azure. At best they show you the very dumb step of copying data from a file to DB. This is the first video I saw where you actually show how you can do something useful with the data and close to real life scenario. Thank you.
@joshuaodeyemi3098 Жыл бұрын
I love you, Adam! I have been struggling with using expression builder in Data Flow. I can't seem to figure out how to write the code. This video just made it look less complex. I'll be devoting more time to it.
@lavanyay27676 ай бұрын
very very detailed work flow , i tried this and able to understand Data flow process so easily . Thank you for the wonderful session.
@eramitgoswami4 жыл бұрын
Your way of explaining is outstanding, after watching it feel like Azure is very easy to learn. kindly keep sharing good videos Thank You..
@AdamMarczakYT4 жыл бұрын
Thanks a ton :)
@aarontian59793 жыл бұрын
Your channel is totally underrated, man
@susanmyers14 жыл бұрын
@ Work I'm having to build out a Data Mart with no training on my own. You are literally saving my hide with your videos. THANK YOU!
@AdamMarczakYT4 жыл бұрын
Glad to help! :)
@subhodipsaha76084 жыл бұрын
I just find your videos while searching for ADF tutorials in youtube. The materials are fantastic and really helping me to learn. Thank you so much!!
@AdamMarczakYT4 жыл бұрын
Happy to help! :)
@alanzhao80744 жыл бұрын
The best video about Azure Data Flows I can find. Thank you Adam!
@AdamMarczakYT4 жыл бұрын
Wow, thanks! :)
@fadiabusafat51623 жыл бұрын
Nice one Adam. Cool one. Keep doing fabulous videos always fella. Many THanks.
@gunturchilli7674 жыл бұрын
Thanks you so much Adam. I was able to crack an interview with the help of your videos. I prepared notes according to your explanation & 3 hrs before the interview i watched your videos again it helped me alot
@AdamMarczakYT4 жыл бұрын
Fantastic!
@omarsantamaria68715 жыл бұрын
Awesome video. I've seen a lot of site & videos and they are so complicated, but all yours are very crystal and anyone can understand.
@AdamMarczakYT5 жыл бұрын
Thanks Omar :)
@icici3215 жыл бұрын
I am new to data and ETL stuff but your video's are too good. Excellent examples and very clear explanation so anyone can understand. Thanks very much.
@AdamMarczakYT5 жыл бұрын
Thank you, always happy to help!
@Lego-tech4 жыл бұрын
Very crisp and clear information, I watched many videos but Adam's contents are awesome!! Thanks dear!! All the best for future good work!!
@AdamMarczakYT4 жыл бұрын
Thank you so much 🙂
@yashmeenkhanam34514 жыл бұрын
Outstanding !You just made Azure easy to learn. Thank you.
@AdamMarczakYT4 жыл бұрын
Awesome, thank you!
@johnfromireland75513 жыл бұрын
ADF is but just one part of about 100 significant tools and actions in Azure. :-(
@CallousCoder3 жыл бұрын
Hi Adam, is it possible to create these pipelines as code as well? Or somehow create them from my actual Azure pipeline? It would be sheerly insane (but it is a Microsoft product) to require and maintain two pipeline one that’s yiur Azure pipeline for CI and CD and one for the ADF. I really would want the Azure pipeline to be able to fill/create the ADF pipeline. But I haven’t found anything yet.
@Haribabu-zj4hd3 жыл бұрын
So nice of your talent explaining the data flow in simple way. Thank you so much Mr.Adam.
@soumikdas77093 жыл бұрын
Your videos are very informative and practical oriented. Keep doing .
@AdamMarczakYT3 жыл бұрын
Thank you, I will!
@Cool2kid4 жыл бұрын
Your video content is awesome!!! Your video is very useful to understand Azure concept specially for me who just started Azure journey. I would like to have one video where we can see how to deploy code from Dev to QA to Prod. How to handle connection string, parameter etc during deployment. thanks again for wonderful video content.
@AdamMarczakYT4 жыл бұрын
ADF CI/CD is definitely on the list. It's a bit complex topic to get it right so it might take time to prepare proper content around this. Thanks for watching and suggesting ;)
@mohmmedshahrukh84502 жыл бұрын
best video on azure I have ever seen❤❤
@sameerdongare11135 жыл бұрын
Another awesome video. The best part of Mapping Data Flow was the Optimization...where we could do Partitioning.
@AdamMarczakYT5 жыл бұрын
Thank you! Glad you like it :)
@achraferraji34032 жыл бұрын
Amazing Video, we want other parts !
@Raguna2 жыл бұрын
Very good explaining the Data Flow. Thanks Mr.Adam.
@balanm85705 жыл бұрын
As useful another Awesome video Adam !!!. Excellent. It was to the POINT !!!. Keep up the good work which you have been doing for plenty of users like me. Eagerly waiting for more similar videos like this from you !!!. Can you please have some videos for Azure Search ...
@AdamMarczakYT5 жыл бұрын
Thank you so very much :) Azure Search is on the list but there is so many news coming from Ignite that I might need to change the order. Let's see all the news :).
@SIDDHANTSINGHBCE3 жыл бұрын
These videos are great. Helping me so much! Thanks Adam
@AdamMarczakYT3 жыл бұрын
Glad you like them!
@techBird-b2m2 жыл бұрын
👍 Its amazing , Practical implementation of Data Flow.
@generaltalksoflife4 жыл бұрын
Hi Adam, Thank for helping us in learning new technologies. You are awesome 👌🏻👌🏻👌🏻👏👏.
@AdamMarczakYT4 жыл бұрын
My pleasure!
@bharatruparel94245 жыл бұрын
Hello Adam, I just finished this video. Very well done indeed. Thanks and regards. Bharat
@AdamMarczakYT5 жыл бұрын
Thanks Bharat :)
@01sanjaysinha Жыл бұрын
Thanks!
@hrefname4 жыл бұрын
Thank you so much for this. I subscribed immediately, very informative and straightforward azure info. Will definitely recommend your channel. Keep up the great work!
@AdamMarczakYT4 жыл бұрын
Awesome, thank you!
@shantanudeshmukh43904 жыл бұрын
Wow ! Fantastic explanation.
@AdamMarczakYT4 жыл бұрын
Glad you liked it!
@ngophuthanh2 жыл бұрын
Thank you, Adam. As always, you rock.
@subhraz2 жыл бұрын
Very well explained and demonstrated. Really helpful to get started with Data flows.
@grzegorzz40253 жыл бұрын
Adam, great tutorial! Kudos!
@AdamMarczakYT3 жыл бұрын
Glad you liked it!
@avinashbasetty4 жыл бұрын
Features are very interesting. want to try with the different partitioning techniques. Thank you for sharing such amazing stuff
@AdamMarczakYT4 жыл бұрын
My pleasure! Thanks! :)
@BijouBakson2 жыл бұрын
It must be very challenging to do all this thing in English for you I imagine, Adam! Congratulations for pushing through despite the difficulty. 🙂
@MrVivekc3 жыл бұрын
very good explanation Adam. keep it up.
@AdamMarczakYT3 жыл бұрын
Thanks, will do!
@MrVivekc3 жыл бұрын
@@AdamMarczakYT Adam do we have trail version of Azure for Learning purpose?
@abhijitk73634 жыл бұрын
Adam, Thanks for this excellent video. You explained almost every feature available there in data flows. Looking forward a video on Azure SQL DWH. I know it will be great to learn about it from you.
@AdamMarczakYT4 жыл бұрын
Glad it was helpful! I'm just waiting for new UI to come to public preview then the video will be done :)
@arulmouzhiezhilarasan85184 жыл бұрын
Impeccable to know reg Mapping Data Flow, Thanks Adam!
@AdamMarczakYT4 жыл бұрын
My pleasure!
@chandrasekharnallam25783 жыл бұрын
excellent explanation with simple scenario. Thank you.
@AdamMarczakYT3 жыл бұрын
Glad it was helpful!
@sapecyrille5487 Жыл бұрын
Great! You are the best Adam.
@Montreal_powerbi_connect3 жыл бұрын
Wow,I like your video, I did it today. and I had good result. thanks for your good explanation.
@AdamMarczakYT3 жыл бұрын
Great job! Thanks!
@rqn22744 жыл бұрын
Nice video Adam. Professional as always
@AdamMarczakYT4 жыл бұрын
Wow, thanks!
@isurueranga97043 жыл бұрын
best tutorial ever... 💪🏻💪🏻💪🏻
@sarahaamir74574 жыл бұрын
Thank you so much Adam! this was very clear and great video and a big help for my interview and knowledge.
@AdamMarczakYT4 жыл бұрын
Very welcome! Thanks for stopping by :)
@valentinnica60344 жыл бұрын
that was actually not so hard. thanks man, you're awesome.
@AdamMarczakYT4 жыл бұрын
No! you are awesome! :)
@GiovanniOrlandoi73 жыл бұрын
Great video! Thanks Adam!
@AdamMarczakYT3 жыл бұрын
My pleasure!
@wojciechjaniszewski90864 жыл бұрын
very well done on explaining principles of mapping data flows!!!
@AdamMarczakYT4 жыл бұрын
Thanks a lot!
@JoeandAlex3 жыл бұрын
Brilliant way of explanation
@JoeandAlex3 жыл бұрын
Subscribed to your channel
@AdamMarczakYT3 жыл бұрын
Thank you, appreciated 🙏
@big-bang-movies3 жыл бұрын
Hi Adam, few doubts. Please help me understand. 1. 10:04, After running the dataflow 1st time, there are 9125 rows got populated. Well, there is no output sink or output dataset associated with it dataflow yet, then where exactly those ingested rows are getting saved/populated? 2.15:04, after re-calculating "title" (by removing the year part), how come the previous original column (title) got disappeared? The modified title column should appear in addition to the previous original column (title) right?
@AdamMarczakYT3 жыл бұрын
hey 1. it's amount of rows loaded. 2. if you create new column with the same name it will replace old one. In this case we replaced title column.
@lowroar51272 жыл бұрын
So helpful! Thank you very much Adam!
@NehaJain-ry9sr4 жыл бұрын
Awesome videos Adam, your videos are great help to learn Azure. Keep it up :)
@AdamMarczakYT4 жыл бұрын
Thanks, will do!
@KarthikeshwarSathya3 жыл бұрын
This was explained very well. thank you.
@AdamMarczakYT3 жыл бұрын
You're very welcome!
@carlossalinas84974 жыл бұрын
Adam you have an ability at explaining complex things, this tutorial made my day, thanks
@AdamMarczakYT4 жыл бұрын
Glad it helped! Thanks!
@eddyjawed10 ай бұрын
Thank you Adam Dzienkuje, this is a great tutorial.
@549srikanth4 жыл бұрын
I would say this is the best content I've seen so far!! Thank you so much for making it Adam! Just wondering, is there a Crtl+Z or Crtl+Y command in case we did some changes in the dataflow and restore it to previous version?
@AdamMarczakYT4 жыл бұрын
Awesome, thanks! Unfortunately not, but you can use versioning in the data factory which will allow you to revert to previous version in case you broke something. Highly recommended. Unfortunately not reverts for specific actions.
@549srikanth4 жыл бұрын
@@AdamMarczakYT Excellent!! Thank you so much for your reply!
@johnfromireland75513 жыл бұрын
@@549srikanth I publish each time I create a significant new step in the pipeline and I use data preview before moving on to the next step. Also, you can , I think, export the code version of the entire pipeline. Presumably you can, then, paste that into a new Pipeline to resurrect your previous version.
@rajanarora66553 жыл бұрын
Your videos are really great and helped me understand lot of concepts of Azure. Can you please make one using SSIS package and show how to use that within Azure Data Factory
@dintelu4 жыл бұрын
Wow..lucid explanation..
@AdamMarczakYT4 жыл бұрын
Glad you think so!
@rosszhu16604 жыл бұрын
A quick question, Azure dataset seems only support already structured data, like CSV or JSON, what if my datasource is an unstructured text file that must be transformed into csv before being used? Is there a way to do this transformation (possibly python code) in data factory?
@AdamMarczakYT4 жыл бұрын
Hey, you can call azure databricks which can transform any file using Python/Scala/R etc. But data factory itself can't do it.
@rosszhu16604 жыл бұрын
@@AdamMarczakYT Got it. Thanks a lot! It looks like I have to learn Spark :-)
@DrDATA-ep6mg4 жыл бұрын
Very nice tutorial 👍
@AdamMarczakYT4 жыл бұрын
Thank you! Cheers!
@nick6s3 жыл бұрын
Excellent tutorials
@skybluelearner41982 жыл бұрын
Good explanation there.
@horatiohe4 жыл бұрын
thanks for the great content!! you are the man :)
@AdamMarczakYT4 жыл бұрын
I appreciate that!
@Lakshmi-y4x5 ай бұрын
Thank you, very helpful tutorials
@dwainDigital3 жыл бұрын
How do you delete from target based on data from the Source? I'm really struggling to understand if i have a column with a value that I want to delete in the target table. Everything seems to be geared up to altering source data coming in
@RahulRajput_0184 жыл бұрын
Thanks buddy ...Great work
@AdamMarczakYT4 жыл бұрын
My pleasure
@mohitjoshi13613 жыл бұрын
Does any of these option changed now? Because I am not able to see any data debug option to be enabled, and directly preview data in dataset itself.
@PicaPauDiablo14 жыл бұрын
Adam, is there a way to preserve the filename and just have it change the extension? For instance, I'm adding a column with datetime, but at the end I would like it to have the same file name, just parquet. Is there a way to do that?
@AdamMarczakYT4 жыл бұрын
Use expressions :) That's what they are for.
@PicaPauDiablo14 жыл бұрын
@@AdamMarczakYT Sorry if it was a dumb question, I'm still new to ADF. Ignore if it's too inane but is fileanem in the @pipeline parameter? I found one online but couldn't get it to parse.
@ahmedmj87292 жыл бұрын
Hello Adam , i follow these steps but i have a problem : i didn't find the source columns when i go to derived column component to write expression based on existing column. in your video , total columns in source component show = 3 , for me =0 ? i changed the source from csv to sql table and i didn't found the solution.
@rahulkota97934 жыл бұрын
Very useful. Thank you so much.
@AdamMarczakYT4 жыл бұрын
Glad it was helpful!
@niteeshmittal4 жыл бұрын
Adam, Your content is always easy to grab, excellent work mate. Could you please explain how to create a pipeline which has a copy activity followed by a mapping data flow activity.
@AdamMarczakYT4 жыл бұрын
Thanks, just drag and drop copy activity and data flow blocks on the pipeline and drag a line from copy to data flow activity.
@jagadeeshpinninti34564 жыл бұрын
can you please explain who to connect source dataset from azure data lake storage gen 2 tables in data flows of Azure data factory?
@AdamMarczakYT4 жыл бұрын
It's the same as blob storage, just create linked service and select Azure Table Storage and create dataset for it. Not that this is not supported for Mapping Data Flows.
@kirankumarreddykkr9606 Жыл бұрын
can you pyspark or sql in Expression functions ? are only scale
@mangeshxjoshi4 жыл бұрын
hi , does Azure Data factory can be used to Replace IBM DataStage Mappings transformation. as ibm datastage is a etl tool and azure data factory is a managed data integration service on cloud. does azure data factory supports only blob storarage , azure cosmos db (sql api) , azure data lake storage , azure sql data warehouse azure sql database only ? Apart from these , does Azure Data factory connects to SAP HANA , SAP bw , oracle . are there any connectors being used to pull data from other sources like SAP hana ,oracle etc
@AdamMarczakYT4 жыл бұрын
Hey, in general ADF has 80+ connectors, including SAP and Oracle. You use those to copy data from those sources to blob storage and then trigger mapping data flow pipeline to get data from blob storage (or data lake), transform it and output it back to blob (or one of supported output systems). Where ADF copies it to designated place.
@tenghover3 жыл бұрын
Would you plan to make video for introduction of each transforamtion components? Thanks
@arun065304 жыл бұрын
nice & detailed video.
@AdamMarczakYT4 жыл бұрын
Thank you!
@biswajitsarkar55385 жыл бұрын
Thanks Adam !! very informative video.Liked it a lot..
@AdamMarczakYT5 жыл бұрын
Thanks and you are most welcome! Glad you hear it.
@RC-nn1ld4 жыл бұрын
Love these videos so easy to understand, do you have a video on new XML connector
@AdamMarczakYT4 жыл бұрын
Great, thanks! Not yet, maybe in near future :)
@samb94032 жыл бұрын
Great video. Question: Under "New Datasets", is there a capability to drop data into Snowflake? I see S3, Redshift, etc. I appreciate the video and feedback!
@javm73782 жыл бұрын
I really like your tutorials. I have been looking for a "table partition switching" tutorial but haven't found any good ones. May be you could do one for us? I am sure it'll be very popular as there aren't any good ones out there and it is an important topic in certifications :-)
@abhim4nyu2 жыл бұрын
Will it work with pipe (“|”) separated value file instead of csv?
@joyyoung32882 жыл бұрын
an error message e.g. handshake_failure when the data flow source retrieve data from API, can anyone help? thanks.
@jayong23702 жыл бұрын
Thank you Adam.
@yashgemini40244 жыл бұрын
Appreciate you content. Thanks.
@AdamMarczakYT4 жыл бұрын
My pleasure! :)
@paulnelson16234 жыл бұрын
For anyone wondering how to make the year check (or any check) in the second step more robust, you can exchange the following expressions using the 'case' expression as used below which says, if this expression evaluates as true, do this, else do something else. Worth nothing here that in the first expression, there is only a true expression provided while the second expression has both true and false directives. As per the documentation on the 'case' expression: "If the number of inputs are even, the other is defaulted to NULL for last condition." /* Year column expression */ /* If the title contains a year, extract the year, else set to Null */ case(regexMatch(title, '([0-9]{4})'),toInteger(trim(right(title, 6), '()'))) /* title column expression*/ /* If the title contains a year, strip the year from the title, else leave the title alone */ case(regexMatch(title, '([0-9]{4})'),toString(left(title, length(title)-7)), title)
@AdamMarczakYT4 жыл бұрын
Thanks Paul :) I used as simple example as possible for people who aren't fluent in scala but of course you always need to cover all possible scenarios. Sometimes I like to fail the transformation rather than continue with fallback logic as I expect some values to be present.
@paulnelson16234 жыл бұрын
@@AdamMarczakYT Of course, I just wanted to see if I could take it a step further to align more closely with what would be needed in a production data engineering scenario and thought others may have the same idea. Thanks for the content! :)
@AdamMarczakYT4 жыл бұрын
Thanks, I bet people will appreciate this :)
@davidakoko33084 жыл бұрын
Hi Mr adam how are you? been trying to use the add function to add two columns of numeric value but the result is wrong E.G ADD(COLUMN_A, COLUMN_B) RESULT =COLUMN_AB instead of adding the values. lets say column_a have value of 334 and coumn_b have value of 4 result is giving 3344 instead of 338. please can you help. Nice video BTW. thanks
@AdamMarczakYT4 жыл бұрын
Check out concat function docs.microsoft.com/en-us/azure/data-factory/data-flow-expression-functions#concat
@Rafian19243 жыл бұрын
Lovely bro!!
@AdamMarczakYT3 жыл бұрын
Thanks 🔥
@mohmedaminpatel44273 жыл бұрын
Around 20:20, We can see there is just one partition, does Azure automatically decide the number of partitions it needs to divide the dataset into ? Also is it done at some cost i.e. more partitions cost more or is it complementary ? Thank you for all the tutorials, I am binge watching them since 3 days now and thoroughly enjoying them ! Would love to see some tutorials for Synapse as well :) !
@mohmedaminpatel44273 жыл бұрын
Wow should have waited before making the comment as you have explained it later in the video itself. Thank you Adam !
@AdamMarczakYT3 жыл бұрын
Glad it helped, thanks! :)
@nidhisharma-rb7nx3 жыл бұрын
Adam, great video.I m new to Data Flow and I have one doubt, I want to implement File level checks in Data Flow but not able to do it. All tasks are performing data level checks like exist or conditional split. Is it possible to implement File level check like whether file exist or not in storage account?
@seb63024 жыл бұрын
I have an issue with the column 'title' not being found in the derived column despite being able to see all the column in the source beforehand.. Very confused!
@seb63024 жыл бұрын
When attempting to aggregate - no columns are found. Again despite seeing them in the source.
@seb63024 жыл бұрын
I've rebuilt the whole thing and still face the same issue. Google yields no results either.. Does anyone know what i'm doing wrong?
@seb63024 жыл бұрын
Just tried again and it works! The only difference this time round was that I didn't enable data flow debug. No idea why it worked this time.
@seb63024 жыл бұрын
Also 'Actions' no longer exists under pipeline - Is there a new way to view the details pane? I can't seem to find one.
@seb63024 жыл бұрын
These actions can be now found if you hover over 'Name'!
@eshaandevgan3124 жыл бұрын
I have a question, please help. I am not able to understand why DataFlows need to have their own data sets. Why not use the pipeline datasets. This will help me a lot. Thanks in advance.
@AdamMarczakYT4 жыл бұрын
It can use pipeline datasets, but not all types/source systems are supported.
@eshaandevgan3124 жыл бұрын
@@AdamMarczakYT Thanks Adam, and your videos are very nice. Keep it up.
@gursikh1335 жыл бұрын
Adam, FOr using transformation do I need to learn scala. Or just refer the documentation you specified for scala functions and write the transformation?
@AdamMarczakYT5 жыл бұрын
Documentation should be enough. MDF is targeting simple transformations so in most cases documentation alone will suffice.
@JohnJohnson-bs4cw3 жыл бұрын
Great Video. Can you use data from a REST Api as a source for a Mapping Data Flow or does the source have to be a dataset on Azure?
@AdamMarczakYT3 жыл бұрын
Here is the list of supported data sources for MDF docs.microsoft.com/en-us/azure/data-factory/data-flow-source?WT.mc_id=AZ-MVP-5003556 . Just copy data from REST API to Blob and then start MDF pipeline using that blob path as a parameter.
@yashnegi94732 жыл бұрын
Video is excellent. I want to know the problem statement which Data flow is solving?
@anubhav20203 жыл бұрын
Hello Adam, thanks a bunch for this excellent video. The tutorial was very thorough and anyone new can easily follow. I do have a question though. I am trying to replicate an SQL query into the Data Flow, however, I have had no luck so far. The query is as follows: Select ZipCode, State From table Where State in ('AZ', 'AL', 'AK', 'AR', 'CO', 'CA', 'CT'...... LIST OF 50 STATES); I tried using Filter, Conditional Split and Exists transforms, but could not achieve the desired result. Being new to the Cloud Platform, I am having a bit of trouble. Might I request you please cover topics like Data Subsetting/Filtering (WHERE and IN Clauses etc.) in your tutorials. Appreciate your time and help in putting together these practical implementations.
@harshapatankar4844 жыл бұрын
Amazing videos.
@AdamMarczakYT4 жыл бұрын
Glad you think so! :)
@JoeandAlex3 жыл бұрын
Adam, a question on Data Flow, do we have a feature available in ADF, where we can call data flows dynamically in a single pipeline?
@AdamMarczakYT3 жыл бұрын
Not sure that I understand, video shows how to execute MDF from ADF pipeline,
@JoeandAlex3 жыл бұрын
@@AdamMarczakYT thank you. My question was, if I have 5 data flows in place and want to execute those 5 data flows using a single pipeline, is it possible? if yes, do we need to create 5 different data flow activities in the same pipeline, or is there any other way?