The most conplete ADF course I've seen until now!! Thanks a lot!!
@WafaStudies4 жыл бұрын
Thank you 🙂
@cristianbautistamartinez72803 жыл бұрын
@Juan Sergio is this a KZbin channel?
@kuladeepk12553 жыл бұрын
Hi Maheer, Can you explain how to develop a pipeline/logic to recursively search for files in a directory regardless of any number of sub folders. It has to get details of all the files in the directory.
@WafaStudies3 жыл бұрын
Best way is use Azure function for it. But still let me give a thought about it implementing it with adf activites
@kuladeepk12553 жыл бұрын
@@WafaStudies it would be better if I use ADF activities. Could you make a video about the implementation?
@narenn3237 Жыл бұрын
I have a scenario where I am reading a set of same structured files from a folder (each file is a sales data from different stores across country). my pipeline is failing every now and then due to the structure of files from some store are changing, so the mapping is failing. after watching this video I am thinking of applying this Metadata combined with if condition. for each loop to iterate on each file, read the COLUMNCOUNT metadata of current item using @item() , if columncount = Xvalue then Run the ExecutePipeline that actually copies the data , ELSE log the wrong file details into a file for after execution investigation. Will that work ? Thanks
@khana044 жыл бұрын
what if I have multiple files under the folder and I want alter or derived a column in all those , can you do those using Get Meta data activity
@WafaStudies4 жыл бұрын
No can do that using derived column transformation in dataflow.
@viveknimmagadda2397 Жыл бұрын
Is there a way to recursively go inside the subfolders and extract the files from those subfolders? Or is there any other activity with which this can be achieved?
@sonamkori81692 жыл бұрын
Amazing ADF Course ....
@WafaStudies2 жыл бұрын
Thank you 😊
@debasissarangi51692 жыл бұрын
don't understand how you use @equals in activity ,when I put it it shows me wrong
@UmerPKgrw3 жыл бұрын
is there a way to store the contents of a file into variables? For Example, read a value from a file and store this value in the variable/parameter.
@absspaceinformation7467 Жыл бұрын
@Maheer - I like the way you explain all the basic ADF concepts to real life scenarios. I have one query if its possible . Kindly help me with it. Is there any way we can know in ADF , who has uploaded file at datalake container. I have real time scenario where, In case of failure of pipeline, I need to notify respective person who has uploaded the file. ?
@sabithaa24653 жыл бұрын
hey, could you please help me to get the data count from mongodb collection. how can we get that
@theratipallyphanikumar773 жыл бұрын
Hi Maheer, I like your Explanation But instead of Explaining can you Please Execute The pipelines so it could be a better understanding. In Many Videos, You are Explaining How to Do it but you are not Executing the Pipeline. So We can get easily how it is impacting
@caesardutta9394 Жыл бұрын
very true...but good explanation...and may be he is wanting us to agile our fingers on keyboard 😃
@focuswallah6 ай бұрын
Agreed
@sgnaneswari29023 жыл бұрын
8:24 i guess it is not MDS, it is MD5 Message digestive file
@venkateshedunuri70383 жыл бұрын
hello brother in adf where we use powerbi plz make one video on that one
@ukumar165710 ай бұрын
Could please explain metadata attributes as well please
@balamanikanta76854 жыл бұрын
Hi Maheer,your series is really hepful. Can We pass the Activity Name to subsequent Activity ? If we can do so, What are the main Activities we can use ? Just please give me a hint ... Thanks in advance. :)
@shashikant23222 жыл бұрын
I agree! it would be more beneficial if we actually executed the pipelines and saw the results
@gauravpratap448214 сағат бұрын
Plase Do not discuss again and again about User Properties in every video
@archanametkari4352 Жыл бұрын
@Maheer - I like the way you explain all the basic ADF concepts to real life scenarios. I have one query if its possible . Kindly help me with it. Is there any way we can know in ADF , who has uploaded file at datalake container. I have real time scenario where, In case of failure of pipeline, I need to notify respective person who has uploaded the file. ?