Now it is asking " the wildcard file name is required " for all cases
@sivaram96545 сағат бұрын
excellent vedio
@azurecontentannu63993 сағат бұрын
@@sivaram9654 thankyou so much 😊
@DinakaranB-ij6cv2 күн бұрын
Can it be used to send notifications to multiple users?
@azurecontentannu63992 күн бұрын
@@DinakaranB-ij6cv yes
@jardila77014 күн бұрын
Hi, I am working with a excel file instead of csv. And when debugging it is asking me about to select a Sheet name
@azurecontentannu63994 күн бұрын
@@jardila7701 ya sheet name or sheet index u need to provide for excel
@akashtribhuvan81246 күн бұрын
Why should one use powershell to create adf? I mean what is the benefit?
@this_is_me-t5u7 күн бұрын
Hi, is your ADF playlist complete?
@azurecontentannu63997 күн бұрын
@@this_is_me-t5u not yet
@this_is_me-t5u6 күн бұрын
@@azurecontentannu6399 okay 👍
@sanjeevdamle8907 күн бұрын
Thank you Annu... very nicely done. It helped me immensely
@azurecontentannu63997 күн бұрын
@@sanjeevdamle890 Welcome 😊
@jitendrasharma9117 күн бұрын
Hi I'm doing this activity in same way but in set variables return last modified date for all files This flow only working for csv ??
@azurecontentannu63997 күн бұрын
@@jitendrasharma911 which file format r u using
@jamieashton6608 күн бұрын
This is the example you're looking for.
@dheerajkumark56029 күн бұрын
Nested for each is not supporting for me
@azurecontentannu63998 күн бұрын
@@dheerajkumark5602 use child pipeline
@maheshreddydandu476413 күн бұрын
good work more informative
@ArsalanMehmood-f6t17 күн бұрын
Thanks for producing a good content. Looking forward to more informative videos around different concepts and use cases in spark
@rohantagrawal198517 күн бұрын
Great content. Very useful for beginner. Is there any video which can help me do REST API POST calls with payload as json data fetched from Azure blob in ADF?
@rohantagrawal198517 күн бұрын
Hello Annuji, Do you have any pointers for REST API POST calls ?
@manikantabandreddy909627 күн бұрын
Plz can you do one video on adf pipeline optimization
@DhavalPujara-f6w28 күн бұрын
Valuable contain
@soumyaranjanbiswal273429 күн бұрын
50
@karthikinuАй бұрын
Thanks a lot for this info, i use Debug mode a lot and i have removed the Execute pipeline in the ForEach which is not required in Debug mode and able to run all activites in parallel
@kotireddy8554Ай бұрын
really good content.
@BikashKumarPradhan-k7cАй бұрын
Can we not use wildcards to handle this scenario as you have explained earlier¿
@azurecontentannu6399Ай бұрын
yes we can
@AdenilsonFTJuniorАй бұрын
Sensational video! One question, why in ForEach1 did you have to create a dataset that points to the parameterised files in the dataset and not use the result of getmedata1, which already returns a list with each of the names and dates?
@azurecontentannu6399Ай бұрын
@@AdenilsonFTJunior The one outside Foreach is pointing to folder level.. So we are getting the file names within the folder using child items and the one inside Foreach is parameterized to process those files one by one through iteration and get the last modified date of the files
@AdenilsonFTJuniorАй бұрын
@@azurecontentannu6399 thanks!
@napoleanbonaparte9225Ай бұрын
Thank u, very much , what if it is case of folder of big data
@madhanrommala2883Ай бұрын
Great explanation! Thanks for this content 😀
@rajashekar4171Ай бұрын
Hi Annu, Could you please make a video explaining the Degree of Copy Parallelism in Copy Activity? It would be really helpful! Thanks!
@azurecontentannu6399Ай бұрын
@@rajashekar4171 sure
@rajashekar4171Ай бұрын
@@azurecontentannu6399 Thanks
@atlanticoceanvoyagebird2630Ай бұрын
Beautiful and simple way of systematical teaching to the students. You are great from your heart explaining everything clearly without hiding . Your teaching is greatly appreciated by Muihammad Khan from North America.
@azurecontentannu6399Ай бұрын
@@atlanticoceanvoyagebird2630 Thankyou so much for your kind words
@rajashekar4171Ай бұрын
Wow, your content and explanations are absolutely fantastic! Really appreciate the clarity and detail you put into every video. Keep up the amazing work! 🙌 Thanks
@atlanticoceanvoyagebird2630Ай бұрын
I never understood this confused parameterized value passing from other instructors except your teaching methods that remained clear explanations. I am learning a lot of things from you which I could not enjoy from other sources. Muhammad Khan from North America.
@atlanticoceanvoyagebird2630Ай бұрын
Your English pronunciations are very clear not other instructors who speaks and can understand 5% audience not like you where audience can understand 100%. Keep on this great pronunciations.
@meghachouksey3248Ай бұрын
If we want to skip files which contain char like fact, aggregated inside the file name like data567- fact.csv and data13144-aggregated-1456.csv from copying how we can do this???
@rajendrayegireddi3429Ай бұрын
Great video, mam I have a doubt that after using a filter activity why don't we take a copy activity directly to load dat into destination. Please clear my doubt
@rajendrayegireddi3429Ай бұрын
Good explanation annu. I have a small question. why don't we lookup activity instead of filter activity to filter last updated file?
@rohigt5745Ай бұрын
Awesome and inspiring, I like your analytical approach and good communication skills.
@rohigt5745Ай бұрын
Excellent analytical approach in the explanation👍
@CodeVedaАй бұрын
I watched a few of your videos on ADF, and honestly, they are much better than the paid ones on Udemy😀
@azurecontentannu6399Ай бұрын
@@CodeVeda Thankyou so much 😊
@PurviMakaneeАй бұрын
Filename and filepath both paramere pointing to file name only.....I guess that is a bug in ADF. Microsoft needs to improve it.
@aperxmim2 ай бұрын
Great explanation of the tools and features before going into the demonstration examples. a good sneak preview.
@Uda_dunga2 ай бұрын
End me kya kiya😢
@SAIKUMAR032 ай бұрын
256?
@azurecontentannu639929 күн бұрын
@@SAIKUMAR03 no it's 50
@SAIKUMAR032 ай бұрын
4-256
@SAIKUMAR032 ай бұрын
excel was also not supported as sink. Iceberg is only supported as sink but not source.
@aswinvenkatesh98402 ай бұрын
=
@aswinvenkatesh98402 ай бұрын
=
@aswinvenkatesh98402 ай бұрын
=
@aswinvenkatesh98402 ай бұрын
=
@virbhadramule60882 ай бұрын
nice thanks for made this content
@rajveerdhumal31432 ай бұрын
Please raise audio quality and avoid disturbance, you have long journey here, thanks a lot
@azurecontentannu6399Ай бұрын
Sure.. Noted
@rajveerdhumal31432 ай бұрын
Thanks for this content really Helpful
@madhua88922 ай бұрын
Good explanation and easy to understand. Soon i will follow your azure content. Hope you will clear all doubts there.
@aswinvenkatesh98402 ай бұрын
=
@aswinvenkatesh98402 ай бұрын
=
@HETALVYAS-j2c2 ай бұрын
can you tech me to :-Use Azure Data Factory to retrieve data from the Azure Log Analytics API. Use the query language used by Log Analytics (KQL).The sink will be SQL DB. Use conditions like error/Fail .