1. Introduction to Pyspark
5:08
6 ай бұрын
Пікірлер
@sivaram9654
@sivaram9654 Сағат бұрын
Now it is asking " the wildcard file name is required " for all cases
@sivaram9654
@sivaram9654 5 сағат бұрын
excellent vedio
@azurecontentannu6399
@azurecontentannu6399 3 сағат бұрын
@@sivaram9654 thankyou so much 😊
@DinakaranB-ij6cv
@DinakaranB-ij6cv 2 күн бұрын
Can it be used to send notifications to multiple users?
@azurecontentannu6399
@azurecontentannu6399 2 күн бұрын
@@DinakaranB-ij6cv yes
@jardila7701
@jardila7701 4 күн бұрын
Hi, I am working with a excel file instead of csv. And when debugging it is asking me about to select a Sheet name
@azurecontentannu6399
@azurecontentannu6399 4 күн бұрын
@@jardila7701 ya sheet name or sheet index u need to provide for excel
@akashtribhuvan8124
@akashtribhuvan8124 6 күн бұрын
Why should one use powershell to create adf? I mean what is the benefit?
@this_is_me-t5u
@this_is_me-t5u 7 күн бұрын
Hi, is your ADF playlist complete?
@azurecontentannu6399
@azurecontentannu6399 7 күн бұрын
@@this_is_me-t5u not yet
@this_is_me-t5u
@this_is_me-t5u 6 күн бұрын
@@azurecontentannu6399 okay 👍
@sanjeevdamle890
@sanjeevdamle890 7 күн бұрын
Thank you Annu... very nicely done. It helped me immensely
@azurecontentannu6399
@azurecontentannu6399 7 күн бұрын
@@sanjeevdamle890 Welcome 😊
@jitendrasharma911
@jitendrasharma911 7 күн бұрын
Hi I'm doing this activity in same way but in set variables return last modified date for all files This flow only working for csv ??
@azurecontentannu6399
@azurecontentannu6399 7 күн бұрын
@@jitendrasharma911 which file format r u using
@jamieashton660
@jamieashton660 8 күн бұрын
This is the example you're looking for.
@dheerajkumark5602
@dheerajkumark5602 9 күн бұрын
Nested for each is not supporting for me
@azurecontentannu6399
@azurecontentannu6399 8 күн бұрын
@@dheerajkumark5602 use child pipeline
@maheshreddydandu4764
@maheshreddydandu4764 13 күн бұрын
good work more informative
@ArsalanMehmood-f6t
@ArsalanMehmood-f6t 17 күн бұрын
Thanks for producing a good content. Looking forward to more informative videos around different concepts and use cases in spark
@rohantagrawal1985
@rohantagrawal1985 17 күн бұрын
Great content. Very useful for beginner. Is there any video which can help me do REST API POST calls with payload as json data fetched from Azure blob in ADF?
@rohantagrawal1985
@rohantagrawal1985 17 күн бұрын
Hello Annuji, Do you have any pointers for REST API POST calls ?
@manikantabandreddy9096
@manikantabandreddy9096 27 күн бұрын
Plz can you do one video on adf pipeline optimization
@DhavalPujara-f6w
@DhavalPujara-f6w 28 күн бұрын
Valuable contain
@soumyaranjanbiswal2734
@soumyaranjanbiswal2734 29 күн бұрын
50
@karthikinu
@karthikinu Ай бұрын
Thanks a lot for this info, i use Debug mode a lot and i have removed the Execute pipeline in the ForEach which is not required in Debug mode and able to run all activites in parallel
@kotireddy8554
@kotireddy8554 Ай бұрын
really good content.
@BikashKumarPradhan-k7c
@BikashKumarPradhan-k7c Ай бұрын
Can we not use wildcards to handle this scenario as you have explained earlier¿
@azurecontentannu6399
@azurecontentannu6399 Ай бұрын
yes we can
@AdenilsonFTJunior
@AdenilsonFTJunior Ай бұрын
Sensational video! One question, why in ForEach1 did you have to create a dataset that points to the parameterised files in the dataset and not use the result of getmedata1, which already returns a list with each of the names and dates?
@azurecontentannu6399
@azurecontentannu6399 Ай бұрын
@@AdenilsonFTJunior The one outside Foreach is pointing to folder level.. So we are getting the file names within the folder using child items and the one inside Foreach is parameterized to process those files one by one through iteration and get the last modified date of the files
@AdenilsonFTJunior
@AdenilsonFTJunior Ай бұрын
@@azurecontentannu6399 thanks!
@napoleanbonaparte9225
@napoleanbonaparte9225 Ай бұрын
Thank u, very much , what if it is case of folder of big data
@madhanrommala2883
@madhanrommala2883 Ай бұрын
Great explanation! Thanks for this content 😀
@rajashekar4171
@rajashekar4171 Ай бұрын
Hi Annu, Could you please make a video explaining the Degree of Copy Parallelism in Copy Activity? It would be really helpful! Thanks!
@azurecontentannu6399
@azurecontentannu6399 Ай бұрын
@@rajashekar4171 sure
@rajashekar4171
@rajashekar4171 Ай бұрын
@@azurecontentannu6399 Thanks
@atlanticoceanvoyagebird2630
@atlanticoceanvoyagebird2630 Ай бұрын
Beautiful and simple way of systematical teaching to the students. You are great from your heart explaining everything clearly without hiding . Your teaching is greatly appreciated by Muihammad Khan from North America.
@azurecontentannu6399
@azurecontentannu6399 Ай бұрын
@@atlanticoceanvoyagebird2630 Thankyou so much for your kind words
@rajashekar4171
@rajashekar4171 Ай бұрын
Wow, your content and explanations are absolutely fantastic! Really appreciate the clarity and detail you put into every video. Keep up the amazing work! 🙌 Thanks
@atlanticoceanvoyagebird2630
@atlanticoceanvoyagebird2630 Ай бұрын
I never understood this confused parameterized value passing from other instructors except your teaching methods that remained clear explanations. I am learning a lot of things from you which I could not enjoy from other sources. Muhammad Khan from North America.
@atlanticoceanvoyagebird2630
@atlanticoceanvoyagebird2630 Ай бұрын
Your English pronunciations are very clear not other instructors who speaks and can understand 5% audience not like you where audience can understand 100%. Keep on this great pronunciations.
@meghachouksey3248
@meghachouksey3248 Ай бұрын
If we want to skip files which contain char like fact, aggregated inside the file name like data567- fact.csv and data13144-aggregated-1456.csv from copying how we can do this???
@rajendrayegireddi3429
@rajendrayegireddi3429 Ай бұрын
Great video, mam I have a doubt that after using a filter activity why don't we take a copy activity directly to load dat into destination. Please clear my doubt
@rajendrayegireddi3429
@rajendrayegireddi3429 Ай бұрын
Good explanation annu. I have a small question. why don't we lookup activity instead of filter activity to filter last updated file?
@rohigt5745
@rohigt5745 Ай бұрын
Awesome and inspiring, I like your analytical approach and good communication skills.
@rohigt5745
@rohigt5745 Ай бұрын
Excellent analytical approach in the explanation👍
@CodeVeda
@CodeVeda Ай бұрын
I watched a few of your videos on ADF, and honestly, they are much better than the paid ones on Udemy😀
@azurecontentannu6399
@azurecontentannu6399 Ай бұрын
@@CodeVeda Thankyou so much 😊
@PurviMakanee
@PurviMakanee Ай бұрын
Filename and filepath both paramere pointing to file name only.....I guess that is a bug in ADF. Microsoft needs to improve it.
@aperxmim
@aperxmim 2 ай бұрын
Great explanation of the tools and features before going into the demonstration examples. a good sneak preview.
@Uda_dunga
@Uda_dunga 2 ай бұрын
End me kya kiya😢
@SAIKUMAR03
@SAIKUMAR03 2 ай бұрын
256?
@azurecontentannu6399
@azurecontentannu6399 29 күн бұрын
@@SAIKUMAR03 no it's 50
@SAIKUMAR03
@SAIKUMAR03 2 ай бұрын
4-256
@SAIKUMAR03
@SAIKUMAR03 2 ай бұрын
excel was also not supported as sink. Iceberg is only supported as sink but not source.
@aswinvenkatesh9840
@aswinvenkatesh9840 2 ай бұрын
=
@aswinvenkatesh9840
@aswinvenkatesh9840 2 ай бұрын
=
@aswinvenkatesh9840
@aswinvenkatesh9840 2 ай бұрын
=
@aswinvenkatesh9840
@aswinvenkatesh9840 2 ай бұрын
=
@virbhadramule6088
@virbhadramule6088 2 ай бұрын
nice thanks for made this content
@rajveerdhumal3143
@rajveerdhumal3143 2 ай бұрын
Please raise audio quality and avoid disturbance, you have long journey here, thanks a lot
@azurecontentannu6399
@azurecontentannu6399 Ай бұрын
Sure.. Noted
@rajveerdhumal3143
@rajveerdhumal3143 2 ай бұрын
Thanks for this content really Helpful
@madhua8892
@madhua8892 2 ай бұрын
Good explanation and easy to understand. Soon i will follow your azure content. Hope you will clear all doubts there.
@aswinvenkatesh9840
@aswinvenkatesh9840 2 ай бұрын
=
@aswinvenkatesh9840
@aswinvenkatesh9840 2 ай бұрын
=
@HETALVYAS-j2c
@HETALVYAS-j2c 2 ай бұрын
can you tech me to :-Use Azure Data Factory to retrieve data from the Azure Log Analytics API. Use the query language used by Log Analytics (KQL).The sink will be SQL DB. Use conditions like error/Fail .