Thanks Wafastudies , You are making ADF easy for us by sharing videos
@vinod027913 жыл бұрын
Hi, These Real Time scenarios helped me a lot and added advantage to my experience in Azure and helped to get a job along with other technologies. Thank you for your videos!! :) :)
@WafaStudies3 жыл бұрын
Welcome 🤗
@mohitupadhayay1439 Жыл бұрын
Hi Wafa, I have some Interview Questions I was asked. Please make some videos on the same as a request. 1. How to create a GENERIC pipeline that can be reused again? 2. Write a Single line query to delete data from 10 tables. 3. What is Encoding in COPY activity? 4. What is the limitation of LOOKUP activity? 5. How do you validate your SINK data in ADF whether it is right or wrong?
@kumarpolisetty30484 жыл бұрын
Thanks wafa studies. Please start tutorial on Azure Databricks. Ur vedios are really helpful.
@jaymakam96732 жыл бұрын
This is what exactly is needed to understand ADF better. Thank you so much for the hardwork _/\_..
@prekshajain51744 жыл бұрын
Amazing explanation wafastudies cleared so many things, great work.
@adityachary24783 жыл бұрын
Fantastic bro ur explanation is awesome I learned lot from ur channel
@vivek05117gece Жыл бұрын
Hi Wafa, very well explained video. I have watched multiple videos on ADF, but after watching your videos I feel confident for the interview.
@akashputti3 жыл бұрын
thanks, its definitely going to be helpful. better than udemy and courera courses
@WafaStudies3 жыл бұрын
Thank you 😊
@dvsubbaraonerella14492 жыл бұрын
Awesome. Thanks alot for all the efforts and your time.
@WafaStudies2 жыл бұрын
Welcome 😊
@mohammedshoaib17693 жыл бұрын
This is just awesome explanation. Covering real time scenarios perfectly giving a good insight about ADF. Thanks you guys keep up the good work. Also expecting more about Databricks.
@AmitSharma-mv5xe Жыл бұрын
Really appreciate your contribution brother
@soumyaprasanna66063 жыл бұрын
Thanks for all the efforts that you have put to create the ADF videos. Can you please share the sample CSV files that you have used to explain here?
@ootec13 жыл бұрын
Really I loved it. Thanks for the video buddy.
@WafaStudies3 жыл бұрын
Thank you 😊
@Anonymous-cj4gy3 жыл бұрын
Hi, If you want to check if there is an error in another column as well then use the below condition : ErrorRows = isNull(toDate(salesDate, 'dd-MMM-yyyy')) || lesser(toInteger(quantity), 0) || isNull(salesItem) || isNull(country) Here if any one of the conditions is true then that row will go to the bad table. Also, remember if you use dd-MMM-YYYY instead of dd-MMM-yyyy it will give you the wrong output.
@mrrathore552 жыл бұрын
perfect!
@NehaJain-ry9sr4 жыл бұрын
Awesome video... many lessons in one video , including common errors... Thanks Wafa.. keep it up.. great work
@WafaStudies4 жыл бұрын
Thank you 🙂
@akkuhome97604 жыл бұрын
Keep up the good work mate.., good cover of topics with hands on experience.
@pawanreddie21623 жыл бұрын
If we dont know which column has error rows then how to give split condition
@sriharig90962 жыл бұрын
Really helpful...thank you for your efforts
@saivaraprasadarao69903 жыл бұрын
Excellent explanation Thank you so much 😊
@WafaStudies3 жыл бұрын
Welcome 🤗
@ArunKumar-kb7fr4 жыл бұрын
Thanks for uploading...explanation is very good..
@pavankumarvarmadendukuri4665 Жыл бұрын
Great work brother... keep it up...
@bhaskarnukala9022 жыл бұрын
I got one question . What if the bad row "01-------2020" doesn't show up in the data preview. It might be a decimal . Then how come 'varchar'datatype will be useful on the DB where we store bad data?
@thiagocustodio81774 жыл бұрын
how to validate if sales item is not null and not empty string? I could not find a way to write an if statement. Adding one condition to each validation will make the flow pretty complex when working with large datasets. Also, is it possible to redirect to the same sink?
@jesseantony12239 ай бұрын
Hello, Thank you for the videos. Is there any way to get the csv files mensioned in the video so that we can practice it?
@tandaibhanukiran48282 жыл бұрын
Thankyou Brother. LotsOfLove
@WafaStudies2 жыл бұрын
Welcome 😊
@annekrishnavinod54823 жыл бұрын
can we use fault tolerance option for skip the bad rows?
@karthikeyana21204 жыл бұрын
hi any idea, how to take latest file from the blob storage
@vaibhavkumar383 жыл бұрын
Tip: In error table have same column names as normal table but the data type is kept as nvarchar- because only that datatype can store any data inside it
@rahulkr56943 жыл бұрын
Nice explanation. I have one question, I have one csv file in that i have one column with 50 length but some how some records came in CSV file with more than 50 length. How to reject these records from ADF.
@arun065304 жыл бұрын
very nice & informative Video.
@WafaStudies4 жыл бұрын
Thank you 🙂
@sraoarjun Жыл бұрын
Excellent video !!
@ramum46842 жыл бұрын
If in multiple col in same row have error then how to built expression in the visual expression builder page?
@ranjansrivastava925610 ай бұрын
Appreciated !!! If you share this concept using copy activity it will be good.
@jayalakshmia6647 Жыл бұрын
Is it possible to run multiple pipelines in parallel? if yes which activity please tell me
@shubhammanjalkar859110 ай бұрын
Amazing. Thanks you so much
@ManojSairamChandP Жыл бұрын
Hi Maheer, I am new learner to ADF, while doing this scenario, i came to know that self hosted IR cannot be used in Data flows, can you clarify the same. i am using sql server as an on prem application Connection failed Linked service with Self-hosted Integration runtime is not supported in data flow. Also can you clarify that do we need to configure Azure DB in ADF as well? So as to get DB tables details
@meelookaru28883 жыл бұрын
Thank you so much for your time. its really helpful. could you please add some more real time scenarios on ADF.
@SantoshSingh-ki8bx4 жыл бұрын
Thanks Wafastudies. I have just one request, If you can share data used in tutorial. It will very helpful to follow on video.
@ashishsinha53064 жыл бұрын
Wafastudies could you provide us with the files it would be really helpful. Great videos btw.
@ShaziyaKhan-xx4uj2 жыл бұрын
is real time playlist for testing.
@gokulajith762 Жыл бұрын
Can we use the same method if we are using a self hosted integration runtime
@vishnum48922 жыл бұрын
i got error saying linked service with self-hosted ir isno tsupported in dataflows. by using self-host ir its not possible to use data debugging.is there any other way.
@Adub3333 жыл бұрын
Hello...i am running into an issue where my blob storage input is returning null values where i have not transformed that particular column...Any recommendations?
@arijitmitra85853 жыл бұрын
Great content. Kudos to you
@srinubathina7191 Жыл бұрын
Thank You
@WafaStudies Жыл бұрын
Welcome
@ssbeats6772 жыл бұрын
Hi, can you please make a video the wat are the errors we can face and types of errors and how to resolve them
@sivareddy1573 жыл бұрын
What about to check isnull function on multiple columns? Bro
@karthikram16254 жыл бұрын
can you explain , how did you connect those sql quiries to linked service?
@sonamkori81692 жыл бұрын
Thank You Sir
@WafaStudies2 жыл бұрын
Welcome 😊
@vidyatechtalks91752 жыл бұрын
where did you created the tables??
@ambatikarthik68222 жыл бұрын
Hi maheer, can you help me out how can we connect SQL Server to ADF while we connecting we facing error
@venkatpr9692 Жыл бұрын
Tq soooo much sir
@sravankumar17673 жыл бұрын
nice explanation bro..
@WafaStudies3 жыл бұрын
Thank you 🙂
@parthasaradireddy67937 ай бұрын
This is useful for etl tester or not
@shalakapowar07072 жыл бұрын
Hi, Thanks a lot for great video, but I have a question, how are you able to access this db from SQL server management as dataflow will not allow this
@vijaybodkhe8379 Жыл бұрын
Thanks for Sharing
@diwakarnrp2092 Жыл бұрын
how to download the files for practice purpose
@prasadrasal50013 жыл бұрын
thank you so much
@WafaStudies3 жыл бұрын
Welcome 😀
@UmerPKgrw3 жыл бұрын
i have simple dataflow copy data from on prem Sql server to blob storage and it gives the following error: Linked service with Self-hosted Integration runtime is not supported in data flow. Please utilize the Azure IR with managed vnet using this tutorial
@balakrishnaganasala5811 Жыл бұрын
Thank you Wafa for the excellent videos', could you lease share the scripts you used to create the DB tables and files you have used over here.
@shubhamsonune14903 жыл бұрын
Very helpfull
@WafaStudies3 жыл бұрын
Thank you 🙂
@sudheerkumar71005 ай бұрын
sir pl tel about BANKNIFTY
@annekrishnavinod54823 жыл бұрын
One Question @wafastudies , here your showing one column. for example, i don't know which column is data not properly coming or null value. how to do?
@mahihi2513 жыл бұрын
add one expression to each column to validate
@indhumathid10952 жыл бұрын
Really Thank You so much for your videos. I am trying to fetch only required records from excel file and load it into Azure SQL Database using ADF can you give me any idea?
@rajashekerreddydommata5372 жыл бұрын
We can add an expression to the variable and use that lists only those which we want, orelse list out all the required records in an array and using for each activity you can iterate and fetch only the required from the excel sheet. Hope this will work.
@amanahmed6057 Жыл бұрын
could you upload the files which you are using in every video or make a google drive
@hariomrajpoot9664 жыл бұрын
What if the number of columns number are different
@sharadgawade94084 жыл бұрын
Thanks for video on real time scenario. If have multiple files in my source location and one of them contain bad record so how to identify the file which holding bad records? Here you are passing file name as hardcoded.
@WafaStudies4 жыл бұрын
Great question. Lets do one thing. I will create second video on this scenario. How to get filenames from folder and loop through each file. Pls stay connected
@sharadgawade94084 жыл бұрын
@@WafaStudies thank you so much..
@dhanyamenon14854 жыл бұрын
Thanks for creating videos on real-time scenarios. It's really helpful. I have a question here - how to validate files in Azure Storage container based on validation schema for that data load already in SQL database used for the ETL validation scenario by using Copydata activity and capture the error rows and send an email notification to the user with the error rows data. I want to use Copy data activity here. have some limitations to use Mapping activity, so not able to use that. Please let me know. Thanks in advance.
@gokulajith762 Жыл бұрын
Hello, do you know how to carry out the same functionalities using copydata activity
@UmerPKgrw4 жыл бұрын
How much it will cost to run an Azure data factory
@paramkusamsaikiran10182 жыл бұрын
Maheer, If possible can you provide us the files you use in your lecture. It will be helpfull for us for practicing.
@subbaraochereddy70893 жыл бұрын
this is work for only one column , how u handling truncation errors, Special characters files issue , pls upload the file
@anujgupta-lc1md4 жыл бұрын
Make More and More
@harishpentela12343 жыл бұрын
Hi Wafa , I have a doubt about loading tables from staging to Data warehouse by using dataflow ,if we have 25 dim tables Do I need to create 25 workflows ?Or is there any Dynamic way to do that Thanks in Advance
@DataWithNagar Жыл бұрын
Create a single workflow template that can be parameterized. Parameters can include source table names, destination table names, transformation logic, etc.
@pranaypandu92683 жыл бұрын
Luv u bro
@WafaStudies3 жыл бұрын
Thank you 😊
@Anas_ILoveMyIndia4 жыл бұрын
Hi Maheer, We have a scenario like to fetch the zip file from FTP server and which is password protected. How to do the extraction and get the data into Blob storage or into SQL DB? Pls make a video. Thanks a lot !
@vigneshmurali19944 жыл бұрын
Pls do on spark
@dasarianandchandra68903 жыл бұрын
bro ur cool
@WafaStudies3 жыл бұрын
Thank you 🙂
@aravindreddyramidi5543 Жыл бұрын
Please provide csv file. It would be helpful.
@nallavellivenkatesh94792 жыл бұрын
Tq
@WafaStudies2 жыл бұрын
Welcome☺️
@manasmohanty57542 жыл бұрын
Kindly provide some other real time scenarios..
@WafaStudies2 жыл бұрын
Kindly check my channel Azure data factory real time scenarios playlist. There are 35+ real time scenarios covered