what a explanation madam.Its simply superb and amazing. Once again proved women has a lot of patience the way you have explained in this video. I am expecting more videos like that. Thanks once again
@learnnrelearn75532 жыл бұрын
Thanks a lot
@gulchehraamirjonova23394 ай бұрын
Thank you so much You can’t imagine how it was helpful
@alladamk4 жыл бұрын
Very clear explanation, thanks for the video
@learnnrelearn75534 жыл бұрын
thank you
@sagar_patro4 жыл бұрын
Thanks for keeping the explanation so simple, it has given me a clear concept. Waiting for your new videos.
@learnnrelearn75534 жыл бұрын
This weekend another video will be uploaded. Please keep watching this space
@desparadoking82093 жыл бұрын
Very good explanation! 👍🙂
@vemarajulasya95854 жыл бұрын
very informative. please upload more videos on loading the data. Im working on azure it is really helpful for me hope you will do more on azure data factory.
@learnnrelearn75534 жыл бұрын
this weekend i will upload more videos
@kids58tv134 жыл бұрын
Informatic video
@learnnrelearn75534 жыл бұрын
Thanks
@arupnaskar38182 жыл бұрын
Great maam excellent.. 🧚💐💐
@kajalcraftandart9244 жыл бұрын
wow so good infomation ❤️❤️❤️❤️keep it now ❤️❤️❤️❤️
@learnnrelearn75534 жыл бұрын
thanks
@MonirHossain-qm8hh4 жыл бұрын
excellent, need more incremental load video, from On-premises SQL server more than 1 table to load Azure SQL Database. please upload
@learnnrelearn75534 жыл бұрын
Thanks for this positive feedback... Surely I will make video on incremental load from multiple table to one table
@priyadarshinichandrasekara46563 жыл бұрын
Hi... In this case, suppose we don't have any firstrow.newwatermarkvalue... what parameter should we pass to copy activity
@kollikr3 жыл бұрын
Do you have any recommended approach to deal with the deleted rows? In your example inserts and updates are handled well but I am also looking for a solution to find any hard deletes in the source table
@nandinisugandhi7483 жыл бұрын
Let me know if you have find any solution for this, I am too looking for deleted rows.
@learnnrelearn75532 жыл бұрын
To find the deleted delta records, you need to do a lookup to the source to find if any record is missing..... else from source we can have a delete flag to identify which record is deleted.(soft delete)
@rahulpathak8683 жыл бұрын
thats a good one but, I'm assuming how we will set the values for first run? lets say for first run?
@ZaikaUnlimited4 жыл бұрын
nice one thanks for sharing
@learnnrelearn75534 жыл бұрын
Thanks
@Mukesh_Sablani3 жыл бұрын
@@learnnrelearn7553 I have provided same task i followed your video but i have stuck in copy activity even I want join your personalize training please conatct me my email mk24sablani@gmail.com. I really need your guidance madam
@tapas65743 жыл бұрын
Hi, good content for learning ADF pipeline. Can you please also share how did you enable on premise sql server set up in windows and mac system in order to show end to end . Thanks in advance.
@raocoachingclasses5984 жыл бұрын
Excellent, good job
@MoAnwar3 жыл бұрын
Can the watermark table and stored proc be created in either the destination DB or a separate DB than the source DB? We have a scenario where the source DB is locked down for making changes like these.
@learnnrelearn75532 жыл бұрын
Watermark will be in the destination...............
@williamtenhoven840511 ай бұрын
Hi, it looks like this doesn't work with parquet. if I do the same in the pipeline expression builder in the copy data Sink and I press Validate ALL it says Syntax error : Opening brace has no closing brace. Thing is..... there is a closing brace...
@anandk47593 жыл бұрын
Can you share the ddl of the watermark table
@MaheshBabu-lu9uq4 жыл бұрын
Excellent
@learnnrelearn75534 жыл бұрын
Thanks
@lucamel87662 жыл бұрын
thanks for explaining a complex topic, but pls check the audio :(
@learnnrelearn75532 жыл бұрын
Sure I will take care the audio.....Thanks for the feedback....
@sabastineade21153 жыл бұрын
Thanks for the great information but i have one question, can we not just implement the incremental load where the new data is being appended in the same file rather than creating a new file for the incremental records?
@manojreddy83603 жыл бұрын
I guess in sink expression we have to use append instead of concatenate with run id and creating new file.
@learnnrelearn75532 жыл бұрын
Yes we can append the file content to a single file using Dataflow....
@investmentlifetime4 жыл бұрын
Great
@learnnrelearn75534 жыл бұрын
Thanks
@cantonschool2 жыл бұрын
How do we write copy activity from Kusto (Azure Data Explorer) ? order_payload | where Last_modified_date > "@{activity('Lookup_max_watermark').output.FirstRow.maxdate}"
@lucaslira52 жыл бұрын
Thanks for the video, is it possible to just 1 file and adding the new records to it? So you don't have to create multiple files in the blob
@Michael_Sam_054 жыл бұрын
Nice
@learnnrelearn75534 жыл бұрын
thanks
@priyab92253 жыл бұрын
Hi, Thanks for this video. Could you please tell me how to sync/incremental load two azure databases dynamically that too using primary/incremental key as adding lastmodifieddate to existing tables is not feasible for us. Please correct me but this is for one table , how can we go for multiple tables. TIA
@learnnrelearn75532 жыл бұрын
Yes, we can do it using parameterized database/table....
@pallarajesh2 жыл бұрын
Hii can you take mock interviews for azure databricks and data factory aspires
@rohitsethi5696 Жыл бұрын
what is water map table . i have of staging table and worked in staging table
@musketeers33442 жыл бұрын
Can we use same Watermark table for different table , so that multiple tables water mark value get update in same table ?
@dileeprajnarayanthumula46522 жыл бұрын
Hi Maam, the explanation is very good I would like to know if there is any possibe chance providing personal training on ADF?
@chinmaykshah3 жыл бұрын
Hi Can you provide script you have used in this video?
@kishorkumar0073 жыл бұрын
Thankyou for such a explained video on Incremental load. One question : Actually I have exhausted my 30 days azure trial and I don't have any subscription left to create any resources in Azure. Could you please advice if there any any other methods to get some credits for free in azure ??
@learnnrelearn75533 жыл бұрын
Yes, You can always go for the Pay-as-You go option anytime you need with the same account. If you are still practicing with the features you can create one more free account with different mailid.
@souranwaris142 Жыл бұрын
Hello, I have a problem with the incremental load I want to create an incremental pipeline from the Oracle on-premise server to Azure data lake(blob storage) I don't have Azure SQL. I just want to push in blob storage as a CSV file. in my case, I have confusion about where I should create the watermark table. someone told me in your case you have to use parquet data. please help me with this I am stuck for many days.
@zahidalam78318 ай бұрын
+1
@UmerAzeem4 жыл бұрын
is this method valid from Aws redshift to blob?
@learnnrelearn75534 жыл бұрын
not sure about the AWS redshift .... but this concept can be used for incremental data load in ETL Process...
@anonymous-254 Жыл бұрын
you just proceeding and proceeding ahead... plz tell what is Incremental Load ?in which scenarios we can use in real life,,, you have explained any backgroung info
@arihantsurana36712 жыл бұрын
I followed all the steps.. but while debugging.. I am facing the problem to run pipeline.. my lookups are running but copydata one is failing... plz help..
@sharavananp55702 жыл бұрын
Hi, Awesome explanation. I did the same for pipline and found a error due to T and Z coming in output of lookup values, where as in sql the datetime format has no T and Z. I tried using formatDateTime ('2000-07-07T23:34:32Z',yyyy-MM-ddTHH:mm:ssZ') and still facing issues. COuld you kindly suggest whats the best approach to solve this
@ushneshadaripa11632 жыл бұрын
I think you need to use "timestamp" data type instead of "datetime" data type.
@sharavananp55702 жыл бұрын
@@ushneshadaripa1163 hi, thanks for the reply. But now I have solved it by using datetime itself.
@sidsan000 Жыл бұрын
@@sharavananp5570 Hi Can you please tell me how u resolved this issue without timstamp and using datetime only
@surendrag75374 жыл бұрын
Nice Training ,please take one solution and use in this that will be blow.
@learnnrelearn75534 жыл бұрын
Thank You
@learnnrelearn75534 жыл бұрын
Yes, I am now implementing a mini project using azure data and AI components... i will upload the videos soon.
@MaheshBabu-lu9uq4 жыл бұрын
Ok
@ayaansk992 ай бұрын
Why we have used store procedure activity here anyone?
@aadityasharma633538 ай бұрын
mam can you please provide the source code which is in the starting of the video.
@SatishBurnwal2 жыл бұрын
The video is really informative. Just that your voice is very feeble to hear.
@PersonOfBook3 жыл бұрын
That is too much work for what should be a straightforward task.