What if we don't have id / unique columns? How can we achieve using composite key (adding multiple columns in alter row?
@sreeg96622 жыл бұрын
what is the stage that you have used to copy xls ?. beging step is missing .
@thesoftwarefarmer3 жыл бұрын
Alter Row transformation only works with excel files and RDBMS but what if we have to deal with a CSV or TSV files?
@RodrigoZenga3 жыл бұрын
This is Awesome! tks for teaching us What about if my data source is a Paginated API JSON File ?
@ksowjanya44882 жыл бұрын
Howto implement this with dataverse dataflow?
@anudipray44922 ай бұрын
Why you removed the space fron sink there is no space
@Lihka0803 жыл бұрын
Hi Mam, suppose I have 1 lakh records in the initial load, later I have added 2000 records to source data and no changes happened to 1 lakh, so what will be the result of your pipeline. In the second run, will it perform insert/update operation for 1 lakh records or will it ignore and perform insert operation for the latest 2000 records
@vrukshalikapdekar46274 жыл бұрын
Superb 👍
@AllAboutBI4 жыл бұрын
Thanks 🙏
@codeworld89813 жыл бұрын
Can i implement scd1 and scd2 in same dataflow? Pls let me know
3 жыл бұрын
Please, if you find the solution, let me know
@AllAboutBI3 жыл бұрын
I don't think it's possible in one data flow
@codeworld89813 жыл бұрын
@@AllAboutBI If we implement these in separate dataflow pls let me know how to do it.
@sivaramkodali82823 жыл бұрын
Can we pass sno as parameter? I want to upsert multiple files with parameters.
@muhammadehsanullah16 күн бұрын
Thanks!
@anudipray44922 ай бұрын
Simple matching will update and not match then insert on keycolum
@RajeevKumar-zf8ox4 жыл бұрын
Nice with SCD Type -1 Implementation. Could you please upload the video for SCD Type-2 implementation. How to maintain the history.
@AllAboutBI4 жыл бұрын
Sure.
@RajeevKumar-zf8ox4 жыл бұрын
In case of incremental load, if there are suppose 70 columns , how will you compare and find the change. When I answered in the interview, that with the help of where clause comparing each columns to detect the change, the interviewer was not satisfied. If you could please help how to handle such scenarios?
@AllAboutBI4 жыл бұрын
Hashing could be the expectation in that case. Hash function can find out if the columns have changed or not
@vivekk95643 жыл бұрын
I am trying to create parameterized data flow where Source & Target are dynamic and I achieved with simple Allow Insert and Recreate table options in Target Settings. But I need to implement Upsert logic for the same and I believe it requires Key Columns also to be parameterized which I am unable to do. Can you please suggest?
@rodrigonicolastamarizcasti52792 жыл бұрын
Can you find an answer to this?
@tipsandhacksbygaurav Жыл бұрын
Have you used dynamic join fields, exactly same manner. Refer this one - kzbin.info/www/bejne/eX6ygYOfmp6Vjq8
@pamilad54732 жыл бұрын
Assume there is no primary key column in source data...then how should perform upsert logic?
@overlord70967 ай бұрын
You need make composite key which will be concatenation of 2 or 3 columns.
@srikanthbachina77643 жыл бұрын
Hi instead of role based mapping we can use auto mapping for drifting columns right
@AllAboutBI3 жыл бұрын
Yes. Rule based mapping helps for type casting.
@edwinraj26524 жыл бұрын
Hi, Thanks for the Video. Is this Upsert logic, works for Cosmos DB. When choosing Cosmos in the Sink, Key columns property is not available for Cosmos.
@AllAboutBI4 жыл бұрын
It does support Cosmos db as sink. Haven't you set any key while creating the Cosmos db container/table?
@edwinraj26524 жыл бұрын
@@AllAboutBI Yeah, it is working. previously I dont have an id field defined. Now it's working fine. Can you post a video about implementing SCD type 2 in cosmos?
@balamuralipati86043 жыл бұрын
@@edwinraj2652 where is the option for defining key ? i can see only unique key. But i want to upsert based on composite key?
@edwinraj26523 жыл бұрын
@@balamuralipati8604 Upserts in Cosmos works based on 'id' property. It's the default behaviour , we dont need to choose the 'id' property in adf sink.
@balamuralipati86043 жыл бұрын
@@edwinraj2652 how would data factory know which rows to update.. ? If i am trying to upsert an excel file, it has two columns col1,col2 and cosmos has id, col1, col2, then how does the update happen?
@harikrishna-el7so4 жыл бұрын
hi its worth watching ,,pls can u share m the azure synapse analytics videos
@AllAboutBI4 жыл бұрын
sure, I will do it once the ADF is done.
@overlord70967 ай бұрын
In sink diectly you are doing upsert rhen why you are using alter row activity?
@AllAboutBI7 ай бұрын
It's not possible to implement without alter row transformation if we got to update something
@preetijaiswal90893 жыл бұрын
Hi, I need to implement upsert in sql server using ADF, How to implement that either by copy activity or Data flow. In mapping data flow sql server is not supported. so how to do this?
@Mayank6127223 жыл бұрын
If your source and sink are SQL, why don't you just implement using SQL Query?
@preetijaiswal90893 жыл бұрын
Yeah i can surely do that.. but i need to automate the process and migtate around 70-80 tables and that too using adf so that's why
@Mayank6127223 жыл бұрын
@@preetijaiswal9089 If you're talking about moving data from on prem to cloud SQL, you can use a lookup to get all the names of tables in the database and then on its output you can run SELECT * INTO. I'm pretty sure this might not be the best way though.
@preetijaiswal90893 жыл бұрын
@@Mayank612722 this is for migration i need to do Update or Upsert.
@Mayank6127223 жыл бұрын
@@preetijaiswal9089 Oops, I read mitigate as migrate.
@Myachnik3 жыл бұрын
Thanks a lot! You help me :)
@raghunandan30684 жыл бұрын
Thanks for this Video! As per my understanding from this video, when doing an UPDATE, it will update all the data point in the record though it is not changed. Is there a way not to UPDATE the unchanged data points. Thanks..
@AllAboutBI4 жыл бұрын
Yes, correct👍
@raghunandan30684 жыл бұрын
Could you pls, put up a video for this. Thanks in advance. when there is record are less it may not impact, however if is million of record updating the same value again, may be too costly. Request you to post a video, on how to overcome this.