#49. Azure Data Factory - Implement Upsert logic in Mapping data flow.

  Рет қаралды 19,707

All About BI !

All About BI !

Күн бұрын

Пікірлер: 55
@monibodhuluru6267
@monibodhuluru6267 2 жыл бұрын
What if we don't have id / unique columns? How can we achieve using composite key (adding multiple columns in alter row?
@sreeg9662
@sreeg9662 2 жыл бұрын
what is the stage that you have used to copy xls ?. beging step is missing .
@thesoftwarefarmer
@thesoftwarefarmer 3 жыл бұрын
Alter Row transformation only works with excel files and RDBMS but what if we have to deal with a CSV or TSV files?
@RodrigoZenga
@RodrigoZenga 3 жыл бұрын
This is Awesome! tks for teaching us What about if my data source is a Paginated API JSON File ?
@ksowjanya4488
@ksowjanya4488 2 жыл бұрын
Howto implement this with dataverse dataflow?
@anudipray4492
@anudipray4492 2 ай бұрын
Why you removed the space fron sink there is no space
@Lihka080
@Lihka080 3 жыл бұрын
Hi Mam, suppose I have 1 lakh records in the initial load, later I have added 2000 records to source data and no changes happened to 1 lakh, so what will be the result of your pipeline. In the second run, will it perform insert/update operation for 1 lakh records or will it ignore and perform insert operation for the latest 2000 records
@vrukshalikapdekar4627
@vrukshalikapdekar4627 4 жыл бұрын
Superb 👍
@AllAboutBI
@AllAboutBI 4 жыл бұрын
Thanks 🙏
@codeworld8981
@codeworld8981 3 жыл бұрын
Can i implement scd1 and scd2 in same dataflow? Pls let me know
3 жыл бұрын
Please, if you find the solution, let me know
@AllAboutBI
@AllAboutBI 3 жыл бұрын
I don't think it's possible in one data flow
@codeworld8981
@codeworld8981 3 жыл бұрын
@@AllAboutBI If we implement these in separate dataflow pls let me know how to do it.
@sivaramkodali8282
@sivaramkodali8282 3 жыл бұрын
Can we pass sno as parameter? I want to upsert multiple files with parameters.
@muhammadehsanullah
@muhammadehsanullah 16 күн бұрын
Thanks!
@anudipray4492
@anudipray4492 2 ай бұрын
Simple matching will update and not match then insert on keycolum
@RajeevKumar-zf8ox
@RajeevKumar-zf8ox 4 жыл бұрын
Nice with SCD Type -1 Implementation. Could you please upload the video for SCD Type-2 implementation. How to maintain the history.
@AllAboutBI
@AllAboutBI 4 жыл бұрын
Sure.
@RajeevKumar-zf8ox
@RajeevKumar-zf8ox 4 жыл бұрын
In case of incremental load, if there are suppose 70 columns , how will you compare and find the change. When I answered in the interview, that with the help of where clause comparing each columns to detect the change, the interviewer was not satisfied. If you could please help how to handle such scenarios?
@AllAboutBI
@AllAboutBI 4 жыл бұрын
Hashing could be the expectation in that case. Hash function can find out if the columns have changed or not
@vivekk9564
@vivekk9564 3 жыл бұрын
I am trying to create parameterized data flow where Source & Target are dynamic and I achieved with simple Allow Insert and Recreate table options in Target Settings. But I need to implement Upsert logic for the same and I believe it requires Key Columns also to be parameterized which I am unable to do. Can you please suggest?
@rodrigonicolastamarizcasti5279
@rodrigonicolastamarizcasti5279 2 жыл бұрын
Can you find an answer to this?
@tipsandhacksbygaurav
@tipsandhacksbygaurav Жыл бұрын
Have you used dynamic join fields, exactly same manner. Refer this one - kzbin.info/www/bejne/eX6ygYOfmp6Vjq8
@pamilad5473
@pamilad5473 2 жыл бұрын
Assume there is no primary key column in source data...then how should perform upsert logic?
@overlord7096
@overlord7096 7 ай бұрын
You need make composite key which will be concatenation of 2 or 3 columns.
@srikanthbachina7764
@srikanthbachina7764 3 жыл бұрын
Hi instead of role based mapping we can use auto mapping for drifting columns right
@AllAboutBI
@AllAboutBI 3 жыл бұрын
Yes. Rule based mapping helps for type casting.
@edwinraj2652
@edwinraj2652 4 жыл бұрын
Hi, Thanks for the Video. Is this Upsert logic, works for Cosmos DB. When choosing Cosmos in the Sink, Key columns property is not available for Cosmos.
@AllAboutBI
@AllAboutBI 4 жыл бұрын
It does support Cosmos db as sink. Haven't you set any key while creating the Cosmos db container/table?
@edwinraj2652
@edwinraj2652 4 жыл бұрын
@@AllAboutBI Yeah, it is working. previously I dont have an id field defined. Now it's working fine. Can you post a video about implementing SCD type 2 in cosmos?
@balamuralipati8604
@balamuralipati8604 3 жыл бұрын
@@edwinraj2652 where is the option for defining key ? i can see only unique key. But i want to upsert based on composite key?
@edwinraj2652
@edwinraj2652 3 жыл бұрын
@@balamuralipati8604 Upserts in Cosmos works based on 'id' property. It's the default behaviour , we dont need to choose the 'id' property in adf sink.
@balamuralipati8604
@balamuralipati8604 3 жыл бұрын
@@edwinraj2652 how would data factory know which rows to update.. ? If i am trying to upsert an excel file, it has two columns col1,col2 and cosmos has id, col1, col2, then how does the update happen?
@harikrishna-el7so
@harikrishna-el7so 4 жыл бұрын
hi its worth watching ,,pls can u share m the azure synapse analytics videos
@AllAboutBI
@AllAboutBI 4 жыл бұрын
sure, I will do it once the ADF is done.
@overlord7096
@overlord7096 7 ай бұрын
In sink diectly you are doing upsert rhen why you are using alter row activity?
@AllAboutBI
@AllAboutBI 7 ай бұрын
It's not possible to implement without alter row transformation if we got to update something
@preetijaiswal9089
@preetijaiswal9089 3 жыл бұрын
Hi, I need to implement upsert in sql server using ADF, How to implement that either by copy activity or Data flow. In mapping data flow sql server is not supported. so how to do this?
@Mayank612722
@Mayank612722 3 жыл бұрын
If your source and sink are SQL, why don't you just implement using SQL Query?
@preetijaiswal9089
@preetijaiswal9089 3 жыл бұрын
Yeah i can surely do that.. but i need to automate the process and migtate around 70-80 tables and that too using adf so that's why
@Mayank612722
@Mayank612722 3 жыл бұрын
@@preetijaiswal9089 If you're talking about moving data from on prem to cloud SQL, you can use a lookup to get all the names of tables in the database and then on its output you can run SELECT * INTO. I'm pretty sure this might not be the best way though.
@preetijaiswal9089
@preetijaiswal9089 3 жыл бұрын
@@Mayank612722 this is for migration i need to do Update or Upsert.
@Mayank612722
@Mayank612722 3 жыл бұрын
@@preetijaiswal9089 Oops, I read mitigate as migrate.
@Myachnik
@Myachnik 3 жыл бұрын
Thanks a lot! You help me :)
@raghunandan3068
@raghunandan3068 4 жыл бұрын
Thanks for this Video! As per my understanding from this video, when doing an UPDATE, it will update all the data point in the record though it is not changed. Is there a way not to UPDATE the unchanged data points. Thanks..
@AllAboutBI
@AllAboutBI 4 жыл бұрын
Yes, correct👍
@raghunandan3068
@raghunandan3068 4 жыл бұрын
Could you pls, put up a video for this. Thanks in advance. when there is record are less it may not impact, however if is million of record updating the same value again, may be too costly. Request you to post a video, on how to overcome this.
@ash3rr
@ash3rr 3 жыл бұрын
count how many times: ohhh k.
@AllAboutBI
@AllAboutBI 3 жыл бұрын
Ok 😀
@mohammadfahim2002
@mohammadfahim2002 2 жыл бұрын
that's annoying
Azure Data Factory Mapping Data Flows Tutorial | Build ETL visual way!
26:25
Adam Marczak - Azure for Everyone
Рет қаралды 232 М.
Tuna 🍣 ​⁠@patrickzeinali ​⁠@ChefRush
00:48
albert_cancook
Рет қаралды 112 МЛН
It’s all not real
00:15
V.A. show / Магика
Рет қаралды 15 МЛН
Find Row Uniqueness through Hashing with ADF and Synapse
7:19
Data Factory
Рет қаралды 6 М.
Generic Type 2 Slowly Changing Dimension using Mapping Data Flows
14:19
Tuna 🍣 ​⁠@patrickzeinali ​⁠@ChefRush
00:48
albert_cancook
Рет қаралды 112 МЛН