21. Dynamic Column mapping in Copy Activity in Azure Data Factory

  Рет қаралды 55,006

WafaStudies

WafaStudies

Күн бұрын

In this video, I discussed about how to perform column mapping dynamically in copy activity in Azure data factory
Link for Azure Synapse Analytics Playlist:
• 1. Introduction to Azu...
Link for Azure Databricks Play list:
• 1. Introduction to Az...
Link for Azure Functions Play list:
• 1. Introduction to Azu...
Link for Azure Basics Play list:
• 1. What is Azure and C...
Link for Azure Data factory Play list:
• 1. Introduction to Azu...
Link for Azure Data Factory Real time Scenarios
• 1. Handle Error Rows i...
Link for Azure LogicApps playlist
• 1. Introduction to Azu...
#Azure #AzureDatafactory #DataFactory

Пікірлер: 117
@vijaybodkhe8379
@vijaybodkhe8379 10 ай бұрын
Thanks
@WafaStudies
@WafaStudies 10 ай бұрын
Thank you 😊
@papachoudhary5482
@papachoudhary5482 3 жыл бұрын
Khuda aap ko lambi umar dei... Sir, Long Live ! Long Live !
@WafaStudies
@WafaStudies 3 жыл бұрын
Thank you so much for such a kind words 🙂☺️☺️
@Funwithkiddoss
@Funwithkiddoss 3 жыл бұрын
Good scenario and superb demonstration. Thanks for your videos
@WafaStudies
@WafaStudies 3 жыл бұрын
Thank you 🙂
@generaltalksoflife
@generaltalksoflife 3 жыл бұрын
Hi Maheer, if you revise at the end of the video like what all activities you did in the pipeline then would be very nice. Thanks. Keep it up. Best of luck.
@WafaStudies
@WafaStudies 3 жыл бұрын
Thank you for feedback 😊
@deepureddy6567
@deepureddy6567 Жыл бұрын
Great video sir these seem to be real time scenarios which are faced in projects
@WafaStudies
@WafaStudies Жыл бұрын
Thank you 😊
@vijaybodkhe8379
@vijaybodkhe8379 10 ай бұрын
This is a really useful video. Thanks for sharing and explaining in easy-to-understand language.
@roshankumargupta46
@roshankumargupta46 3 жыл бұрын
Great Maheer! Would be great if you can make one video on how do you perform unit testing on copy activity to check whether rows and columns copied correctly or not
@WafaStudies
@WafaStudies 3 жыл бұрын
Thank you. Sure i will plan
@sureshthippani5163
@sureshthippani5163 3 жыл бұрын
Hi This tutorial is very good. can you please prepare a video to create dynamic dataset for different environments and please create some videos for COSMOS DB (No SQL) .once again thanks sir.
@himanshutrivedi4956
@himanshutrivedi4956 3 жыл бұрын
Wonderful ..You Rocks as always..My Azure Rockstar..
@WafaStudies
@WafaStudies 3 жыл бұрын
Thank you 😊
@RahulKumar-jg5ly
@RahulKumar-jg5ly 3 жыл бұрын
Great explanation and presentation 👍👍👍....it is really helpful.
@WafaStudies
@WafaStudies 3 жыл бұрын
Thank you 😊
@vijaymulimath6519
@vijaymulimath6519 11 ай бұрын
i have one doubt.......that u have already hardcoded mapping throgh that u got json file so my question is is it dynamic mapping?
@priyankapatimidi2392
@priyankapatimidi2392 3 жыл бұрын
Great content, very much helpful in real time implementation
@WafaStudies
@WafaStudies 3 жыл бұрын
Thank you so much ☺️
@battulasuresh9306
@battulasuresh9306 3 жыл бұрын
Master piece Add more videos in this play list
@WafaStudies
@WafaStudies 3 жыл бұрын
Thank you 🙂
@battulasuresh9306
@battulasuresh9306 3 жыл бұрын
@@WafaStudies anna, ADB Nerchukovalantey pre Requsities konchem chepparaa? currently working in deloitte on ADF
@HanumanSagar
@HanumanSagar 3 жыл бұрын
@@battulasuresh9306 Hi Bro which project u are working in Deloitte?..
@avirozenboim6446
@avirozenboim6446 2 жыл бұрын
Thanks for this - was so helpful .... If I may , instead of manually creating the json I used string_agg function to generate the json automatically from a simple mapping table where you map source column to target column for each entity SELECT EntityId, ‘”type”: “TabularTranslator”, “mappings”: [ ‘ + string_agg( ‘{“source”:{ “name”:”‘ + SourceColumn + ‘”},”sink”:{“name”:”‘ + TargetColumn + ‘”}}’, ‘,’ ) + ‘ ] } ‘ AS ColumnMapping FROM dwh_control.entity_column_mapping em GROUP BY em.EntityId
@WafaStudies
@WafaStudies 2 жыл бұрын
Great nice way
@SQL4ALL
@SQL4ALL 2 жыл бұрын
@@WafaStudies Great video. Thank you
@SQL4ALL
@SQL4ALL 2 жыл бұрын
Thanks Avi, this is really helpful. You might need to add leading { in the query ....tested works for me
@anshuldubey9395
@anshuldubey9395 11 ай бұрын
Great finding Avi
@Benwooduk85
@Benwooduk85 Жыл бұрын
Amazing tutorial, very well explained and works a treat for me using CRM as a source. My only issue is I have to match source and sink field name, including case. Very odd as you clearly don't have that issue in your demonstration.
@0shaan0
@0shaan0 3 жыл бұрын
Hello Maheer, All your videos are simply superb and anyone can learn about Azure very easily ...would like to request you to Please create some videos on Azure Architecture....
@gurumoorthy5321
@gurumoorthy5321 Ай бұрын
Hi Maheer, Thank You... The solution is very interesting. I have 2 questions -- 1. Is it possible to run the for each parallelly (can we Uncheck "Sequential") -- ? 2. This json is the only possible solution to dynamically mapping the columns -- Do we have any other alternate logic -- ?
@geoffchaddock
@geoffchaddock 11 ай бұрын
Great tutorial, proved very helpful in my real time implementation
@mayurkrish75
@mayurkrish75 Жыл бұрын
Kudos to this tutorial! Very well explained.
@nivedabaskar316
@nivedabaskar316 2 жыл бұрын
Thank you very much sir , very useful content . 👍
@WafaStudies
@WafaStudies 2 жыл бұрын
Thank you ☺️
@udaychodagiri252
@udaychodagiri252 6 ай бұрын
Nice Video. Please keep on doing this kind of stuff.
@neha1075
@neha1075 3 жыл бұрын
How to pass these dynamic mapping if my json is array type?
@sudheermattapally2791
@sudheermattapally2791 3 жыл бұрын
Awesome....very useful article.
@WafaStudies
@WafaStudies 3 жыл бұрын
Thank you 🙂
@Sandeep.Gupta27
@Sandeep.Gupta27 3 жыл бұрын
Thanks for the video. Its very informative. I have one doubt. Suppose we execute the pipeline once and for the 2nd execution what if the source file have some additional or missing columns!? Will this give error? Thanks
@ajaykhedkar5940
@ajaykhedkar5940 3 жыл бұрын
Yes , I have same doubt. If my files are coming daily, and one of file have some additional columns then how can we cope up with that? 2nd thing if I wanted to vomit those extra columns then can I do that? If yes, then how? @maheer
@SairamV17
@SairamV17 2 жыл бұрын
Did you get any answer ? For this
@SairamV17
@SairamV17 2 жыл бұрын
#Wafastudios please answer this ?
@satyaraj21
@satyaraj21 2 ай бұрын
I have scenario of handling empty string in Source need to populate in DateTime column in target using TabularTranslator logic.
@venkatrajak6277
@venkatrajak6277 3 жыл бұрын
Hi Sir, thank you soo much for this Video. I have one doubt, we are already mapping manually, so it will be fixed for multiple runs rt? then what is the use of passing this dynamically? because we have to map and take the json any how for the first time. please help me in this?
@vickym3193
@vickym3193 3 жыл бұрын
One reason I can think of is - if in future, you get more columns, you can directly add more items to the JSON, instead of modifying your ADF pipeline.
@MaheshReddyPeddaggari
@MaheshReddyPeddaggari 3 жыл бұрын
Great content Thanks Maheer
@WafaStudies
@WafaStudies 3 жыл бұрын
Welcome 🤗
@rodrigoalejandronunezcabre3150
@rodrigoalejandronunezcabre3150 3 жыл бұрын
hello master!! , thanks for the video. ¿It is possible to perform a dynamic mapping from a web source and destination in to data lake? Thank you !
@parthgovekar8693
@parthgovekar8693 6 ай бұрын
hi, did you found any solution for dynamic mapping from a web source ?
@shekareddy5146
@shekareddy5146 3 жыл бұрын
I was looking for the same 👍
@WafaStudies
@WafaStudies 3 жыл бұрын
😊👍
@nitagawade3330
@nitagawade3330 2 жыл бұрын
Outstanding Maheer!!!
@WafaStudies
@WafaStudies 2 жыл бұрын
Thank you ☺️
@AnandKumar-dc2bf
@AnandKumar-dc2bf 3 жыл бұрын
Excellent video...
@WafaStudies
@WafaStudies 3 жыл бұрын
Thank you 😊
@rk-ej9ep
@rk-ej9ep Жыл бұрын
Very nice..👍
@WafaStudies
@WafaStudies Жыл бұрын
Thank you ☺️
@bagamanocnon
@bagamanocnon Жыл бұрын
Hi. How do I change the type/data type in the Sink/destination dataset for a certain column? Say I want to change Emp_Id from int to nvarchar. I can't see any button to change the data type. thanks.
@acharjyadebasis
@acharjyadebasis 3 ай бұрын
if the excel column is changing (like Emp_name/emp_name/EMP_NAME) then how to handle this scenario
@annukumari9629
@annukumari9629 3 жыл бұрын
Wow..! Very nicely presented. Thankyou so much for your constant effort. May you recieve many more accomplishments and blessings. You deserve it all. Keep going. :)
@WafaStudies
@WafaStudies 3 жыл бұрын
Thank you 🙂
@sonamkori8169
@sonamkori8169 2 жыл бұрын
Thank You Sir
@WafaStudies
@WafaStudies 2 жыл бұрын
Welcome 😊
@nuwanmenuka
@nuwanmenuka 10 ай бұрын
Amazing explanation
@yashaswinividwanmani2143
@yashaswinividwanmani2143 Жыл бұрын
Data read and data written are not same I.e my source consists of details of 20 employees but my output when loaded into SQL db ...only first employee details are loaded...could u please help me resolve this issue???
@prasangisreenwas9088
@prasangisreenwas9088 Жыл бұрын
Like same task dynamically set column names by using maping data flows
@ragharaj3367
@ragharaj3367 2 жыл бұрын
This is a great video! Thanks for that :) I have scenario where I have to map one column from the source to multiple sink columns. Please could you let me know if there is a way to do this. I would really appreciate your help on this.
@nagasandeepkumarkapa1561
@nagasandeepkumarkapa1561 2 жыл бұрын
Hi ,Can I know if you have found any resolution for this one
@ragharaj3367
@ragharaj3367 2 жыл бұрын
@@nagasandeepkumarkapa1561 Microsoft say there is no way we can do this, but we had to make tweaks to the JSON to get it working
@nagasandeepkumarkapa1561
@nagasandeepkumarkapa1561 2 жыл бұрын
Thanks for that , any chance you have done that ? Please let me know just the idea if it is done.
@skselva403
@skselva403 3 жыл бұрын
Super brother 👌👌👌
@WafaStudies
@WafaStudies 3 жыл бұрын
Thank you 😊
@user-wr3mt5yv5o
@user-wr3mt5yv5o Жыл бұрын
Hi, can yo guide me please? I'm loading csv file, but whe the data is in SQL, these data it is in the differente order, I need the data in the same order that the source. What can I do for this?
@dibassimohamed5514
@dibassimohamed5514 2 жыл бұрын
It magical. Thank you a lot
@WafaStudies
@WafaStudies 2 жыл бұрын
Thank you 😊
@priyankadp5777
@priyankadp5777 5 ай бұрын
Hi Maheer .another wonderful video.. I have a scenario.. to read from on prem db and write to azure sql db (upsert operation) .i am able to do this using copy activity. I have 2 new columns only in sink( create date and update date) .. create date shud be set with current date when inserting new record and update date column shud be updated only in case of updating the record(not inserting) .. how do achieve this using upsert copy activity.. i dont find any way to distinguish between new record or existinf record in copy
@jebakanifamilyjyojay33
@jebakanifamilyjyojay33 2 жыл бұрын
Hi , How can we manipulate/transform the source column in Copy data activity - Additional columns mapping ?
@sujitunim
@sujitunim Жыл бұрын
Hi Maheer I have one use case in which I am getting data from the rest API but the order of the column is not the same every time. And I need to write a response in the CSV file. But the column should be a specific order. Is there any way to handle this in ADF copy activity.
@GrowthMindsetGlobal
@GrowthMindsetGlobal 3 жыл бұрын
Really useful
@WafaStudies
@WafaStudies 3 жыл бұрын
Thank you ☺️
@ReelsVibe1
@ReelsVibe1 Жыл бұрын
Sir if we have done mapping manually to copy json object then why do we need to do it dynamically? Please reply sir
@gpriyanka811
@gpriyanka811 3 жыл бұрын
how to concatenate two column values from source into one column value into destination in copy activity under mapping
@sanketchakane7745
@sanketchakane7745 Жыл бұрын
@WafaStudies I am still confused if it is still a dynamic way to accept data types because anyway we are storing the data type in a physical table and we are accepting as a input while load if input data type changes then table_mapping still has old data how to update that table before load satrts
@shubhampawade2933
@shubhampawade2933 2 жыл бұрын
Instead of SQL table can we store mappings in a file? If yes then how do we query the data from that file. Please any response is appreciated! Btw Thanks for the amazing video series!
@vasdecabeza2
@vasdecabeza2 2 жыл бұрын
Yes, it can be in file stored in Blob Storage and use Metadata Activity to retrieve that.
@shubhampawade2933
@shubhampawade2933 2 жыл бұрын
@@vasdecabeza2 But get metadata activity gives us the information about the file right? And not the content of the file?
@ayushibansal7947
@ayushibansal7947 4 күн бұрын
Awesome .
@anuragkhare3821
@anuragkhare3821 3 жыл бұрын
Its very helpful video for everyone. Thanks for sharing. I have similar kind of requirements but my data is in JSON format. I have data in JSON format in ADLS Gen2 and need to load the data into Azure SQL Table. I have collection reference under that I need take all data point. Challenge: Data point (Number of column) may vary over the time. How can we do it dynamically.? Example: Under the Collection Reference, we have 90 columns. After a year it may be 95 or 100.
@sanketchakane7745
@sanketchakane7745 Жыл бұрын
in data flow there is a option swift drift, inferschema
@shanavajshanu7196
@shanavajshanu7196 Жыл бұрын
Hi Sir, Is it possible to add an if condition on the source column in json mapping
@vasavig4612
@vasavig4612 3 жыл бұрын
Hi thanks for this video. Can we read Excel sheets dynamically in Azure data factory
@WafaStudies
@WafaStudies 3 жыл бұрын
You need to maintain file and sheet names some where as configuration and then lookup that data to dynamically point to sheet names
@deepjyotimitra1340
@deepjyotimitra1340 3 жыл бұрын
Hi, I have a requirement of loading all excel files, it may be csv or xls into sql db. The file names are not known because it will be dynamic everytime only file extension is fixed. Can we read file from blob using *.xls or *.csv?
@deepjyotimitra1340
@deepjyotimitra1340 3 жыл бұрын
For my requirement, i think the ans will be metadata activity.
@KostikV
@KostikV Жыл бұрын
Many thnx!
@vijaysagar5984
@vijaysagar5984 2 жыл бұрын
Hi Bro, Any workaround for CSV files which has multiple headers and we can merge them as one Header ? Source is FTP and some files are good and some files has multiple headers.
@manasam7777
@manasam7777 Жыл бұрын
Hi @vijaysagar5984 any solution of the asked question?
@karnatimanikantareddy9969
@karnatimanikantareddy9969 Жыл бұрын
Anna i am able to till 19 columns only, if I add one more extra column , i am getting the error like (Failed to run pipeline1 ) something i am getting can u pls Tess is there any limitations for adding the columns pls suggest anna
@suktvm
@suktvm 4 ай бұрын
Will it fail if the 2 tables contains similar column name ?
@snvp786
@snvp786 2 жыл бұрын
Nice
@WafaStudies
@WafaStudies 2 жыл бұрын
Thanks 🙏
@santhidhanuskodi8668
@santhidhanuskodi8668 2 жыл бұрын
do u have the json code for the overall pipeline and DB changes? we can download and refer
@nightfury5967
@nightfury5967 2 жыл бұрын
thanks. :)
@WafaStudies
@WafaStudies 2 жыл бұрын
Welcome 😊
@jolyjuju
@jolyjuju 8 ай бұрын
Its really very useful video for many no doubt, just one more thing, for example if source file has DOB or Birthdate which need to sink to the BirthDate column, in this case if i give multiple source tag, its not working, can any one have an idea to achive this
@potrunaresh2241
@potrunaresh2241 3 жыл бұрын
Requirement: Create poc for daynamic pipelines with config table and store proc.attached datasets are to load into tables in SQL and create add piplnes to load the data dynamically with config table . please tell me that bettr approach Thanks I am having 8 excel files
@dbasalo
@dbasalo 11 ай бұрын
Excellent video, you got a like from me, and I would add a subscription if I wasn't subscribed already!
@hindi-english1664
@hindi-english1664 Жыл бұрын
Is it working for delta loading?
@anjireddy5931
@anjireddy5931 3 жыл бұрын
Maheer if you don't mind, please tell me how to create azure free account, I have created account but it has showed to me as you're not eligible for an azure free account
@WafaStudies
@WafaStudies 3 жыл бұрын
kzbin.info/www/bejne/p6uro61thppnbZY
@parthpatwardhan3450
@parthpatwardhan3450 Жыл бұрын
Good work but most of the things you have created already how should we get it yaar
@cloudfitness
@cloudfitness 3 жыл бұрын
👍🏻
@priyankadp5777
@priyankadp5777 5 ай бұрын
Hi Maheer .another wonderful video.. I have a scenario.. to read from on prem db and write to azure sql db (upsert operation) .i am able to do this using copy activity. I have 2 new columns only in sink( create date and update date) .. create date shud be set with current date when inserting new record and update date column shud be updated only in case of updating the record(not inserting) .. how do achieve this using upsert copy activity.. i dont find any way to distinguish between new record or existinf record in copy
@srinubathina7191
@srinubathina7191 Жыл бұрын
Thank You Sir
Teaching a Toddler Household Habits: Diaper Disposal & Potty Training #shorts
00:16
小丑在游泳池做什么#short #angel #clown
00:13
Super Beauty team
Рет қаралды 38 МЛН
Dynamic Column mapping in Copy Activity in Azure Data Factory
58:13
20. Schema Mapping in Copy activity of ADF pipeline
28:53
Azure Content : Annu
Рет қаралды 7 М.
18. Copy multiple tables in bulk by using Azure Data Factory
18:27
Azure Data Factory Mapping Data Flows Tutorial | Build ETL visual way!
26:25
Adam Marczak - Azure for Everyone
Рет қаралды 226 М.
Azure Data Factory | Copy multiple tables in Bulk with Lookup & ForEach
23:16
Adam Marczak - Azure for Everyone
Рет қаралды 190 М.
5. Copy Activity in azure data factory | Copy data activity in ADF
14:02
learn by doing it
Рет қаралды 6 М.
Azure Data Factory: Parse Nested Json with Data Flows
10:22
Apostolos Athanasiou
Рет қаралды 5 М.
Azure Data Factory - Rule Based Mapping and This ($$) function
18:17
MitchellPearson
Рет қаралды 11 М.