Upgrade delete bad image transmission channel Upgrade delete bad video transmission channel Upgrade delete bad shorts transmission channel
@HariArayakeel-m2y9 күн бұрын
I’m working with two tables in Azure Synapse data flow. Both tables, table1 and table2, have the same columns: [A], [B], [C], and [D]. I’ve applied a left outer join using [D] as the key, with table1 as the left table. In the Join output, I’m getting separate columns such as [table1@A], [table2@A], [table1@B], [table2@B], and so on. I’d like to merge the corresponding columns (e.g., [table1@A] and [table2@A]) into a single column [A]. Is there an easier way to achieve this other than creating a derived column with a condition like: if (!isNull([table1@A]), [table1@A], [table2@A])? It is taking alot time to write code for each cloumn to do it this way. I have tried to use 'Exist' condition, but there is only 'Exist' and 'not exist' condition which is not convenient. Looking forward to your suggestions. Best regards, Harry
@HariArayakeel-m2y9 күн бұрын
I’m working with two tables in Azure Synapse data flow. Both tables, table1 and table2, have the same columns: [A], [B], [C], and [D]. I’ve applied a left outer join using [D] as the key, with table1 as the left table. In the Join output, I’m getting separate columns such as [table1@A], [table2@A], [table1@B], [table2@B], and so on. I’d like to merge the corresponding columns (e.g., [table1@A] and [table2@A]) into a single column [A]. Is there an easier way to achieve this other than creating a derived column with a condition like: if (!isNull([table1@A]), [table1@A], [table2@A])? It is taking alot time to write code for each cloumn to do it this way. I have tried to use 'Exist' condition, but there is only 'Exist' and 'not exist' condition which is not convenient. Looking forward to your suggestions. Best regards, Harry
@Alicia_BPHCАй бұрын
Thank you 🦋
@arpitshah6804Ай бұрын
Just a note, as of Oct 2024, it did not allow me to store a complex object and said I can do it with sync as JSON ro move to azure snowflake, etc. The trick is to convert the final result to string if you want to save it to sql db. Hope that helps someone
@Alicia_BPHCАй бұрын
Thank you 🦋
@philsvibe9179Ай бұрын
is there away i can also do same for insert. i notice the conditon also works for insert but there will be conflict
@DrDiabolical0002 ай бұрын
Thanks a lot man!
@adilmajeed84392 ай бұрын
Thanks for sharing but i always have an issue with majority of the videos, the presenter never share the source file, how the the novice people will able to understand the workflow behind it
@jayasurya7564 ай бұрын
thanks for the video, my source database is on-premise postgresql database which is not supported in data flows. is there any workaround for this? I am planning to create a staging table in sink database which truncates the data before copy activity and in the second copy activity i will copy the data with pre-copy script to delete the records based on id column which are found in the staging table. is it a correct approach?
@nagendrasrinivas-cj7sr4 ай бұрын
What if I want do dynamic data masking, then I want to unmask for few roles, is there any way where I am getting a data from on prime and , can save a parquet file with masked field SSN in adsl then loading into delta table we maintain the same masking but at reporting we unmask for Admin role
@sandroquinta11254 ай бұрын
Will this work on Fabric?
@HabariYaMere4 ай бұрын
How can I get the 'key' value, the value that doesn't have a label associated, this is how some of my json looks like: {"eytNvV2D0Qj": {"value": "2", "created": "2022-05-19T08:08:10.863", "lastUpdated": "2022-05-19T08:08:10.863", "createdByUserInfo": {"id": 46131, "uid": "IMcMexD14Yg",}}, "jmp5oJG2Yp8": {"value": "2021-11-19", "created": "2022-05-19T08:07:59.513",}}, "r9aBsXCiVss": {"value": "2022-03-14"}} so what I want to pull out are the - what we are calling keys, and their corresponding 'value' property key value eytNvV2D0Qj 2 jmp5oJG2Yp8" 2021-11-19 r9aBsXCiVss 2022-03-14 I have tried the expression (key as string, value as string) but nothing is returned. any advice please?
@AlejandroAlvarez-ms6rs4 ай бұрын
Is there a way to add comments to the data flow script?
@shuaibsaqib50854 ай бұрын
Great!
@HabariYaMere5 ай бұрын
THANK YOU!!! I can't believe there's not comments, you saved us loads of time!
@jorglang78836 ай бұрын
I really like this generice approach. How would I handle the case, if the source column name is different from the sink column name?
@kojo.6 ай бұрын
i have a date column with dates in this format "2024-03-11T11:23:37.0000000" how do i read it ? it appears as nulls for me
@jaspreetmodi6 ай бұрын
good one
@Asearles4047 ай бұрын
Is Data source read sequentially or in parallel? and can we control the order of Sources being read in?
@gothickmatt7 ай бұрын
Let's say your table has just two text fields, column1 and column2. When one row has column1 = "AB" and column2" = "C", won't this generate the same hash as a row where column1 is "A" and column2 is "BC"? Because the concatenation of either will be "ABC", and then you hash that result and generate the same hash?
@eddyjawed7 ай бұрын
First time I seen how you look. You are handsome should show yourself more :D
@hutchm928 ай бұрын
The important part of this video is what's in the scope field? Please share.
@AbhinayKacham8 ай бұрын
This video is a life saver.
@Bunbon77808 ай бұрын
What is "source2" as Incoming stream in "lookupPK"?
@rajatpai50488 ай бұрын
How can you capture deletes with this process?
@ideasystemsmexico8 ай бұрын
Thanks
@AleAndreatta9 ай бұрын
It helped A LOT! Thank you!
@harithamarripudi56229 ай бұрын
Hi, sir. Could you please help me with connecting/sending the ADLS, containing delta Parquet data, to MySQL Server as an SQL table using data flow in Azure
@bananaboydan36429 ай бұрын
didnt show how to configure eventgrid at all wow
@Aussified9 ай бұрын
Any plans to support this natively in the future? This actually makes the configuration harder to setup compared to ADF / Synapse Pipelines.
@MSDataFactory9 ай бұрын
Yes! This demonstrates a technique that can be utilized today in Fabric with what is in the product now. We will continue to add feature to make use cases like pipeline triggers from events and time slices as a native feature in Fabric to make this easier.
@MSFT_ScottSewell10 ай бұрын
🎉 thanks for sharing this!
@ahvallentin10 ай бұрын
Will you introduce VNET Data Gateway to the Data Factory and Synapse services and/or Data Factory in Fabric, so that we can avoid having to use a self-hosted IR VM? We have a large number of VM's that we need to maintain which are used for ingesting data from on-prem data sources. I would like to avoid having to use VM's but just integrate ADF with a vnet.
@RajeshPhanindra10 ай бұрын
As ADF v2 made v1 obsolete, do you foresee MF Data Factory making ADF obsolete? Would this mean that Microsoft will slowly stop rolling updates to ADF ?
@keen8five10 ай бұрын
Hi, the provided link does not work
@chsrkn11 ай бұрын
Can we have parameters in fabric dataflows and assign the values from the Pipeline, like the way we do in synapse dataflows.? if we cannot pass the parameters, we cannot make the dataflow in fabric as meta data driven. Can you make video on building generic Fabric Dataflows..?
@MSDataFactory10 ай бұрын
Not yet, but that is coming soon!
@chsrkn11 ай бұрын
Can we build metadata driven dataflows in Fabric.? for example: In the synapse Dataflows we have a Schema Drift/Map Drift in Derived activity and Auto column map on the Sink, with these features we can make the dataflow as Metadata driven, do we have the same in Fabric.?
@ashi14511 ай бұрын
Hi - I have setup as SQL Azure Database source and Managed SQL Instance target but it does only capture inserts and not delete or modifications on rows. So just replicate only inserts
@MariusS-h2p11 ай бұрын
Output to separate file doesn't work, produced file is always empty. Had to build this feature myself using a split.
@nsulliivan11 ай бұрын
This framework looks fantastic! How are you handling the unknown source metadata (Consider if this were the FIRST run you're doing) mapping tab/component in the parameterized Copy activities that you have at the end of your loop component?
@keerthicr475811 ай бұрын
How to maintain historical data in the Target system For example 3 record deleted but deleted record flag need to update as 0 or N in adf. if new data is added flag need to update to 1
@Spiderman-hb7oc11 ай бұрын
Thank You Verymuch......
@MohanPalaniappan-h9v Жыл бұрын
Thankyou !! Is there any option to specify the ODP context as a query to fetch the data from a dynamic query, specifying any S/4HANA tables rather specifying static object?
@EurenPL Жыл бұрын
3 years ago, but still relevant :) thank you
@BidyadharBarik-m7m Жыл бұрын
how will connect from SQL on -Premise to azure sql using ADF CDC?
@BidyadharBarik-m7m Жыл бұрын
How will connect from SQL On-Premise to Azure SQL Database . As we are unable to see integration runtime .
@SilvanoPaxia Жыл бұрын
Great Video Mark! As always!
@SilvanoPaxia Жыл бұрын
We are using Dataverse, so we can use a Model Driven App Application to maintain the Metadata
@NeumsFor9 Жыл бұрын
Also, are Gen2 data flows fully inclusive of all M functions? Are mapping data flows getting deprecated?
@MSDataFactory Жыл бұрын
Yes they are! The approach we took with Dataflows in Fabric was to include the entire PQ/M implementation rather than translate to Spark on the fly as in ADF.
@NeumsFor9 Жыл бұрын
Are the event based triggers there yet?
@MSDataFactory Жыл бұрын
Not yet. We are working on this feature and plan to have more announcements in CY24.