Hey all! How's it going with Dataflows and Fabric? Have you started using either yet?! Share you experiences below 👇😃
@AmritaOSullivan Жыл бұрын
Thanks for this fab video. It is so so easy to understand and so useful to actually go through use cases end to end!
@LearnMicrosoftFabric Жыл бұрын
Hey thanks for watching and for the feedback! I’ll be doing more end-to-end projects in the future for sure ☺️💪
@rizvinazish8 ай бұрын
very nice easy way you explained it, very helpful!!
@benshi19755 ай бұрын
hey awesome vid! thanks I follow you! with dataflow can i aim a specific schema on the warehouse destination? or i should combine pipeline and dataflow for that? thx
@LaZyBuM99910 ай бұрын
Thanks for the wonderful videos on Microsoft Fabric. I see that all the data imports are "one time activities" from a source. How can we get the delta data (eg. new records, deleted records) in the source syncd with the lakehouse periodically? i.e., What about the continuous CRUD operations being done on the source (eg. Sql DB) ? How can that be synced with the data in the lakehouse?
@LearnMicrosoftFabric10 ай бұрын
Thanks fo your comment! Yes you're right, they are one-time activities (but you can schedule then to run every day/ hour etc, but it's never going to be 'real-time' The feature you are describing is more like OneLake shortcuts (which you can create a real-time link to FILES in external locations (ADLS and Amazon S3). Microsoft are also releasing soon a feature called Database Mirroring which will do the same thing, but for databases (Snowflake, CosmosDB, Azure SQL etc) - this feature is currently in private preview, I believe they will release it soon for public preview!
@eniolaadekoya56233 ай бұрын
hi wills for the report can i import background design maybe from figma or canva
@LearnMicrosoftFabric3 ай бұрын
yes
@sanishthomas28588 ай бұрын
nice.. Quick question why we are not having dataflow UI that we had in adf and synapse
@LearnMicrosoftFabric8 ай бұрын
Because it’s a different product!
@jevonzhu Жыл бұрын
dataflow2 cannot be deployed in Fabric pipeline by my side, why is that?
@LearnMicrosoftFabric Жыл бұрын
Hey! Could be a number of reasons! Does the dataflow run ok outside of the pipeline? I would look in Monitoring Hub and analyze the error message. Good luck!
@arnabgupta4391 Жыл бұрын
Thanks for the awesome video. How do you add the folder names like BronzeLayer/SilverLayer? It all got created into the same workspace for me.
@LearnMicrosoftFabric Жыл бұрын
Hey thanks for the question, which part of the video are you referring to?
@rdbdebeer9085 Жыл бұрын
The folders are referring to data lakes so he has a Data lake for Bronze Data and a Data lake for Silver Data.
@scarabic5937 Жыл бұрын
Great video, thanks
@LearnMicrosoftFabric Жыл бұрын
Thanks for watching!
@pphong Жыл бұрын
Thank you for sharing! I didnt load my data into the azure storage account - i used DFg2 to read/upload the CSV file. Do you guys experience slow delta writing to the lakehouse? [Can I do something to speed it up]?
@LearnMicrosoftFabric Жыл бұрын
Hi Yes in general the Dfg2 is quite slow at the moment during the public preview, I’m sure the write speeds will increase as we move closer to GA (general availability) of Fabric