Hi there! This video has been super helpful for understanding the migration process. I'm working on migrating a workspace (24 workspaces actually) with power bi reports inside them, from a Premium P1 environment to Fabric (F24). I'm having a bit of trouble with migrating the visuals since the customer wants everything to remain unchanged. Any additional tips or resources on this would be greatly appreciated. Thanks so much for your help! 😊
@Gandalf1969Guy2 ай бұрын
Agreed. Super helpful. One common use-case is like you have described....... an existing PBI report environment, (complete with a potentially complex model) where you just want to start pulling data through Dataflows. Is there any way to maintain the existing model (transferring the PowerQuery steps to the DataFlows) and therefore not break the reports?
@ainaajidian65549 ай бұрын
Can i use dynamic path for Dataflow Gen 2 insert into one table with append from multiple file and remove duplicate row(s)? So if new file uploaded to folder i choose, then the Dataflow Gen 2 use the newer file(s), if you know the way please enlighten me.
@austinlibal9 ай бұрын
Something like this could possibly be achieved though currently Dataflow Gen2 does not support parameters for dynamic content. You could set up a source connection to a folder and append files though using the "From Folder" option in your Get Data hub. Then those files will go through and run through whatever applied steps you have setup for that DFG2 and load that to Lakehouse every day.
@myusrn3 ай бұрын
I needed a story to compute some metrics [ e.g. count of records with foo, bar and foobar present ] on the 1st and middle of every month. I found that i could use power bi dataflow legacy to schedule execution of sql statement that produced a single record with each of those count based metrics in a column that it stuffed into an azure data lake storage gen2 container entry. Then i was able to use power bi data source connector support for azure data lake storage gen2 container to pull in the set of all those individual 1 record files and create a trend lines chart for all thos metrics as the months ticked by. Now with power bi data factor dataflows gen2 solution they no longer rely on an azure data lake storage gen2 container so this maneuver to get at all the scheduled metric computations as a data set isn't going to work. Any insights as to how i should be accomplishing this metrics computation and display of the set over time in current power bi experience?
@DavidLiLove8884 ай бұрын
Hi Austin, How can we get a copy of your PBIX file for practice? thanks and do you have a github download area for that file?
@austinlibal4 ай бұрын
Hello! This is a PBIX from the Dashboard in a Day class that you build throughout the day of that one-day training. You can really use ANY PBIX though to accomplish this!
@Scorpian22k9 ай бұрын
👍. Getting data through Import mode Vs using dataflow gen2 mode. Which one is better in terms of report refresh time? OR Direct lake is much faster? Any reference..
@austinlibal9 ай бұрын
The purposes of the two would be different from one another to be honest. Dataflow Gen2 is going to be more of an ETL tool where using Power BI Desktop or building a report in the service with DirectLake is going to be your best option when leveraging Fabric. Nothing is ever going to be as fast as Import but with DirectLake it can match the speed of Import while also still using DirectQuery like functionality.
@Scorpian22k9 ай бұрын
@@austinlibal , sounds well!
@tranmanh265 ай бұрын
Every time you migrate from Power BI Desktop to Fabric Dataflows Gen2, you no longer need a Gateway, right?
@austinlibal4 ай бұрын
That’s not necessarily true. If you’re using a source that is on-premises you will still need a gateway.