How do you even use VertiPaq Analyzer with Power BI???

  Рет қаралды 34,909

Guy in a Cube

Guy in a Cube

Күн бұрын

Пікірлер: 64
@JackOfTrades12
@JackOfTrades12 5 жыл бұрын
Thanks for the follow up, i was really confused when I was reviewing my models. To people asking about performance. If you're having to use filter within your measures over a large dataset then I would make a calculated column that returns 1 if the logic is met, then have a measure that just sums the calc column. If you are only doing arithmetic then the calc column doesn't make much sense, unless you are going to have the value presented in a raw data table in your report. With a raw data table, the measure will execute over each row, and you will possibly exceed resources or have performance issues. As Marco is pointing out, it's all a balance of storage, performance, and customer requirements. Sometimes an extra column can't be done if the table is too large.
@princecena
@princecena 5 жыл бұрын
Thanks Marco and Patrick for the insights...really nice seeing both of you together
@guillermochez5531
@guillermochez5531 3 жыл бұрын
An effective engine data to pack; fabulous, only the necessary. Thanks Marco and Patrick for the power of vertipaq and the session with this data sample.
@SamuelRoshan
@SamuelRoshan 5 жыл бұрын
Mind blown! Thank you Patrick and Marco!
@praveenpatil282
@praveenpatil282 3 жыл бұрын
Awesome Video. Marco, I'm your fan after seeing this. You do a much better job explaining than Alberto. Productive 12 mins of my life.
@yigitkadioglu9210
@yigitkadioglu9210 4 жыл бұрын
Simple and effective, thank you Patrick & Marco
@mathew9665
@mathew9665 5 жыл бұрын
This is one of the most important item when building a report thank you
@MAbdullah47
@MAbdullah47 3 жыл бұрын
Thank you Marco for this useful introduction
@hasnain280
@hasnain280 2 ай бұрын
Thank you Marco & Patrick very Helpful👍
@kasunmanchanayake8464
@kasunmanchanayake8464 5 жыл бұрын
Wow marco and Patric together
@DIGITAL_COOKING
@DIGITAL_COOKING 5 жыл бұрын
First nice t-shirt Second, your videos with Marco are a great good one, Patrick 👍👍👍
@Prodoshdutta
@Prodoshdutta 5 жыл бұрын
Great Optimization
@EricaDyson
@EricaDyson 5 жыл бұрын
Great session. Nicely explained (for us lesser mortals) .. more please!
@robsonnascimento5935
@robsonnascimento5935 5 жыл бұрын
Awesome, thank you for this video!!!
@GregKramerTenaciousData
@GregKramerTenaciousData 5 жыл бұрын
nicely done fellas....for what it's worth, 'yo Marco' is much more approachable than 'professor Marco' :-)
@luti4689
@luti4689 4 жыл бұрын
Awesome video! Managed to get a recent report from 200 down to 40. Seeing the impact of a single column is really useful. Do I really need it, can I use a measure instead of a column? It really makes a difference :) Thanks!
@JorgeSegarra
@JorgeSegarra 5 жыл бұрын
Fantastic job, guys, thank you!
@nikolaidedenkov8414
@nikolaidedenkov8414 4 жыл бұрын
Thank you guys, great as usual!
@tomaskavicky
@tomaskavicky 3 жыл бұрын
great help guys! thanks
@GHRichardson
@GHRichardson 5 жыл бұрын
I have what I think could be a common problem to many: My millions of rows of data have a unique identifier in them. I need to keep the identifier despite its terrible cardinality :-(. However, it's worse that that. I need to add a string prefix to the unique Id field because it ends up being a lengthy URL which I display as a link icon in the UI within a table. This makes the problem much worse because the size of the column and its unique values prevent any significant compression and the file becomes huge. Is there something that I can do to improve this in the DAX instead? Maybe in a similar way to your SUMX function described in the video? My concern would be that a measure that 'calculates' the eventual URL string would get run for every row in my display table whenever a filter was changed and that would take several seconds...
@WordBaron
@WordBaron 3 жыл бұрын
I second this comment. This video kind of frustrated me a little because how can we remove IDs when we want our users to be able to identify and return to the records at the Order or OrderLine level of granularity? Is this for only summary level models that don't require drill-down? And I agree that it's even worse in most datamodels because of the hyperlinks that are created for these IDs to be used on the querystring because it's not only one column for each ID, but now two because of the URL. These ID and URL-to-ID-record columns take up the vast majority of space in our model. After thinking about this some more, I was able to make the hyperlink into a measure instead of a calculated column www.screencast.com/t/9o4sT9jV1 which saves a little bit of space, but not nearly as much as I was expecting based upon what DAX Studio showed. Also, you can't just get rid of an ID that is used to link tables together can you? Granted the SUMX function was a true size-saver in that you didn't sacrifice any functionality but could still save some space, but I left feeling that the ID removal was completely impractical. Maybe I'm just missing something... I actually hope that I am.
@PedroCabraldaCamara
@PedroCabraldaCamara 4 жыл бұрын
awesome video!!!!! Thank you so much guys!!!
@GuyInACube
@GuyInACube 4 жыл бұрын
Thanks for watching! 👊
@kurtdevotto
@kurtdevotto 5 жыл бұрын
Hi there, how did you get de Advance mode in Dax Studio?
@arklur3193
@arklur3193 5 жыл бұрын
You need to enable it in the Options, as Marco said --> imgur.com/GnFxG1c
@TestPBI
@TestPBI Жыл бұрын
Hello, thank you for the video. In the last portion of the video, by using SUMX in a measure instead of a calculated column - granted, the model size is reduced. But by doing this, won't you put additional strain on the formula engine, and user interactions that use the measure with SUMX be significantly slower?
@alpesh417rt
@alpesh417rt 3 жыл бұрын
great value in video - implemented !!!!
@ianstuart2505
@ianstuart2505 5 жыл бұрын
Nice tips, thank you. I think there is a balance to be struck as well though... I generally advise creating calculated columns as far back in the process as possible - ideally on the database prior to connecting Power BI. This is to centralise the "single version of the truth" and minimise maintenance. This is an approach I would stick with unless performance proved to be an issue. Thoughts?
@marcorusso7472
@marcorusso7472 5 жыл бұрын
Correct - but the memory consumption described in the video is not caused by the calculated column, it is caused by the cardinality of the column that produces low compression.
@arklur3193
@arklur3193 5 жыл бұрын
What I'm thinking of is, could you mention a few regular use-cases where you would sacrifice memory for using a "calculated column" rather than an iteration function? Let's say summing up 5, 10, 20, 50 columns, or if the calculated column's logic is just "complex" (I know-I know, "What is complex?" :D), using conditions, multiplications, etc. How can you decide not to use a measure, but rather store the value in a column, so you can just sum on that, meaking the measure much more faster and less resource intensive. Thanks in advanced.
@WmRod
@WmRod 5 жыл бұрын
I had a similar question. There has to be a trade off somewhere between file size and execution speed
@marcorusso7472
@marcorusso7472 5 жыл бұрын
It depends. For example: with 10 billions rows a column with Line Amount could be 30% faster than multiplying the two columns. However, the storage for the additional column could be 20-30 GB. In a report, the difference can be 1-2 seconds faster, something like 3 seconds instead of 4.5 seconds. Memory is not free, CPU is not free. Customer choice, but the real issue is that you cannot continue to add memory, so you have to make choices. However, if you have 10 million rows, the difference is smaller in percentage, it could be actually faster and in any case the difference (if any) would be in milliseconds. So I would always save memory in those cases. For Power BI, it's a no brainer: save memory - you cannot have 10 billions rows.
@arklur3193
@arklur3193 5 жыл бұрын
@@marcorusso7472 Thanks for the detailed answer, I'll keep it in mind!
@marcmaurice8942
@marcmaurice8942 4 жыл бұрын
Hi! Is there an easy way to see which columns in the data model are not used? Thanks!
@rapatel1642
@rapatel1642 Жыл бұрын
Great video.
@rukiatukijancollier6061
@rukiatukijancollier6061 5 жыл бұрын
Very helpful! Thank you
@nicolassavignat6091
@nicolassavignat6091 5 жыл бұрын
Thanks all
@nandukrishnavs
@nandukrishnavs 5 жыл бұрын
Informative 👍
@noahhadro8213
@noahhadro8213 Жыл бұрын
Is there a way to use PowerShell to export the vpax file from the service automatically?
@malikakunseitova7088
@malikakunseitova7088 3 жыл бұрын
Amazing
@WmRod
@WmRod 5 жыл бұрын
What happens to rendering speed when executing the new measure? Maybe it doesn't matter so much in this example, but in general, I would think you would embed compution heavy formulas in a calculated column, especially if there are dependencies.
@marcorusso7472
@marcorusso7472 5 жыл бұрын
It depends - see similar comments above
@spilledgraphics
@spilledgraphics 4 жыл бұрын
Amazing!!!!!
@Lixia123
@Lixia123 4 жыл бұрын
love it!
@vog5197
@vog5197 5 жыл бұрын
Thanks for the video, always love the performance optimization videos! I was just wondering though; is it always better to replace calculated columns with measures? Isn't a calculated column calculated once (on load of the initial report), whereas measures are recalculated with constantly when you navigate the visuals pane? I'd imagine there would be some cases where I'd rather want to wait a bit longer initially to have snappier performance navigating the dashboard once loaded.
@dbszepesi
@dbszepesi 5 жыл бұрын
Calculated columns are not compressed with as much efficiency as the regular data due to when they are created in the build process. Therefore from a compression standpoint, they are not preferred.
@arklur3193
@arklur3193 5 жыл бұрын
@@dbszepesi This is very specific to calculated columns, if you create it in PQ, you are "fine". Your model will be still bigger, obviously, but the engine can compress your model therefore making it smaller compared to a DAX calculated column.
@marcorusso7472
@marcorusso7472 5 жыл бұрын
@@arklur3193 Not really, the biggest difference in compression is the number of unique values of the column, which is bigger in Line Amount rather than Quantity / Net Price if you consider this example. The difference between computed columns and calculated columns is minimal and relevant only for columns that have a small number of unique values (up to 10/100, certainly not 1000 or more).
@JackOfTrades12
@JackOfTrades12 5 жыл бұрын
If you're having to use filter within your measures over a large dataset then I would make a calculated column that returns 1 if the logic is met, then have a measure that just sums the calc column. If you are only doing arithmetic then the calc column doesn't make much sense, unless you are going to have the value presented in a raw data table in your report. With a raw data table, the measure will execute over each row, and you will possibly exceed resources or have performance issues.
@MAbdullah47
@MAbdullah47 3 жыл бұрын
But the Order Number if it is the foriegn key how you deleted it?
@alexbehrmann9557
@alexbehrmann9557 5 жыл бұрын
Do the same rules apply with direct query models?
@marcorusso7472
@marcorusso7472 5 жыл бұрын
No, DirectQuery doesn't import data and you cannot use aggregations for measures that have row-level calculations like in this demo.
@DarkOri3nt
@DarkOri3nt 5 жыл бұрын
So basically Marco recommends not using calculated cols unless completely necessary because of their memory consumption. What's to stop you from computing this in the source and importing it in ready to use. I would assume you can use this against tabular models aswell ? 12 millions rows is not big in pbi.
@arklur3193
@arklur3193 5 жыл бұрын
If you create your calculated column in PQ rather than DAX, your modell will be still bigger but not just that big as if using DAX. And yes, 12M rows is not that much, but this is just a video explaining this stuff, you only need to worry about these if your model is much bigger.
@marcorusso7472
@marcorusso7472 5 жыл бұрын
@@arklur3193 See comment above - the difference would be minimal in this case, computed columns in PQ are equally expensive to calculated columns if there is a medium-high cardinality.
@pratikfutane8131
@pratikfutane8131 5 жыл бұрын
Genius!!
@ashutoshready
@ashutoshready 5 жыл бұрын
Hey, How to enable Vertipaq Analyzer Preview ?
@DreeKun
@DreeKun 5 жыл бұрын
Quoting a comment from below: "I had the same problem. But then downloaded and installed 2.9.5 from ci.appveyor.com/project/DarrenGosbell/daxstudio/builds/28512060/artifacts and it worked as expected."
@ashutoshready
@ashutoshready 5 жыл бұрын
@@DreeKun Great, worked for me as well. thanks.
@stephenbrincat7124
@stephenbrincat7124 5 жыл бұрын
Following Marco's instructions but cannot view result, i.e the VertiPaq Analyzer Preview is not showing....
@marcorusso7472
@marcorusso7472 5 жыл бұрын
Please read this thread and check possible solutions or post more details: github.com/DaxStudio/DaxStudio/issues/235
@ianpollard4138
@ianpollard4138 5 жыл бұрын
I had the same problem. But then downloaded and installed 2.9.5 from ci.appveyor.com/project/DarrenGosbell/daxstudio/builds/28512060/artifacts and it worked as expected.
@pabeader1941
@pabeader1941 5 жыл бұрын
Is it just me or are they blurry?
🚨 YOU'RE VISUALIZING YOUR DATA WRONG. And Here's Why...
17:11
Adam Finer - Learn BI Online
Рет қаралды 241 М.
Thank you Santa
00:13
Nadir Show
Рет қаралды 37 МЛН
I thought one thing and the truth is something else 😂
00:34
عائلة ابو رعد Abo Raad family
Рет қаралды 9 МЛН
Мама у нас строгая
00:20
VAVAN
Рет қаралды 11 МЛН
Don't underestimate anyone
00:47
奇軒Tricking
Рет қаралды 21 МЛН
Debugging a slow Power BI report with Phil Seamark
13:01
Guy in a Cube
Рет қаралды 71 М.
VertiPaq Analyzer crash course
20:50
SQLBits
Рет қаралды 4,7 М.
Understanding OneLake within Microsoft Fabric
10:29
Guy in a Cube
Рет қаралды 77 М.
How to use the Power BI Aggregations Feature
32:18
Guy in a Cube
Рет қаралды 55 М.
2 ways to reduce your Power BI dataset size and speed up refresh
17:24
Dataverse + Power BI: What you didn't even know you needed!
12:45
Guy in a Cube
Рет қаралды 62 М.
[DAX] Best Practices 101 for Optimization & Performance (with Alberto Ferrari)
1:26:41
Thank you Santa
00:13
Nadir Show
Рет қаралды 37 МЛН