great video, could u please not use the the background music ? its quite distracting while trying the grasp what youre saying . thanks for the vids again.
@BI.Insight9 күн бұрын
Thanks Shazeb for the comment. I will definitely work on this to improve the future videos.
@mirrrvelll516415 күн бұрын
This is a great and valuable video. I didn't see it here that someone explained it quite in detail =) I have a question: I am using it the exact way you mentioned here in the video. I am a developer and report creator, and we have our "golden datasets" that our colleagues are using. But, I am sometimes struggling with RLS groups and rights. So, do I need to grant Build permissions on the Manage Permission level so that someone can use that Golden Dataset in Excel Analyze and as an App? I am defining RLS on the Desktop file, then adding RLS Groups in Service for that specific Report and RLS on the Golden dataset. It is working correctly, but do I adjust it excessively on both datasets?
@BI.Insight9 күн бұрын
Thank you for your comment and questions, Mirr. About your question regarding the "Analyse in Excel"; please note that RLS does "NOT" apply to connections made using via this method. In other words, when users connect to a semantic model using Excel, they are effectively "bypassing" the RLS rules defined in Power BI. This means they have access to the entire semantic model. Re. the question about granting Build permissions: if you want users to be able to create reports or dashboards using the "golden datasets", you indeed need to grant them Build permissions. This permission allows them to build new content based on the semantic model provided in the Power BI service. However, note that this does not impact the use of RLS; it simply allows users to create content based on the accessible data. In terms of your setup, if you have defined RLS in the Desktop and correctly assigned the desired security groups or accounts to the RLS roles in the service, it sounds like you are on the right track. Again, you don't need to apply additional RLS configurations for the semantic model when used with Analyze in Excel, since RLS won't be enforced there. Good luck!
@jamilhamed837715 күн бұрын
Great presentation! Please write the blog or video on performing historical range partition refreshes using power shell. P.S Love the book!
@BI.Insight9 күн бұрын
I’m glad you like the book. And I appreciate the suggestion, I'll see what I can do!
@vanikomakula864023 күн бұрын
This is really great session. I have couple of questions: 1. can we implement auto-scale up and down based on consumption? So it scales up when a threshold is met and scales down to reserved or default F SKU when consumption goes down? 2. Can we implement same logic based on multiple Fabric capacities with logical naming convention such implement rule for set of capacities whose name is prefixed?
@BI.Insight9 күн бұрын
Hi Vani, Thanks for your question and sorry for the late reply. I'm glad you found the video valuable. There are a couple of points in your question that require a lengthier response. 1-1: Auto-scaling based on consumption: While Microsoft Fabric has built-in features like bursting and smoothing, these do not provide full auto-scaling up and down based on real-time consumption. Bursting allows your capacity to exceed its licensed limit temporarily to handle spikes in demand, but it does "NOT" automatically scale down. Smoothing helps distribute long-running workloads across a 24-hour period to avoid sudden spikes, but it does not dynamically adjust capacity levels. Read more here: blog.fabric.microsoft.com/en-NZ/blog/fabric-capacities-everything-you-need-to-know-about-whats-new-and-whats-coming/?WT.mc_id=DP-MVP-5003466#BurstSmooth If you want a true auto-scaling solution, it is possible to automate it with Azure Logic Apps or other automation tools such as runbooks. You need to monitoring Fabric capacity consumption (for example via Log Analytics), and triggering scaling actions when predefined thresholds are met. However, this approach introduces complexity and potential reliability challenges, as scaling decisions may not always be instant or optimal due to API limitations and processing delays. So, a more reliable and controlled approach is to schedule upscaling and downscaling at fixed times when peak and off-peak usage are predictable, ensuring a balance between performance and cost efficiency. 1-2: Reservation vs. Pay-as-you-go Licensing: It is important to consider that Fabric capacities can be purchased under two "different" licensing models: - Reservation: Requires a "one-year commitment" but offers a significant discount (around 40% off compared to PAYG). - Pay-as-you-go (PAYG): Charged hourly, providing flexibility but at a higher cost. So, switching between Reservation and PAYG dynamically through automation is not possible and doesn't make sense in practice. Since Reservation locks you in for a year, it is not an option for short-term scaling needs. Auto-scaling makes sense on PAYG to adjust the Fabric SKUs (for example, moving between F2 and F4). In the last part of my latest video, I explained some scenarios and use cases where these kinds of automations make sense. You can find it here: kzbin.info/www/bejne/rInGgHZ5qtyBppo 2. Scaling logic based on multiple Fabric capacities with naming conventions: Could you clarify your specific use case? If the goal is to apply scaling rules to a set of capacities based on their name prefixes, this can be implemented using Azure Logic Apps. You can fetch a list of Fabric capacities from the API, filter them based on naming conventions, and apply scaling rules accordingly. However, since Fabric capacities are independent, each would still need to be managed individually within your automation logic.
@blakemorphis12623 ай бұрын
This is incredibly helpful, thank you for posting this!
@BI.Insight3 ай бұрын
Glad it was helpful!
@pavlahrabcova3914 ай бұрын
Hi, thanks for your videos! please what is api version - where did you get this number?
@BI.Insight4 ай бұрын
Thanks for bringing this to my attention 👍🏼. Just updated the description.
@SleeplessSwan74 ай бұрын
Exactly what I needed. Amazing explanation, short and sweet. Thank you so much!
@BI.Insight4 ай бұрын
Glad it was helpful!
@jonahviakeyboard4 ай бұрын
To save you 10 minutes, set parameter data type to text
@BI.Insight4 ай бұрын
Maybe you need to watch the video again🙂. I think it’s on me that you didn’t get the message correctly. You must avoid using type ‘any’ not setting the parameter to type ‘text’. We must always set the data type correctly.
@qasimali-gu3oz5 ай бұрын
Hi Soheil, I have earned "Rock Me".Can I get the book, please.
@BI.Insight5 ай бұрын
Hi Qasim, thanks for the comment. I have already replied all the “Rock Me” messages to follow me and message me on LinkedIn for the instructions. Can you please do so? Thanks!
@qasimali-gu3oz5 ай бұрын
@@BI.InsightI did send you message on linkedin but still in pending, maybe because I don't have premium account.
@BI.Insight4 ай бұрын
Sorry to hear that. Just follow me on linked in, make a post and tag me in. I will message you as soon as I notice your tag. That way may work better.
@MatíasEzequielRivara5 ай бұрын
im have a problem with short resource id. im put de character "/" but send error badrequest. im look log and seed a wrong path = /Microsoft.Fabric/capacities%2Fabric/resume" change "/" for "%2"
@BI.Insight5 ай бұрын
Hi Matias, Thanks for your comment. On which part of the solution you get the error, on "Read a resource" or "Invoke resource operation"?
@handschriftonderwijs4 ай бұрын
I notice something weird; every time when i try to fill in the Short Resource Id like in the blog and YT video (capacities/fabricname). I constant get a 404 not found. When i look at the logs and check the Inputs i see that it replaces the "/" with "%2F". Why is this? I think this is why the Action can't find it and returns a 404.
@BI.Insight4 ай бұрын
@@handschriftonderwijs Thanks for the comment. I cannot replicate the issue you face. Can you please clarify which operation gives you the error, the "Read a resource" or "Invoke resource operation"?
@handschriftonderwijs4 ай бұрын
@@BI.Insight In this video at 8:03; when i followed it exactly as is (ofcourse not the fabricname); it failed when i checked the run history like in 9:03; the Invoke resource operation was red crossed and was 404 Not Found.
@BI.Insight4 ай бұрын
@@handschriftonderwijs Please ensure your capacity is running. Encoding the "/" to "%2F" is normal and will not cause any issues. I suspect your interval on "Recurrence" is short and runs the workflow before you saved it and while it runs, your capacity is not in "Active" state. That is just a possibility. I suggest you use the "Read a resource" operation before the "Invoke resource operation" to get the status of the capacity first. This has been explained in detail in the second episode of this series: kzbin.info/www/bejne/ZmrJh4V8Zb-kprMsi=jRbpU43nK8gwnaUz
@vijaydoradla32295 ай бұрын
Which role do we need to assign to the Logic App to manage Fabric capacity pause or resume through the Logic App?
@BI.Insight5 ай бұрын
Thanks for your comment. The user used in the Azure Resource Manager (ARM) operations must at least have Fabric Administrator role. Read more here: learn.microsoft.com/en-us/fabric/enterprise/pause-resume?WT.mc_id=DP-MVP-5003466
@vijaydoradla32295 ай бұрын
Which role do we need to assign to the Logic App to manage Fabric capacity pause or resume through the Logic App?
@BI.Insight5 ай бұрын
Thanks for your comment. The user used in the Azure Resource Manager (ARM) operations must at least have Fabric Administrator role. Read more here: learn.microsoft.com/en-us/fabric/enterprise/pause-resume?WT.mc_id=DP-MVP-5003466
@DoThePoint6 ай бұрын
Very well done series. Very captivating and straight to the point.
@carloscantu758 ай бұрын
Quite useful and interesting, thanks for sharing!
@qasimali-gu3oz8 ай бұрын
"Rock Me!"
@BI.Insight7 ай бұрын
Congratulations. You won a free electronic copy of my book. Please message me on LinkedIn to receive your copy. Cheers
@whitejames018 ай бұрын
Rock Me
@nishantkumar95708 ай бұрын
Rock Me!🎉
@gpltaylor8 ай бұрын
Hi at marker &t=510 you are able to select "External" as a datasource then select your connection. Within my setting the External/Sample-data options are missing. Is this a setting I need to enable somewhere?
@BI.Insight8 ай бұрын
Things has changed since I recorded the video. You can now use the "Connection" dropdown to select the connections available to you, click "More" within the dropdown create new connection or use "Sample data" which appears on the left pane. You can see it in the following image: biinsight.com/wp-content/uploads/2024/05/Snag_112a39.png The related blog on my website has updated content: www.biinsight.com/microsoft-fabric-connections-demystified/
@AnandDwivedi8 ай бұрын
Very Useful I was looking for something like that
@workstuff525311 ай бұрын
Another lose for MS Pro licencing
@BI.Insight10 ай бұрын
Well! Not necessarily. As explained in the video, we have options. The only thing Pro users miss is Fabric integration with Azure DevOps.
@Nalaka-Wanniarachchi11 ай бұрын
Excellent Presentation !!!
@tomaszk9388 Жыл бұрын
great course, thanks for sharing it!
@TainuiaKid1973 Жыл бұрын
Great demo, Soheil. Just bought a copy of your Expert Data Modeling book.
@soheilpalermo2 жыл бұрын
Soheil You are the best Mate. thanks heaps for your informative book and education videos.