do we need to perform the whole steps for restoring all the databases or is it a one time process?
@DataBarКүн бұрын
Granting your SQL Server instance access to your key vault should be a one time process.
@22257618 күн бұрын
Hi. Quick question, @ 3:38 you tried to add users, how SSMS authorisesd them, were they already given access to the database, and from a active directory? Thanks in advance.
@DataBar7 күн бұрын
Yes these users are both from my Entra tenant and they were already provided the necessary permissions to access the database.
@naturevibezz8 күн бұрын
please provide the documentation reference page.
@DataBar8 күн бұрын
You can start out with this: learn.microsoft.com/en-us/azure/azure-sql/managed-instance/restore-database-to-sql-server?view=azuresql&tabs=managed-identity
@naturevibezz7 күн бұрын
@@DataBar for me the assymetric key is giving error. Key with name 'tdekey' does not exist in the provider or access is denied. Provider error code: 2058. (Provider Error - No explanation is available, consult EKM Provider for details)
@naturevibezz7 күн бұрын
@@DataBar Key with name 'tdekey' does not exist in the provider or access is denied. Provider error code: 2058. (Provider Error - No explanation is available, consult EKM Provider for details) I am not able to create the asymmetric key.
@benshi197515 күн бұрын
Hi, awesome vid. I need to connect my Fabric data with SAP that its in a VPN.... what can i use? thanks
@DataBar7 күн бұрын
Have you looked into using an on-prem data gateway with the SAP connectors?
@afzaalawan15 күн бұрын
Excellent..
@DataBar8 күн бұрын
Glad you enjoyed it!
@patsickboy74325 күн бұрын
How to do RLS in Lakehouse?
@DataBar25 күн бұрын
Have a look at this and it should help you. RLS is only supported on a Lakehouse through the SQL Analytics Endpoint. learn.microsoft.com/en-us/fabric/data-warehouse/tutorial-row-level-security
@benshi1975Ай бұрын
awesome video man!! Ive been looking for this solution the hole day!! since the Manage Roles in my PowerBi Desktop report is blocked, I guess because I connected my Report to the Warehouse semantic model.. this is the solution to RLS my diferent viewers, isnt it? My main table with sales dosnt have a column with the email, if I upload a table and relation it with the main sales table, is it enough¡? thanks bro!!
@DataBarАй бұрын
Thanks. If your semantic model is based on a star schema, you can filter your fact tables and slice based in your dimensions. So yes the relationships are key in the scenario you described.
@user-ky9rv2kl6jАй бұрын
It would helpful if you add video on how to load data from Oracle on prem to fabric using Notebook.Thanks
@DataBarАй бұрын
We recently launched a new Oracle connector. You can use this connector in addition to an on-premise gateway to access oracle on prem data. support.fabric.microsoft.com/en-ca/blog/announcing-new-data-factory-connectors-released-in-may-2024?ft=All
@apollobell105Ай бұрын
Great info and very informative 💪🏽…I’m currently a production SQL DBA on the road to becoming a Data Engineer so I enjoy your content. You gained a new subscriber 😊💪🏽
@DataBarАй бұрын
I appreciate it!!! 💪🏽
@aakanshaverma13102 ай бұрын
can we access SAP data in Microsoft fabric through On Prem Data Gateways ? I dont see any information on the internet that we are the source data that is accessible through MS fabric. If you know pls reply. thanks
@aakanshaverma13102 ай бұрын
at 5:31 in video , can you tell what is there in "more"
@DataBar2 ай бұрын
Yes SAP data can be accessed through an on-prem gateway. Check out this link for more details. blog.fabric.microsoft.com/en-us/blog/integrate-your-sap-data-into-microsoft-fabric/
@apramit13 ай бұрын
Is the CMK works on auto failover group? if yes, then how key volt will be accessible by secondary database? Please help me to understand. Any documentation will be much appreciated
@DataBar3 ай бұрын
Just like we provided the primary instance with access to the key vault, we need to provide any secondary instances with that same access to the key vault.
@sarthakkharehcl1423 ай бұрын
Best and easiest way to explain the Elastic Pool.
@DataBar3 ай бұрын
Glad it helped!
@escapetothesky3 ай бұрын
Very useful and well paced video
@DataBar3 ай бұрын
Thank you! Please like and share!
@preyebuowari91294 ай бұрын
Excellent video
@DataBar4 ай бұрын
Thank you!!
@datasqlai4 ай бұрын
whats the latency of moving databases out of latency pool or increasing service tier of it. How to deal with noisy neighbour problem and how to mitigate the risk of kicking out the noisy neighbour from the pool without disruption or latency ?
@DataBar4 ай бұрын
Increasing a service tier can vary depending on if the current underlying VM has the resources to allocate to your instance, or if it needs to be moved to a different instance. However the downtime is minimal as some of this work involved with potentially moving to a new VM happens in the background while your current instance continues to run, and then we will do, in essence a DNS change, at the end. Moving databases into a new elastic pool is a relatively seamless experience. Within elastic pools, your can limit the amount a vCores that a database can consume from that pool helping to put guardrails around noisy neighbors.
@datasqlai4 ай бұрын
@@DataBar my main concern is kicking out the noisy neighbour from the elastic pool to a dedicated azure SQL instance
@DataBar4 ай бұрын
Understood. That’s why it’s important to set up good monitoring and alerting procedures so you can identify things like that and take appropriate action!
@loucaspapaspyrou51204 ай бұрын
So you either need to be using a Warehouse in Fabric or a SQL Endpoint. Therefore if I go into the Warehouse as my user I will only see my users information but if I go into a Lakehouse then this filtering will not be applied?
@DataBar4 ай бұрын
From the Lakehouse you are either using the SQL Endpoint to access data with T-SQL in which the row level security will apply, or you are accessing the data via Spark and hitting the underlying delta parquet files and RLS will not apply in that scenario.
@culpritdesign4 ай бұрын
nvarchar is not a supported data type?
@DataBar4 ай бұрын
That was at the time of this recording. nvarchar should be supported now as long as it’s not nvarchar(max) as LOB data over 1MB is unsupported.
@lerbous46334 ай бұрын
just install 1 on prem data gateway in local PC to connect to multiple on prem databases? like oracle/SQL server etc
@DataBar4 ай бұрын
One gateway can be used to connect to multiple on-prem databases.
@user-ji2si8ic4p5 ай бұрын
Great video Tamarick and at a pace I can follow and sufficient clarity through the process. Thanks
@DataBar5 ай бұрын
Thank you. Glad it was helpful!
@franciscoanalytics5 ай бұрын
Hey! Great video! Is it possible to use dynamic data masking for contributor users?
@DataBar5 ай бұрын
Any user with Admin, Contributor, or Member permissions on the workspace will be able to see unmasked data. If you need it to be masked for a user, either give the Viewer workspace role, or grant more granular SQL permissions using the GRANT statement.
@awadelrahman5 ай бұрын
can you do the other way around, accessing a databricks service (example search index) from an app on azure? What is the difference between Creating service principlal on azure and the one on databricks?
@clvc6995 ай бұрын
Could I access to SQL Server On premise with Notebook in Fabric?
@DataBar5 ай бұрын
I am not aware of a way to do this via notebooks. The ideal solution would be to use Fabric Mirroring to mirror your SQL Server database into OneLake. Currently the mirroring feature is in private preview and only supports Azure SQL DB, CosmosDB, and Snowflake, but sql server will be coming soon.
@NeutronStar96 ай бұрын
Please add more tutorials
@DataBar5 ай бұрын
Will do
@user-sm7js3pt3i6 ай бұрын
What if more than 1 user need to access the on-prem data? Does every user need to install their own data gateway on the server?
@DataBar6 ай бұрын
No each user does not need to install their own gateway. The admin of the gateway, usually the person that installs it, can grant additional users permission to use the gateway. See the below doc for more info: learn.microsoft.com/en-us/data-integration/gateway/manage-security-roles
@user-sm7js3pt3i6 ай бұрын
What if you want more than 1 user to access the on premise data? Do you have to make a gateway for each user?
@DataBar6 ай бұрын
No each user does not need to install their own gateway. The admin of the gateway, usually the person that installs it, can grant additional users permission to use the gateway. See the below doc for more info: learn.microsoft.com/en-us/data-integration/gateway/manage-security-roles
@kuto16 ай бұрын
Really a great video, helped a lot.
@DataBar6 ай бұрын
Thanks! Glad it helped!!!
@escapetothesky6 ай бұрын
Easy to follow along. Thank you. Subscribed.
@DataBar6 ай бұрын
Thank you!
@apurvgaykhe7156 ай бұрын
Very well explained
@DataBar6 ай бұрын
Glad it was helpful!
@vijaykundanagurthi85857 ай бұрын
Hey Tamarick, do you have a reference how to configure the credentials as Datarbciks cluster configuration?
@jammuarun7 ай бұрын
This video helped me a lot. I struggled for 4 days to mount Datalake on Databricks. This video explained exactly what I need. Thanks a lot for creating this video.
@DataBar7 ай бұрын
Thank you! Glad it was able to help!!
@rasmusandreasson15487 ай бұрын
Would love to have more fabrics content, great job!
@ramonpacheco428 ай бұрын
ok, but can i use on premise on data factory? not in dataflow gen 2? or KQL?
@DataBar8 ай бұрын
On prem gateways are used to access data on-prem and are currently only supported within Gen2 Dataflows.
@shashipaul62798 ай бұрын
That's amazing .... wanted someone to guide it in better way ... Luckily got this video .... Hope you will continue it n explore everything related to data science stuff which can empower data analysts with DS stuff.... 🎉
@DataBar8 ай бұрын
Glad this was able to help you!!!
@vishwassee9 ай бұрын
i am getting this error after i publish error: GatewayCannotAccessSqlError Couldn't refresh the entity because of an internal error any idea how to fix it?
@DataBar8 ай бұрын
I would need to review it to really understand what’s going on. I recommend you open a support case for assistance.
@efficiencygeek9 ай бұрын
Well done. Showing a lot of good detail. Thanks much.
@DataBar9 ай бұрын
Thank you! I appreciate you watching!!
@theministerbo72939 ай бұрын
Very good video. I"m learning. Still need hands on but that's coming.
@BudgetBill9 ай бұрын
I agree, I need hands on as well
@DataBar7 ай бұрын
Best of luck!
@larryarrington93959 ай бұрын
East Coast Sr - I plan tio complete all 7 videos in this series. Will have questions so I plan to reach out. Started with your video on Fabrics. Looking to take a deep intio Azure ecosystem. Tks for what you do
@DataBar9 ай бұрын
Thank you!!!
@larryarrington93959 ай бұрын
Just found your site today. Where have I been? 70 yrs old and retired. Now out of retirement to learn new technologies. Starting with Fabric. Will check out your vast collection of Microsoft related videos. Tks for sharing and being so passionate about what you love. It show! I will have questions and plan to reach out. 😅 East Coast Sr.
@DataBar9 ай бұрын
Thank you for watching sir! I appreciate it!!!
@theministerbo72939 ай бұрын
I will keep watching and learning. As I have questions I will reach out Keep doing what you do! I greatly appreciate it! 😊 ❤
@DataBar9 ай бұрын
Thank you! I appreciate you!
@pawelkostadinow9 ай бұрын
This video was very helpful to me as well! Thank you!
@DataBar9 ай бұрын
Thanks for watching!!
@ayocs29 ай бұрын
is Clustered columnstore index appropriate for real time replication? I saw blogs on why you shouldn't use indexes on columns that gets updated frequently. could you explain in brief please?
@DataBar4 ай бұрын
Indexes help with data retrieval, but they slow down data writes. In a real-time replication scenario you have lots of write activity taking place. Each time you write new data, any indexes you have will need to be updated, thus impacting performance.
@Southpaw0710 ай бұрын
Since the solution is centered around BC and zone redundancy could you use lb as opposed to traffic manage to reduce cost ?
@DataBar10 ай бұрын
Azure SQL only allows one replica to serve both read and write workloads so load balancing isn’t really a concept here. You can redirect read-only workloads to a secondary replica. Does this answer your question?
@BudgetBill10 ай бұрын
Great explanation to make decisions
@DataBar10 ай бұрын
Glad it helped!
@BudgetBill10 ай бұрын
very helpful information
@DataBar10 ай бұрын
Glad it helped!
@windyearle10 ай бұрын
The thing is, the replica is still in the same region right? So you're probably better off using GP and creating your own Auto-failover group with a second MI in a second region? That way you've got multi-region HA, which is better than what Business Crit offers.
@DataBar10 ай бұрын
It just depends on your use case. If your main goal is offloading read-only workloads then using the included replica of the BC tier meets that purpose. The replica is there for high availability purposes and to ensure Microsoft can maintain its SLA’s. But if you are looking for a solution for disaster recover, then, as you stated, auto-failover groups would be the best solution for that use case.
@grimmersnee10 ай бұрын
great explanation
@DataBar10 ай бұрын
Glad it was helpful!
@CoopmanGreg11 ай бұрын
Fantastic Video. Well explained and very detailed. Thanks
@DataBar11 ай бұрын
Glad it was helpful!
@nickgovorun538411 ай бұрын
Great! But how did you do that: on 1:50 you have win10 inside win 11, that is inside macOS ? Have you 3 VMs ?
@DataBar11 ай бұрын
MacOS is my host OS. I then have an RDP connection to a Windows 11 VM. At the 1:50 mark I am running the SSMS app connected to a SQL Server 2022 instance hosted on Windows Server 2022.
@andreyjimenez4204 Жыл бұрын
Great video! when are they going to add support for more external sources like Google Cloud Storage?
@DataBar Жыл бұрын
We don’t have a date to announce publicly yet but other providers are in the works.
@nickgovorun5384 Жыл бұрын
Thnx a lot. You really helped me!
@DataBar Жыл бұрын
Thanks for watching and glad I could help you!!
@joseronnieranile1712 Жыл бұрын
Thanks for the video. I successfully linked my azure sql database to synapse but it's not replicating the table successfully, the status always says 'WaitingForSnaphot'. Can you please help?
@DataBar Жыл бұрын
It’s kind of hard to troubleshoot without actually understanding and looking into your environment. Have a look at this and perhaps it can help you. learn.microsoft.com/en-us/azure/synapse-analytics/synapse-link/troubleshoot/