Can we used for 200GB database how much time it will take ? If there we faced any timeout issue ? Please share your input.
@ajoy120310 күн бұрын
Hi how can we pass the specific environment to a notebook dynamically in fabric
@Learn2Share78610 күн бұрын
Thanks, pls share the notebook code
@Learn2Share78610 күн бұрын
Thank, is it possible to share the notebook?
@mdhruv110 күн бұрын
Would be real cool if you can show the steps to us a backpac backup file to restore in sqlserver . I think you have to use the az cli for that to work . Store the bac file in the lake house and show us how to retrieve it
@Tales-from-the-Field9 күн бұрын
DBABULLDOG from the channel here. So you would like to see how to backup & restore a bacpac to sql server on a VM using the lake house? Looking for some clarification to create the content.
@mdhruv19 күн бұрын
Need to be able to restore a bac file stored in the one lake to a fabric sql server db
@CarlintVeld13 күн бұрын
Would it be possible to have zero downtime during the migration?
@CarlintVeld13 күн бұрын
What would be your suggestion how to migrate from Azure SQL Database to another?
@rajamahesharavapalli15 күн бұрын
Nice video.... You should have also included connecting to kernel and executing cells etc. Great help though. thanks for sharing :)
@kvh493215 күн бұрын
i keep getting ""Forbidden", 403" error on the write operation. i am using a working, active secret with the correct secret. do i need to set specific API permissions to write to fabric lakehouse?
@oliverfried581816 күн бұрын
Hey Bradley, if you have a min, could you help me out? I'm getting the error: Error: ErrorCode=UserErrorInvalidPluginType,'Type=Microsoft.DataTransfer.Common.Shared.PluginNotRegisteredException,Message=Invalid type 'FabricSqlDatabase' is provided in 'linkedServices'. Please correct the type in payload and retry.,Source=Microsoft.DataTransfer.ClientLibrary,' It seems like my connection to my Fabric SQL Database has the wrong type, however, I can't edit it. @SQLBalls, do you know what the right value should be?
@cre70019 күн бұрын
Thank You! Right to the point. Appreciate your video.
@robcarrol22 күн бұрын
Great video, thank you. I deployed #FabSQL in a client environment yesterday. They were going to use a Warehouse originally but now they have the option of SQL DB inside fabric, they are going with that instead 😊
@ifeadeniyi414323 күн бұрын
Can we look at how to subset and anonymize/sanitize data during the import process from On-prem SQL Server to FabSQL. This helps in certain compliance cases when real PII needs to be protected (or not replicated as is).
@retr0isagod23 күн бұрын
Be nice if the fabric pipeline had CDC built into on prem sql -> "fabsql".. ADF does this (in preview?) but won't work with an IR. This would be very useful instead of having to roll your own CDC handling methodology across the gateway..
@kbaig665123 күн бұрын
CDC and schema conflicts or feature conflicts between onprem older version with new Azure Fabric database would be great addition to do data migrations
@jeromedupourque606723 күн бұрын
Great video, Thank you!
@RameshReddy-l2k24 күн бұрын
It is very useful. Is it possible to pass dynamic parameters from Copy Data activity to Note Book Activity in the Data Pipeline
@robcarrol24 күн бұрын
Great session, thanks! Love the #FabSQL name, going to try and drop it into a client call tomorrow 😂
@Tales-from-the-Field24 күн бұрын
Boom!
@dom211425 күн бұрын
Is it possible to upload pdf documentation with for Ai skills to be aware of?
@alwayslearnsomethingforgoodАй бұрын
Thank you, it worked like a charm!!!
@Tales-from-the-Field24 күн бұрын
@dbabulldog here. Glad it worked as expected.
@alwayslearnsomethingforgoodАй бұрын
Thank you for sharing and Demo! Database watcher is cool and seems easier to implement than SQL Insights.
@Tales-from-the-Field24 күн бұрын
@dbabulldog here. Once you get familiar with the deployment I absolutely agree. Plus, the low latency collection utilizing RTI gives us near real time data.
@bladbimerАй бұрын
Thank you for your video. By any chance do you know how to do it with a SQL Server instance created inside a VM. I tried to create a Managed Identity attached to the VM using the Storage Blob Data Contributor role for the Azure Blob Container but this didn't worked. Thanks.
@javokhirilkhamboyev4092Ай бұрын
How to use this parameter when we want to use delete statement instead of insert?
@jamesphillips1756Ай бұрын
Amen Josh on no pineapple on pizza
@robcarrolАй бұрын
I've seen issues with dropping databases on an MI if there are other backups running in the background. The MI has to take a tail-log backup (in case you want to restore the dropped database) and this has to wait until the other backup has completed, which can take a while for large databases. The database drop appears to hang and you can't restore another backup with the same name until it has completed. The workaround I use is to rename the database before you drop it, then restore the point-in-time backup with the original name. You don't need to wait for the drop to complete, but you do need to make sure you have enough storage space for 2 copies of the database (until the original drop completes).
@Tales-from-the-Field24 күн бұрын
@dbabulldog here. Great call out Rob, and another solid way to recover our database within out Managed Instance. It's having these plans in out recovery plan making it easier at 3 AM in the morning.
@robcarrol24 күн бұрын
@Tales-from-the-Field Hi Daniel! Yes, the less thinking required at 3am the better 😁
@DevopsWithNaumanАй бұрын
Azure database watcher connected to sql managed instance does not show any sql managed instance in dashboard am follow ur steps but did not found any luck yet can u help me plz?
@Tales-from-the-Field24 күн бұрын
@dbabulldog here. I am sorry you are having issues with setup. Three things that I would look at first. 1) When you look at the data store either in ADX or in the Event house to you see a folder of Managed Instance Tables? 2) Have you validated security has been applied to the Managed Instance and to the data store? 3) When you look at dm_exec_requests or dm_exec_sessions do you see connectivity to the MI from Database Watcher?
@paulshackleton3594Ай бұрын
hi , how can i update the parameter using sparksql in the notebook with (for example) a select statement e.g. the number if rows added to a table
@reynaldomorillo4271Ай бұрын
THIS IS REALLY BIG!!! I HAD BEEN WONDERING ABOUT THAT. Great and simple video instructions! Very POWERFUL !!!!
@sushantyadav7639Ай бұрын
Thank you for the wonderful series of videos. I have a few questions. 1. How to automate the creation of containers in the storage account when considering large numbers of databases to migrate (100+ databases). It's possible that there's a script that can create a container simultaneously. 2. How to create credentials in an on-premise SQL server instance for database backups in newly created containers for all databases at once. 3. Modify the OLA Hallengren script to backup SQL databases in their respective containers.
@amadoubari7041Ай бұрын
Nice. Do I have to setup a private endpoint in azure? I am using a public endpoint
@Tales-from-the-Field24 күн бұрын
@dbabulldog here. In this case I was using VNET peering since this was all in Azure. At this time Only VNet-local endpoint is supported to establish a link with SQL Managed Instance. learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-feature-overview?view=azuresql#limitations Hope this helps.
@zhasktherealmanАй бұрын
Thanks for the article and video :) It solves the problem of including a 10min sleep in the pipeline. And "According to the documentation, automatic synchronization between the Lakehouse and SQL endpoint in Microsoft Fabric can take a few seconds. " Wow!!! For me, It takes atleast 3-5 minutes everytime
@SharonPay-b8qАй бұрын
Exactly what I was trying to do. Thank you.
@saharshjain3203Ай бұрын
Hey at 3:24 while creating the connection for the trigger the window loading never seems to end, what should I do (I left it for 6 hours yesterday)
@gamingguru155327 күн бұрын
Did you find a solution for this? I'm facing the same issue.
@Boo-rp8knАй бұрын
Hi , guys my name is Dary im from dominican republic im 19 years old i study Data sciences and this topic in my country or spanish content doesnt touch theses subjects. Good Job dont give up new fallower. I hope that you continue to speak clearly. Many creators of the youtube content doent realized that they can emprove their views and followers only speaking clearly. I havent a good English but i hope that my message arrived you
@GuillaumeBerthier2 ай бұрын
Excellent ! Thank you to surface this blog post in this video with this clever way to use AI Skills in Notebook
@Tales-from-the-Field2 ай бұрын
Glad to be of service. It's good to get those AI Skills out in the wild, and I hope you find them helpful in your work @GuillaumeBerthier!!
@martijn85302 ай бұрын
my scenario : on-prem gateway needed to extract the data. first step is a complex notebook to populate variables. Then in a copy job, i use these variable to make sure only the needed data is extracted. Do you have a better option to achieve this?
@martijn85302 ай бұрын
its not possible to parse the two variables only using one 'set variable' .. i need like 5 variables, and it can become a bit messy
@martijn85302 ай бұрын
awesome video .. Thanks alot!
@Tales-from-the-Field2 ай бұрын
Thank you for watching @martijn8530!!
@DataBard2552 ай бұрын
Really good content! I have a question. I encountered the scenario (at least 6+ months ago) where I tried using Fabric features in a region that did not support Fabric. I then tried creating the capacity in a region that supports Fabric, but because the capacity wasn't in the same region as the Power BI tenant (which you show in this video), the capacity wouldn't work. Microsoft support told me that the Fabric capacity has to be in the same region as the Power BI Tenant. So here's my question: Is it still a requirement that Fabric capacity must be provisioned in the same region as the Power BI tenant? If so, and you are in a region that doesn't support something like Mirroring, does that mean you'd have to move both the capacity and the tenant to a different region?
@Tales-from-the-Field2 ай бұрын
Hi @DataBard255 per Bradley, "Jared!! I'm not sure if this has changed over the last 6 months, but if someone in support told you that today, then I would say they are wrong. You can have a Fabric Capacity in a region outside of the Power BI / Fabric Tenant. However, a couple things to keep in mind. If it is cross region then there could be some additional fees for storage transfer that may not be there if they are all in the same region, but that overall cost would be minimal. Hope you are well my friend!"
@DataBard2552 ай бұрын
@Tales-from-the-Field , thanks for the clarification! That makes the region situation much easier to manage than I expected. Also, thanks for the mirroring tips. I've stumbled on these steps a few times. Great to have a concise place to review these considerations!
@zhiyingwang12342 ай бұрын
I'm confused. So in Power BI gateway, there is already Azure Data Lake Storage Gen2 connection created. This kind of gateway was not mentioned in other similar tutorials online. Is it really necessary? How to create this connection? This gateway was not mentioned before, it just suddenly appeared in the video.
@adefwebserver2 ай бұрын
Another great job on this video.
@orcun.iyigun2 ай бұрын
Why would you go for Azure Data Factory for such task? What is the benefit of using ADF here rather than Azure Automation? Not only that pricing wise ADF can be more expensive and an overkill
@Tales-from-the-Field24 күн бұрын
@dbabulldog here. Good call out. ADF could be expensive yes, but most folks that I have seen deploy this ADF cost has not been a concern. Azure Automation could absolutely be a choice here. We could also utilize elastic jobs to perform the automation of our statistics and index maintenance (which is my first choice these days with its current improvements). The team I was working with preferred ADF over automation since they were already using ADF. It's all about choice and what works best within your environment.
@CindyofTheShire2 ай бұрын
This looks like a great option! Question: i only see my VM name for my sql server in the managed identity select list. Would that work or does it have to be listed as sql managed instance?
@Tales-from-the-Field2 ай бұрын
Hi @CindyofTheShire per Bradley, "Hi Cindy! If I was in the Azure Portal, the way I would find this is to go to my SQL MI Instance, under Security I would then go to Identity, and the Object ID should be there for the Entra ID. As long as that is there you've got a System Managed Identity. If the System Managed Identity is not turned on you can turn it on. The user name for the managed Identity should be the name of the service; for example in the video my Azure SQL Managed instance was named sqlmiinsiders. I hope this helps, please let me know if you have more questions."
@geirforsmo87492 ай бұрын
Hi! great video. I am actually trying to something that is very similar to this. I need the communication line to be not connected to public internet, and the data flow is only legal from on prem sql to fabric as a push mechanism using ADF. In your solution, do I have to do anything to make the line completely secure? The data is highly sensitive that is why I need to do it this way. By the way, are you using Synapse or ADF in your example?
@CathyCui-r5b2 ай бұрын
Can we use managed identity instead of SAS?
@Tales-from-the-Field24 күн бұрын
@dbabulldog here. I wish we could but with our current versions of SQL Server box product we must use SAS token or storage account keys. From our docs "Backup to Azure Storage account only supports authentication with Shared Access Signature (SAS) tokens or storage account keys. All other authentication methods, including authentication with Microsoft Entra ID (formerly Azure Active Directory), are not supported." I am looking forward to seeing if this is changed with the release of SQL Server 2025. If you are using Azure SQL DB or Managed Instance, you can, and I highly recommend using a Managed Identity to backup to url. If you are looking at the later scenario Brad does a video on it here on the channel.
@dipeshvora86212 ай бұрын
Hey, What if I want to execute the fabric pipeline only if there is data blob created on a specific container and folder. I dont wanrt to trigger it for all the blob uploads ??
@kudakwashemujiri66212 ай бұрын
Thank you for the above Sir, very informative and helps. I kinda have the same scenario with encrypted SPs, would you know how anything about how to do this.
@JamesCollins902 ай бұрын
*Customer wants a test run of upgrading an application, and needs a PITR snapshot in SMI in case of need to rolback* Googles how to.... finds this video, PHEW! Thankyou for dropping this, very handy!
@KevinBillings3 ай бұрын
I am using it now... and it works. Reminder ... set a batch limit on the downstream foreach activity that is consuming the .json output.
@rafaelalexandrou-t2y3 ай бұрын
The SID comes as an encrypted value. Anyone had the same issue?
@SandeepRajput-g9i3 ай бұрын
Nice video . I want to know if this service is free or do we have to pay for using this service? I mean does it billed in my monthly billing?