Пікірлер
@peterdaniels3428
@peterdaniels3428 Күн бұрын
Cool! Does this work with ADLS Gen2 configured as a shortcut in Fabric?
@adisbey-ef
@adisbey-ef 5 күн бұрын
Very interessting, can you please provide tsql that you used in video ? I know it can be downloaded from Ola Hallengren but you did some changes to it and it would be much easier to get it from you. Thank you
@Tales-from-the-Field
@Tales-from-the-Field 2 күн бұрын
DBABulldog here. TY for reaching out I’ve been a bit busy so I do apologize for the delay. I will work on gathering this up and follow-up with a GitHub location.
@shanthanpaladi231
@shanthanpaladi231 5 күн бұрын
How can Parameterize the database name in notebook from other notebook which use SparkSQL.
@phillipdataengineer
@phillipdataengineer 8 күн бұрын
awesome vid, man! Thanks for the detailed information
@Tales-from-the-Field
@Tales-from-the-Field 8 күн бұрын
Thank you for watching @phillipdataengineer!
@zacharythatcher7328
@zacharythatcher7328 8 күн бұрын
I also still don't have the AI skills option. I have a paid subscription but I don't currently have the F64 or higher. Can anyone confirm whether this is why?
@Tales-from-the-Field
@Tales-from-the-Field 8 күн бұрын
Hi @zacharythatcher7328 per Bradley, “Hi Zachary, yes sir this requires an F64 and it must be enabled from the admin portal. I will make sure to call this out in a video soon.”
@brianszeto3418
@brianszeto3418 10 күн бұрын
Great video! How were you able to get AI skills working in a Fabric trial capacity? I tried it myself and I'm able to setup an AI skill, but upon asking questions in the chat window I get a FT1 SKU Not Supported error
@Tales-from-the-Field
@Tales-from-the-Field 8 күн бұрын
Hi @brianszeto per Bradley, “Hi Brian, this only works on a paid F64. I will make sure to update that in a future video. I’m on a paid F64 bit it is showing the subscription count down, not really sure why. “
@rcb3921
@rcb3921 10 күн бұрын
Very helpful video! Thanks so much, this really clears up some of my questions.
@Tales-from-the-Field
@Tales-from-the-Field 7 күн бұрын
Thank you for watching @rcb3921
@Chathumal2011
@Chathumal2011 11 күн бұрын
You just saved my day. Copied 5GB data within 1 minute :)
@Tales-from-the-Field
@Tales-from-the-Field 9 күн бұрын
Daniel here I am so glad this was able to help ya.
@singamsgr
@singamsgr 11 күн бұрын
Nice and informative demo. I would like to know the other way, i mean reading files/tables from lakehouse with rest API into other applications.
@AndyLeonard
@AndyLeonard 11 күн бұрын
Good morning, Gentlemen! :{>
@SQLBalls
@SQLBalls 11 күн бұрын
Good morning @AndyLeonard!
@Tales-from-the-Field
@Tales-from-the-Field 11 күн бұрын
Good morning Andy!
@rafaelcn27
@rafaelcn27 13 күн бұрын
Thank you for solving my doubt 🙌🙌
@Tales-from-the-Field
@Tales-from-the-Field 7 күн бұрын
Our pleasure! Thank you for tuning in @rafaelcn27
@martijnolivier4553
@martijnolivier4553 13 күн бұрын
@SQLBalls Can this also be done without a virtual machine?
@joevenegasiii8716
@joevenegasiii8716 17 күн бұрын
Have you encountered an error in creating a trigger? The Microsoft community doesn't seem to have a good answer. I successfully connection to my subscription and storage account and select the events. When I get to the "Review and Connect" step, I consistently get the following error: Create Azure Blob Storage system events Failed Cannot read properties of undefined (reading 'type')
@stevenwheeler541
@stevenwheeler541 19 күн бұрын
Daniel, I have a MS Tech Bits topic for you on Azure SQL MI failover options. Is it needed, what is included for different tiers including Business Critical, and what options are available. Thanks Steven
@Tales-from-the-Field
@Tales-from-the-Field 16 күн бұрын
Daniel here from the channel. Great suggestion I have added it to the short list. Need to work through how i want to or if i want to break up into multiple videos.
@syednayyar
@syednayyar 20 күн бұрын
Great Amazing, Can you please help getting the data into the Lakehouse or Datawarehouse in Fabric using android APP code ( java native ) i am frustratingly annoyed because no help / guidance found around on how to ingest / upload the Json file from a native Java app that creates a Json file , will be great if you can please share some ideas if this is even possible ( on the offical documentation only python methods are listed ) no java or kotlin way to ingest Json / upload Json to OneLake Your help will be amazing please.
@adefwebserver
@adefwebserver 24 күн бұрын
Thank you for this.
@Tales-from-the-Field
@Tales-from-the-Field 19 күн бұрын
Thank you for watching @adefwebserver !!
@icarusrising728
@icarusrising728 24 күн бұрын
I wish this concept existed with the LakeHouse Shortcuts.... I have to keep dropping and re-adding shortcuts :(
@rishivarma4479
@rishivarma4479 25 күн бұрын
while defining the parameters what is the value that you have provided for APIKey. I am not able to generate the data because of the missing field.
@user-iu8xl7gh4m
@user-iu8xl7gh4m 12 күн бұрын
same issue then how you resolve this??
@iamrohitdube
@iamrohitdube 27 күн бұрын
Great Tutorial!
@Tales-from-the-Field
@Tales-from-the-Field 4 күн бұрын
Thank you for watching @iamrohitdube !
@Rider-jn6zh
@Rider-jn6zh Ай бұрын
how can we connect directly to sql server from fabric and bring data into fabric workspace not in lakehouse
@robcarrol
@robcarrol Ай бұрын
Just tried this out for a client today, works a treat. Thanks!
@Tales-from-the-Field
@Tales-from-the-Field Ай бұрын
Thanks for watching everyone! Remember to follow us on X @TalesftField and subscribe to our channel here on KZbin! See you next time!!
@robcarrol
@robcarrol Ай бұрын
Great feature and very easy to use. I've used it in client projects to copy databases to a dev subscription for testing.
@Tales-from-the-Field
@Tales-from-the-Field 20 күн бұрын
@dbabulldog here from the channel Awesome to hear! I really enjoy the simplicity it provides to do just that.
@varundhawan45
@varundhawan45 Ай бұрын
Big thanks to @Tales-from-the-Field for the shoutout! Keep the awesome content coming ❤
@Tales-from-the-Field
@Tales-from-the-Field 4 күн бұрын
Always!
@sushantjadhav9525
@sushantjadhav9525 Ай бұрын
Great demo. 👍🏻
@peterdaniels3428
@peterdaniels3428 Ай бұрын
Awesome video. Thank you! I'm still confused about the region aspects/limitations for Managed Private Endpoints. Are you saying that the azure resource region and the fabric region have to be the same?
@bumdinh9911
@bumdinh9911 Ай бұрын
Sir can we PASS NOTEBOOK PARAMETERS to LookUp activity in Data Pipeline?
@Tales-from-the-Field
@Tales-from-the-Field Ай бұрын
Hi @bumdinh9911 per Bradley, "Hello! That is a great question, sadly I could not find a way to do it at this time. The closest way I found was to trigger the Job scheduler API for Microsoft Fabric and you could run a job from a pipeline, but there are limitations and one listed was passing a parameter via an API. Now, just to play the other side. While it is not possible you could write the notebook information to a DW or to an Azure SQL Database table, and then do a look up operation to get the data. So we could accomplish the task of getting data from a notebook back into a pipeline, but it is not as straight forward as I was hoping. If that changes, or I should say 'when that changes', I promise you I will make a video on that!"
@dubbocom
@dubbocom Ай бұрын
Thank you ! - great help / advice - cheers
@Godzillastinkydiaper
@Godzillastinkydiaper Ай бұрын
#beforetheguywhocomment
@garimasharma2558
@garimasharma2558 Ай бұрын
Hello nice explanation. Can you please tell me when we create capacity from azure portal. at that time under which type of Azure subscription we can create this. Is it PAYG, CSP, CSP(NCE). or we can create under all these type of subscriptions.
@Tales-from-the-Field
@Tales-from-the-Field Ай бұрын
Hello Garima.. thank you for reaching out.. you can go to Fabric capacity under your Azure portal to create one under the same subscription & Tenant...you can create in any of the subscription type
@dankboii
@dankboii 2 ай бұрын
is this possible to trigger the pipeline as a file is added to the lakehouse
@Tales-from-the-Field
@Tales-from-the-Field Ай бұрын
Hi @dankboii per Bradley, "Not yet. Yet being the key word. But as of right now this is specifically for Azure Blob Storage only."
@User-k3z
@User-k3z 2 ай бұрын
Hi, Thanks for the video. I've created Vnet and subnet and connected with Virtual Network data gateways as per your video. Now I'm trying to connect with ADLS Gen2 but it is showing error like this - Invalid credentials. (Session ID: 7e3a7126-ca0c-4f1f-b762-c40a284fcc5a, Region: us) Please let me know that really helpful for me.
@robcarrol
@robcarrol 2 ай бұрын
Love this, thanks!
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Thank you for watching @robcarrol!
@gonzamole
@gonzamole 2 ай бұрын
How we've complicated our lives
@DBABulldog
@DBABulldog Ай бұрын
We do don't we? I did a more recent video on Elastic Jobs was a wee bit less when it comes to moving pieces.
@Nalaka-Wanniarachchi
@Nalaka-Wanniarachchi 2 ай бұрын
Wow.This seems magic..
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Gotta love that Fabric Magic!
@SQLTalk
@SQLTalk 2 ай бұрын
This video is SUPER helpful. Thank you very much for letting us know how to incorporate Fabric Eventhouse with Database Watcher. Super helpful.
@Tales-from-the-Field
@Tales-from-the-Field 2 ай бұрын
Thank you Kirby! We really appriceate you and all your great work!
@SQLTalk
@SQLTalk 2 ай бұрын
Wonderful video. Thank you for making this.
@sheaerickson537
@sheaerickson537 2 ай бұрын
Great video, thank you!
@DBABulldog
@DBABulldog Ай бұрын
Glad you liked it my friend. Keep come back to the channel for more great content by the team.
@rhyslewis7922
@rhyslewis7922 2 ай бұрын
Do you know if you can use spark streaming from a mirrored Azure SQL data warehouse? If you try and enable change data feed via a notebook in the lakehouses where the mirrored database is shortcutted from it errors out saying unavailable from shortcut. And yet SET TBL PERMISSIONS is not supported trying to enable change data feed via the SQL endpoint in the data warehouse.
@TheSQLPro
@TheSQLPro 3 ай бұрын
Great demo!!
@Tales-from-the-Field
@Tales-from-the-Field Ай бұрын
Thank you @TheSQLPro!
@clvc699
@clvc699 3 ай бұрын
How can you pass parameters to where in Select statement ?
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Hi @claudiovasquezcampos9558 are you looking for this in a Spark SQL Statement in a Notebook or a T-SQL Statement as part of a data pipeline task against an Azure SQL DB, Azure SQL MI, SQL Server, or Fabric DW?
@fazizov
@fazizov 3 ай бұрын
Great video, thanks, Bradley. based on my understanding, mirroring doesn't allow cross-tenant connection. I have a trial fabric account, but can't add any other Azure services, like ASQL there (pay as you go option is disabled for some reason). Any ideas on how to overcome that problem?
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Hi @fazizov per Bradley, "Hello sir! If I'm correct in my understanding you have a Fabric / Power BI Tenant and the Azure Account is not associated with that through the M365 Tenant. The Azure Account that you want to Mirror must be associated witht he M365 account (not an M365 expert, but this is what I've been told). If that is the case the Mirroring should not be limited to Trial capacity, you should be able to Mirror into it. The Trial capacity is equivalent to an F64 so you should have the functionality of an F64 other than those items that have been called out as not included in the trial capacity like private endpoints and managed identity. If your Azure Account is associated with the M365 account your tenant is in please let me know, Mirroring should not be facing any other restriction."
@fazizov
@fazizov 3 ай бұрын
Thank you very much for detailed explanation,I will definitely try that advice!
@ismailbartolo9741
@ismailbartolo9741 3 ай бұрын
Hi, Can you make a tutorial on exploiting the capabilities of Delta Tables? Specifically, how previously stored data can be accessed using versioning to analyze changes in our data over time. Should we store the data from our Delta Table version 1 in another Delta Table, or should we create a view for subsequent visualizations in Power BI? Thank you again for your videos!
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Absolutely love this suggestion @ismailbartolo9741 , we will get Bradley to start working on this!
@ismailbartolo9741
@ismailbartolo9741 3 ай бұрын
​@@Tales-from-the-Field🔥🔥🔥 thanks 😊
@abeerahmed5634
@abeerahmed5634 3 ай бұрын
I want to use the output of Notebook 1 in another notebook(it is using smpt to send mail and the mail should have the output), how do I do it
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Hi @abeerahmed5634 could we get a little bit more information. Are you building text from Lakehouse fields, or is it plugging in numbers. You may not be able to get to specific, but trying to understand what parameters we need to define to send from Notebook1 into the child notebook.
@peterjacobs3749
@peterjacobs3749 3 ай бұрын
Hello Daniel - thank you for the Video! I am teaching and want to bring my beginner SQL Admins to ADS, so this is very helpful! May I ask you which video software you are using for making this presentation? I like how you highlight the commands/areas to focus on. Is it Camtasia?
@danieltaylor6623
@danieltaylor6623 3 ай бұрын
Your very welcome. TY for coming to the channel. The team does utilize Camtasia for all our video editing. If you keep an eye out they drop good deals around holidays.
@peterjacobs3749
@peterjacobs3749 3 ай бұрын
@@danieltaylor6623 Thank you very much Daniel!
@777bashir
@777bashir 3 ай бұрын
what if I need to use pipeline in Fabric itself not ADF, is that possible?
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Hi @777bashir Yes this is completely possible! Bradley released a video on this just 11 days ago, check it out here: kzbin.info/www/bejne/oX65pXuqlNZrpqs
@terryliu3635
@terryliu3635 3 ай бұрын
Thanks for sharing!! The current copilot is related to DataFlowGen2, right? Are there any copilot capabilities associated with Pipeline orchestration?
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Hi @terryliu3635 per Bradley, "Yes Terry, you are correct. This is just for the Data Flow Gen 2 process in this video. Ok, scratch my previous comment. I realized that link goes to the data factory page. On the Public Road map it says 'In the future (Q2 CY2024), we'll also introduce Copilot for Data Factory in Data Pipelines. ' here's the link to that: learn.microsoft.com/en-us/fabric/release-plan/data-factory#copilot . The good news is we are currently in Q2..... so hopefully we will have this soon!"
@zongyili569
@zongyili569 3 ай бұрын
Hi. I am able to create vnet gateway, the connection status is online. However, Data Gateway is not loaded when I try to create the Connection. The Data Gateway dropdown list is empty. Would you advise what could be the issue?
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Hi @zongyili569 per Bradley, "Just want to make sure I understand this correctly. We've Registered the Microsoft.PowerPlatform resource provider, provisioned a Virtual Network and set up a subnet for the Microsoft Fabric tenent to use and we delegated the subnet to the service Microsoft.PowerPlatform/vnetaccesslinks, went to our Fabric Tenent and when we selected Virtual network data gateways, we had a License capacity, then you could select the Azure Subscription and the resource group, and the Vnet was provisioned in and it didn't show up? My initial thought would be that it could be a tenant where they Azure Subscription is not attached to the Fabric Tenent, but if you can see the subscription and the resource group, then you may need to call support. I'd love to hear from you on this to see what the resolution is. "
@zongyili569
@zongyili569 3 ай бұрын
@@Tales-from-the-Field Hi, it was my mistake. Fabric pipeline doesn't support vnet data gateway right now. That's why I can't see those vnet data gateway connections.
@OneNI83
@OneNI83 3 ай бұрын
In this method can we bulk import tables (lets say we want tables that are filtered using a query and those tables we need to import) ,what would be the maximum that can export at one go ?
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Hi @OneNI83 per Bradley, "Look at this much like Azure Data Factory. Use this for your initial fork lift, based on your comfortability with this and other technologies. If you are a T-SQL Warehouse person, land it in files and do a CTAS to load or load it directly to a warehouse and then use T-SQL to Transform your data. If you are spark person, land it in files and use a notebook to transform and load your data. This is a powerful tool, but there's multiple ways to ingest data after it lands. So super long answer for a short questioni, Yes you can use this to bulk load. Not sure, there's not really a limit and you could scale the parallelism to increase throughput for VLDB workloads."
@adilmajeed8439
@adilmajeed8439 3 ай бұрын
Thanks for sharing. When you have created the connection in the Fabric service and then you had clicked to see the properties again, after scrolling at the bottom, I can’t see the staging part in my connection which you had showed in the properties dialogue box. Power BI Gateway is already been updated. Any pointers…
@Tales-from-the-Field
@Tales-from-the-Field 3 ай бұрын
Hi @adilmajeed8439 per Bradley, "Hello sir, it's a little hard to troubleshoot this on KZbin. Could you hit me up on Linkedin and we could DM with screen shots?" www.linkedin.com/in/sqlballs/