Пікірлер
@user-lj9fk8dg9h
@user-lj9fk8dg9h 3 күн бұрын
Hello sir, Thank you so much providing these productive videos. Today, I faced a challenge, and the solution I couldn't find elsewhere. That is How to Extract data from SAP Hana Cloud to Microsoft Fabric (cloud to cloud connectivity). Could you please help me here?
@user-ph1km5vk9l
@user-ph1km5vk9l 3 күн бұрын
whta is name of the episode mentioned with short cuts explained ? thank you
@keen8five
@keen8five 12 күн бұрын
status "running" just says that the vCore "did something", right? But there is no way to tell if the cores were running at 1% or at 100% load, correct?
@jennyjiang6301
@jennyjiang6301 11 күн бұрын
Yes, you are right. The resource utilization chart currently only indicates that the vCore is running and does not indicate CPU or memory utilization. What kind of information are you specifically looking for?
@RSCHAB
@RSCHAB 12 күн бұрын
Hi How to add a table into the lakehouse? I dont have one.. br. R.
@ashanw
@ashanw 12 күн бұрын
Great explanation and good content. Can you kindly share the yml file with me? Thanks
@naimuddinsiddiqui9249
@naimuddinsiddiqui9249 13 күн бұрын
Great explanation , If we will do any changes in dataset which is in our local pc how would it reflect the data changes in kql do we have to establish any bridge like Integration run time/Virtual/Cloud gateway ?
@olegkazanskyi9752
@olegkazanskyi9752 16 күн бұрын
I get this error when I'm trying to clone a table. Any hints on how to resolve it? Feature 'DISCOVERED TABLE' is not supported by table clone.
@vishwanathvt7701
@vishwanathvt7701 16 күн бұрын
I have created the Synapse workspace what is the username and password? How to set that?
@MrLee1334
@MrLee1334 17 күн бұрын
Hey while working with parquet files ive noticed depending on sql query complexity it may occur that running the exact same SQL query multiple times for the exact same parquet file, it may result in different results - has anyone ever noticed that same behavior before?
@juanm555
@juanm555 18 күн бұрын
Excellent video, Abhishek explains everything in a wonderful way. Eagerly expecting more videos with him!
@BUY_YOUTUB_VIEWS_378
@BUY_YOUTUB_VIEWS_378 19 күн бұрын
🎉
@moeeljawad5361
@moeeljawad5361 20 күн бұрын
Thanks for this video, i am currently using the notebook activity in Fabric Pipelines. My notebook is mature now and it runs very well. I was thinking of gathering the code in the notebook into a job definition, for the sake of saving execution time in the notebook. Would replacing a notebook by a job definition makes the code execution faster? Another question would be about job descriptions themselves, if you have defined some helper functions in the notebook, can i move them to a side job definition that is being called from the main job definition? if yes then how? Thanks
@keen8five
@keen8five 26 күн бұрын
I'd love to see the Capacity Unit consumption of a Notebook execution in the Monitoring Hub
@Elizabeth-st4yk
@Elizabeth-st4yk 26 күн бұрын
Noted. This request is in our backlog.
@rankena
@rankena 26 күн бұрын
Is there a way to generate Bearer token programmatically?
@SumitArora-zf3of
@SumitArora-zf3of 29 күн бұрын
What are the options to build a Power BI report on a large dataset if it contains lets say 500millions of records?
@user-dy8xu7uj8k
@user-dy8xu7uj8k Ай бұрын
Hi, Good Morning!, I have to convert the existing SQL server stored procedure into fabric environment, In my stored procedures there are CURSOR commands but fabric doesnt support CURSOR commands, in this case how do I proceede, is there any alternative.
@Get_YT_Views_579
@Get_YT_Views_579 Ай бұрын
Thanks for the positivity!
@peterlapic6761
@peterlapic6761 Ай бұрын
Is there a way to perform Lifecycle Management policy on the Dataverse using Synapse Link? Want to pull all data from Dataverse to the Datalake the way Synapse Link does but delete old data in the Dataverse but still retain it in the Data lake. I want the data in the data lake to run through the Azure Lifecycle management policy so that it ends up in the cooler tiers to save cost but still be reportable for PowerBi using serverless sql.
@thepakcolapcar
@thepakcolapcar Ай бұрын
hello @amit When I follow the steps, in power BI I see the "Trial" option as disabled adn by default it has selected my "Pro" licence. However on top it shows me "PPU trial: 59 days left". Is that how it is uspposed to be? Further as I proceed, and try to create Lakehouse, it gives me a message asking to upgrade to free Fabric tiral capacity.
@mcquiggd
@mcquiggd Ай бұрын
Unfortunately, the audio is very bad, and also the screen resolution of these recordings makes it very difficult to read - the occasional zooming in just makes it confusing. It's a pity as the content is pretty good - perhaps include the example files so people can try this themselves. This series could really use an Editor to make sure the content is uniformly presented.
@adatalearner
@adatalearner Ай бұрын
May I make a request to include a session on how an enterprise Fabric environment should look like including DevOps CI/CD pipelines ?
@adatalearner
@adatalearner Ай бұрын
does this require separate co-pilot license ?
@johnfromireland7551
@johnfromireland7551 Ай бұрын
Requires >-= P1 Capacity
@i.k.986
@i.k.986 Ай бұрын
maybe there are questions: what does the burndown do? The smoothing takes place always, I mean at least for specific activities, right? When the capacity is turned off, and activities are smoothed, during the off period of the capacity, the capacity is still charged, right?
@omerturkoglu4259
@omerturkoglu4259 Ай бұрын
can we say that delta is based on parquet? So Delta is nothing but an advanced Parquet?
@adilmajeed8439
@adilmajeed8439 Ай бұрын
Thanks for sharing. Why the copilot is not incorporating SynaspeML code instead of scitki-learn library? Once the data volume becomes large, scit-kit learn library will not work efficiently the way it needs to be, at the end the DataFrame is based on pandas not Apache DataFrame. Any suggestions on that?
@rajrik
@rajrik Ай бұрын
Excellent question. We're working on improving our integration and native awareness of Fabric capable libraries such as SynapseML and you will continue to see those improvements emerge as Copilot progresses this year. Watch this space!
@EsteraKot
@EsteraKot Ай бұрын
Clarification: we were planning to switch to the Microsoft Fabric www.youtube.com/@MicrosoftFabric channel, but we have finally decided to stay here. We will continue delivering more content for our nearly 13k loyal viewers. Thank you!
@gpltaylor
@gpltaylor Ай бұрын
short and to the point! nice.
@gpltaylor
@gpltaylor Ай бұрын
I like this style of demo breakdown where we're not treated like morons :) We can all read the microsoft learn website. Here we dig into each section. From this along I feel I am able to get work done. Thank you
@gauravdevgan79
@gauravdevgan79 Ай бұрын
does it provide comparable features as offered by shiny app ?
@vijaybodkhe8379
@vijaybodkhe8379 Ай бұрын
Thanks for sharing
@04mdsimps
@04mdsimps Ай бұрын
I tried fabric last summer when it came out and deemed it a beta at that point. Now its a year on, why should I move from azure synapse and power bi to fabric?
@bitips
@bitips Ай бұрын
Question : If I'm playing separately for storage, why my storage becomes inaccessible when my capacity is paused ?
@up_0078
@up_0078 Ай бұрын
Data is accessible as long as you have compute available. You create databricks cluster or any other compute and can access data stored on onelake.
@Milhouse77BS
@Milhouse77BS Ай бұрын
Looking forward to this.
@mattstainsby4542
@mattstainsby4542 Ай бұрын
I want this to work so badly but the config is turning out to be really difficult. For my kernal I can see fabric-synapse-runtime, however, when I'm running, spark is not being recognised
@yashub9580
@yashub9580 Ай бұрын
i am running a sarima but whenever i am running it its giving me 100+ experiments. But i should be getting only one
@i.k.986
@i.k.986 Ай бұрын
Thank you for this clear explanations!
@sabarivel4555
@sabarivel4555 Ай бұрын
How to reuse the common dimensions across semantic models like the one lake shortcut created in this demo? This would be really useful to reuse the shared dimensions across subject areas.
@knuckleheadmcspazatron4939
@knuckleheadmcspazatron4939 Ай бұрын
This is really awesome! For some files this is a great method. Use it when it works kinda thing.
@sanishthomas2858
@sanishthomas2858 Ай бұрын
Nice. if I save the files from source into the Lakehouse File as csv and Json then will it save it has delta parquet if not then why we are saying data is saved in one lake as delta parquet
@sam910312
@sam910312 Ай бұрын
This could blocked D365 transactions?
@bloom6874
@bloom6874 2 ай бұрын
great series
@MrSatishc84
@MrSatishc84 2 ай бұрын
I need the video for synapse deployment using YAML file
@bloom6874
@bloom6874 2 ай бұрын
Learning is really quick with your videos. Why you stopped posting more videos in this series. Please continue.
@bloom6874
@bloom6874 2 ай бұрын
Please add more videos to this series
@saivadurai
@saivadurai 2 ай бұрын
Good to see the flow
@hariprakash326
@hariprakash326 2 ай бұрын
can you share the YAML file
@riazahmedshaik
@riazahmedshaik 2 ай бұрын
How to setup multi select in parameter?
@patrickshahrouz3705
@patrickshahrouz3705 2 ай бұрын
This was amazing! However, it felt like there was more Abhishek wanted to cover on the topic of Medallion Architecture alone. Hope you can setup a follow up!
@TomFrost33
@TomFrost33 2 ай бұрын
Ther are many video options about loading data into a Lakehouse. How do we manage\edit the data once it is in there?
@DanielWillen
@DanielWillen 2 ай бұрын
We will stick to Azure SQL for now