Пікірлер
@DanielSvendsen86
@DanielSvendsen86 2 күн бұрын
Thank you for a nice video in a really nice pace. Not too short and not too long. Thanks.
@tiago5a
@tiago5a 8 күн бұрын
will it work with postgre on-prem database?
@chrismettler136
@chrismettler136 2 ай бұрын
This is very nice!
@Milhouse77BS
@Milhouse77BS 2 ай бұрын
This could be nice! If I could do iSeries/AS400.
@samial2341
@samial2341 3 ай бұрын
Great Video!
@Himanshubishnoi-td3nh
@Himanshubishnoi-td3nh 4 ай бұрын
but why to convert in delta format, could not i directly use serverless pool or spark on dataframe?
@baklava2tummy
@baklava2tummy 6 ай бұрын
What I don’t understand is why you would create the lake database in the Serverless pools however (ie not in the spark notebook. Love your videos btw!
@harryakb11
@harryakb11 6 ай бұрын
Your tutorial is really helpful, mate. Big thanks!
@mkeii
@mkeii 7 ай бұрын
useful description but most people want to know when to use one versus the other (which problems each are designed to solve) and this isn't discussed.
@marvinalmarez4458
@marvinalmarez4458 7 ай бұрын
why is the spark pool not being picked up on our setup. It's in the same workspace and and resource group?
@bilalshafqat1634
@bilalshafqat1634 8 ай бұрын
Great explanation.
@Mahmoudalgindy
@Mahmoudalgindy 8 ай бұрын
Thanks so much Andy, unfortunately 😊 with D365 Finance & Operations the Append Only is ON and cannot be modified, please advise what SQL query should be used to avoid duplications and just take the last version of the rows.
@sukumarm5926
@sukumarm5926 9 ай бұрын
Thanks for the great video. If I have a requirement to get this data into Azure SQL DB. CRM -- > Synapse link -- . Microsoft Data fabric -- > SQL . Does this make sense.
@DatahaiBI
@DatahaiBI 9 ай бұрын
There's a lot of steps there, if you just want it in Azure SQL DB then you can configure the Dataverse export to Azure Data Lake and then import into Azure SQL DB learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-pipelines?tabs=synapse-analytics
@Blade17534
@Blade17534 11 ай бұрын
When I select my spark pool, the storage account drop down is empty. Otherwise, the storage account dropdown is populated. Any idea?
@janpoulsenskrubbeltrang5957
@janpoulsenskrubbeltrang5957 11 ай бұрын
Super usefull, Andy. Thank you!
@hellmutmatheus2626
@hellmutmatheus2626 11 ай бұрын
Which one do you think is worth to take first dp-203 or dp-600? I am already pl-300 I see that Fabric is the next step in my PowerBi career but I dont know if the market is growing in this direction yet, what are your thoughts?
@JonathanCrotz
@JonathanCrotz 11 ай бұрын
Great explanation! Subscribed immediately
@raghuramsharma2603
@raghuramsharma2603 Жыл бұрын
Hi Andy we have this requirement in our current project...but we are facing issue and moreover we are using "Select Enterprise Policy with Managed Service Identity"...Any pre configurations if you are trying with Managed Service Identity? Can you pls help thanks
@rwnemocsgo2542
@rwnemocsgo2542 Жыл бұрын
Very nice video! I was looking at your channel to see if I could find a way to set up the "BYOL" concept through a synapse link. According to Microsoft Techtalks, it should be possible to export Dynamics tables into Data lake without an Analytics workspace. I even saw it briefly during one of the techtalks, however they never explained it in detail. When I tried it, my Finance and Operations tables aren't visible to me unless i choose the analytics workspace and a spark pool. I'm finding the Microsoft documentation extremely confusing regarding this. Any ideas?
@michaeldemarco82
@michaeldemarco82 Жыл бұрын
Just a tangential comment he has the same vocal intonations as George Michael
@marcosmartin3148
@marcosmartin3148 Жыл бұрын
Good afternoon, I am having problems develoing this process. I have done everything but my sync status in my azure synapse link go from "Initial sync in progress" to "Error" without giving any further information. If I go to my data lake, the selected table is inside it but with CSV format not Delta. The only difference is the connection is from Dynamics F&O. Do you think that the problem can come from LCS Dynamics F&O? Thanks in advance.
@artem77788324
@artem77788324 10 ай бұрын
I have exactly the same problem. CSVs are loaded successfully to the data lake but spark job is failing when converting to delta format.
@jeanfabrice9159
@jeanfabrice9159 Жыл бұрын
When to use Lake Database and when to use to SQL Database in synapse?
@DatahaiBI
@DatahaiBI Жыл бұрын
With “SQL Database” do you mean the Serverless SQL Pools or Dedicated SQL pools? Lake databases are used for when you’re workloads in terms of data transformation is done via Spark
@tyronefrielinghaus3467
@tyronefrielinghaus3467 Жыл бұрын
Had to change speed to 1 25...you talk too slowly.
@ezmix
@ezmix Жыл бұрын
I think he talks at just the right speed.
@JohnYoung-p3f
@JohnYoung-p3f Жыл бұрын
Hi Andy, thanks for this walkthrough. My basic question is why is it acceptable to have to run compute (additional costs) and make copies (additional costs) of data from Lakehouse to Warehouse when with Azure Databricks, the Lakehouse is one compute execution, one security model, one copy of the data, etc. Fabric has separate capacity and security models depending on where you are coming from (and they don't carry through, ie security model doesn't move with the data). Fabric Shortcuts break the security model.
@DatahaiBI
@DatahaiBI Жыл бұрын
Good question. Well it’s all what technology you choose based on skill set. Yes you can land your data with a Fabric Lakehouse and then if you don’t want to move that data again, then as long as it’s modelled and is as clean and prepared how you need it, then fine. But if you need to further transform data then you have the choice of using the warehouse service if you’re a sql focussed developer or team. IMHO it’s the same with Databricks, you still need to transform your data into what you need for analysis and reporting.
@VictorHugo-bd3bf
@VictorHugo-bd3bf Жыл бұрын
Very useful. Thanks for sharing
@graymccarthy685
@graymccarthy685 Жыл бұрын
Looking forward to this one - all your resources were the backbone for me getting to grips with DP 500.
@DatahaiBI
@DatahaiBI Жыл бұрын
Thank you. I’ve put what I believe to be relevant learning links against all the individual skills being measured in this blog www.serverlesssql.com/dp-600-fabric-analytics-engineer-skills-measured-guide/
@mehmetbekirbirden6858
@mehmetbekirbirden6858 Жыл бұрын
My understanding from Azure documentation is a bit different. As I understand, Fabric warehouse is essentally the same fabric lakehouse on spark. The diference is, to make warehouse sa called ACID compliant, they restricted spark side of things to give more capability to the sql end point part. It is not the SQL Server we know.
@DatahaiBI
@DatahaiBI Жыл бұрын
The Warehouse service uses the enhanced Synapse Serverless SQL Pools engine (as does the Lakehouse SQL Endpoint), not Spark.
@MarnixLameijer
@MarnixLameijer Жыл бұрын
In the documentation Microsoft mentions: "For the Dataverse configuration, append-only is enabled by default to export CSV data in appendonly mode. But the delta lake table will have an in-place update structure because the delta lake conversion comes with a periodic merge process." Does that mean that when we delete a row in Dataverse, the latest version of the Delta table has no record of the record? If so, do older versions of the Delta file still contain the deleted record, or does the 'once per day optimize job' remove that history?
@DatahaiBI
@DatahaiBI Жыл бұрын
In Append-only mode there is a flag added to the destination table which indicates if the source row has been deleted. It is not hard-deleted from the Delta tables.
@nishantshah38
@nishantshah38 10 ай бұрын
@@DatahaiBIDoes this mean that if we export data in Delta Lake format, we won't have a history of records available in Delta Lake? If something is deleted, can I still query it from Delta Lake? How can I use the time travel feature of Delta Lake? My requirement is to query all the historical data. Will exporting to Delta Lake format provide this feature or not?
@DatahaiBI
@DatahaiBI 9 ай бұрын
@@nishantshah38 Yes exporting to Delta will give you the features of Delta out of the box. However, part of the Synapse Link process is to run daily OPTIMIZE and VACUUM jobs to remove "old" data, this defaults to 7 days retention period.
@LearnMicrosoftFabric
@LearnMicrosoftFabric Жыл бұрын
"Throwing in a Power BI custom report theme - that's interesting" -> I thought exactly the same 🤣 Great video Andy - very well laid out 👊 exciting times ahead!
@c2c538
@c2c538 Жыл бұрын
Youre providing great content please continue your good work and kindly provide lengthy explanation videos with a practical example consisting a complete pipelines from start to end
@Suna1988be
@Suna1988be Жыл бұрын
Great video thanks! There are still some bugs with serverless sql pools though.. Also, how will it go during deployment of external tables ? It requires a valid path on the data lake to create the external table. So if it's a new external table not yet available on the target environment, the deploy will fail.
@BaijuThakkar
@BaijuThakkar Жыл бұрын
Will this work when we have views dependent on views in other databases? In our setup we have logical data warehouse, dims and facts view depend on views in other database that have views over delta lake files.
@DatahaiBI
@DatahaiBI Жыл бұрын
It should do. AFAIK Azure Data Studio supports database references, but I haven't tested yet.
@BaijuThakkar
@BaijuThakkar Жыл бұрын
@@DatahaiBI I tried and it seems that at the moment it fails on validating the view itself as we are using openrowset view and dacpac fail in resolving this against the delta lake with error message like one below!! Severity Code Description Project File Line Suppression State Error SQL71561: Computed Column: [Dim].[Product].[VATCode] has an unresolved reference to object [$(database)].[dbo].[Product2].[VAT_Code__c]. Serverless Synapse Physical Test C:\Users\SourceControlFolder\Dim.Product.sql 23
@germanareta7267
@germanareta7267 Жыл бұрын
Great video, thanks.
@DatahaiBI
@DatahaiBI Жыл бұрын
Thanks. Anything that I can do to clarify/expand on anything?
@VeganSmasher
@VeganSmasher Жыл бұрын
Very, very helpful info. Thank you for showing the Fabric side of domains. This is exactly the info I was looking for. Liked & subscribed. :-)
@trgalan6685
@trgalan6685 Жыл бұрын
Good content but long-winded; take out the local weather report and the 'maybe' technology and an hour+ video is reduced to half. There's definite value here but people's time is important to them.
@DatahaiBI
@DatahaiBI Жыл бұрын
This was a live stream rather than a curated video hence the length. Glad you found the content useful
@russellbrown6784
@russellbrown6784 Жыл бұрын
Great video
@AIDataLearnings
@AIDataLearnings Жыл бұрын
Thank you for such good content. The way you calmly explain the basic concepts is fabulous, would like to binge on other videos of yours.
@AIDataLearnings
@AIDataLearnings Жыл бұрын
Very nicely explained.. you now have a new subscriber. keep creating more content.
@krypton0125
@krypton0125 Жыл бұрын
nice video! when to use Lake Database and when to use to SQL Database in synapse?
@jeanfabrice9159
@jeanfabrice9159 Жыл бұрын
I would have asked the same question too !
@krypton0125
@krypton0125 Жыл бұрын
@@jeanfabrice9159 did you get the answer?
@jasoncysiu
@jasoncysiu Жыл бұрын
This tute is amazing - thank you!
@timroberts_usa
@timroberts_usa Жыл бұрын
do you have any pre-built infrastructure scripts to establish resources for the examples? be great to include
@ShangKheiShek
@ShangKheiShek Жыл бұрын
Was a hard exam, but well worth it!
@germanareta7267
@germanareta7267 Жыл бұрын
Great video, thanks.
@gallardorivilla
@gallardorivilla Жыл бұрын
Thanks for demo!! great resources videos
@rjh560
@rjh560 Жыл бұрын
Hi Andy, Thanks for the useful video! Just a quick question - when your workspaces are in a deployment pipeline, do you know if they can still be put on different capacities as you describe? Or do all workspaces in a pipeline have to be on the same capacity? Don't worry if you don't know, I can try it out and comment the answer here!
@DatahaiBI
@DatahaiBI Жыл бұрын
Hiya, at the moment Fabric items are not supported in deployment pipelines so we won't know the full story yet. I can deploy power bi items to workspaces assigned to different capacities, but of course they are not dependent of fabric capacities.
@radekou
@radekou Жыл бұрын
Thanks for great explanation - does this mean that 15 minutes is as low as we can get in terms of latency? What solutions would you recommend if the requirement is to detect change in data in sub-minute (or ideally couple of second) range? Thanks
@DatahaiBI
@DatahaiBI Жыл бұрын
Yes 15 minutes is the lowest latency here for delta merging. In terms of sub-minute you could look at the normal CSV export process, but even then Microsoft state "near-real time" which could mean up to a few minutes before any changed data in Dynamics is available in Synapse for querying
@germanareta7267
@germanareta7267 Жыл бұрын
Great video. Thanks.
@peterdaniels3428
@peterdaniels3428 Жыл бұрын
The sql endpoint is definitely like the synapse server less sql pool. I'm a little surprised that we don't have something like cetas, tho. Maybe it's coming.
@DatahaiBI
@DatahaiBI Жыл бұрын
yep, build on the polaris engine and enhanced so shares a lot of similarities. Not sure CETAS will come in the lakehouse, I'm betting the sql endpoint will stay as read-only (but you never know!)
@fernandogarcia408
@fernandogarcia408 Жыл бұрын
Nice video, one suggestion it's a video covering end-to-end data warehouse, since loading data from source, create the layers and what happen on each layer and then incremental load facts and dims on gold layer with surrogate keys.I mean from people like me that come from "traditional" B.I, it's not to easy to understand. This could be done a video for synapse using serveless.Cheers and thank you for your videos, help us a lot.
@DatahaiBI
@DatahaiBI Жыл бұрын
Hi, I’ll be doing a “Let’s Built a…” video in the next few weeks. It’s early days with the data warehouse functionality, lots of missing features including identity etc
@bigglesharrumpher4139
@bigglesharrumpher4139 Жыл бұрын
Very succinct and valuable introduction to the new Fabric Lakehouse and Warehouse. Good stuff!