Пікірлер
@pragatisharma6036
@pragatisharma6036 17 күн бұрын
if my file get new data it will also work for the updated data or should i need to creat Agin the data flow
@DataVerse_Academy
@DataVerse_Academy 16 күн бұрын
You can implement the incremental load inside the dataflow gen2 as well.
@GiancarloBlaze
@GiancarloBlaze 20 күн бұрын
Very informative! thank you
@kiranj8769
@kiranj8769 28 күн бұрын
Excellent, great video..🎉
@nickfaleev9233
@nickfaleev9233 Ай бұрын
I just finished watching this video on using PySpark in Microsoft Fabric, and it was incredibly helpful! The explanations were clear, and the step-by-step walkthrough made complex concepts easy to understand. This is definitely a must-watch for anyone looking to get started with PySpark in Microsoft Fabric. Great job, and thank you for sharing this valuable content! I highly recommend subscribing to this channel. The content is consistently high-quality, with tutorials that are not only informative but also easy to follow. Whether you're a beginner or an experienced professional, this channel offers valuable insights and tips that can help you stay ahead in the data world. Don't miss out on the great content they’re putting out!
@DataVerse_Academy
@DataVerse_Academy Ай бұрын
Thank you! 🙏
@gopalammanikantarao593
@gopalammanikantarao593 Ай бұрын
Nice Video, It's helped me to understand M.Fabric flow.
@DataVerse_Academy
@DataVerse_Academy Ай бұрын
Thank you! 😊
@Otto_at_work
@Otto_at_work Ай бұрын
Very useful. Thank you.
@surerakase5017
@surerakase5017 Ай бұрын
Can we pass parameters in dataflow gen2
@DataVerse_Academy
@DataVerse_Academy Ай бұрын
No, it’s not supported yet.
@meashhh1134
@meashhh1134 Ай бұрын
Very very good explanation Thank you
@balajikrishnamoorthy7352
@balajikrishnamoorthy7352 Ай бұрын
will this work if we delete or update a data in the source table ?
@DataVerse_Academy
@DataVerse_Academy Ай бұрын
It will work in case of update , not in delete. As it’s incremental approach not CDC.
@conexaomotorizada
@conexaomotorizada Ай бұрын
Hi, how can i edit a table from a semantic model in power bi's power query ?
@namangarg7023
@namangarg7023 Ай бұрын
Can u pls explain how u get connection oath for source over here like we did in ADF
@moeeljawad5361
@moeeljawad5361 Ай бұрын
Thanks for your video, very intuitive and impressive. I have a question, if your Table_List table is saved as a delta table from a python script, and i would like to have the Max_value updated.... I believe i will not be able to keep the stored procedure, right? if not should i be replacing it with a notebook activity? Thanks
@DataVerse_Academy
@DataVerse_Academy Ай бұрын
Yes you are right you will not be able to use stored procedure to update a delta table. You need to create a notebook for that.
@Giraffe-j3h
@Giraffe-j3h Ай бұрын
how to create new columns and dim date table in this model
@vinaypratapsingh5815
@vinaypratapsingh5815 Ай бұрын
When i am clicking on New Semantic Model , I am not able to see all those tables to select a table or all tables . Because of that i am not able to create Semantic Model. Could you please help me here ? Thanks
@DataVerse_Academy
@DataVerse_Academy Ай бұрын
Whats the error you are getting ?
@vinaypratapsingh5815
@vinaypratapsingh5815 Ай бұрын
@@DataVerse_Academy Thanks for your response. I am not getting any error, but I am not able to select any table to create my semantic model . Under Select all , it's not giving me tables name to select the table name
@DataVerse_Academy
@DataVerse_Academy Ай бұрын
Please try below once Settings- admin portal- > tenant settings - > information protection -> allow users to apply sensitivity labels for content - enable this, Then you will be able to create semantic model through lakehouse
@priyanthakarunathilake8030
@priyanthakarunathilake8030 2 ай бұрын
Really helpful. Thanks.
@ceciliaayala3923
@ceciliaayala3923 2 ай бұрын
Great explanation, thank you!
@sridharnallagatla7442
@sridharnallagatla7442 2 ай бұрын
For microsoft fabric how the market there is any calls
@DataVerse_Academy
@DataVerse_Academy 2 ай бұрын
Not now, but you will a lot of movement towards fabric in upcoming 1-2 year.
@tv.TheDogFather
@tv.TheDogFather 2 ай бұрын
Thanks for the video... Gold_Product is still not included in the Code zip file. Can you please include it? Not as Important, but at the same time, can you include the Run_Load notebook?
@BrundaAS-e8r
@BrundaAS-e8r 2 ай бұрын
Not able to access the azure blob storage link you have shared
@pragatisharma6036
@pragatisharma6036 2 ай бұрын
product script is missing in data code file please upload it
@andreaskoblischke8186
@andreaskoblischke8186 2 ай бұрын
Nice one. Usefulefor me. Thx
@lucasamorim5823
@lucasamorim5823 2 ай бұрын
Thank you, help me a lot with this short video.
@seetha110378
@seetha110378 2 ай бұрын
how to read data from a REST API that has authentication tokens (Refresh tokens) into a Fabric lakehouse?
@priyankaparida8423
@priyankaparida8423 2 ай бұрын
Hello Sir, After line no. 23 it is directly showing line no.77 .the middle part is skipped so not getting the code in between that. can you help with it.
@dhheanom10d
@dhheanom10d 2 ай бұрын
Thank u so much!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
@ADhuidv
@ADhuidv 3 ай бұрын
Hello sir, Thank you so much providing these productive videos. Today, I faced a challenge, and the solution I couldn't find elsewhere. That is How to Extract data from SAP Hana Cloud to Microsoft Fabric (cloud to cloud connectivity). Could you please help me here?
@heyrobined
@heyrobined 3 ай бұрын
So I did the same with on prem sql server to the warehouse , and checked every connections are preview working but when I am validating the activity I got this error "Copying data from on-premises connection with staging storage in workspace is not supported. Please use external staging storage instead." what could be the solution?
@vamshisamineniz5905
@vamshisamineniz5905 3 ай бұрын
Is there a way to connect to ADLS directly without creating Shortcut in Lakehouse
@DataVerse_Academy
@DataVerse_Academy 3 ай бұрын
Yes you can use the service principal method.
@ADhuidv
@ADhuidv 3 ай бұрын
Sir, How can we build the JDBC/Pyodbc connection between Fabric Data warehouse and Fabric Notebook. I have been finding it since a long time, but un-successful
@DataVerse_Academy
@DataVerse_Academy 3 ай бұрын
But why do you need it, what is the use case which you are trying to implement?
@ADhuidv
@ADhuidv 3 ай бұрын
1.Initially, we are getting data from multiple sources and sinking them in one warehouse (Raw Data). 2. Now we want to extract data from this warehouse (Raw Data) to another warehouse (Transformed Data) through a Notebook wherein we will be performing our transformation logic. Hence, I want to build the connection between warehouse and Notebook only using JDBC or Pyodbc
@Shreekanthsharma-t6x
@Shreekanthsharma-t6x 3 ай бұрын
Hi , I have some complex "Scalar user defined functions" defined in MYSQL and I have to migrate them to fabric, but as of now fabric doesn't support creation of "Scalar user defined functions" in warehouse. In this scenario please let me know alternative options I can use. Thanks
@DataVerse_Academy
@DataVerse_Academy 3 ай бұрын
you can build that logic inside the procedure. I know you will not able to return a value using a function, but you can build whatever the logic which you are trying to build. If you can give me context, then i will provide you the code as well
@ADhuidv
@ADhuidv 3 ай бұрын
Sir, Can we extract the data directly from warehouse to the notebook, then transform it, and then finally save it to the same warehouse??
@DataVerse_Academy
@DataVerse_Academy 3 ай бұрын
You can transform the data inside a notebook from warehouse, but you can’t write the data into the warehouse.
@Shreekanthsharma-t6x
@Shreekanthsharma-t6x 3 ай бұрын
Hi Good Morning!, I have to convert the existing SQL server stored procedure into fabric environment, In my stored procedures there are CURSOR commands but fabric doesn't support CURSOR commands, in this case how do I proceed, is there any alternative.
@DataVerse_Academy
@DataVerse_Academy 3 ай бұрын
You can use the while loop for that.
@Bharathkumar-l3n
@Bharathkumar-l3n 4 ай бұрын
Hi sir, can we use metadata activity instead of look up activity, in order to perform same operation and get same result? Is metadata activity can do the same work as look up activity do?
@DataVerse_Academy
@DataVerse_Academy 4 ай бұрын
No, you will not be able to write query inside the meta data activity.
@EduInquisitive
@EduInquisitive 4 ай бұрын
thanks for the great explanation but here i didnt get one thing and that is where the metadata for managed table is getting created. I cant see any files created in Files folder while for external we can see them in the path provided.
@DataVerse_Academy
@DataVerse_Academy 3 ай бұрын
for managed tables, metadata and data is inside the tables folder itselft
@Shreekanthsharma-t6x
@Shreekanthsharma-t6x 4 ай бұрын
this is great video. thanks
@Shreekanthsharma-t6x
@Shreekanthsharma-t6x 4 ай бұрын
I have a SQL server stored procedure which updates, deletes and merges data into a table , how do I convert the stored procedure to pyspark job, is it possible to update a table in fabric using pyspark?, please make a video on this topic
@DataVerse_Academy
@DataVerse_Academy 4 ай бұрын
It’s very easy to do the same thing in pyspark, we can do all the stuff which you mentioned. I am a on break for couple of months. I am going to start creating video very soon.
@Shreekanthsharma-t6x
@Shreekanthsharma-t6x 4 ай бұрын
@@DataVerse_Academy please do create a video when you are back from break. Thanks
@AnisurRahman-wm2ys
@AnisurRahman-wm2ys 4 ай бұрын
Excellent !!!!! Do you have this type of video for SCD2 ?
@tv.TheDogFather
@tv.TheDogFather 2 ай бұрын
I think for the DIM Merges just wrap the merge inside an Insert Into and change the Update of the Merge Accordingly.
@yveshermann
@yveshermann 4 ай бұрын
this guy is a champion!! Thanks so much :):)
@IEYdel
@IEYdel 4 ай бұрын
Super helpful! Do you have a video that shows the silver layer with an example of joining related data from heterogenous data sources with data cleansing and deduplication? :D Still you are my hero Vishnu! Thank you for this video!
@gagansingh3481
@gagansingh3481 4 ай бұрын
Sir could you please make a video on Azure log analytics for Semantic Model and azure mirroring in fabric
@DataVerse_Academy
@DataVerse_Academy 4 ай бұрын
Definitely 💯
@rajudasari8482
@rajudasari8482 4 ай бұрын
which table is speed in retrieving the data
@sanishthomas2858
@sanishthomas2858 4 ай бұрын
Nice. quick question, the Presentation slide shown for Architecture is Power Point or any other software?
@DataVerse_Academy
@DataVerse_Academy 4 ай бұрын
It’s power point.
@ROHITKUMARGUJAR-k8t
@ROHITKUMARGUJAR-k8t 4 ай бұрын
Very good explanation.
@DataVerse_Academy
@DataVerse_Academy 4 ай бұрын
Thank you 🙏
@John.Wick.221
@John.Wick.221 4 ай бұрын
Where can I get more such data source
@samuel_t_chou
@samuel_t_chou 4 ай бұрын
Thank you, random Indian KZbinr. Very clear and useful.
@DataVerse_Academy
@DataVerse_Academy 3 ай бұрын
So nice of you
@orasha4846
@orasha4846 5 ай бұрын
Can you read multiple files if they have the same structure?
@DataVerse_Academy
@DataVerse_Academy 5 ай бұрын
Yes we can do that. It just like we read the data from a folder in PowerBI.
@kel78v2
@kel78v2 2 ай бұрын
Gen2 works fine with one file but throws a data type error from transformation folder when the dataflow combines multiple files
@JatinHingorani-jm3hc
@JatinHingorani-jm3hc 5 ай бұрын
i followed this process but getting error loading data in import mode "Unable to open the physical file. Operating system error 5: “5(Access is denied.)”" any idea why access denied error i have pro license and Member access of workspace
@longphamminh5804
@longphamminh5804 5 ай бұрын
why did you create two folder "current" and "archive" in Files
@DataVerse_Academy
@DataVerse_Academy 5 ай бұрын
To archive the processed file from current to archive folder.
@longphamminh5804
@longphamminh5804 5 ай бұрын
Thank you for answer
@longphamminh5804
@longphamminh5804 5 ай бұрын
@@DataVerse_Academy I have one more question: When to use spark.read.table() and spark.sql
@sonyvijai
@sonyvijai 5 ай бұрын
Great Video. Crisp and clear
@SiddeshSawarkar
@SiddeshSawarkar 5 ай бұрын
can i write data to warehouse using notebook??
@DataVerse_Academy
@DataVerse_Academy 5 ай бұрын
For now we can just analyse the data only, we don’t have data manipulation functionality in Microsoft fabric. May be in future they will add.