Microsoft Fabric - II - OneLake
34:21
Azure Synapse Analytics - Part I
40:17
Azure Databricks - What, Why & How
29:18
Пікірлер
@Learn2Share786
@Learn2Share786 6 күн бұрын
Thanks, would be really helpful if you upload the notebook on GitHub etc - that would help in practicing.
@AsangaRamanayake
@AsangaRamanayake 17 күн бұрын
Great work Dinesh...!!!
@Learn2Share786
@Learn2Share786 24 күн бұрын
Thanks for the explanation, could you pls share the ADF code/template?
@DineshPriyankara
@DineshPriyankara 18 күн бұрын
Sorry, I generally discard once used in videos :(. Anyway, it is a simple one, just follow the steps for adding items to the pipelines.
@MDSely-uf9tf
@MDSely-uf9tf 29 күн бұрын
😊ল০০ল😊০
@Nalaka-Wanniarachchi
@Nalaka-Wanniarachchi Ай бұрын
Nice video ! From my experience, the starter pool works well without HC mode, while custom Spark pools can benefit from it. I've noticed HC mode enhances performance for heavy-load concurrent tasks, but if we are cost-conscious, it may be prudent to avoid HC mode for smaller tasks, as underutilized resources could increase CU costs.
@DineshPriyankara
@DineshPriyankara Ай бұрын
Good point, Nalaka! Since the Custom Pool takes time to start, enabling HC for pipelines might indeed highlight its value, definitely worth testing and seeing the results. Thanks for sharing this. The one we saw with 1.7 million CUs only includes code for updating an Azure SQL DB, but it’s part of a pipeline with many activities that typically take around 20-30 minutes to complete.
@MorsiMasmoudi
@MorsiMasmoudi Ай бұрын
hello and thanks for the video, can you share the code for the database sample and the SP sending email ?
@saharshjain3203
@saharshjain3203 3 ай бұрын
Hey Dinesh, I want to create a data flow such that I can extract all the attachments having csv and or Excel format and save it in the Lake House, in the table format in Microsoft Fabric. Can you guide me how to execute this task?
@KishoreLingala
@KishoreLingala 5 ай бұрын
Nice one. I tried this given error like this "[Open Browser Console for more detailed log - Double click to close this message] Failed to load model class 'ReportModel' from module 'powerbi-jupyter-client' Error: No version of module powerbi-jupyter-client is registered"
@KishoreLingala
@KishoreLingala 5 ай бұрын
Nicely Explained . Do we have option of Authenticating using Service Principal (Bearer Token) in PowerBiClient ?
@KishoreLingala
@KishoreLingala 5 ай бұрын
Nicely Explained . Do we have option of Authenticating using Service Principal (Bearer Token) in PowerBiClient ?
@syednayyar
@syednayyar 5 ай бұрын
Can i connect and ingest the data using Java ( Json file ) upoading to Lake house ? i am not able to find any Java snippet using it , Use case is that i want to ingest data directly through my android app ( using Java ) it may be HTTP or Rest i dont mind but i want to ingest the files ( let say GPS data , or sensor data ) saved in JSON format into my Lake house in Fabric is it even possible can you please make video on it please ? ( I dont want to use Azure subscription )
@hasnaamanaa1729
@hasnaamanaa1729 8 ай бұрын
good job and clarification
@pratikpophali107
@pratikpophali107 8 ай бұрын
Hii! Thank you for this wonderful video. I am doing a similar thing in which I am consuming data from the Kafka topic and sending it to Power BI. I want to replicate the same thing that you have done to understand it. Can you please share the code or github repository link for it?
@superfreiheit1
@superfreiheit1 11 ай бұрын
Can you show us how to config Maschine Learning Services on SQL Server 2022?
@manjunathkampali8942
@manjunathkampali8942 11 ай бұрын
Nicely Explained . Do we have option of Authenticating using Service Principal (Bearer Token) in PowerBiClient ?
@wendzbrand
@wendzbrand Жыл бұрын
Where can I get the Table Name?
@cheatvideos3015
@cheatvideos3015 Жыл бұрын
Wow, this is awesome! Did you share the code so I can get it and play with it? Thanks in advance for everything. I'm looking forward to watch all you videos. And Can you do permission in dashboard , like if you have cities and want each user to see his city only, like what we do in Power BI desktop version.!?
@FilippoBesana
@FilippoBesana Жыл бұрын
Hello, Thanks for this video! I'm trying to implement Push and Delete using a capability of latest SqlServer. It can call REST webservice creating MSXML2.ServerXMLHttp object using sp_OACreate/sp_OAMethod/sp_OADestroy system stored procedures. The main advantage is working directly on database where data are stored bypassing VisualStudio and a lot of unusefull complexity generated by writing, compiling source code, releasing it and so on... Calling webservices from database there is a very strange behaviour. Push methos works fine : it returns 200 as status code and I see values in realtime in my dashboards on PowerBi Service, but Delete method return 200 as status code but does not delete anything! If I delete rows on the microsoft webpage you shown in the video all rows are deleted. Do you ave any suggestion?
@saadhafeez9171
@saadhafeez9171 Жыл бұрын
Hi, I have a question. You used the static access token to remove the rows, that expires in two hours I think. How can we make it dynamic? I tried registering an app in azure but I think that does not have permission. kindly let me know. Thanks.
@shireentabassum5052
@shireentabassum5052 Жыл бұрын
G🩵🪈🍍😮😮😮
@weronikakrol5812
@weronikakrol5812 Жыл бұрын
Hello! It's a very interesting video, thank s lot! Can you tell me if it is possible to create a push dataset and connect it to usual datasources like Oracle database?
@DineshPriyankara
@DineshPriyankara Жыл бұрын
Yes, when it comes to Push Dataset, you are responsible for connecting with the source and get the required data, and push to Power BI. If you code get the right dataset from Oracle, it just works as shown in the video.
@susanoyekanmi861
@susanoyekanmi861 Жыл бұрын
It’s a very nice video but I have a question, can you just upload the dataset from a google sheet and then create a streaming dataset with it and then analyse it , will it work the same way as seen in the video
@BISimplifier
@BISimplifier Жыл бұрын
Note. Can connect to a Direct Lake dataset in Desktop in a Live mode .Once you create the semantic model using the web editing in service, you can connect to the dataset in Live mode in the Desktop to create reports.
@DineshPriyankara
@DineshPriyankara Жыл бұрын
Yes, unfortunately, I had no way of showing that part, the workspace configured with a capacity is not allowed to be exposed :). Only limitation we see with current version is, creating datasets for Data Lake folders, using the PBI service.
@tharindhuanuradha
@tharindhuanuradha Жыл бұрын
can I bring data from Infor data lake to microsoft fabric lakehouse and perform some transformations
@DineshPriyankara
@DineshPriyankara Жыл бұрын
I have not worked with Infor Data Lake, however if it can be accessed using generic protocol (HTTP, OData, REST API), then data can be brought in using pipelines. Once in, any type of transformation is possible, either using Data Flow or Notebooks.
@krishnakashyap2165
@krishnakashyap2165 Жыл бұрын
​@@DineshPriyankara mm
@sangeetadevnath7409
@sangeetadevnath7409 Жыл бұрын
Nicely explained.Thankyou
@Gayashan4lk
@Gayashan4lk Жыл бұрын
great content.
@SKGA
@SKGA Жыл бұрын
Good stuff..! 👏
@matrixlnmi169
@matrixlnmi169 Жыл бұрын
These are tightly couple based approaches , if solutions need to move to another cloud provider in those cases? A new fresh investment would be required
@anoopv5790
@anoopv5790 Жыл бұрын
Very Clear Explanation. Excellent!
@HEMANTHKUMAR-gu7fi
@HEMANTHKUMAR-gu7fi Жыл бұрын
Thank you so much
@motionblender27
@motionblender27 2 жыл бұрын
Awesomely done..thank you. If someone wants to update mail is sent in table column..just add one more step of sal action to execute query
@dpatro7245
@dpatro7245 2 жыл бұрын
If we want to return multiple output value from notebook then how do we get individual value into pipeline, can you please help me
@giladsefti301
@giladsefti301 2 жыл бұрын
Excellent explanation. Thanks!
@SAMSARAN2108
@SAMSARAN2108 2 жыл бұрын
Can I pass my SSO credentials that used to login into Power BI to make Databricks connectivity? Please confirm. Regards, Sam
@vamsi.reddy1100
@vamsi.reddy1100 2 жыл бұрын
Your voice and pronunciation is like kud venkat ...,
@sudarshant2340
@sudarshant2340 2 жыл бұрын
Data In Excel file 2 columns: Country-name, Flag India, 1 Netherlands, 1 Romania, 0 I have a excel file having data mentioned above storing in Azure data Lake Gen2. Requirement: I want only flag 1 data from excel using Azure data factory without using SQL server and data flow. I’m trying implement the requirements using the look up, set variable and for each activities in adf but I’m unable to find solution. Can you please give me your suggestions or ideas how to implement the pipeline in ADF.
@felixkongho6969
@felixkongho6969 2 жыл бұрын
Hi Dinesh, In creating connection in logic app and after entering the servername, database name and table name, I got an error "Could not retrieve values. login failed for user <token-identified principal>" any suggestion on how to resolve the connection issue?
@priyadarshinis8008
@priyadarshinis8008 2 жыл бұрын
Thanks for the informative video 👍
@mohammedsiraj8523
@mohammedsiraj8523 2 жыл бұрын
Thanks , very usefull
@mohammedshabaaz9625
@mohammedshabaaz9625 2 жыл бұрын
Are you looking for a thumbnail designer for your channel which would increase the click through rate of your videos then plz let me know. Happy to help Dinesh and btw your work is amazing
@reshmakonde597
@reshmakonde597 2 жыл бұрын
what if daily new files are added , how to control the access on them?
@ash3rr
@ash3rr 2 жыл бұрын
Why do I get network timeout when I try to list my clusters
@kamlakarpawar6671
@kamlakarpawar6671 2 жыл бұрын
How to set up recurring migration through queries/script from SQL server(On-premise) to Azure SQL database I need a help to sync the data on Azure SQL from SQL server(On-premise). Available recourses: 2 Database (SQL Server) on premise available on different server Azure SQL database on cloud Migration script/queries are ready to fetch data from on-premise sql server Requirements: Set up a scheduler which will run every 12 hours means two times in a day on Azure SQL. In scheduler, using migration scripts data will be fetch from the On-premise SQL server and insert into Azure SQL Database.
@pragnyapuranik5355
@pragnyapuranik5355 2 жыл бұрын
Thank you … this helped me a lot :)
@chinmaygokhale4260
@chinmaygokhale4260 2 жыл бұрын
Does this encryption type support "like" operator and wild cards for searching?
@krishnachaitanyareddy2781
@krishnachaitanyareddy2781 2 жыл бұрын
can you tell how storage is mapped when creating apache spark anywhere you mentioned use particular storage
@manniedigicast
@manniedigicast 2 жыл бұрын
Hi Dinesh. How can I copy the output of a lookup activity to blob storage or ADLS?
@fumini99
@fumini99 2 жыл бұрын
REally awesome presentation - thanks a lot!!
@swathibandarupalli803
@swathibandarupalli803 2 жыл бұрын
Hi Dinesh...Could you please create a video on different file storage formats like parquet, avro, ORC..etc., with some practical use cases
@sravankumar6180
@sravankumar6180 2 жыл бұрын
Could you also share a video on how the data is stored when we use a dedicated sql pool and a cost comparison between use of Synapse serverless,Dedicated and data bricks lakehouse
@You77alesi
@You77alesi 2 жыл бұрын
Hi Sravan, did you find something useful in the meantime ?
@sravankumar6180
@sravankumar6180 2 жыл бұрын
Thanks for sharing this Dinesh, this is very useful. I would like to understand more about the use of Serverless SQL pool for use as Datawarehouse and PowerBI Consumption. If we use Parquet - it doesn't support schema enforcement and comes across read-write conflicts. Could you provide a video on your analysis on use of Serverless SQL pool for BI consumption