simple but effective process and expalination as well. Great job
@rajasdataengineering75852 жыл бұрын
Glad you liked it!
@omprakashreddy42303 жыл бұрын
Simply Awesome !! We are learning a lot from your videos.
@rajasdataengineering75853 жыл бұрын
Thanks Omprakash
@avinashpasupuleti073 жыл бұрын
Simply Super Bro.. Awaiting for more videos from you on db related activates in data bricks
@rajasdataengineering75853 жыл бұрын
Hi Avinash, thank you. Sure will post more videos on db related activities
@Khm3rick2 жыл бұрын
Great video! Just one question, I saw you defined the jdbcDriver...but I didn't see it used after in jdbcUrl? What is it for?
@ShivaKumar-dj8bj Жыл бұрын
Yes I have also observed it. Later in other videos I found out that we have to add it as an option when reading data into data frame like .option("driver", jdbcDriver")
@sushilkushwaha2602 жыл бұрын
Awesome, Very nice explanation...
@rajasdataengineering75852 жыл бұрын
Thank you
@sravankumar17673 жыл бұрын
Nice explanation bro 👍
@prathapganesh7021 Жыл бұрын
I really appreciate this video thank you🙏
@rajasdataengineering7585 Жыл бұрын
Thanks Prathap! Glad you find it useful
@prathapganesh7021 Жыл бұрын
Can i join for paid project.. Or else how can i contact you..
@kylebrogan64162 жыл бұрын
What about just being able to access those same tables via an existing databricks catalog in the hive metastore structure for example? Is there a way to do that?
@dataisfun49642 жыл бұрын
Thanks worked perfectly.
@rajasdataengineering75852 жыл бұрын
Great
@VirajithaPanguluri2 жыл бұрын
Can we mention type of authentication while connecting? What if we have only Azure Active Directory Password ? How to mention that?
@a2zhi976 Жыл бұрын
like in unix , can i save all these details in one file and call in the beginning of the scripts. ?.
@rajasdataengineering7585 Жыл бұрын
Yes we can use yaml or json configuration file to save the details and during run time, spark can read the configuration file and process accordingly
@alwalravi Жыл бұрын
How to write this product table data into blob storage in parquet format in a databrick notebook? Plz help
@rajasdataengineering7585 Жыл бұрын
We can use databricks writer df.write.format("parquet").save(location)
@rajasekharmedia898711 ай бұрын
I want to do query from sql and load the result into one variable. Can we do that. Like select max(id) from sql table. I am using this id for comparison in next steps
@veerag94262 жыл бұрын
Super nice video
@rajasdataengineering75852 жыл бұрын
Thank you
@arnabsontu65782 жыл бұрын
Sir, Can we also create , view, alter , run stored procedures from databricks ?
@rajasdataengineering75852 жыл бұрын
Hi Arnab, stored procedures can't be created in databricks. Views can be created and can be altered as well
@betterahyosi Жыл бұрын
You didn't use the jdbcDriver . What is the purpose to have jdbcDriver ????
@rajasdataengineering7585 Жыл бұрын
Why do you say I didn't use jdbc driver???? Look at 7:30 in the video
@betterahyosi Жыл бұрын
@@rajasdataengineering7585I meant that u didn't pass the jdbcDriver value in to the jdbc url
@shashikantchaturvedi1559 Жыл бұрын
Hi Raja's, I am following this tutorial step by step but I got an error while running the 2nd cell of getting product table. the error is " java.sql.SQLException: No suitable driver", can you please help in this case.
@shashikantchaturvedi1559 Жыл бұрын
Now I got that, something was wrong in preparing the connection. I can connect, and get the data from Azure Sql Server.. Thanks Raja.
@rajasdataengineering7585 Жыл бұрын
Glad to hear you fixed the issue 👍🏻
@vaddenata6735 Жыл бұрын
Thank you so much ❤Sir....
@rajasdataengineering7585 Жыл бұрын
Most welcome! Hope you find it useful
@vaddenata6735 Жыл бұрын
@@rajasdataengineering7585 Yes 💯
@bhumikalalchandani321 Жыл бұрын
Sir getting no suitable driver on running df @@rajasdataengineering7585
@AjithKumar-cj7hh Жыл бұрын
What if data is huge like 100 gb. Is it still recommended?
@rajasdataengineering7585 Жыл бұрын
Jdbc connection has performance issue while handling huge amount of data. But there are options to improve the performance which can be applied depending on the use case
@suresh.suthar.24 Жыл бұрын
great video
@rajasdataengineering7585 Жыл бұрын
Glad you enjoyed it
@shivangishingatwar13563 жыл бұрын
Could you help me , establish connection string using azure active directory authentication mode
@apoorvsrivastava71212 жыл бұрын
Sir how can we connect using serects from keyvault ?
@rajasdataengineering75852 жыл бұрын
We need to create scoped credentials in databricks first to setup integration between key vault and databricks
@apoorvsrivastava71212 жыл бұрын
@@rajasdataengineering7585 thank you will check and do 💪
@melvin99932 жыл бұрын
Simple and effective
@suman_vedakshaB Жыл бұрын
HELLO SIR..WHILE IMPORTING DATA HOW WE COME KNOW WHICH IS MODIFIED AND WHICH IS LATEST DATA??I MEAN ANY UPDATED DATA HOW WE HANDLE THAT..PLS REPLY
@Jayalakshmi-r9t7 ай бұрын
how to get that ip address . i did not find while logging. please can you say
@rajasdataengineering75857 ай бұрын
You can get it from command prompt using ipconfig command
@natarajbeelagi5693 ай бұрын
I want to connect to my localhost but the connection is getting refused. Can you please make a video on it?
@arnabsontu65782 жыл бұрын
Sir, is there any way to hide the password from exposing it in the code ?
@rajasdataengineering75852 жыл бұрын
Yes Arnab, we can use azure key vault
@akashsharma47692 жыл бұрын
Also we can use Databricks secret scope
@rajasdataengineering75852 жыл бұрын
Yes we can use databricks secret scope
@Jayalakshmi-r9t7 ай бұрын
how to get that IP address , foe me it was not visible
@SantoshKumar-yr2md10 ай бұрын
how to get multiple tables from Azure SQL into databricks notebook
@govardhanbola11952 жыл бұрын
Hard coding Password in the code is not recommended. Can we get password from Azure Key Valt. Can you please let us know the steps for that
@rajasdataengineering75852 жыл бұрын
We need to integrate azure key vault with databricks by creating secret scope
@MrTejasreddy2 жыл бұрын
simple superb can you make a video how to creatra account in databricks community addition for free