Enjoying the PySpark tutorials! Can you make a video on setting up Azure and navigating the portal? It would be super helpful. Thanks for the great content!
@a2zhi976 Жыл бұрын
you are my guru from onwards ..
@rajasdataengineering7585 Жыл бұрын
Thank you
@shivanisaini20762 жыл бұрын
this video is worth watching, my concepts related to access the file in databricks are clear now thank you sir
@rajasdataengineering75852 жыл бұрын
Thanks Shivani!
@ndbweurt344852 жыл бұрын
very clear explaination. god bless u.
@rajasdataengineering75852 жыл бұрын
Thank you
@HariprasanthSenthilkumarАй бұрын
Can you please make a video to connect to ADLS by service principal
@rajasdataengineering7585Ай бұрын
Sure I will do
@dhivakarb-ds9mi4 ай бұрын
I am getting this error Operation failed: "This request is not authorized to perform this operation using this permission."
@natarajbeelagi5692 ай бұрын
How to hide access keys?
@rajasdataengineering75852 ай бұрын
We can use databricks scoped credentials
@alexfernandodossantossilva47852 жыл бұрын
If we have a Vnet on the storage account? How can we access?
@naveenkumarsingh38296 ай бұрын
hey you are using location as wasbs:// which is nothing but azure blob storage location , and sometimes you are taking abfss:// which is path to azure data lake gen2 location.. Since I am still learning , I am getting really confused now.. And your video says adls connection with databricks..then it should be abfss:// right for a file path?
@rambevara5702 Жыл бұрын
Don't we need to app registration for data lake?
@rajasdataengineering7585 Жыл бұрын
That is another way of integration through service principal
@rambevara5702 Жыл бұрын
@@rajasdataengineering7585 whatever it is fine right..brother where can I get this databricks notebook..do you have any GitHub
@felipedonosotapia Жыл бұрын
Thanks so much!!! nice tutorial
@rajasdataengineering7585 Жыл бұрын
Glad it was helpful!
@lucaslira52 жыл бұрын
How would I do if the container had more files instead of just 1?
@rajasdataengineering75852 жыл бұрын
We can use wildcard to select multiple files
@lucaslira52 жыл бұрын
@@rajasdataengineering7585 what would this wildcard be like? I have two files in the container (city.csv and people.csv) but it's only bringing people.csv
@rajasdataengineering75852 жыл бұрын
You can give *.csv so that it can pick all CSV files
@lucaslira52 жыл бұрын
@@rajasdataengineering7585 But I would like to bring a specific file, for example my blob has 50 .csv files but I only want to bring people.csv to perform an ETL
@lucaslira52 жыл бұрын
Would it be here for example to put .option("name","people.csv)? df = spark.read.format("csv").option("inferSchema","true").option("header", "true").option("delimiter",";").option("encoding","UTF-8").load(file_location)
@Ramakrishna4102 жыл бұрын
Great knowledge. How can we apply access polices on mounted containers? For ex , 50 users have acess for databricks so , 50 users can see the all files under mounted container but i want to give read acess for few users only? How can we?
@rajasdataengineering75852 жыл бұрын
Hi Alavala, Good question. Mount points can be accessed from darabricks through service principal or Azure Active Directory. If we use service principal (SP) to create a mount point, all users/groups under the databricks workspace can access all files/folders in mount point. So if you want restrict access for set of people, there are many ways. One common approach is use AAD to create mount point so that user access ca be controlled using IAM within Azure portal. Another approach could be creating 2 different databricks workspaces and accessing mount point through 2 different service principals one with read access, another with write access. Hope it helps
@sujitunim2 жыл бұрын
Really very helpful... could you please create video for on premise Kafka integration with databricks
@rajasdataengineering75852 жыл бұрын
Sure Sujit, will do one video on this requirement
@rajivkashyap2816 Жыл бұрын
Hi sir, Any git link is dere so that we can copy and paste the code
@jagadeeswaran3308 ай бұрын
Nice explanation!
@rajasdataengineering75858 ай бұрын
Glad it was helpful! Thanks
@lucaslira52 жыл бұрын
with this option, is possible writing in data lake? Or only read?
@rajasdataengineering75852 жыл бұрын
We can write as well
@dataengineerazure29832 жыл бұрын
@@rajasdataengineering7585 How get the dataset source(csv files)? thanks
@AkashVerma-o7o9 ай бұрын
is it free to use azure data lake?
@rajasdataengineering75859 ай бұрын
No, it's not free
@kartechindustries30692 жыл бұрын
Sir does azure data lake comes under community groups or free services
@rajasdataengineering75852 жыл бұрын
No, azure data lake is paid services but Microsoft provides one month free subscription with some free credit. You can take advantage of it for your learning purpose
@DivyenduJ Жыл бұрын
Hello All, I am new to this and getting below error , many thanks if anyone could help for step 1: Invalid configuration value detected for fs.azure.account.keyInvalid configuration value detected for fs.azure.account.key
@rajasdataengineering7585 Жыл бұрын
Hi, seems the access key is invalid. Could you check it once again from storage account
@DivyenduJ Жыл бұрын
@@rajasdataengineering7585 Thanks a lot sir for the guidance it worked , mistakenly set on rotate key. May be that's the reason.
@rajasdataengineering7585 Жыл бұрын
Glad to know it worked!
@subbareddybhavanam5829 Жыл бұрын
Hi Raj, Can you please add data files too. like CSV and Json ...