Very cool. I'd used kan ban boards in azure organizations and azure devips. Adf is so similar Id had no idea about it
@NripaEmpowerthroughknowledge Жыл бұрын
Thank you forsharing . Insightful
@SoftWizCircle Жыл бұрын
Glad it was helpful!
@apurvgolatgaonkar-6765 Жыл бұрын
thanks sir your video is very helpful for me. 🙂
@SoftWizCircle Жыл бұрын
Thanks
@pigrebanto7 ай бұрын
thanks. Does this work with Azure Blob Storage or Azure Data Lake?
@SoftWizCircle7 ай бұрын
yes it does
@lorenzosvezia53849 ай бұрын
Ty for sharing this useful info I actually have a similar problem, I'm trying to create a service principal on Databricks but i don't understand how the tocken works, How does it works in that case?
@SoftWizCircle8 ай бұрын
Thank you for watching and sharing your question! Service principals and tokens can indeed be a bit tricky. In Databricks, a service principal is used to authenticate and authorize an application to access Databricks APIs without requiring personal user credentials. The token you mentioned acts as a key that the service principal uses to authenticate its requests. To create a service principal and use a token with it in Databricks, you’ll need to: Register an application with Azure AD to obtain a service principal. Assign the necessary permissions to your service principal, depending on what tasks it needs to perform. Generate a token for the service principal in Azure, which will be used in your Databricks configuration.
@beaufonville180711 ай бұрын
how do you know where and what to mount?
@SoftWizCircle11 ай бұрын
it is based on what data and where it is stored and what is path to consume it
@rakeshreddy1822 Жыл бұрын
Hello... I found your video to be very helpful for me. Let me know if you can provide me Azure Data Factory and Azure Databricks training.
@SoftWizCircle Жыл бұрын
right now i am not able to get time so apologies i can not help you right now
@Gowtham-hm3fr6 ай бұрын
Task: Set up a Basic Data Pipeline in Azure Step 1: Data Ingestion Azure Service: Azure Event Hubs or Azure Blob Storage Steps: 1. Create an Azure Event Hub namespace or Blob Storage account. 2. Set up an Event Hub or Blob Container to receive incoming data. 3. Configure access policies and keys for ingestion. Step 2: Data Transformation Azure Service: Azure Databricks or Azure HDInsight (Spark) Steps: 1. Provision an Azure Databricks workspace or HDInsight cluster. 2. Develop a Py Spark or Spark job to process and transform data. 3. Schedule or manually run the Spark job to process incoming data. Step 3: Data Storage Azure Service: Azure Data Lake Storage Gen2 (ADLS Gen2) or Azure SQL Database Steps: 1. Create an ADLS Gen2 storage account or Azure SQL Database. 2. Define folders or tables to store processed data. 3. Ensure proper access control and data retention policies. Step 4: Orchestration and Monitoring Azure Service: Azure Data Factory Steps: 1. Set up an Azure Data Factory (ADF) instance. 2. Create pipelines to orchestrate data movement from Event Hub/Blob to Databricks/HDInsight to the ADLS Gen2/SQL Database. 3. Configure triggers for pipeline execution and monitoring through ADF. This is my task how to do that? Any specific video having that task please share me
@SoftWizCircle6 ай бұрын
if you see different video on this channel for azure then you will find video about all resources you have mentioned. you just need to stich it in proper way to achieve your task