Thank so much sir, Very understandable explanation way.
@rajasdataengineering758511 ай бұрын
You are most welcome
@ashswinsubbiah3752 Жыл бұрын
You are a gem sir.
@rajasdataengineering7585 Жыл бұрын
Thank you, Ashswin!
@kanikevenki9988 Жыл бұрын
Thanks sir for sharing this
@rajasdataengineering7585 Жыл бұрын
You are welcome!
@kanikevenki9988 Жыл бұрын
@@rajasdataengineering7585 sir I'm preparing for interview for spark with Azure. Can you make about Azure also
@rajasdataengineering7585 Жыл бұрын
Sure Venki, will create videos on other azure services like ADF, Synapse, ADLS etc
@kanikevenki9988 Жыл бұрын
@@rajasdataengineering7585 thanks sir for your valuable time and replying name .. I hope the calses will starts soon
@rajasdataengineering7585 Жыл бұрын
Sure!
@Learn2Share786 Жыл бұрын
Thanks, can you pls make a separate video on question no.21 i.e troubleshooting OOM issue in real time?
@rajasdataengineering7585 Жыл бұрын
Sure Fahad, will make separate video on this requirement
@tallaravikumar4560 Жыл бұрын
How to tune a spark job which is already running in Databricks ?
@rajasdataengineering7585 Жыл бұрын
You can enable Adaptive Query Execution to handle few performance bottleneck automatically. Also you can use spark UI to get few metrics about hour job and tune based on that information
@dvsrikanth22 Жыл бұрын
Nice video thanks for sharing👍👍, Sir can u please do a video on incremental load from adls to sql table using csv files and also maintain the audit log table for files when the file loaded and time and no.of records in that particular file. So that in next load notebook should pick up only new file from adls path.
@rajasdataengineering7585 Жыл бұрын
Sure Srikanth, will create a video on this requirement
@sabesanj5509 Жыл бұрын
Thanks for the part 2 video raja bro👍. Those topics has minimal points for all the questions but some unique and important points to be told in interview..
@rajasdataengineering7585 Жыл бұрын
Glad you liked it!
@sabesanj5509 Жыл бұрын
@@rajasdataengineering7585 yes bro very informative🙌🏽