You are amazing, you made my day by helping me complete the entire spark setup and how to run a master with worker nodes.
@anikethdeshpande83366 ай бұрын
this is the best video tutorial on this concept!!! it worked well! thank you
@thomasbates91894 ай бұрын
You are so helpful! Thank you sir!
@berginv6827 Жыл бұрын
Good Work !!!
@revathi60356 ай бұрын
Nice explanation. Thanks.
@jasonrayen62854 жыл бұрын
Awesome Video! Hats off!
@shabbirgovernor4 жыл бұрын
Thanks 🙂
@mrkash81433 жыл бұрын
Hi good one . I see an error in the last attempt, remote host forcibly closed the connection. Can you explain it and how to fix it. The gui will shouw you the job Got killed ..if i remember.
@bisiflora3 жыл бұрын
Thank you very much, Sir
@ammar28710 ай бұрын
any idea how to check parameters like time /core and memory used as cant see them on the 8080 and 8081 UI
@trungthanh4134 Жыл бұрын
How we can setup spark master/worker on docker container on VM. Then connect to it from local ?
@Kyrios_X3 жыл бұрын
Hi, i have a problem on importing findspark. I get the message "Import findspark could not be resolved". Do i have to install it in a specific folder (e.g. spark folder) ?
@pratheekp89663 жыл бұрын
yeah same problem?What to do
@prabhaskrishnan3 жыл бұрын
Thank you. Its working exactly as per your guidance. Could you use spark-submit command to run the same. Im getting error in my machine when I run use spark-submit as ' Python was not found;'
@francescofurini18313 жыл бұрын
thanks! you saved me!
@shabbirgovernor3 жыл бұрын
Felt like batman!😅
@fozantalat4509 Жыл бұрын
Hi Shabbir, can you please show us how to stop spark master cluster and worker cluster and also an example of spark-submit?
@shabbirgovernor Жыл бұрын
Sure will try to make a video on it someday
@mahak8914 Жыл бұрын
Can we create two workers and a driver on single personal laptop?
@mareenafrancis37933 жыл бұрын
Hi sir, I am working in an 18 core machine. so how can i add all the cores to the cluster as worker?
@anandlamani29113 ай бұрын
hi, the same procedure works for two laptops ? cluster setup for two computers
@PragadeeshSP-i9h18 күн бұрын
Did u able to do it?
@s.m.shreyas918 Жыл бұрын
Thankyou, sir no other video taught this, but i also wanted to tryout this at production level, so i have 2 laptops one sony 4gb ram and second lenovo 8gb ram , i thought to make sony as master and lenovo as worker node but i was not able to access master from worker terminal, altough message was displayed to be connected, gradually then retrying messages was displayed. kindly create a content on that too
@shabbirgovernor Жыл бұрын
Sure👍🏻
@anandlamani29113 ай бұрын
@@shabbirgovernor please can you help us to these type of problems
@PragadeeshSP-i9h18 күн бұрын
@@shabbirgovernorthank you it's really good explanation but I am trying to do it with 2 laptops 1 for master and other for worker but worker isn't connect to master can I get help from you??
@Karansingh-xw2ss10 ай бұрын
Hi How can we connect it on multiple system
@akshayjoshi49619 ай бұрын
YES I WANNA KNOW THIS
@narendramarreddy7004 жыл бұрын
Hi Shabbir, I get the below error when i try to run this from the command prompt. However, I was able to run this from pycharm . Can you please advsie. Thank you .. C:\Users\***>python N:\Spark\Scripts\Python\Projects\MasterSlave\MasterSlaveTester.py Traceback (most recent call last): File "N:\Spark\Scripts\Python\Projects\MasterSlave\MasterSlaveTester.py", line 1, in import numpy ModuleNotFoundError: No module named 'numpy'
@shabbirgovernor4 жыл бұрын
pip install numpy
@narendramarreddy7004 жыл бұрын
@@shabbirgovernor It worked. Thank You. Your videos on Spark are very thorough. Excellent job!!
@shabbirgovernor4 жыл бұрын
Happy to help sirji.!😊
@TheDollyjolly2 жыл бұрын
I got no returns after spark-class2.cmd org.apache.spark.deploy.master.Master command. As a result, I can't move on to the next step. But the localhost:8080/ is working. Any idea why?
@pratheekp89663 жыл бұрын
Hi, i have a problem on importing findspark. I get the message "Import findspark could not be resolved". Do i have to install it in a specific folder (e.g. spark folder) ?