Jetbot Neural Network Based Collision Avoidance

  Рет қаралды 74,276

Zack's Lab

Zack's Lab

Күн бұрын

Пікірлер: 174
@JuanPerez-jg1qk
@JuanPerez-jg1qk 4 жыл бұрын
you can train to search items as well..like missing keys ..by showing the keys ..then hide it ..let the bot search mode ..it will notfity to you by phone..or alarm device
@Bippy55
@Bippy55 5 жыл бұрын
You've highlighted the system very well. It's encouraging especially to an older engineer that just used to program in HEX. Thanks very much!
@ZacksLab
@ZacksLab 5 жыл бұрын
Dave, I'm really glad that it was able to help you, I hope you enjoy the adventure! :)
@steveconnor8341
@steveconnor8341 5 жыл бұрын
I love the incoming light bulb threats. Great video
@ZacksLab
@ZacksLab 5 жыл бұрын
haha, thank you! :D
@这啥呀这是
@这啥呀这是 4 жыл бұрын
I was going to say "ha, that's an overtraining right there", but it turns out working pretty good!! Very nice work!
@quinncasey120
@quinncasey120 5 жыл бұрын
"I can use this as data" hilarious!
@lilasarkany3381
@lilasarkany3381 5 жыл бұрын
It's interesting how good the quiality of your videos, how good you explain stuff but you doesn't even reached 1K I think you deserve more.
@ZacksLab
@ZacksLab 5 жыл бұрын
thank you Lila, I'll keep the videos coming regardless, I hope to hit 1k soon!
@jackflynn3097
@jackflynn3097 5 жыл бұрын
Awesome, I'm working on my turtlebot project, using ROS's gazebo to do simulation and A3C(a Reinforcement Learning algorithm) to train the BOT. It can save tons of time by avoiding gathering data and label them.
@ZacksLab
@ZacksLab 5 жыл бұрын
jack flynn, interesting. How is the simulation environment generated? It would be cool to see a side by side comparison of both methods. I would imagine that the ideal training set is a combination of both sims and real data.
@jackflynn3097
@jackflynn3097 5 жыл бұрын
@@ZacksLab well the problem is not about the simulation, its about Deep RL algorithms. As DeepMind's research from DQN to DDPG and A3C(Deep RL methods) take raw pixels as input, and learn how to avoid obstacles and even navigate through maze.
@jackflynn3097
@jackflynn3097 5 жыл бұрын
@@ZacksLab this is a Video by DeepMind, it's a result on playing TORCS using A3C methods: kzbin.info/www/bejne/ZqnSYn-arZh_a7M
@ZacksLab
@ZacksLab 5 жыл бұрын
jack flynn So does the AI in the video you shared always stay inside the simulated environment? What happens when you put AI that was trained strictly on simulated data into a physical device in the real world and it encounters scenery, lighting conditions, objects, and scenarios that the simulator wasn’t able to provide? The issue with training on data generated in simulators is that the real world throws scenarios at the AI that the simulations just can’t account for. Are you saying that the A3C method solves this issue?
@jackflynn3097
@jackflynn3097 5 жыл бұрын
@@ZacksLab okay, I got it. Actually RL's main idea is learning by interaction. It let the agent(jetson bot in this case) try moves and gain rewards. If it hit an obstacle then this episode is finished and gain negative reward. Agent's goal is to maximize total reward. So after episodes of episodes of learning, it adjusts parameters in the agents NN. This can be done in a simulated environment or done in real world. The agent is a policy network which in A3C methods is also called an Actor. It's a policy network, when you input a state(a image from camera) it output an action(left right forward or stop). A3C is a RL methods, RL is different from supervised learning in that you don't need to give your learner labeled data. Back to the simulated env, if train the bot is difficult in real world. Then it can be done in simulated env. Which ROS(robot operating system based on Ubuntu) provides tools to do that. When your agent/actor is well performed in the simulation, putting it into real world is easy(ROS makes sure of it). I did some experiments, after the robot is trained in the simulated env, it works on every kinds of ground surface. Maybe because the weights on pixels related to the grounds are small.(Still working on the prove that).
@hfe1833
@hfe1833 5 жыл бұрын
Of all the jet nano video I saw, all I can say this is like a practical demonstration or almost a real world scenario, congrtas bro, by the way hope you can reading numbers like speed limit simulating car limit
@ZacksLab
@ZacksLab 5 жыл бұрын
Thank you! Yes adding road sign detection is actually what I want to work on next, I was thinking about putting a nano and camera on my car dashboard to collect data and start working on sign and streetlight detection and interpretation.
@hfe1833
@hfe1833 5 жыл бұрын
@@ZacksLab this will be great and awesome, you deserve one subscriber, in 3..2..1,bell button click done
@gusbakker
@gusbakker 5 жыл бұрын
"breakdancing? ups.. I'll consider this a feature"
@RAP4EVERMRC96
@RAP4EVERMRC96 4 жыл бұрын
Went to the comments to see if somebody already commented :D
@toranarod
@toranarod 5 жыл бұрын
Thank you. Best demo i have seen with information that really help me move forward.
@ZacksLab
@ZacksLab 5 жыл бұрын
you're welcome!
@tiamariejohnson6898
@tiamariejohnson6898 5 жыл бұрын
wow i want to invest in this product you engineered from Nividia, that is complex code and you did amazing
@ZacksLab
@ZacksLab 5 жыл бұрын
Ariel, thank you! The jetbot is an open source project built by the Nvidia community, I didn't personally design the Jetbot or the code used in this video, it's all available on the Jetbot github for anyone to use/experiment with!
@kestergascoyne6924
@kestergascoyne6924 5 жыл бұрын
Thank you very much. I might build this as my first robot!
@ZacksLab
@ZacksLab 5 жыл бұрын
You’re welcome, let me know how it goes!
@markmilliren1453
@markmilliren1453 5 жыл бұрын
Awesome stuff Zack. This one I understood a little more than the subscribe counter. Love the video structure and creativity!
@John-nr1ez
@John-nr1ez 5 жыл бұрын
This video is so well done, awesome job! I like the JetBot color scheme and Jupyter theme :)
@ZacksLab
@ZacksLab 5 жыл бұрын
Thanks so much! I ordered 3d printing filament just to get those colors for the chassis :)
@diggleboy
@diggleboy 4 жыл бұрын
Great video Zach! I'm looking to get into the NVidia Jetson Nano for signal processing. Nice to see how easily it is to use pytorch to train the classifier, download it to the Jetson board and run it. This example you gave is really cool. Liked. Subbed. Smashed the bell.
@MakeCode
@MakeCode 5 жыл бұрын
I became a big fan to this channel! It is what I wanted to do.
@flaviuspopan8024
@flaviuspopan8024 5 жыл бұрын
So damn glad you started making videos, they’re really entertaining and inspiring.
@ZacksLab
@ZacksLab 5 жыл бұрын
Thank you Flavius, that means a lot to me!!
@gusbakker
@gusbakker 5 жыл бұрын
Would be great to see a Drone project made with Jetson Nano
@ZacksLab
@ZacksLab 5 жыл бұрын
I'd love to do something with the Nano and a drone platform, it's definitely on my project list. I was working for a startup using the Jetson TX2 (the big brother of the Nano) for vision based collision avoidance for industrial drones... I wrote a medium blog post about the hardware development for it if you're interested! medium.com/iris-automation/the-journey-to-casia-part-one-faea27491f02
@sohaibarif2835
@sohaibarif2835 5 жыл бұрын
Important point, it DOES NOT support Wifi and Bluetooth out of the box. You need to purchase and install a module. Also, I just learned the hard way that power is an issue too. On mine, after installing the module, it will not turn on with the USB power.
@ZacksLab
@ZacksLab 5 жыл бұрын
Sohaib Arif, good point, I should have explicitly stated that. Are you using the m.2 key or a USB dongle? I have not had any power issues with the Intel WiFi/BT card. If you're using the dongle and the issue is due to power draw on VBUS, you could try setting the jumper to use power coming from the barrel jack which allows for up to 4A I believe. You'd have to adapt the output of your battery to this connector though.
@sohaibarif2835
@sohaibarif2835 5 жыл бұрын
@@ZacksLab I am using the M.2 key. Interestingly, I tried powering it via a portable USB phone charger that I know sends 5V/2A and it worked but it does seem to be slower now. You are right about the 4 A barrel jack, I will add that soon. Do you have any suggestions for a portable version of that config? I am mostly a software guy so I don't have much experience with the electrical stuff.
@ZacksLab
@ZacksLab 5 жыл бұрын
I would look for a battery that can source 5V up to 4A from a single port (I think the battery on the bill of materials for the jetbot can do 3A per port, which is likely more than enough although I haven't done a power study on the jetbot). Then, use a USB type A to barrel jack adapter like this one: www.bhphotovideo.com/c/product/1368294-REG/startech_usb2typem_3_usb_to_type.html/?ap=y&gclid=CjwKCAjwq-TmBRBdEiwAaO1enw753uFBGzvPy3oIlOcMy3uRFGAFWwvLlx5PHGL2FudDY-Jb9OE1qhoCOvAQAvD_BwE&lsft=BI%3A514&smp=Y Make sure you connect the J48 Power Select Header pins to disable power supply via Micro-USB and enable 5V @ 4A via the J25 power jack.
@airinggemi
@airinggemi 4 жыл бұрын
Great video. I don't know about much about AI but your video make me excited. Unfortunately, I cant buy motor driver and plastic ball in my country, what should i use to replace it? And after i craft the car, how i use your data? Should i buy the Jetbot's Notebook?
@ryanc5195
@ryanc5195 5 жыл бұрын
Hi, it is great video. I just got nano and wondering how to setup training to save my own data. Will you have step by step video? Thanks
@ZacksLab
@ZacksLab 5 жыл бұрын
Hi Ryan, thank you! If you follow the collision avoidance example that is on the jetbot's github repo under NVIDIA-AI-IOT/jetbot/notebooks/collision_avoidance you will find a jupyter notebook called data_collection.ipynb. Launch this notebook on the jetbot and run through the code, if your jetbot hardware is set up correctly everything should go smoothly. I can definitely do a step by step video on this but it will take me a bit to get it posted!
@Osmanity
@Osmanity 5 жыл бұрын
@@ZacksLab thank you if it takes a while it is ok but it would be very very helpful with step by step thanks again dude
@miguelangelpicoleal3234
@miguelangelpicoleal3234 4 жыл бұрын
​@@ZacksLab I would also like to see that in one of your upcoming videos, Hopefully It's still in your plans.
@mianwaqasarshad9611
@mianwaqasarshad9611 3 жыл бұрын
I've jetpack 4.6 installed on my 2 GB jetson nano and I've interfacd Rasbperry pi V2 CSI Camera. The issue which I am facing right now is the live execution of Thumbs task in free DLI course (sample programs). Nano is working fine while taking samples of thumbs up and down Infact it is training the neural network perfectly. But during Live execution for prediction purposes it is unable to determine whether I am holding thumbs up or down. I've been stucked to this matter months ago rather I've ran the same sample on my friends nano but i couldn't find a remedy. Will be waiting for beneficial reponse.
@TheRealMoviePastor
@TheRealMoviePastor 4 жыл бұрын
Awesome. Love the breakdancing. LOL
@tornShoes010
@tornShoes010 5 жыл бұрын
I would like to know more about how you configured a custom theme for the jupyter notebook.
@ZacksLab
@ZacksLab 5 жыл бұрын
I believe you can only set the theme with jupyter lab, not jupyter notbeook. In jupyter lab, go to settings -> jupyterlab theme
@erikschiegg68
@erikschiegg68 5 жыл бұрын
Talk on. I'm looking for a things recognition for blind people, so they can point with the head or hand and nano speaks what it sees. This would work quite out of the box with a speaker and camara connected, I hope. There are also those nice 3D mapping cameras, helping to map the blind peoples enviroment. An idea for you.
@ZacksLab
@ZacksLab 5 жыл бұрын
that's an interesting idea. having it speak what it sees would be relatively easy, the hard part would be accurately determining what the person is pointing at reliably from different camera angles and such. do you imagine that this device would just get placed somewhere in the room and as the person moves around and points to things it would respond (assuming the person and object are within its field of view)? or would the person hold the device and use it to point? the latter would be much easier, but the former could be solved too.
@erikschiegg68
@erikschiegg68 5 жыл бұрын
@@ZacksLab I imagine a hand wrist band with the camera and lateral blinding to get a narrow, defined angle of view. So it shoud be a portable system with accu pack. Maybe in a rucksack. You definitely do not want a fisheye camera for this task.
@mattizzle81
@mattizzle81 4 жыл бұрын
There are Android smartwatches now that have two cameras on them, one facing up, and one looking out away from the hand. Tensorflow lite runs on Android. Personally I don't know why people bother with things like Jetson Nano, when modern smartphones and smart watches have so much capability now. Unless you are doing robotics, in which case these embedded devices have all the IO ports, etc.
@robotoid-human
@robotoid-human 5 жыл бұрын
I'm loving the shirt man, best game ever!
@ZacksLab
@ZacksLab 5 жыл бұрын
omg I know, UO ruined every other game for me, nothing will ever compare. what server did you play on? I was on Atlantic from 2002-2004... I was on yamato before that because I had no idea what servers were when I was first starting and I just randomly chose one.
@grahamhilton2397
@grahamhilton2397 4 жыл бұрын
Great Video, I am currently doing a similar project using the waveshare JetRacer. This is a simple question but how do you save the images for training? Also I am doing supervised learning first as I have an oval track to use !
@ZacksLab
@ZacksLab 4 жыл бұрын
hey graham, thanks! I obtained/saved the images using this jupyter notebook: github.com/NVIDIA-AI-IOT/jetbot/blob/master/notebooks/collision_avoidance/data_collection.ipynb You could use this as a starting point for data gathering and tagging for your oval track.
@knarftrakiul3881
@knarftrakiul3881 4 жыл бұрын
Wow... I bet this could be used to spot potholes on road.
@shikharsharma1495
@shikharsharma1495 4 жыл бұрын
How much time does it take to train the data set? The Jupyter bar shows "Busy" for the last 45 mins.
@ZacksLab
@ZacksLab 4 жыл бұрын
With a 1070 ti gpu it took a few mins
@GWebcob
@GWebcob 3 жыл бұрын
Underrated video
@stevejeske2266
@stevejeske2266 4 жыл бұрын
Nice video Zack! I genuinely fear the day when I see driverless cars everywhere, but I think AI is fascinating. I hope you make more videos on this subject. BTW, I know Tesla is all-in with electric cars, but I am not convinced that our existing electrical infrastructure can safely and efficiently supply such an increase of demand if electric cars become popular. The brownouts in CA associated with the PG+E grid and the wild fires is just one example. Ohio just passed a law to subsidize Perry and Davis-Besse Nuclear Power plants $150 Million dollars per year for 6 years because they cannot compete with gas-fired turbines (gas is abundant and cheap). These nuclear power plants are 40 years old and should be retired. And even though there is new technology for higher efficiency nuclear power, I know of only one nuke plant (in SC) that has been significantly upgraded in the past 10 years due to environmental concerns. I am not an advocate of nukes, but I seriously question if this nations's electrical grid can handle such an increased demand. So, please convey my message to Elon the next time you see him! Take care. sj
@ZacksLab
@ZacksLab 4 жыл бұрын
Thanks Steve! AI is a tricky subject that carries a lot of social and ethical concerns. It also has a lot of promising benefits that are currently in use and improving quality of life for many people today. But it is a double edged sword. I do want to do more projects with AI and hardware. I’ve been getting crushed at work so my time for KZbin has diminished... but I look forward to jumping back into it when I free up!
@AngryRamboShow
@AngryRamboShow 5 жыл бұрын
Cool channel Zack. I have a 1080 Ti that will be used for training data, but I'm still waiting on the Nano to be delivered :S
@eliotacura9080
@eliotacura9080 4 жыл бұрын
This might be a simple question but How do you transfer your dataset to the desktop pc, and transfer the trained model back to the Nano for demos ? I know you mentioned via Wifi but I'm kind of curious on a bit more depth explanation, Thanks.
@ZacksLab
@ZacksLab 4 жыл бұрын
i use WinSCP to do secure file transfer. you can open a SFTP connection to the IP address and transfer files to and from your PC and the remote connection (in this case, the Jetbot)
@tmerkury2813
@tmerkury2813 2 жыл бұрын
Any tips on how you collect data such as image, throttle, steering angle into a dataframe for machine learning?
@ZacksLab
@ZacksLab 2 жыл бұрын
hey! yes, you generally need to work with an image sensor that has a sync or trigger pin that allows you to synchronize a frame capture with data from other sensors.
@tmerkury2813
@tmerkury2813 2 жыл бұрын
​@@ZacksLab Ohhh boy I got some learning to do, any tips on how to get started on that? So my plan is to use the following: -Nvidia Jetson Nano on the rc car to run the machine learning model - A basic computer camera to capture images -A built RC car with an ESC and Motor and a controller Is there any specific way to connect these tools to collect the data or will I need something special? Sorry for the complex questions here haha but any helpful directions would be appreciated! Or if you have videos on this I would love to watch. Thank you!
@ZacksLab
@ZacksLab 2 жыл бұрын
have you chosen an image sensor? i would start with the datasheet to learn its different capture modes from there, define your sensors for steering position, throttle, etc... and figure out their interfaces. it's likely you can use the Jetson's GPIO or SPI/I2C (what ever the interface is) to orchestrate all the triggering of data. you'll then need to define some sort of data structure for storing the image data + sensor data. i doubt something like this exists exactly for your use case, so you'll have to write your own drivers and software for handling all of the above. depending on the image sensor and other sensors you chose, the drivers may actually already exist in the linux kernel, but you'll have to enable them. i don't have any youtube videos on how to do this, but basically you have to reinstall Jetpack and recompile the kernel, device tree, and modules. there really is no easy shortcut for doing this, you will have to go down the rabbit hole of linux. alternatively, you can add a microcontroller that orchestrates the frame sync with other data and pass the data over the jetson side of things and then handle it in software at that point, it won't be as high of performance given the latency through the micro, but if your frame rate is low, it probably won't matter.
@tmerkury2813
@tmerkury2813 2 жыл бұрын
@@ZacksLab Thank you so much for your response I will keep all of these notes in mind going forward. It seems like I have a lot of work ahead of me and Nope I haven't pocked an image sensor yet but I certainly will soon to get started. If all goes well, in about 8 months, I'll have it done and I shall show you it. Thanks agaiN!
@isbestlizard
@isbestlizard 4 жыл бұрын
YES this is awesome! i'm going to stick my nano on a quad drone and make it learn how to FLY ITSELF :D
@isbestlizard
@isbestlizard 4 жыл бұрын
@no one expected the spanish inquisition I'm doing an msc in ai which has a project, picked reinforcement learning going to get it to learn in a simulator then transfer it to hardware!
@softwarelabstelecast
@softwarelabstelecast 4 жыл бұрын
Where did you get wheels from? Or did you print them yourself?
@ZacksLab
@ZacksLab 4 жыл бұрын
i believe these are the ones i bought: www.pololu.com/product/185 the BOM on github has adafruit listed but they are always sold out, alternatively there are STLs available that you could print. hope that helps!
@kamalkumarmukiri4267
@kamalkumarmukiri4267 5 жыл бұрын
Wonderful video.... Thanks for sharing. I am eagerly waiting for my nano kit ordered from amazon :)
@boogerrs1031
@boogerrs1031 3 жыл бұрын
Hi zack! i got myself a jet bot but i'm having trouble with the training of the last layer of the AlexNet model. I moved the dataset over to my laptop and ran the code using my gpu but but it gave me this error CUDA error: CUBLAS_STATUS_ALLOC_FAILED when calling `cublasCreate(handle)` I tried running it on the cpu
@ZacksLab
@ZacksLab 3 жыл бұрын
hey! without seeing your code it will be hard to help you... do you have it in a github repo? i could take a look if so
@boogerrs1031
@boogerrs1031 3 жыл бұрын
@@ZacksLab sorry i should've edited the comment cause i submitted it by accident without finishing and then i forgot to do it X.X my bad. anyway the code is the exact same one that you showed in the video, the same one on the jetbot github. copied and pasted it from github to a jupyter notebook on my laptop but it doesnt run the last cell where you create the new model based on the alexnet model and the dataset. if i run it on the gpu i get the error message i wrote in the previous comment. if i run it on the cpu i get another error message pointing to the line "loss = F.cross_entropy(outputs, labels)" in the last cell of the code saying that target 2 is out of bounds. the code is the exact same one as on the jetbot github, which is kinda weird cause everywhere i look on youtube everyone seems to have no issues with this collision avoidance program, meanwhile i'm having trouble running some code that is supposed to be good as it is. by the way thank you for replying!!!
@yezhang2947
@yezhang2947 5 жыл бұрын
Cool! Thanks for sharing!
@ZacksLab
@ZacksLab 5 жыл бұрын
You’re welcome, glad you liked it!
@a36538
@a36538 5 жыл бұрын
Really cool! Could you mate this with an rc car chassis? Could one make an autopilot rc car?
@ZacksLab
@ZacksLab 5 жыл бұрын
a36538 yes absolutely! This could control anything, an RC car, your car, a drone, heavy machinery, you name it. It’s just a matter of interfacing it properly to all of the sensor and actuators!
@fayobam_mech_tronics
@fayobam_mech_tronics 5 жыл бұрын
I love this, do you think a mobile 2070mq is a good GPU to learn deep learning and other artificial intelligence things
@ZacksLab
@ZacksLab 5 жыл бұрын
Thank you Ayobami! Yes, the 2070mq is certainly powerful enough to get started with machine learning and training neural networks! Especially for the Jetbot or other implementations for the Jetson Nano.
@FunkMasterF
@FunkMasterF 5 жыл бұрын
Great video. Thank you. +1 for breakdancing.
@rksb93
@rksb93 5 жыл бұрын
hey as a beginner i had a question regarding your training data images, did you use augmentation in any form to increase the amount of images that you could have trained your NN on?
@ZacksLab
@ZacksLab 5 жыл бұрын
hi Surya, no I did not use any sort of augmentation (I believe you're referring to translations, rotations, etc...). I would be interested in seeing how this affects performance if there were a tool that would automatically grow a dataset using this technique. thanks for the question!
@John-nr1ez
@John-nr1ez 5 жыл бұрын
Hi Surya, the data augmentations that you can apply depends on the task. For the JetBot collision avoidance, the data augmentation only includes pixel-wise color distortion (brightness, hue, saturation, etc.). Horizontal flipping might also be appropriate for this task, since it doesn't matter whether we're blocked on the left or right. However, cropping, translations, and scaling change the perspective of the camera relative to objects, which would change the objective. For example, if we 'zoom' the image as a form of data augmentation, we would end up seeing some 'far away' objects as near by, which we would want to label as 'blocked', but it would falsely label with the original tag 'free'.
@rksb93
@rksb93 5 жыл бұрын
John makes sense, so in essence he can get twice the amount of data by flipping images on the vertical axis but any other form of augmentation is not worthwhile. Did I get that right?
@SandeepGhoshaction
@SandeepGhoshaction 5 жыл бұрын
Awesome project ! Was looking for something like this only. Can the same concept be applied using a Raspberry Pi 3 b+ ? Please keep posting related stuff because youtube has got tons of electronics videos as well as tons of DL/NN videos ... But electronics along with AI, its really not there much. Subscribed !!
@angelleal517
@angelleal517 4 жыл бұрын
Do you know the actual range of the camera ? how far can it detect objects from ? Great video by the way !
@ZacksLab
@ZacksLab 4 жыл бұрын
i do not know the max detection range of this camera (its also a function of the size of the object). i have worked with high megapixel cameras capable of doing object classification out to 1km. of course this depends on your computer vision and post processing algorithms as well.
@evansyomu2879
@evansyomu2879 5 жыл бұрын
Does the Jetson Nano take any camera that has a CSI(MIPI) interface?
@ZacksLab
@ZacksLab 5 жыл бұрын
Yes, it supports MIPI-CSI2 cameras, here's an example for getting the raspberry pi v2 camera working with the nano: bit.ly/2oCborL
@MixedBag562
@MixedBag562 5 жыл бұрын
Wow. Before I saw this video, I was like "man, I don't need a $250 wall avoiding robot. I have an ArcBotics sparki(a programmable robot with it's own C++ IDE)." Seriously, big difference. You might be like, "oh, it just avoids walls so what", but really, this is just something else. Anybody who is just scrolling through the comments, this is a MUST SEE. I hate the natural human inclination toward clickbait instead of valuable and worthwhile content like this. I wish more people would seek out what actually is fulfilling, and benefits their career long-term. Instead, they look for trash like "Spongebob licking the marble for 10 hours". But people are people, so here's a suggestion: make your titles more concise, and the thumbnail self-explaining(that is, not including terms alot of people don't get like 'jet bot'). Also, presentation is BIG. And I don't just mean presentation in the video, but also the thumbnail, AND the title. TKOR(grant Thompson) is really a pro at this, and that's how he gets so many views and followers. His content isn't inherently interesting; it's just the title, and his thumbnail. If you could make a video with 1. Good content, 2.A short, concise, self-explanatory thumbnail AND title that draws interest to someone new to your channel, you'd be unstoppable. Even novice [engineers, technicians, chemists ect.] like TKOR, NurdRage(I'll admit he's fairly advanced) and The Slow Mo Guys(really, I think those people hardly even know about the chemistry of the explosives they use) make good channels with basic information And TONS of followers. Dude, if you want to get even more money for better projects via youtube(ads, maybe sponsors) you've gotta get more relateable. No, I'm not saying you need to get super basic like the above youtubers mentioned, I mean you have to Explain and Draw people in such a way that you can attract impatient newbies looking for clickbait, then when they least expect it, shove something that is rare and valuable down their throats(knowledge, skill, circuit science, computer programming). Then, they realize that youtube isn't just instant gratification and click-happy pleasure; theres MUCH, MUCH more to it than that. Awesome work Zack! Your videos and those alike make KZbin worth it. God bless!
@ZacksLab
@ZacksLab 5 жыл бұрын
Thank you FriedPickle, your comment and feedback means a lot to me. :)
@MixedBag562
@MixedBag562 5 жыл бұрын
;)
@angelleal3005
@angelleal3005 4 жыл бұрын
This is amazing ! I wonder Are you moving the Jetbot back and forth while it avoids obstacles, or Do you command it to go to a desired destination ( obviously avoiding obstacles by itself in the process )?
@ZacksLab
@ZacksLab 4 жыл бұрын
Thanks! no there is no input from me, it just attempts to navigate any environment you put it in while avoiding any collisions. I could modify the program to give it more of a “purpose” rather than just moving around and avoiding things.
@angelleal3005
@angelleal3005 4 жыл бұрын
@@ZacksLab Oh ok I see, Great work man ! new sub, would absolutely love to see more stuff of this sort. Thinking of doing one school project on this matter.
@woolfel
@woolfel 5 жыл бұрын
I just bought Jetson Nano too. Have you tried running faster rcnn on the nano? I'm still waiting until I get a power brick and battery.
@Flix-f6q
@Flix-f6q 3 жыл бұрын
why is the camera movement so jittery? How did you do it?
@ZacksLab
@ZacksLab 3 жыл бұрын
which camera? the one I'm filming with or the jetbot's?
@makersgeneration3739
@makersgeneration3739 5 жыл бұрын
Thanks Zack! 😎
@ZacksLab
@ZacksLab 5 жыл бұрын
welcome :D
@sgodsellify
@sgodsellify 3 жыл бұрын
How long does your jetbot last with a fully charged battery? You said you are using a 10 amp battery.
@ZacksLab
@ZacksLab 3 жыл бұрын
average current draw is around 1A, so with a 10Ah battery you get close to 10 hours of run time. under full load the nano can draw 10W, so run time will be closer to 5 hours if you're doing a lot of compute.
@lanliu9263
@lanliu9263 5 жыл бұрын
hi ,Zack ,it is an amazing jetbot. Could you share the brand and model of your motor driver and DC-motor ?
@ZacksLab
@ZacksLab 5 жыл бұрын
Hi lan liu, here are the links for the motor driver and motors: amzn.to/2FbcLmE (driver) amzn.to/2Ri28mY (motors)
@lanliu9263
@lanliu9263 5 жыл бұрын
@@ZacksLab thanks
@binwangcu
@binwangcu 5 жыл бұрын
2:04 ~ 3:40 is the part of data scientist's life which no one wants to take on :)
@Blobcraft13
@Blobcraft13 5 жыл бұрын
I'm loving these video
@dgb5820
@dgb5820 4 жыл бұрын
Really appreciate this video
@ZacksLab
@ZacksLab 4 жыл бұрын
thanks! appreciate the comment :)
@watcher9412
@watcher9412 5 жыл бұрын
Can you use the jetson mano to build a drone using the battery that you have
@ZacksLab
@ZacksLab 5 жыл бұрын
The Jetson Nano could be used onboard a drone for many different functions, however you’d still want to use LiPo batteries intended for use with motors, as the BLDC motors commonly found on drones can pull a lot more current than this battery is capable of providing safely.
@LakerTriangle
@LakerTriangle 5 жыл бұрын
What language would I have to learn to use the Nano? I always wanted to do something with face recognition.
@ZacksLab
@ZacksLab 5 жыл бұрын
You can do quite a bit with just Python and a familiarity with Linux. C++ is useful for OpenCV, but there is a python library that wraps the original opencv libraries into a library called opencv-python. You will sacrifice some run-time performance using this python wrapper instead of developing in cpp, but development in python is generally considered easier.
@ThomasGodart
@ThomasGodart 5 жыл бұрын
Great video! Thanks for sharing it 👍
@ZacksLab
@ZacksLab 5 жыл бұрын
thank you! and you're welcome :)
@seadark2074
@seadark2074 3 жыл бұрын
I'm fresh.How to use this model to recognize multiclass object? especially code
@ZacksLab
@ZacksLab 3 жыл бұрын
you're trying to use a NN to identify objects? if so, start here maybe: kzbin.info/www/bejne/aImwnIONlNh8fck
@ZacksLab
@ZacksLab 3 жыл бұрын
this is also good: kzbin.info/www/bejne/oWbTiYujidCDhK8
@knarftrakiul3881
@knarftrakiul3881 4 жыл бұрын
Just drive up and down local roads several times ..maybe throw in some deer models
@WildEngineering
@WildEngineering 5 жыл бұрын
Nice work man! RIP beer.
@ZacksLab
@ZacksLab 5 жыл бұрын
TheWildJarvi thanks! Haha, yeah. Looking back at the video I came close to knocking it over a few other times :P
@victorclaros8967
@victorclaros8967 5 жыл бұрын
Amazing job !!!!
@ZacksLab
@ZacksLab 5 жыл бұрын
thank you!
@naimuddinshimul2770
@naimuddinshimul2770 5 жыл бұрын
Great video !
@Dataanalyticspro
@Dataanalyticspro 5 жыл бұрын
What program did you use for data collection and tagging so fast with the on board camera?
@ZacksLab
@ZacksLab 5 жыл бұрын
Hey Jared! It was written in python and can be found here: github.com/NVIDIA-AI-IOT/jetbot/blob/master/notebooks/collision_avoidance/data_collection.ipynb
@erwinn58
@erwinn58 4 жыл бұрын
How long did it take for you to transver the data to your PC?
@ZacksLab
@ZacksLab 4 жыл бұрын
I log into the Jetbot using WinSCP and transfer the files over SFTP. Took less than a minute... both my computer and Jetbot were near my router.
@erwinn58
@erwinn58 4 жыл бұрын
@@ZacksLab aha! Today I tried transferring around 60 pictures. But when I was planning to download the 'dataset.zip' file in the collision avoidance demo, it said that I had no permission! I have no permission to download that zip file... Do you might have a clue? Thanks in advance
@ZacksLab
@ZacksLab 4 жыл бұрын
What software are you using to transfer the file?
@erwinn58
@erwinn58 4 жыл бұрын
@@ZacksLab well all these things happen in jupyter notebook. I always download my zip files with WinRAR. I heard it took a while for the jetbot to process all the pictures. But 3 hours later it still didn't worked..
@ZacksLab
@ZacksLab 4 жыл бұрын
Hmm, if you're trying to transfer to a Windows PC, download WinSCP and you can use SFTP to transfer files to and from your Jetbot (use the Jetbot username and pw to login). If you're having issues locally on the Jetbot, it could be a Linux permissions issue, which you can adjust for that file with the terminal command: sudo chmod 777 /path/to/file.zip
@LouieKotler
@LouieKotler 5 жыл бұрын
Amazing content! I'm an aspiring electronics engineer and hope to be like you one day. Do you do this work professionally? Any tips for someone like me who wants to start working with software like PyTorch but only has a general understanding of statistics and calculus? Thanks.
@ZacksLab
@ZacksLab 5 жыл бұрын
Hey Louie, thanks for checking out my channel! Yes, I'm an electrical engineer working on collision avoidance systems for autonomous drones. At work my focus is mostly in hardware design but at home I like to explore other topics (like this). I'd recommend checking out some courses online, there's a course called "Practical Deep Learning with PyTorch" on Udemy that covers all the fundamentals (I'm not affiliated with the course author or Udemy in any way). Udemy usually has 90% off on their courses so look around for coupon codes -- don't ever pay the full price.
@SandeepGhoshaction
@SandeepGhoshaction 5 жыл бұрын
Awesome sir ! How can I get in contact with you ? I am a beginner in electronics and Computer Vision.
@liamdiaz7767
@liamdiaz7767 4 жыл бұрын
Do you think it's possible to do same with a drone ?
@ZacksLab
@ZacksLab 4 жыл бұрын
absolutely. just need access to the autopilot. pixhawk (and similar drone APs) can take external commands that could be coming from an AI system such as this.
@liamdiaz7767
@liamdiaz7767 4 жыл бұрын
@@ZacksLab Is there any blog or site where I can read more about it ? I'm working on a project and would love to implement some of this.
@ZacksLab
@ZacksLab 4 жыл бұрын
I'm not sure if there is one specifically for what you're talking about doing, but there is plenty of documentation on pixhawk autopilots for drones if you search for it. what drone platform do you intend to work with?
@liamdiaz7767
@liamdiaz7767 4 жыл бұрын
@@ZacksLab I have an intel aero ready to fly drone, it comes with a pixhawk as the flight controller. I have seen some works with the same configuration but a raspberry pi is used as the companion computer. I would like to use the Jetson nano as the companion computer instead for the purposes of data collection & collision avoidance as you have shown here with the jetbot, but obviously in my drone.
@ZacksLab
@ZacksLab 4 жыл бұрын
ah, got it. you have to look into the documentation for that platform, however, since you said it uses pixhawk you can likely use the jetson nano to send maneuver commands to it via serial or CAN.
@saidbouftane5253
@saidbouftane5253 5 жыл бұрын
great video
@harrywillisdick8660
@harrywillisdick8660 5 жыл бұрын
Can you do this on a open MV h7?
@ZacksLab
@ZacksLab 5 жыл бұрын
It looks like OpenMV H7 is an ARM based computer vision platform. I have seen people implement neural networks on microcontrollers, but I would imagine that you will quickly reach its limitations. Also, you cannot take advantage CUDA or libraries and frameworks like TensorFlow, TensorRT, PyTorch, Keras, etc... unless you're developing for a system that can run Linux and has an NVIDIA GPU available to it (like in the case of the Jetson Nano).
@adamjohnson9846
@adamjohnson9846 4 жыл бұрын
incoming jetbot
@adamjohnson9846
@adamjohnson9846 4 жыл бұрын
I went to it as a how to video and found it very entertaining
@adamjohnson9846
@adamjohnson9846 4 жыл бұрын
and probably cuz there's not very many people doing raspberry or Alex Webb repos
@ZacksLab
@ZacksLab 4 жыл бұрын
awesome, glad you liked it :)
@B0XMATTER
@B0XMATTER 4 жыл бұрын
NOT THE BEER
@syed5126
@syed5126 5 жыл бұрын
Can you list your pc's specs?
@ZacksLab
@ZacksLab 5 жыл бұрын
Sure, my PC is: Intel i7-4790K NVIDIA 1070Ti 32GB DDR3 500GB SSD
@JuanPerez-jg1qk
@JuanPerez-jg1qk 4 жыл бұрын
wow...it can play fetch the stick...just throw the stick it will chase it and run over it ..waiting next command master
@ZacksLab
@ZacksLab 4 жыл бұрын
we have to be nice to our robots so they are nice to us once they become sentient ;)
@dubber889
@dubber889 3 жыл бұрын
"I'll consider this a feature" typical programmer LOL consider a bug as feature
@DOSputin
@DOSputin 5 жыл бұрын
Cat gives 0 fucks.
Robot Brain (NVIDIA Jetson Xavier NX Developer Kit)
12:16
Skyentific
Рет қаралды 67 М.
黑天使只对C罗有感觉#short #angel #clown
00:39
Super Beauty team
Рет қаралды 8 МЛН
Players push long pins through a cardboard box attempting to pop the balloon!
00:31
Jetson Nano: Vision Recognition Neural Network Demo
12:42
ExplainingComputers
Рет қаралды 234 М.
NVIDIA JetBot: Jetson Nano Vision-Controlled AI Robot
16:44
ExplainingComputers
Рет қаралды 193 М.
OpenCV Python Neural Network Autonomous RC Car
3:06
Z Wang
Рет қаралды 2,4 МЛН
The "Impossible Torpedo" was real
16:33
Steve Mould
Рет қаралды 31 М.
Getting Started with LIDAR
47:27
DroneBot Workshop
Рет қаралды 1,4 МЛН
DIY Autonomous Car Racing with NVIDIA Jetson
2:49
NVIDIA Developer
Рет қаралды 294 М.
Chaos Theory - Analog Circuit Lorenz Attractor PCB
7:07
Zack's Lab
Рет қаралды 12 М.
Real-Time Object Detection in 10 Lines of Python Code on Jetson Nano
26:18
Fix Cordless Drill - BOSCH 10.8 #bosch
39:30
Dahen Zana
Рет қаралды 1,8 М.
黑天使只对C罗有感觉#short #angel #clown
00:39
Super Beauty team
Рет қаралды 8 МЛН