NVIDIA JetBot: Jetson Nano Vision-Controlled AI Robot

  Рет қаралды 192,061

ExplainingComputers

ExplainingComputers

Күн бұрын

Jetson Nano “JetBot” machine learning robot review and demo. Includes hardware, software, Jupyter Lab notebooks for executing Python code, collision detection examples, and some introduction to training a neural network model.
The JetBot shown was supplied for review by NVIDIA, but this is not a sponsored video. If you are interested in getting a JetBot, I would strongly recommend starting on its NVIDIA web pages here: developer.nvid...
The JetBot Wiki, which contains all of the information required to build, setup and run vision recognition and machine learning on a JetBot, is here: github.com/NVI...
There is also info on full JetBot kits here: www.nvidia.com...
My previous review of the NVIDIA Jetson Nano SBC is here:
• NVIDIA Jetson Nano
And my “Jetson Nano: Vision Recognition Neural Network Demo” video is here:
• Jetson Nano: Vision Re...
I also have an introduction to AI video here:
• Explaining AI
More videos on single board computers and broader computing topics can be found on the ExplainingComputers channel: / explainingcomputers
You may also like my other channel, ExplainingTheFuture, at: / explainingthefuture
#JetBot #JetsonNano #NVIDIA #ExplainingComputers

Пікірлер: 395
@johncnorris
@johncnorris 5 жыл бұрын
nVidia: "It just works!" Explaining Computers: "After extensive assembly and configuration it does."
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
So true. NVIDIA were very helpful during production of this video, but production lasted longer than on many other videos. :)
@jasontiscione1741
@jasontiscione1741 5 жыл бұрын
​@@ExplainingComputers Probably starting when you first plug it in and Ubuntu thinks you have a black and white TV
@bosstroll9019
@bosstroll9019 5 жыл бұрын
If Mr. Scissors found a way to mate with it and breed, they would rule Earth by Thursday
@VeryUsMumblings
@VeryUsMumblings 5 жыл бұрын
1st choice: Ruke the world. 2nd choice: make lots of unboxing videos!
@brianm6337
@brianm6337 5 жыл бұрын
Meh- they can have the earth. Kids will be a bunch of little cut-ups, though.
@brianm6337
@brianm6337 5 жыл бұрын
@b gg Explaining Computers does. ;D
@cinnabarsonar2072
@cinnabarsonar2072 5 жыл бұрын
Gives a whole new meaning to the term "hardware porn" I'll grab my coat.
@digitalghosts4599
@digitalghosts4599 5 жыл бұрын
Jetson nano is a ridiculously powerful platform for the price. I'm using it for high speed machine vision to monitor manufacturing process in real time and this beauty can record 200fps in 720p and it can easily process 100 frames per second with a simple detection algorithm. We are living in the future. 10 years ago this would be a huge challenge even for a desktop PC not to mention that there was nothing in place to capture such high frame rates in real time and process them simultaneously.
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Great feedback, thanks for sharing. I share your appreciation of the Jetson Nano -- it is a really great board.
@travelstories7530
@travelstories7530 4 жыл бұрын
how do u make that simple detection algorithm is is based on those green square u defined in the algorithm?
@hussainbharmal5998
@hussainbharmal5998 Жыл бұрын
How is it going on, after 3 years? Have you been using it to date in other applicationa too?
@JohnK68
@JohnK68 Жыл бұрын
Actually it's not powerful at all. A nvidia tegra T210 with a GM20B GPU it's a toy. Also the OS on the board is way to heavy for the system.
@NataLia-yb6vm
@NataLia-yb6vm 4 жыл бұрын
Now I no longer need a Tesla for self driving car. I can make my own haha
@chriholt
@chriholt 5 жыл бұрын
All I can say is "Wow!" Very impressive hardware and software package!
@michelfilion5482
@michelfilion5482 5 жыл бұрын
Amazing...If anything, AI shows us how we take for granted our own complex cognitive abilities.
@maxvaistuk5493
@maxvaistuk5493 4 жыл бұрын
www.3dmaxprinter.com/shop/do-it-youself/rc-robot-car-with-ultrasonic-sensor-for-education-robotics-robotic-car/
@gpalmerify
@gpalmerify 5 жыл бұрын
This video helped me appreciate my Subaru's "Eyesight" system even more. Thank you Chris.
@willyarma_uk
@willyarma_uk 5 жыл бұрын
This is very cool! Now can you program it to search for a rabbit?
@iluvrgb
@iluvrgb 5 жыл бұрын
Back again with another interesting video. The Nvidia Jetson Nano is interesting
@AnthonyCook78
@AnthonyCook78 5 жыл бұрын
What a waste of time!
@chadwick2629
@chadwick2629 5 жыл бұрын
@@AnthonyCook78 How?
@cinnabarsonar2072
@cinnabarsonar2072 5 жыл бұрын
For some reason I want to try and turn this into a glorified cat toy.
@BlenderRookie
@BlenderRookie 5 жыл бұрын
Wow, I wanna play with that. Plus the commands seem rather intuitive. The commands kinda remind me of that old graphics program called Logo Writer.
@johnsweda2999
@johnsweda2999 5 жыл бұрын
Instead of having a camera to measure edges wouldn't it be better to have an optic measurement, you could use form a mouse, so it can count how many steps / rotations it's doing and work out the area surface and produce a map. This could be used in conjunction with the camera, I mean a camera doesn't know the area it's patrolling with a counting system it is easier for it to work it out I would have thought
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
I am sure that you are right that using/adding sensors other than vision would improve matters. :)
@BharatMohanty
@BharatMohanty 5 жыл бұрын
I like this video sir, very informative..I made a terminal controlled rover but I without microcontroller or SBC ..
@qzorn4440
@qzorn4440 Жыл бұрын
wow very nice project. 🥳 I sure miss the old Heath-Kit days. When it was a turn-key project with everything in a box and a great manual. Need a Teach-Kit company to replace Heath-Kit?
@ma-burke
@ma-burke 5 жыл бұрын
(Blib #6336 in the EC digiverse.) "...to see what it can do. So let's go and take..." _time passes_ _much time passes_ _years in fact_ "... a closer look."
@stuartg40
@stuartg40 5 жыл бұрын
Why does it only turn left?
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Only because that's what this sample program that runs this neural net is programmed to do (the neural net reports free or blocked, the code takes this input and applies the "turn left if blocked" rule. I can turn the other way! :)
@judgeguilty
@judgeguilty 5 жыл бұрын
It's due to the Zoolander Paradox kzbin.info/www/bejne/e161k6d5jbmgmZY
@codycast
@codycast 5 жыл бұрын
b gg na. It would have come with a speaker to call everyone racist, complain the whole time and demand $20/hr to move in circles.
@marcombo01
@marcombo01 4 жыл бұрын
@@ExplainingComputers and could you program it to take the most optimal decision to give it a more natural behavior?
@NataLia-yb6vm
@NataLia-yb6vm 4 жыл бұрын
It could randomize left or right if they both within "acceptable range" or take the highest value if not that proximate or equal.
@MegaZiggo
@MegaZiggo 5 жыл бұрын
This is quite interesting. I worked as a Field Engineer for a company call Intelligent Reasoning System Inc. back in 2001/2002. They made capital equipment for installation on an electronics manufacturing line for use in Automated Optical Inspection of PCBs. We used an early form of image training (the founders of the company also wrote the software for the terrain tracking algorithms for the Tomahawk cruise missile and adapted those to this application) with statistical analysis within a set standard deviation. The system worked well and it is very similar to what you demonstrated here with the Jetbot. We would train examples of good images and bad images of electronic components on a PCB and for each false call, we would train that as a bad actor. Over the course of the manufacturing run, the system would become more and more accurate. Good stuff indeed. Keep the good material coming! I am considering getting one of these to teach my daughters not only Python, but machine learning in general...
@CodyBanks10
@CodyBanks10 4 жыл бұрын
"Something like that" is a buzz phrase I will be using from now on.
@technicalcreativityandtric687
@technicalcreativityandtric687 5 жыл бұрын
Hello UNCLE how are u Your this video is amazing Very very good project thank you
@0dyss3us51
@0dyss3us51 4 жыл бұрын
Would love to see you expand on this robots capabilities! :D how about a hand that can grab drinks from the fridge? Okay a little ambitious, buuut possible I guess :D
@Tr3xShad
@Tr3xShad 4 жыл бұрын
Impressive. I was wondering if the nano module can be used on Xavier NX developer kit, as the Nvidia documentation state they share same pin out. Already have the nano module but eyeing the NX development board with it support for M.2 drives support
@hangaming1978
@hangaming1978 4 жыл бұрын
Sorry sir, please add subtitle indonesia. Thank you.
@Osmanity
@Osmanity 5 жыл бұрын
Totally mindblowing video! I just wish that Nividia sold pre-assembled JN-AI Robots versions so that we can just focus on building the software and learning more about training a neural network which is for me the main goal. Thanks as always for an interesting video!:0
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Since I made this video there are some suppliers selling kits, which makes things somewhat easier. I've added the link to the video description.
@Osmanity
@Osmanity 5 жыл бұрын
@@ExplainingComputers thanks
@peterjansen4826
@peterjansen4826 5 жыл бұрын
I boycott Nvidia because of their shenanigans (physX via X87, Gameworks, GPP...), I prefer to use AMD-GPU's for something like this.
@elmong9124
@elmong9124 5 жыл бұрын
For the AI ones , google coral and jetson nano offer similar performances, GPU and TPU is perfect for AI work loads , it’s not running from any cloud , it is inferencing(running) the AI itself
@elmong9124
@elmong9124 5 жыл бұрын
I would also want some amd gpu AI stuffs but unfortunately they do not offer any development kit for this kind of workload
@peterjansen4826
@peterjansen4826 5 жыл бұрын
@@elmong9124 I don't doubt that AMD is working on it with their open software stack. These are very interesting applications.
@dancingCamels
@dancingCamels 5 жыл бұрын
Would be interesting to see the least number of training data images it takes to work effectively.
@stanislavkotzev4157
@stanislavkotzev4157 5 жыл бұрын
7 years later he hasn't changed at all :D just watched the big data video
@williama29
@williama29 5 жыл бұрын
I like robots and AI I wouldn't mind having Mr. Scissors as a AI robot
@Techn0man1ac
@Techn0man1ac 2 жыл бұрын
It's alive, it's aliiiive
@retrorobodog
@retrorobodog 5 жыл бұрын
cool :):):)
@didiyontingwi
@didiyontingwi 5 жыл бұрын
Interesting.. I like this product.. Thanks.. Greetings from indonesia..
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Greetings back from the UK. :)
@CRTECHPE1
@CRTECHPE1 4 жыл бұрын
nice one. after 5 year, those things can be much easier
@SlowPCGaming1
@SlowPCGaming1 4 жыл бұрын
It could make for a neat cat toy. Teach it to perform different stunts or tasks based on which animal it sees under a variety of conditions. Or toddlers, puppies, any small silly creature with curiosity or skittish behavior. Is there a full cover for that bot? I wonder how well it would do on detecting stairs that are carpeted or feature a confusing geometric pattern.
@ExplainingComputers
@ExplainingComputers 4 жыл бұрын
There are lots of JetBot designs now, all for makers to but together and modify. I love the idea of a cat toy! :)
@statorworksrobotics9838
@statorworksrobotics9838 5 жыл бұрын
Another fantastic video but perhaps it helps to highlight the problems of current AI approaches. Bandwidth and data hungry, non transferable knowledge. Whereas a simple symbol based heuristic solution would probably suffice.
@smartassist9700
@smartassist9700 5 жыл бұрын
I am finding sponsors! The used mower I purchased. The brand Company has not produced in 7+ years. However, without asking they shipped me many new parts they had stored for that model mower. I will purchase the remaining 3-4 small items from them. Mower will be better than NEW.
@bradscott3165
@bradscott3165 4 жыл бұрын
Jetbot should be named Dale because he only turns left.
@Bippy55
@Bippy55 5 жыл бұрын
This is one of your BEST videos. I'd favor the company creating a kit of parts. Like Heathkit used to. Then you'd know the parts should work after assembly. I'd also like a BEGINNER's PRIMER on programming the neural net robot. But overall, "Bravo!"
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
There are now kits -- see links in video description! :) Glad you like this video. It is not one of my most popular. :(
@smartassist9700
@smartassist9700 5 жыл бұрын
Thank you yet again! THAT IS PART OF MY "KING PROJECT". i want to automate lawn tractor mower to mow lawn using visual and distance sensor for obstacles but program to mow in a pattern. Learning properly to go around objects on eventually both sides to keep mow pattern. One thing I have not figured is to read where grass has been cut verses the next cutting row of taller grass. Unless it uses "mapping" and location to determine the next row. That would make sense. The servo motors for steering and such I have to figure out with experts in that area. There is one company that has pattern mowing already. I am discussing selling "kits" to install on older mowers already purchased. They seem interested. but that will have to wait as I have to many automation things to complete first. Also paint a workshop barn/office. Insulate, run wiring, and interior walls and set up work table, hand tool storage organized on wall. (I am building skeletal arm that requires servo motors camera and nvidia to run it. Helper Arm to pull tools and put them up when finished. I have 4 months to also setup garden, water retention, sensors pumps water lines. This year with no garden I grew the most delicious Pineapples, Lemon grass, tomatoes, and seasoning herbs...playing around. I need some automation interested people to assist with this project. amazing the blue print for automation has grown substantially. I guess it will take 2-3yrs to implement fully. I would prefer that 24hrs a day over any other option to spend time on.
@Peter_Enis
@Peter_Enis 4 жыл бұрын
Use bluetooth adapters in the corners of the garden as "gps-beacons". Now you can use the mower to print text or pictures in your lawn like a big plotter.....?
@62shalaka
@62shalaka 5 жыл бұрын
Sundays are exciting; I never know what Chris will present to us next. Great as usual!
@adaemus333
@adaemus333 5 жыл бұрын
Good robot. Robot move only straight and turn left. For this hardware its not poor. Again nice video.
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
The robot can move/turn as required, the code here just says "straight forward or turn left if the neural net reports blocked".
@MichelMorinMontreal
@MichelMorinMontreal 5 жыл бұрын
I forgot.... I guess you've already had a chance to take a look at the "new kid" from Intel.... up-board.org/upsquared/specifications/
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Does look good. :)
@nustada
@nustada 5 жыл бұрын
Nature always has at least two cameras "eyes", is there a way to add another camera? Stereoscopic AI would be exponentially more powerful than not. Also feedback sensors for the wheels and preferably shocks to implement AI on terrain?
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
A second USB camera could potentially be added.
@irenew6520
@irenew6520 5 жыл бұрын
Please can you show a demo where the robot also runs clockwise. Is it possible for you to duplicate the code and adding a flag for switching randomly left or right or better with AI ? Otherwise this official demo looks a bit sufficent, nor than great. THANKS
@nocturnalnights27
@nocturnalnights27 5 жыл бұрын
Metal... ...Gear?!
@bosstroll9019
@bosstroll9019 5 жыл бұрын
A weapon to surpass Metal Gear
@八音-m9d
@八音-m9d 4 жыл бұрын
英伟达确实是好东西就是贵了一些,但是越贵的东西越好吗
@TheTwick
@TheTwick 5 жыл бұрын
Thank you for this video. The definitive test would be to train it to follow the voice command : “Robot, get me a beer!”
@phildodd9942
@phildodd9942 5 жыл бұрын
A useful presentation you have given us, pointing out that AI can be other things apart from analysing numbers or text ! In fact you've made us sit up and take notice ! So THAT'S how a self-driving car parks itself ! Thank you for this enlightening demo !
@CTCTraining1
@CTCTraining1 5 жыл бұрын
I fell happier to know my Tesla is unlikely to fall off the table.
@fecklarjenkins2549
@fecklarjenkins2549 4 жыл бұрын
Should name it Ricky Bobby thinks it’s a NASCAR and can only turn left LOL
@nystudiolofiinstrumentalre726
@nystudiolofiinstrumentalre726 5 жыл бұрын
Hi. Please help me to code 14 servo and 2 dc motor in python and raspberry pi4 .i bought it recently and tried to do but unsucessful i am losing my interest and i dont want
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
I have a whole series here on "Rapberry Pi Robotics", and another on my "Pi Devastator Robot".
@GregSilverado
@GregSilverado 4 жыл бұрын
years from now our robot overlords are going to remember your kindness to one of their little ones.... they will be merciful to you
@ExplainingComputers
@ExplainingComputers 4 жыл бұрын
:)
@waynerobarge8543
@waynerobarge8543 5 жыл бұрын
Thank you for doing this video. I have played with the nano but have not looked at the robot feature. Question: the robot seemed to favor turning in one direction for avoidance most of the time. Is that a random variable - like flipping a coin - or is the robot taking additional information before making the decision which way to turn?
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
The robot is simply following the rule "if neural net reports blocked, turn left".
@donaldduck5731
@donaldduck5731 4 жыл бұрын
How with one camera can it assertain depth? This nvidia is so frikin clever, I need one of these robots, if only to realise how ignorant and dumb I am.
@ExplainingComputers
@ExplainingComputers 4 жыл бұрын
It cannot measure depth with one camera. It is using a neural net to pattern match what it can see against what is has learnt a blocked pathway looks like. Which is rather clever! :)
@Iamnotheretocompete
@Iamnotheretocompete 3 ай бұрын
i saw a floppy drive!
@vermontmario
@vermontmario 3 жыл бұрын
i like your videos , thank you
@twmbarlwmstar
@twmbarlwmstar 5 жыл бұрын
Impressive, I managed to write a simple IA script this week, as in copy it and get it to run, and that was a real first. It didn’t really do anything as such, but I could see the code working and that helped me get my head around things. Really, I struggle with all this and always will but just making some sense of it helped. My 10-year-old son will hopefully do better and having something like a robot will be more meaningful than my naming game. A £100 price point seems about right for this, especially as it should be extensible and it is the code that is what it’s all about, and you need affordable so there's access. Hopefully you can have say a team of 4 kids with one Jetson to build a project. £25 each isn't a complete killer over a year, say £1 a week- even those on very low incomes should be able to support that. To be honest though, I’ll always be a hardware man at heart so the Jetson is a bit lost on me, I’ll only turn it into a Kodi box or games emulator. EDIT: I do wonder what Raspberry will come up with, clearly it won't be Jetson standard as Nvidia are miles ahead on this stuff and have massive budgets. The same for AMD, I just wonder there, they have supposedly invested some money in IoT/Embedded but I think they will be in a very different direction- could be wrong but more 86 64 and industry utility stuff- certainly not self driving cars.
@TheDavidPoole
@TheDavidPoole 5 жыл бұрын
Hi Chris, bit late to this party - sorry. Do you think a lidar module would integrate well with this project, a combination of visual and 3d mapping would be quite cool. Plus, on a more robust chassis like the Devastator (?) robot chassis you used on the pi controlled series could be a useful tool for hazardous area searches etc. It would be interesting to see you do a video or two on such a project. And quite educational. Now it's time for me to catch up with the rest of your videos that I've missed. Cheers!
@Tom_P_242
@Tom_P_242 5 жыл бұрын
I think the better is Jetson RaceCar.... but it's only my own opinion
@PeteVanDemark
@PeteVanDemark 5 жыл бұрын
Fascinating! Nice little robot. Never would have guessed it uses an image library to navigate. Like the table too!
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Every ExplainingComputers video where I look at any hardware is shot on that table. :)
@arthurmoore1379
@arthurmoore1379 5 жыл бұрын
@@ExplainingComputers My good man this is way to complicated for mm. Arthur here:
@ryancoke777
@ryancoke777 5 жыл бұрын
That roller ball on the bottom reminds me of an old mouse ball. That leads me to a great idea: a Jetson nano controlled mouse using a Bluetooth gamepad. Useless but why not?
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
I'm sure your idea could be done. :)
@daviddavidsonn3578
@daviddavidsonn3578 5 жыл бұрын
could you have a sort of robot detecting heat source ? like a heat seeking missile ?
@egg11kompaniehuhn
@egg11kompaniehuhn 5 жыл бұрын
With an IR camera. But please go easy on germany we didnt do anything wrong this time.
@bowenyang2808
@bowenyang2808 4 жыл бұрын
Am i the only person who thinks it risky to put a autonomous car running an untested model on a TABLE?
@ExplainingComputers
@ExplainingComputers 4 жыл бұрын
It was done with care and caution! :)
@garyoa1
@garyoa1 5 жыл бұрын
Not sure what they are using but roomba has been working like that for 20 years or more.
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Using neural nets to learn an area is clearly not new. But doing so with vision recognition at this price point is I think pretty novel.
@saturno_tv
@saturno_tv 5 жыл бұрын
Greetings Chris!. Expecting some review about the up-board.
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
If they send me one . . . :) Their new stuff does look very good: up-board.org/
@bijanshadnia3620
@bijanshadnia3620 4 жыл бұрын
I thoroughly enjoy your videos. This one in particular is one of the coolest I've seen on your channel. Thank you for the hard work you do.
@rolfsinkgraven
@rolfsinkgraven 5 жыл бұрын
A very interesting video.
@suvetar
@suvetar 5 жыл бұрын
You could use this to emulate the old Big Trak! (www.wikiwand.com/en/Big_Trak) :D
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Nice idea.
@Sam_Saraguy
@Sam_Saraguy 5 жыл бұрын
Move aside laser pointer, the cat has a new toy.
@ronjenkins4257
@ronjenkins4257 5 жыл бұрын
I can imagine networked robots of this sort, each perhaps in a different school, racing around a common virtual racetrack or in battling in a common virtual battlefield. In actuality each school's robot would be operating on an agreed upon real physical space (a school gymnasium), but playing on a common virtual space, populated by virtual robots being generated by the physical robots in each location. The winning robots would be the ones with the best-programmed algorithms supplied by the students.
@lollerich
@lollerich 5 жыл бұрын
Very, very interesting. I finally got at least a general idea of how neural networks are trained. Thank you!
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Great to hear this -- that is what I was trying to convey. There is a fair bit of process between gathering the sample data and having a final trained model, but the essence is what you see here.
@lollerich
@lollerich 5 жыл бұрын
@@ExplainingComputers Thank you. While I did in general understand the purpose of neural networks I was always very hazy on how you would actually train one.
@iKostanCom
@iKostanCom 5 жыл бұрын
For some reason your robot never goes to right bottom corner... looks like a bug
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
A flaw in its training, certainly.
@spidermcgavenport8767
@spidermcgavenport8767 5 жыл бұрын
Thank you, Mr. Barnatt, Explaining Computers, My jet bot would be using vintage fisher price contrux.
@spidermcgavenport8767
@spidermcgavenport8767 5 жыл бұрын
Short Circuit S.a.i.n.t robot Johnny 5!
@RealRobotZer0
@RealRobotZer0 5 жыл бұрын
How big was the model that you had to copy to the robot in MB?
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
As I recall, a few hundred megabytes.
@thomasottvideos
@thomasottvideos 5 жыл бұрын
Tesla Model 3? We don't need no stinkin' Tesla Model 3. We've got JetBot!!
@lowiehojas5525
@lowiehojas5525 3 жыл бұрын
I was just leisurely watching, but instead I learnt something that I need for my school assignment :D I just learned from your video how to train a data set for visual detection! Thank you so much!!
@Croaker369
@Croaker369 5 жыл бұрын
Shouldn’t it be able to “self learn”, instead of being taught by the user?
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Some neural nets self learn. But the vast majority require training with sample data.
@mihailvormittag6211
@mihailvormittag6211 8 ай бұрын
👍
@gplayer01
@gplayer01 5 жыл бұрын
Excellent review & demo Chris of the Jetson. Interesting to know the amount of pictures you took to train the Jetson. Well done
@maybehappen4138
@maybehappen4138 5 жыл бұрын
Did you train it with just one edge as an example, or did you need to repeat for each edge?
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
I trained on all edges, too far away from the edge nearest to the camera!
@MacPhantom
@MacPhantom 5 жыл бұрын
Great to see that NVIDIA uses PyTorch!
@SniperUSMC
@SniperUSMC 2 жыл бұрын
This would be a good project to use to train an autonomous lawn mower. The direction changes need to be in a bit smaller increments so it doesn't just turn too much at once but just a little and retest for obstacle. If all the information can be stored on SanDisk card and saved then the trained brains/program can be moved to a bigger robot and have the "Blades of Death" (lawn mower blades) installed and turned loose safely to mow the lawn.
@krishnar754
@krishnar754 4 жыл бұрын
Collision avoidance is not working on my jetbot. Could not read image from camera is the runtime error appearing. Please help me with this sir. I'm using raspberry pi camera btw.
@ExplainingComputers
@ExplainingComputers 4 жыл бұрын
This sounds like the camera is not working, rather than an issue with collision avoidance with the ML model. Can you see an image from the camera?
@krishnar754
@krishnar754 4 жыл бұрын
@@ExplainingComputers yes sir ! I tried running a command from terminal and it worked perfectly without any issues. But when I run collision avoidance notebook, The line camera =Camera.instance() leads to run time error:could not load image from the camera. Edit: Sir it would be really helpful for me if you help me with this issue sir. Please provide any solution for this issue sir.
@BlensonPaul
@BlensonPaul 2 жыл бұрын
Lovely. Awesome..
@pranavraja10
@pranavraja10 5 жыл бұрын
Awesome video, could you consider making a video on a Raspberry Pi based self driving car, there's a project called Donkey Car
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
I will take a look.
@World_Theory
@World_Theory 5 жыл бұрын
A while back, I got an Idea… This is how it goes… You have an Artificial Neural Network (ANN), that is having trouble with a task that requires a perception of Time. What is a simple way to give it a perception of time? Answer: Give it memory. What is a simple way to give it memory? Answer: Use data from its sensors, taken at consistent intervals, and give them a “memory slot” in a cache. Use a small number of memory slots for progressively older sensory data samples. Whenever a new sample is added, delete the oldest sample, and shift the contents of all the other slots to the slots representing the sample that is one interval older than them, until the slot representing the newest data sample is empty, so that the most recent sample can occupy that empty slot. Feed the ANN from whatever is in the memory slots, as a separate sensory input. It will then be able to compare old vs new, and perceive changes from as far back as the memory slots go. This will need careful optimization, to avoid overwhelming the memory capacity of the hardware that it's all being run on, and to not require a ridiculous number of “neurons” in the ANN to be dedicated to input. One method of optimization, may be to also perform a resampling operation on saved sensory data, to step it down in size, every time the data is shifted to a new slot, whether immediately after it first enters memory, or if it meets the condition of being past a certain age, so that it loses fidelity, but still exists for reference for much longer within a given memory size constraint. Some care should be given to choosing a good resampling filter. It should probably be a balance between accurate and lightweight for processing. And accuracy brings up another subject: Resampling images while stored in the sRGB color space, is mathematically proven to distorting the overall energy of an image. All camera input should be converted to linearRGB colorspace, if it's going to be subjected to resampling. sRGB is meant to make efficient use of digital memory, for cases where an image is meant to be shown to humans, by matching assigning more importance to ranges of the color spectrum that the human eye is most sensitive to perceives differences in. So sRGB is completely unnecessary for computer vision. Another optimization method is to simply start with a smaller sized resolution for sensory input, or to use a color model other than RGB, such as Lab, where there is one color channel for Lightness, and two color channels for chromatic data. Then, you can aggressively down-sample the chroma channels by a positive whole number. I've mentioned resampling a lot, but I think I should mention that I'm not 100% sure how useful it would be to save processing power overall. I'm quite certain that it will save memory, though.
@orleydoss3171
@orleydoss3171 5 жыл бұрын
👍
@enriqueatentar8876
@enriqueatentar8876 4 жыл бұрын
Robot needed more sense to completely unlock his self Consciousness.
@ExplainingComputers
@ExplainingComputers 4 жыл бұрын
:)
@MrVein5.0
@MrVein5.0 5 жыл бұрын
wait a second, i got a 1hour ad before this video? wow. interesting stuff but wow.
@jasongooden917
@jasongooden917 5 жыл бұрын
Or buy a Vector
@srtcsb
@srtcsb 5 жыл бұрын
Very cool Chris. Of course it's 'simple', stuff like this starts out simple (you Arduino & Raspberry Pi blinky lights folks will know what I'm talking about... You can learn a lot from blinking those lights! ). Does Nvidia show you (or give access to) the code for this? Like another commenter said, possibly you could modify or extend the code for more functionality. I'd love to build one of these and examine the possibilities. Thanks for another great video Chris.
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Hi Steve. And yes, all the code is available to tinker with. And there are various other examples available, like self-driving down a road based on its markings, and following a person/object.
@arthurdent8091
@arthurdent8091 5 жыл бұрын
Hi Chris. Nice video. I caught the pun EYE instead of I, Robot, nice one. Robot vision will interest me more when I can be inside a self-driven car. Cheers.
@a.i.3025
@a.i.3025 4 жыл бұрын
Good ai robo
@F15HHOOKS
@F15HHOOKS 5 жыл бұрын
On a similar note Chris, have you seen the Tesla Autonomy Day presentation on KZbin? The first part details the bespoke chip build used for full self driving that I think you may find interesting.
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Again, I will investigate! :)
@Ruhgtfo
@Ruhgtfo 5 жыл бұрын
Pass the butter
@PsiQ
@PsiQ 5 жыл бұрын
11 minutes into the video i think it would be cool if you could print arrows and stick it on boxes and walls and it follows the direction :-) .. left right forward back, turnaround would all be possible.. even like turn 30° left, without any programming. edit: hahaaa! and if @13:45 you put a picture of yourself in with "free" it's gonna run you over if it gets you ;-)
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
This should be possible. This short video shows a JetBot navigating a road based on the road markings -- and not running over Lego people: kzbin.info/www/bejne/hGipp52Bi9GFqpI
@michelj.gaudet5048
@michelj.gaudet5048 5 жыл бұрын
This device must've been conceived by either a current or retired NASCAR driver... left turn, go straight, left turn, go straight, left turn, go straight, left turn, left turn, go straight, left turn...... lol😉
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
It manages some nice, gentle corners in this video: kzbin.info/www/bejne/hGipp52Bi9GFqpI
@tamaseduard5145
@tamaseduard5145 5 жыл бұрын
Thank you sir 🙏 🙏 🙏
@gregadams558
@gregadams558 5 жыл бұрын
robot vacuum
@NomadicSage
@NomadicSage 5 жыл бұрын
This was a great video, please do more videos on AI and programming
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Thanks. I hope to return to AI, but so far -- and after many attempts -- it has never really taken off on this channel.
@NomadicSage
@NomadicSage 5 жыл бұрын
@@ExplainingComputers it's quite different and interesting to see it in action, the experiments that you do in your style of breaking them down into easy to understand components is what always draws me to watch your videos. As well as replicate them with my own configuration.
@twmbarlwmstar
@twmbarlwmstar 5 жыл бұрын
@@ExplainingComputers I want EDGE! No seriously, after you teased EDGE I've been reading up on it and looking forward to your planned video, I an see a direct use for EDGE computing in my life more so than IA, despite things like Amazon Dot or what have you. I think IA is a hard sell full stop, because of people like me, I get the concept but I don't get the access- as in I'm a bit thick and just don't get it and don't have the coding skills to get it. It won't go down well but I'm finding some of the MS stuff a bit easier than the Linux stuff- with the proviso I am on about stuff targeted at complete novices. And I'm not making any claims that MS W10 embedded is in anyway better, I just found some of it more accessible to me- probably because you get W10 as a base to build on. And all it is a doorbell that can tell you if it is a bear at the door or it isn't a bear at the door and that's it. There aren't many bears at my door (a few wolves maybe).
@yunlongsong7618
@yunlongsong7618 5 жыл бұрын
fascinating
@sethrd999
@sethrd999 5 жыл бұрын
I have only one question for the platform, what is the runtime with only using one power source?. Good video and overview, I would suggest that anyone looking at doing this, learn the math required first ( if your not upto speed ), then possibly look at OpenCV amongst other tools, I say this because all the ground work is done for you with this platform so what are you really learning technically?.
@ExplainingComputers
@ExplainingComputers 5 жыл бұрын
Runtime depends on the level of motor use, but is certainly many hours. The Jetson Nano itself runs a good 6 to 8 hours (again, depending on what it is doing) on one charge of the pack. I expected to be recharging a lot, and did not need to.
Simple A.I. Demo with Jetson Nano
14:14
James Bruton
Рет қаралды 64 М.
Остановили аттракцион из-за дочки!
00:42
Victoria Portfolio
Рет қаралды 3,8 МЛН
Офицер, я всё объясню
01:00
История одного вокалиста
Рет қаралды 4,7 МЛН
小天使和小丑太会演了!#小丑#天使#家庭#搞笑
00:25
家庭搞笑日记
Рет қаралды 31 МЛН
Animals Seeing Themselves For The First Time!
8:27
CubeHub01
Рет қаралды 17 МЛН
Radxa X4: An N100 Pi
20:48
ExplainingComputers
Рет қаралды 54 М.
NVIDIA's Low Power AI Dev Platform on Arm
18:36
ServeTheHome
Рет қаралды 110 М.
NVIDIA Jetson Nano
14:29
ExplainingComputers
Рет қаралды 430 М.
12 New AI Projects using Raspberry-Pi, Jetson Nano & more
7:50
ToP Projects Compilation
Рет қаралды 748 М.
Build Your Own GPU Accelerated Supercomputer - NVIDIA Jetson Cluster
15:03
Nvidia Jetson(s) Explained - in under 400 seconds!
6:25
Hardware.ai
Рет қаралды 19 М.
Jetbot Neural Network Based Collision Avoidance
7:50
Zack's Lab
Рет қаралды 73 М.
NVIDIA Jetson Nano 2GB is the Best Way to Learn AI
10:36
Skyentific
Рет қаралды 47 М.