Check out the corresponding blog and other resources for this video at: deeplizard.com/learn/video/6stDhEA0wFQ
@kudoamv5 жыл бұрын
So I can't use Computer Vision Programs which require GPU because I am an AMD card user?
@mae10004 жыл бұрын
@@kudoamv Actually, you can, I think. AMD invested in that field too, google "gpuopen".
@debajyotisg5 жыл бұрын
Great job speeding Jensen Huang up. xD
@sudeepkadyan54804 жыл бұрын
Can i use 1050ti 4gb for data science
@faroukmokhtar69004 жыл бұрын
Ofcourse
@atisbasak3 жыл бұрын
But it will be quite slow.
@Xiler69695 жыл бұрын
Congratulations, You've impressed me. Very professional series. Right to the good stuff, clear and sharp voice, broad yet specific explanations.
@larryteslaspacexboringlawr7395 жыл бұрын
5:13 ballmer ambush, panic clicking to skip (thank you for awesome video)
@deeplizard5 жыл бұрын
lol. You are welcome!
@majeedhussain32766 жыл бұрын
This channel seriously deserves million subs.Have been watching many series from this channel.Great work !!!!! keep going I'm sure this channel gonna flow with lots of subscribers someday .
@deeplizard6 жыл бұрын
Thank you, Majeed! We're glad to hear you've been enjoying multiple series here, and we're happy to have you as an engaged member of the community! Always appreciate seeing your comments :)
@estebansevilla2229 Жыл бұрын
I ended up here because my daughter is learning "AI" at high school and now I need to understand how this all works to make her a PC.
@deeplizard Жыл бұрын
😅😅
@SDRIFTERAbdlmounaimАй бұрын
did you build that pc just fine or you needed further help ?
@gabriellugmayr28714 жыл бұрын
wow, I saw these first 4 videos of the Pytorch series and am impressed how much time & effort you put into these tutorials. Thanks a lot. Also, you have developed enormously (although the older tutorials were already very good)
@ajwadakil60204 жыл бұрын
This channel should have more subscribers, seriously
@Sikuq4 жыл бұрын
Beautifully done, Chris. Wow. Thanks. I learned a lot.
@MarcelloNesca3 жыл бұрын
Thank you very much, currently learning deep learning and this was perfect to explain why I need a good GPU
@chinonsoalumona67342 жыл бұрын
Hi Marcello how’s your deep learning experience going?
@IgorAherne5 жыл бұрын
Good overview, also having 8 cores won't necessarily speed up computation by exactly x8, perhaps by x7 in practice I just wish you would mention that processors use SSE2, AVX2 and similar things that allow each core do 8 summations/multiplications/shifts/etc at a time, rather than one by one. This allows a CPU registers to process arrays in chunks of 8. Many C/C++ programmers don't know about those, and build programs that are by default doomed to underperform. So I feel everyone is always unfair towards the CPU. Everybody is pointing at the cores, but each core can (and should) use intrinsics, doing parallel things. Especially with various RNNs where we only win if we move the entire algorithm to the GPU to avoid data transfer bottlenecks, and when the RNN is decently wide in each layer for the GPU. Also, CPU is really flexible when it comes to 'if/else' or while loops, reacting faster and wiry when the branch occurs.
@deeplizard5 жыл бұрын
Hey Igor - Thanks for adding these details. Great stuff. Much appreciated! 🙏
@AdrianDucao5 жыл бұрын
my girl friend and i been doing a lot of deep learning lately
@osumanaaa99824 жыл бұрын
you sure it's not just shallow computations ? :p
@q3d3854 жыл бұрын
@@osumanaaa9982 🤣
@Aditya_Kumar_12_pass4 жыл бұрын
I hope you are not hoping for any output.
@anamitrasingha63624 жыл бұрын
@@Aditya_Kumar_12_pass KZbin should have a Haha react xD
@vibesnovibes63203 жыл бұрын
How many layers for protection? Are you clear on how back propogation is supposed to work? 😂😂😂
@engkamyabi2 жыл бұрын
Amazing video and loved the short clips! Thank you!
@shivangitomar55572 жыл бұрын
The BEST VIDEO on this topic!
@Aweheid3 жыл бұрын
Rich informative video !! No better explanation is more than yours!!
@benjaminhansen480819 күн бұрын
You can explain this whole AI trend 5 YEARS AGO
@jamesferry84846 жыл бұрын
Thank you for sharing. Very helpful.
@deeplizard6 жыл бұрын
Hey James - You are welcome!
@bazzinga2043 жыл бұрын
Beautifully explained.
@mayur98766 жыл бұрын
Thanks for putting in all the efforts.
@deeplizard6 жыл бұрын
You are welcome!
@Henry..s3 жыл бұрын
What I learned from this video is that nvidia gpu got their speed from the CEO
@Ammothief415 жыл бұрын
Any idea on how the rtx graphics cards and their tensor core stuff compares to the standard gtx gpus? Is that something that tensorflow or pytorch take advantage of?
@deeplizard5 жыл бұрын
Haven't seen the comparisons. And. Yes. www.geforce.com/hardware/technology/cuda/supported-gpus
@taj-ulislam69025 ай бұрын
Very professional video. Good information.
@Not0rious74 жыл бұрын
Nice video, I like all the graphics you used. Where do you find them?
@ajaykrishnan42776 жыл бұрын
you just nailed it
@anujsaboo70814 жыл бұрын
You have a mistake in the quiz section: Q. Different PyTorch components are written in different programming languages. PyTorch is written in all of the following programming languages except? Ans. Java (on blog correct answer is coming as Python)
@deeplizard4 жыл бұрын
Hey Anuj - Thank you so much for pointing this out! I've fixed it. You may need to clear your cache to see the change. Chris
@vky7713 ай бұрын
Everyone wishes they saw your video 5 years back 😅
@robertatvitalitystar24446 жыл бұрын
ty! Peeling the plastic of a brand new GPU is a good day, lol.
@krishna_o154 жыл бұрын
I am agreeing since Soumith has said it.
@jordan52535 жыл бұрын
Jesus @2:41 I spit out my water lol
@durafshanjawad52504 жыл бұрын
Very Helpful Series
@lancelotxavier90845 жыл бұрын
nVidia is holding back processing power in order to make selling their products sustainable.
@deeplizard5 жыл бұрын
This is important. A company's incentive to make a profit can be a double-edged-sword. Consider the same problem in healthcare or biotech.
@henson2k4 жыл бұрын
@@deeplizard conflict of interests indeed, every software developer knows that LOL
@TheTariqibnziyad5 жыл бұрын
Finally these nerds got the guts to call something a funny name, Embarrassingly parallel xD
@deeplizard5 жыл бұрын
🤓
@zackrider37083 жыл бұрын
@@deeplizard can the new XE server gpu from intel handle ai or Deep Learning workloads like nvidia gpu ?
@e4r2816 жыл бұрын
Maybe if we start telling people the brain is an app they will start using it.
@clbl87064 жыл бұрын
That's some cringeworthy joke my grandma would share on FB.
@e4r2814 жыл бұрын
@@clbl8706 actually, it's not a joke.
@nikhilmathur33514 жыл бұрын
@@e4r281 Unlike you
@hailongvan82853 жыл бұрын
ok commie
@nozaxi03276 жыл бұрын
Thank you. This series so helpful for me
@deeplizard6 жыл бұрын
Glad to hear that, jesse! You're welcome!
@qusayhamad72433 жыл бұрын
thank you
@mexzarkashiev24354 жыл бұрын
So which graphics card should I buy for deep learning?
@kudoamv5 жыл бұрын
PLEASE HELP, Is it something like - "you have to download pytorch with cuda if you wanna use gpu or else you will only be able to use cpu" ? I am an AMD user.
@JasperHatilima5 жыл бұрын
CUDA is NVIDIA platform and so only supports NVIDIA cards like the GTX GPUs. For AMD, the framework for parallel programming is OpenCL...which unfortunately does not have a development community as big as the CUDA community.
@ankitaharwal58865 жыл бұрын
I can feel your pain 😔 😔
@RohanSawant-s3q5 ай бұрын
😄very nice video
@sajolsajol8393 Жыл бұрын
wow!
@A-Predator3 жыл бұрын
i just learned the name of deep learning 😵
@sqliu94893 жыл бұрын
brilliant explanation
@vibesnovibes63203 жыл бұрын
Great video
@_SupremeKing Жыл бұрын
Although still being confused, I just picked up some new knowledge for a layman like me
@faizahmed80344 жыл бұрын
Aren't GPU's used for image processing (i.e conversion of binary code to analog Graphic pixels)? If so how can we use them for mathematical computations ?
@jackt95352 жыл бұрын
I'd like to know whether you could use dedicated graphic card for deep learning while you don't have a CPU with no iGPU. This will help me much with my screen cable management issue (I'm new to this) ! Thanks!
@henson2k4 жыл бұрын
At research stage I can see how Python is acceptable choice. However for production systems Python too large in size and not fast enough!
@SW-ud1wt5 ай бұрын
I have HP envy ci7 laptop having GeForce rtx 2050 card, will it be used for machine learning tasks?
@henson2k4 жыл бұрын
When multiple apps are using CUDA how it's managed by GPU? Can GPU execute different kernels at the same time?
@isbestlizard4 жыл бұрын
OH MY GOD WOW are you a lizard too? i love ai and stuff as well :D
@deeplizard4 жыл бұрын
🤣
@magelauditore3334 жыл бұрын
5:25 i was like wtf. Can anyone have the link of whole video. 😂😂. Man i got excited and started shouting at home
@deeplizard4 жыл бұрын
It's a popular one. Google and you will find 😂
@Priya_dancelover Жыл бұрын
excellent
@MMphego4 жыл бұрын
Subscribed.............
@richarda16303 жыл бұрын
exciting and if you read WBW blog thrilling times
@SuperMIKevin Жыл бұрын
Lmao, I can't believe they actually named it embarrassingly parallel.
@hedonismbot1508 Жыл бұрын
Never in my lifetime would I have ever imagined "Embarrassingly [something]" would be an actual technical term.
@deeplizard Жыл бұрын
😅😅
@rifatmasud5 жыл бұрын
Anyone have that video link regarding "python is slow"?
@deeplizard5 жыл бұрын
Added it to the description. Here you go: kzbin.info/www/bejne/enO5fZadppd4nZI
@arkamitra43455 жыл бұрын
GIL is evil 😓
@louerleseigneur45324 жыл бұрын
merci merci
@True385 жыл бұрын
Did they remove all those stats/functions on a newer version of Cudo? Because I just recently downloaded it, and the only thing I can see on the screen is CPU, XMRig, h/s to the left and Payout Coin to the right. That's it! I'm using CPU but want to use GPU, but can't see any option. Please help if you can.
@alkeryn17004 жыл бұрын
Guys, don't use cuda, use HIP so it runs everywhere or use opencl or sycl but don't have your software stuck to proprietary and platform specific hardware and software
@beomheelee12493 жыл бұрын
you're so geinus!! Thanks
@ezevictor44484 жыл бұрын
Is a gtx 960m or 1050m worth using?
@zeppybrawlstars39065 жыл бұрын
So when I play games with advance Ai make sure my gpu is ready
@deeplizard5 жыл бұрын
Exactly. 🤖
@adithi7294 жыл бұрын
Sir is there any way to do cuda programing online? i mean any online compiler is present now? my system is not supporting cuda..pls help
@dalirkosimov46233 жыл бұрын
You can do it on jupyter
@DinoFancellu3 жыл бұрын
Its a pity that AMD don't seem to support CUDA. Their new big navi cars look really nice apart from that
@kanui36185 жыл бұрын
Can I combine Nvidia GTX 1070 or higher with amd ryzen 5
@deeplizard5 жыл бұрын
no amd at the moment.
@pscheuerling4 жыл бұрын
I came Here to learn how to utilitise deep learning cores for Training my own ai... I still don't know why i should buy cores i cant use.
@A-Predator3 жыл бұрын
😵
@lakeguy656162 жыл бұрын
the forward pass can rely on matrix math which can be run through CUDA(software layer) and done on an NVIDIA GPU. The more GPU cores, the faster the process. A gpu with 100 cores will perform this step 10x faster than a GPU with 10 cores (in general...).
@invest81985 күн бұрын
someone who bought $NVDA 2018 ?
@alebecker126 ай бұрын
I have a doubt, does Nvidia has a monopoly in such hardware? If not, does CUDA only work on Nvidia hardware?
@deeplizard6 ай бұрын
Yes. Nvidia built CUDA. It only works with their hardware.
@Infinityxx35 жыл бұрын
u need some music in vid... attracks views.... cool vid
@deeplizard5 жыл бұрын
What kind of music do you like?
@Infinityxx35 жыл бұрын
@@deeplizard PEWDIEPIE bitch lasanga ? jk... any relaxing music when u talk... when u changing camera shot etc... ;)
@deeplizard5 жыл бұрын
haha. That sent us on a tangent. Hadn't seen that before.🤣
@avdhutjadhav56572 жыл бұрын
Does that mean GPU with more CUDA is better for Deep learning ?
@lakeguy656162 жыл бұрын
yes (and more gpu ram...)
@MeowsyDancer3 жыл бұрын
I will now send this to anyone who asked why I bought a 3090! rip wallet tho
@deeplizard3 жыл бұрын
🤣💰
@yourgflikesit4 жыл бұрын
Steve Balmer off his face again lol
@robertsmith5125 жыл бұрын
SUBBED ! EVERYBODY SUB THIS CHAN ! THIS ONE KNOWS HE'S STUFF ! GO LOOK AT THE PLAYLIST LIB !
@deeplizard5 жыл бұрын
Thanks Robert! Note that there are two of us here. 🦎🦎
@MilesBellas Жыл бұрын
Jensen is better at 2x
@cubul324 жыл бұрын
Former CEO - and we can see why.
@deeplizard4 жыл бұрын
Although, he became a billionaire from his tenure at Microsoft 😄
@Joco50123 жыл бұрын
Wow was it this ok to do so much coke back in the day? 5:12 damn dude take it down a notch
@iLevelTechnology3 ай бұрын
He doesn’t explain tensor very well but overall good job
@ErrorRaffyline04 жыл бұрын
I really dislike CUDA becouse it’s not open-source and AMD is not able to use it, it makes development for both AMD and NVIDIA much harder.
@liangwei48694 жыл бұрын
5:57 東工大でるのは草www
@jacobcorr3372 жыл бұрын
CUDA not explained at all
@lakeguy656162 жыл бұрын
CUDA is a software layer that interfaces with Nvidia GPUS to allow porting some problems (think forward pass) to the GPU which can be done in parallel. (your pc has an Nvidia GPU, with software like Pytorch, you tell PC cuda is available and to send certain processes to GPU for processing in parallel.) vastly over simplified.
@Drtsaga3 жыл бұрын
"Deep learning" should not be on the title in my opinion.
@escapefelicity29133 жыл бұрын
I'm glad I don't need to listen to Jensen Huang.
@mmm-ie5ws4 ай бұрын
u did a rly bad job at explaining why gpu's are better for parrallel computing than cpu's.