How Nvidia Won AI

  Рет қаралды 418,335

Asianometry

Asianometry

Күн бұрын

When we last left Nvidia, the company had emerged victorious in the brutal graphics card Battle Royale throughout the 1990s.
Very impressive. But as the company entered the 2000s, they embarked on a journey to do more. Moving towards an entirely new kind of microprocessor - and the multi-billion dollar market it would unlock.
In this video, we are going to look at how Nvidia turned the humble graphics card into a platform that dominates one of tech’s most important fields: Artificial Intelligence.
Links:
- The Asianometry Newsletter: asianometry.com
- Patreon: / asianometry
- The Podcast: anchor.fm/asianometry
- Twitter: / asianometry

Пікірлер: 601
@Asianometry
@Asianometry 2 жыл бұрын
What would you like to see on the channel?
@bjliuyunli
@bjliuyunli 2 жыл бұрын
Thanks a lot for the video! Would be great to see a video about power semis like IGBTs and Silicon carbide.
@2drealms196
@2drealms196 2 жыл бұрын
You've covered Nvidia, could you cover Nuvia and Nivea?
@masternobody1896
@masternobody1896 2 жыл бұрын
yes more gaming video is what I like
@mikoyangurevichfulcrum
@mikoyangurevichfulcrum 2 жыл бұрын
Can't wait to see. You should do a video on the companies behind modern day tanks, no matter the country. The history on many companies (General Motors, Etc) and them taking the time to design the tanks and make them functional is very interesting. I'd love to see a video on that.
@screwcollege8474
@screwcollege8474 2 жыл бұрын
Marvell technology pls
@okemeko
@okemeko 2 жыл бұрын
From what a professor in my university told me, they didn't only "work closely" with researchers. They straight up gifted some cards so research centers in some cases. This way, not only did nvidia provide a good platform, but all the software made was naturally made for CUDA
@eumim8020
@eumim8020 2 жыл бұрын
My master's thesis supervisor has 5 professors submitting a request for a GPU each, NVIDIA covers all their monopolistic anticompetitive core with a whole system for helping public university systems, if i'm lucky my final DL model will be trained in his little office server with those GPUs
@slopedarmor
@slopedarmor 2 жыл бұрын
i think i member that nvidia gifted a gtx980ti to the developers of kingdom come deliverance (a kickstarter computer game), to supposedly help them with development? haha
@monad_tcp
@monad_tcp 2 жыл бұрын
Ah that old trick from Microsoft of gifting goodies. Like giving away office licenses or the entire Internet Explorer for free if you bought Windows.
@steveunderwood3683
@steveunderwood3683 2 жыл бұрын
If you don't provide some help to early adopters, how are you ever going to build a thriving environment? Providing cards, software, training and support to academics was a good thing. The sleazy stuff they did was to cook studies to make the benefits of a GPU look much greater than it really was, in applications where the benefits of GPU were marginal at best. GPGPU is great for some things, and weak for others. The early nVidia sponsored papers were so heavily rigged, it took some serious analysis to figure out where GPGPU was a real boon, and how big that boon might be.
@monad_tcp
@monad_tcp 2 жыл бұрын
@@steveunderwood3683 Yeah, its the environment, the benefits stated on those papers were rigged to nVidia's side, but they would be feasible in a computational level with a open environment for study. But the industry is too locked on Cuda/Intel x86 . At least now, things are going to change a bit, as if we could say ARM is different...
@e2rqey
@e2rqey 2 жыл бұрын
Nvidia does a really good job of identifying new burgeoning industries where their products could be leveraged, then integrating themselves into the industry from so early on that as the industry matures Nvida's product become essential to the functioning of that industry. I remember visiting a certain self driving car company in California about 4 years ago and seeing a literal wall of Nvidia 1080Ti GPUs. They had at least a couple hundred of them. Apparently they had all been gifted to them by Nvidia. I've heard Nvidia will also send their engineers out to work with companies and help them optimize their software or whatever they are doing, to get the maximum performance out of the GPU for whatever purpose they are using them for.
@zerbah
@zerbah 2 жыл бұрын
Nvidia has great support for AI and game development. When I was talking with a small indie game studio about their game, they confirmed that Nvidia sent them two top of the line founder's cards for development free of charge and offered to optimize drivers for their game when the final build is ready. Meanwhile, the AMD cards were crashing and having black screen monitors because of buggy drivers making it complete pain to test the development version of the game on them...
@aamirsiddiqui9957
@aamirsiddiqui9957 2 жыл бұрын
@@zerbah How long will AMD take to be as good as Nvidia
@cyranova9627
@cyranova9627 2 жыл бұрын
I remember one. that some game developer actually get invited to dining with Nvidia person to talk about their game development with nvidia GPU. not AMD one. all they just do sweet talk to game developer
@tweedy4sg
@tweedy4sg Жыл бұрын
True they do... but is not exactly successful everytime. Remember how they joined the mobile AP (application processor) market with the Tegra series, which now seem to have fizzled out into oblivion.
@graphicsRat
@graphicsRat Жыл бұрын
@@tweedy4sg Yes not every bet will win. In fact most bets will fail. But the 1 out of the 5 that succeed will more than pay for the failures and much more. That's how investments work. Venture capitalists for example know this too well. Not all their investments will pay off. But every now and then they invest in tomorrow's Google scale company and that's where they make their money.
@0MoTheG
@0MoTheG 2 жыл бұрын
CUDA was originally not targeted at machine learning or deep neural networks, but molecular dynamics, fluid dynamics, financial monte carlo, financial pattern search, MRI reconstruction, deconvolution and very large systems of linear equations in general. A.I. is a recent addition.
@TheDarkToes
@TheDarkToes Жыл бұрын
Back in the day, we would have 64 cuda cores and we thought we were hot shit hitting 800mhz. Look how far it's come.
@christopherpearson8637
@christopherpearson8637 Жыл бұрын
You stumble into the right choices sometimes.
@zombielinkinpark
@zombielinkinpark 2 жыл бұрын
Despite both Google and Ali cloud developed their own NPU for AI acceration. They are still buying large quantities of Nvidia Delta HGX GPUs as their own AI development platform. Programming for CUDA are far easier then their own proprietary hardware and SDK. Nvidia really put a lot of effort in the CUDA sdk and make it to be industry's standard.
@mimimimeow
@mimimimeow 2 жыл бұрын
I think it's worth mentioning that the a lot of recent advances in GPU computing (Turing, Ampere, RDNA, mesh shaders, DX12U) can be traced to the PlayStation 2's programmable VU0+VU1 architecture and PlayStation 3's Cell SPUs. Researchers did crazy stuff with these, like real time ray tracing, distributed supercomputing for disease mechanism research and USAF's space monitoring. PS3 F@H program reached 8 Petaflops at one point! Sony and Toshiba would've been like Nvidia today if they provided proper dev support to make use of these chips' capability and continued developing, than just throwing the chip to game devs and said "deal with it". I feel like Sony concentrated too much on selling gaming systems and didn't realize what monsters they actually created. Nvidia won by actually providing a good dev ecosystem with CUDA.
@dhargarten
@dhargarten Жыл бұрын
Didn't Sony at one point encourage and support using PlayStations for science computing, only to later block it completely? With the PS4 if I recall correctly?
@FloStyle_
@FloStyle_ Жыл бұрын
@@dhargarten It was the PS3 and running linux native on the console. Later that caused exploits and hacks of the hardware and sony closed the ecosystem really fast. That caused lawsuit that took years into the PS4 lifespan to conclude.
@Special1122
@Special1122 8 ай бұрын
​@@FloStyle_geohot?
@Quxxy
@Quxxy 2 жыл бұрын
I don't think you're right about what "clipping" means at 2:56. Occlusion (hiding things behind other things) is done with a Z-buffer*. As far as I recall, clipping refers to clipping triangles to the edge of the screen to avoid rasterising triangles that fall outside of the visible area, either partially or fully. As far as I'm aware, no one ever did occlusion geometrically on a per-triangle basis. The closest would be in some engines that will rasterise a simplified version of a scene to generate an occlusion buffer**, but that's not handled by the geometry engine, it's just regular rasterisation. *Except on tile-based rasterisers like the PowerVR lineage used in the Dreamcast and some smartphones, notably the iPhone. (Not a graphics programmer or expert, just an interested gamer.) *Edit*: Also, for 7:46 about the fixed function pipeline being totally gone: from what I remember this is not entirely true. GPUs still contain dedicated units for some of the fixed functionality; from memory, that includes texture lookups and blending. Reminds me of an old story from someone who worked on the Larrabee project who mentioned that one of the reasons it failed to produce a usable GPU was that they tried to do all the texturing work in software, and it just couldn't compete with dedicated hardware.
@Asianometry
@Asianometry 2 жыл бұрын
Thx. I'll look into this and see if a clarification is needed
@Quxxy
@Quxxy 2 жыл бұрын
@@Asianometry I doubt it. It's an inconsequential detail that doesn't change anything about the substance of the video. I mean, I doubt anyone is watching a video about nVidia's AI dominance looking for an in-depth technical description of the now long-obsolete fixed function pipeline. :)
@musaran2
@musaran2 2 жыл бұрын
Clipping is the general removal of what does not need rendering: view volume, backface, occlusion…
@tma2001
@tma2001 2 жыл бұрын
yeah I was about to post the same nitpick - also the setup and window clipping part of the fixed function pipeline is still there in hardware its just not programmable (nor should it be). The raster ops backend is not programmable either - just configurable. The Painters algorithm is an object based visibilty test that clips overlapping triangles against each other whereas the z-buffer is an image based per pixel visibilty test.
@vintyprod
@vintyprod Жыл бұрын
@@Quxxy I am
@PhilJohn1980
@PhilJohn1980 Жыл бұрын
Ah, geometry stages with matrices - I remember my Comp Sci computer graphics class in the 90's where our final assignment was to, by hand, do all the maths and plot out a simple 3D model on paper. Each student had the same 3D model defined, but different viewport definitions. Fun times.
@CarthagoMike
@CarthagoMike 2 жыл бұрын
Oh nice, a new Asianometry video! Time to get a cup of tea, sit back, and watch.
@deusexaethera
@deusexaethera 2 жыл бұрын
Ahh, the time-honored winning formula: 1) Make a good product. 2) Get it to market quickly. 3) Don't crush people who tinker with it and find new uses for it.
@heyhoe168
@heyhoe168 Жыл бұрын
Nvidia dont really follows (3), but it have a very strong (2).
@cubertmiso4140
@cubertmiso4140 Жыл бұрын
@@heyhoe168 agree on that comment. 3) corner the market 4) raise prices
@peterweller8583
@peterweller8583 Жыл бұрын
@@heyhoe168 3 Thay's too bad because that is where the most honey comes from.
@shmehfleh3115
@shmehfleh3115 Жыл бұрын
@@heyhoe168 Neither does Apple, unfortunately.
@locinolacolino1302
@locinolacolino1302 Жыл бұрын
3* Create an accessable proprietary toolkit (CUDA) that's become mainstream in legacy content, and crush anyone who tries to leave the Nvidia ecosystem.
@scottfranco1962
@scottfranco1962 2 жыл бұрын
Nvidia is a real success story. The only blemish is (as illustrated by Linus Torvald's famous giving the middle finger to them) is their completely proprietary stance on development. Imagine if Microsoft had arranged so that only their C/C# compilers could be used to develop programs for Windows. CUDA is a closed shop, as are the graphics drivers for Nvidia's cards.
@janlanik2660
@janlanik2660 2 жыл бұрын
But msvc can be used only on Windows.
@theairaccumulator7144
@theairaccumulator7144 2 жыл бұрын
@@janlanik2660 imagine using windows, much less mvsc
@scottfranco1962
@scottfranco1962 2 жыл бұрын
@@janlanik2660 I think you misread what I said. Microsoft (or any OS maker, Apple included) could have easily made it so that only their compilers could be used on their systems, no GCC, no independent developers. That is what Nvidia has done.
@janlanik2660
@janlanik2660 2 жыл бұрын
@@scottfranco1962 ok sorry for the misinterpretation. But even so, I have only used CUDA, which is indeed Nvidia only, but I believe that there are some cross platform solutions, e.g. OpenCL, so you don’t have to use proprietary tools to run something on Nvidia, or am I wrong?
@Ethan_Simon
@Ethan_Simon 2 жыл бұрын
@New Moon You don't need something to be proprietary to pay your engineers to work on it.
@BaldyMacbeard
@BaldyMacbeard Жыл бұрын
The secret of their success for many years was: working closely with developers/customers to gain advantage over their competitors. For instance, Nvidia would give free cards to game developers and send out evangelists to help optimize the game engines. Obviouly resulting in a strong developer bias towards Nvidia cards. Which is how and why they were outperforming AMD for many years. In the machine learning space, they are being extremely generous in their public relations to academia, once again giving away tons of free GPUs and helping developers out. It's a fairly common tactic to try and bring students on board so once they graduate and go on to work in tech companies, they bring a strong bias towards software & hardware they're familiar with. In the server market, Nvidia has been collaborating closely with most manufacturers while offering their DGX systems in parallel. They also have a collaboration with IBM that solders Nvidia GPUs onto their Power8 machines, giving a ginormous boost to bandwidth between GPU and CPU and also PS5-like storage access. And don't forget about the Jetson boards. Those things are pretty amazing for edge computing use cases like object recognition in video and such. They dominate like they do by not trying to sell a single product, but offering tons of solutions for every single market out there.
@409raul
@409raul Жыл бұрын
Genius move by Nvidia. Jensen Huang is the reason why Nvidia is where they are today. One of the best CEOs in the world (despite the greed LOL).
@TherconJair
@TherconJair Жыл бұрын
It's quite easy when your main competitor was nearly extinguished by anti-competitive measures of their much larger rival, Intel, and has to stay afloat somehow while bleeding money. Nvidia made so much money with gaming cards when AMD couldn't compete due to lack of funds for RnD with, that they had an extremely calm "blue ocean" to work with and could comparatively cheapely build up their de-facto monopoly in the space. AMD will need to invest a lot of money to somehow break into the now very "red ocean" of the Nvidia monopoly of CUDA. I don't see them able to survive in the long term against two much larger rivals, and we'll be all losers for it.
@Magnulus76
@Magnulus76 9 ай бұрын
Yeah, Nvidia offered alot of support. I know there's alot of fanboys that think NVidia must have some kind of secret sauce, but the truth is that CUDA's performance isn't necessarily any better than OpenCL. And I say that as somebody that owns an NVidia card. NVidia just spent alot on support and generated alot of influence/hype.
@Doomlaser
@Doomlaser 2 жыл бұрын
As a game developer, I've been waiting for a video like this. Good work
@conradwiebe7919
@conradwiebe7919 2 жыл бұрын
Long time viewer and newsletter reader, love your videos. I just wanted to mention that the truncated graph @ 15:28 is a mistake, especially when you then put it next to a non-truncated graph a little later. The difference between datacenter and gaming revenue is greatly exaggerated due to this choice of graph. I feel it actually diminished your point that datacenter is rapidly catching up to gaming.
@hgbugalou
@hgbugalou 2 жыл бұрын
I would buy a shirt that says "but first, let me talk about the Asianometry newsletter".
@yadavdhakal2044
@yadavdhakal2044 Жыл бұрын
Nvidia didn't invent the graphics pipeline. It was invented by Sillicon Graphics or SGI. SGI developed the language OpenGL as far as 1992. They mainly used to target cinema and scientific visualizations market. They used to manufacture entire work station with their own OS (IRIX) and other specialized servers. What Nvidia did was to target the personal entertainment market. This made Nvidia competent because of decreased overall unit cost. Later OS such as linux were able to run these GPUs in cluster and thus here too SGI loosed. SGI could easily be like Nvidia if they were on right track. SGI is now reduced to a conference known as SigGraph. And mainly is research based peer program. And still contributes to computer graphics especially through OpenGL and Vulkan API specification!
@lookoutforchris
@lookoutforchris 8 ай бұрын
The original GeForce card was so groundbreaking they were sued by Silicon Graphics for copying their technology. SGI won and nVidia paid royalties to them. Everything nVidia had came from SGI 😂
@drewwollin3462
@drewwollin3462 2 жыл бұрын
Very good as always. A good explanation of how graphics cards work and how they have evolved.
@Sagittarius-A-Star
@Sagittarius-A-Star 2 жыл бұрын
I don't want to know how much effort it was to put all this information together. Thanks and thumbs up. P.S.: At Nvidia they are insane. Just try to find out which GPU you have and how it compares to others or if they are CUDA capable ..... You will end up digging through lists of hundreds or thousands of cards.
@killerhurtalot
@killerhurtalot 2 жыл бұрын
That's the thing though. Nvidia usually actually has 6-7 actual chips that they manufacturer. They don't manufacturer tens or hundreds of GPUs each generation... The main difference is that due to manufacturing defects, the GPUs are just binned and has different sectors enabled. The 3090 and 3080 are actually the same chip. The 3080 just has around 15% less pipelines/CUs and less tensor cores enabled...
@Baulder13
@Baulder13 2 жыл бұрын
This man has no quit! The amount of research he puts in and how much content that has been coming out is ridiculous.
@Hobbes4ever
@Hobbes4ever 2 жыл бұрын
@@killerhurtalot kind of like what Intel does with their Celeron
@marksminis
@marksminis 2 жыл бұрын
@@killerhurtalot yes that is correct. A large silicon wafer is a huge investment. By testing each core, defective cores can be coded out, so you still have a working chip to sell. Throwing out a large expensive chip just for having a few bad cores would be insane. Only a small percentage of chips coming off the huge wafer are totally perfect, and those are mostly near the center of the wafer.
@TickerSymbolYOU
@TickerSymbolYOU Жыл бұрын
This is literally the best breakdown on KZbin when it comes to Nvidia's dominance of the AI space. Love your work!
@409raul
@409raul Жыл бұрын
Nice to see you here Alex! Nvidia for the win!
@prashantmishra9985
@prashantmishra9985 Жыл бұрын
​@@409raul Being a fanboy of a corporate won't benefit us.
@havkacik
@havkacik Жыл бұрын
Totally agree 👍 :)
@ted_1638
@ted_1638 2 жыл бұрын
fantastic video! thank you for the hard work.
@BenLJackson
@BenLJackson 2 жыл бұрын
I felt some nestalgia, good vid 👍 deciphering all this back in the day was so much fun. Also I love your explanation of AI and what it really is.
@DavidSoto90
@DavidSoto90 2 жыл бұрын
such a valuable video, great work as usual!
@Matlockization
@Matlockization Жыл бұрын
Thank you for explaining some of the details in the beginning.
@ministryofyahushua3065
@ministryofyahushua3065 2 жыл бұрын
Love your channel, very well presented.
@NeilStainton
@NeilStainton 2 жыл бұрын
Thank you for your excellent work in condensing and analysing NVIDIA’s progress.
@Meta5917
@Meta5917 Жыл бұрын
Great video. Keep it up, proud of you
@BartKus
@BartKus 2 жыл бұрын
You do really good work, sir. Much appreciate.
@punditgi
@punditgi 2 жыл бұрын
First rate information well presented! 👍
@dipankarchatterjee8809
@dipankarchatterjee8809 2 жыл бұрын
A very well researched presentation. Thank you Bro.
@harrykekgmail
@harrykekgmail 2 жыл бұрын
a classic in your stream of videos!
@screwcollege8474
@screwcollege8474 2 жыл бұрын
how you posted 2 months ago?
@2drealms196
@2drealms196 2 жыл бұрын
@@screwcollege8474 patreon members get access to his vidoes first. Later on he makes the makes the videos public. Another way is through his college partnership program.
@christakimoto8425
@christakimoto8425 4 ай бұрын
This is an outstanding and informative video. Thank you so much!
@ttcc5273
@ttcc5273 Жыл бұрын
Thank you for this video, it was informative, digestible, and I learned more than I expected to. 👍
@RandomlyDrumming
@RandomlyDrumming 2 жыл бұрын
A small mistake, right at the beginning - Geforce 256 had hit the market in 1999, not 1996. In the mid-90's, Nvidia was, more or less, just another contender, chipping away at the market dominance of the legendary 3dfx. :)
@shoam2103
@shoam2103 2 жыл бұрын
So theirs wasn't the first GPU? I think the PlayStation had an albeit very basic one..
@shoam2103
@shoam2103 2 жыл бұрын
Okay 5:55 clears it up a bit..
@RandomlyDrumming
@RandomlyDrumming 2 жыл бұрын
@@shoam2103 Well, technically, it was, as it handled the entire pipeline. Interestingly, the first *programmable* graphics chip for PC was Rendition Verite v1000 (RISC-based), released back in 1995, if I'm not mistaken. :)
@DM0407
@DM0407 Жыл бұрын
Yep, I had bought an RIVA TNT2 to play Asheron's Call in 1999. I guess the 256 was out at this time but I couldn't afford it bad the TNT2 was still a massive jump in performance. Going from a choppy software renderer to "hardware accelerated" graphics was amazing at the time.. The paths had textures! Who knew? I don't remember the original Geforce being that big of a deal, but I remember lusting after the Geforce 2 AGP.
@jmk1727
@jmk1727 2 жыл бұрын
man your videos are all always amazing......PERIOD.
@Magnulus76
@Magnulus76 9 ай бұрын
They had neural networks being used in computer games even back in the early 90's, to a limited extent (mostly a few strategy games). The reason there's hype about neural nets now, is that the raw computing power of a GPU allows companies to develop neural networks that can mimic human visual perception and pattern recognition.
@MohammadSadiqurRahman
@MohammadSadiqurRahman 2 жыл бұрын
insightful. loved the content
@rzmonk76
@rzmonk76 2 жыл бұрын
Subscribed, really nice presentation!
@Zloi_oi
@Zloi_oi Жыл бұрын
This is really interesting!Thanks for your work, sir!
@Socrates21stCentury
@Socrates21stCentury Жыл бұрын
Nice job, very informative !!!
@richardm9934
@richardm9934 3 ай бұрын
Fantastic video!
@jdevoz
@jdevoz Ай бұрын
Amazing video!
@helmutzollner5496
@helmutzollner5496 Жыл бұрын
Excellent overview. Thank you.
@skipsteel
@skipsteel 2 жыл бұрын
Thanks really well done, you made the complex simple thanks.
@GBlunted
@GBlunted 2 жыл бұрын
This is cool content, I liked this video! I like the explanation of low-level processes as well as the history lesson of how it all evolved to where it is today...
@LimabeanStudios
@LimabeanStudios Жыл бұрын
Just found this channel the other day and it's amazing. One thing I don't see mentioned in the comments is that Nvidia is often rated one of the best companies in the world to work at. It's a lot easier to do big things with happy employees lol
@nailsonlandim
@nailsonlandim 2 жыл бұрын
Excellent video. funny fact is I passed the day dealing with CUDA and a CV application I'm working on
@AaronSchwarz42
@AaronSchwarz42 2 жыл бұрын
Excellent analytics on market diffusion of COTS //
@kokop1107
@kokop1107 7 ай бұрын
This is a very good and acurate explaination
@ChristianKurzke
@ChristianKurzke Жыл бұрын
I love this, very well researched, and the correct level of technology for the average executive,.. who isn't a math genius. ;)
@jem4444
@jem4444 Жыл бұрын
Extremely well done!
@Bianchi77
@Bianchi77 2 жыл бұрын
Nice video,thank you for sharrng it :)
@shmehfleh3115
@shmehfleh3115 Жыл бұрын
This video filled in a lot of gaps for me. I work with the things and I wasn't sure how GPUs evolved into general computing devices.
@hc3d
@hc3d 2 жыл бұрын
wow, amazing analysis.
@FrancisdeBriey
@FrancisdeBriey 2 жыл бұрын
Subscribed !
@valenganev5774
@valenganev5774 Жыл бұрын
what do think about the fujitsu Celcius PC? where do you place this among other PC's? What is the future of fujitsu?
@Palmit_
@Palmit_ 2 жыл бұрын
thank you John. :)
@swlak516
@swlak516 2 жыл бұрын
These videos make me feel smarter than I really am. And I feel like you’re one of the few KZbin contact creators in the space who can do that. Thank you.
@Speed001
@Speed001 2 жыл бұрын
This is definitely a bit above me with tech terms I don't care to learn.
@PlanetFrosty
@PlanetFrosty 2 жыл бұрын
Good presentation
@supabass4003
@supabass4003 2 жыл бұрын
I have spent more money on nvidia GPUs in the last 20 years than I have on cars lol.
@mspy2989
@mspy2989 2 жыл бұрын
Goals
@heyhoe168
@heyhoe168 Жыл бұрын
Same. Btw, I dont have a car.
@wazaagbreak-head6039
@wazaagbreak-head6039 Жыл бұрын
I have no reason to update my ancient corolla it's a piece of crap but it gets me to work each day
@LimabeanStudios
@LimabeanStudios Жыл бұрын
I have only purchased one of each and same lmao
@prateekpanwar646
@prateekpanwar646 Жыл бұрын
@@wazaagbreak-head6039Is it 750 TI / 760?
@birseyleryap
@birseyleryap Жыл бұрын
that popping sound @8:53 from the lips
@michaelhulcy6680
@michaelhulcy6680 2 жыл бұрын
"Triangles. Triangles all the way down baby." Dat was a good one. Making Duke Nukem in 96 jealous man.
@Jensth
@Jensth 9 ай бұрын
You were spot on with this one. After this came out everyone bought NVIDIA stock up like crazy.
@green5270
@green5270 2 жыл бұрын
excellent video
@Campaigner82
@Campaigner82 2 жыл бұрын
You make so good videos! The pictures I’m intrigued by. You’re doing a good job!
@nabeelhasan6593
@nabeelhasan6593 2 жыл бұрын
I always wish there was a unified framework like Cuda for all platforms , NVIDIA absolute monopoly on Deep learning reallly makes things hard
@Corei14
@Corei14 2 жыл бұрын
Open cl. Now making it work as well as others is a different question
@joshuagoldshteyn8651
@joshuagoldshteyn8651 Жыл бұрын
How does it make things really hard? Simply use an Nvidia GPU with high batch sizes or any CPU with low batches sizes?
@emulegs5
@emulegs5 2 жыл бұрын
Please remember to leave a video link to the last video and the first in a series and the first aswell, I would have clicked links to them based off your intro alone
@markhahn0
@markhahn0 Жыл бұрын
important to point out that no one really uses Cuda for AI - they use pytorch or tensorflow. that means that Nv doesn't have any real lock on the market - alternatives are highly competitive.
@Stef3m
@Stef3m Жыл бұрын
That is an important point that is too rarely bring out
@kotokotfgcscrub
@kotokotfgcscrub Жыл бұрын
ML frameworks came to existing later and were built upon cuda and cudnn, and are way more optimized for nvidia even after starting to support other hardware.
@igorwilliams7469
@igorwilliams7469 2 жыл бұрын
Thinking about that elevator analogy a bit too much... Are there ANY elevators with level tiers midway (like bottom and top) for riders to decamp? While obviously adding complexity to a system that it almost plug and play, it could certainly be interesting!
@timswartz4520
@timswartz4520 Жыл бұрын
That G-Force 256 made me very happy for a long time.
@bhavintoliabg4946
@bhavintoliabg4946 2 жыл бұрын
This one video made me respect NVIDIAs work more than any advt ever would.
@ravindertalwar553
@ravindertalwar553 Жыл бұрын
Congratulations 👏 and lots of love and blessings ❤️
@cfehunter
@cfehunter Жыл бұрын
"Early graphics processing broke scenes up into triangles".... they still do.
@MikkoRantalainen
@MikkoRantalainen Жыл бұрын
Great document as usual but the bar graphs not starting from zero around 15:30 wasn't very cool. The illustration made it appear like Gaming is more than double the Data Canter revenue but when you compare the actual numbers 3.22 vs 2.94, you'll quick see that the difference is actually about 9%!
@gregsutton2400
@gregsutton2400 Жыл бұрын
great info
@GhostZodick
@GhostZodick 2 жыл бұрын
Your video always have a low frequency pounding sound in the background. Would you mind look into that and try to fix it in the future videos? At first I thought it was something pounding in my house, but later realized it was in your video because I only hear it at certain parts of your video.
@allcouto
@allcouto Жыл бұрын
You guys completly fogot DOJO!
@johnaugsburger6192
@johnaugsburger6192 2 жыл бұрын
Thanks so much
@zodiacfml
@zodiacfml 2 жыл бұрын
beaten me to this critique which is the most important part of Nvidia's luck/success. I recall it took years before Nvidia finally got to CUDA support/programming. Researchers using GPUs is also the reason why AMD bought Ati. There was a whitepaper from AMD that computing will move/focus to graphics from then on, they were just more than a decade way too early with that prediction. Another thing to note, it is the gamers/consumers that made all this possible paying for the R&D of graphics cards that will be used to sell products for the datacenter. Ray tracing hardware for example is a poor feature for use in gaming currently but it is excellent for industrial use.
@markhahn0
@markhahn0 Жыл бұрын
in some ways, it's remarkable how poorly AMD has done. they've never delivered on anything like a sleek cpu-gpu-unified infrastructure, even though they have all the pieces in hand (and talked about things like HSA). it'll be ironic if Intel manages with oneAPI, since for so long, they were defending the CPU like a castle...
@zodiacfml
@zodiacfml Жыл бұрын
agreed. though the hardware on the latest gaming consoles were impressive when they were announced, just ok when the consoles became available. AMD also doesn't have a foot in Arm where Nvidia has on the Nintendo Switch and Apple on M1. My last two PCs are Intel i3-8100 and recently a i3-12100 since I have some use of the iGPUs.
@rjl7655
@rjl7655 2 ай бұрын
newsletter link doesn't work...
@IntangirVoluntaryist
@IntangirVoluntaryist 2 жыл бұрын
I still have several old gen cards TNT cards, banshee, voodoo, first gen geforce, some early gen ati cards too i also have some old soundblaster cards :)
@estebancastellino3284
@estebancastellino3284 Жыл бұрын
I remember when NVidia software graphic accelerator card was the cheap option for those of us who couldn't afford a Vodoo card, the one that did came with hardware accelerator. Vodoo was about it's fith version by the time NVidia put the GForce chip on.
@etherjoe505
@etherjoe505 2 жыл бұрын
Single Instruction Multiple Data 👍👍👍
@doug184
@doug184 7 ай бұрын
AMD?
@royfpga
@royfpga Жыл бұрын
Thanks!
@19smkl91
@19smkl91 Жыл бұрын
6:24 I've seen people rubber banding when stepping on and even bug off at half way up, usually receiving hurtings.
@johndvoracek1000
@johndvoracek1000 Жыл бұрын
I was wondering if you would mention Apple; then you did at the end but not in the way I anticipated. Isn't Apple's M chip a move in the same chip capabilities and architecture as Nvidia, etc.?
@davidbooth8422
@davidbooth8422 2 жыл бұрын
Hi John. I love your videos! Do you have any possible connections that might want to manufacture a much better and cheaper smoke detector than is available today? I would love to explain how easy that would be to any technical person who would listen. I am not trying to make money either, just save lives.
@minecraftdonebig
@minecraftdonebig 2 жыл бұрын
If i was in charge all chip processes engineers and associated people would be required to wear wizard hats because this shit is insane magic
@tonyduncan9852
@tonyduncan9852 2 жыл бұрын
Thanks for that. :)
@hunter8980
@hunter8980 2 жыл бұрын
How many polygons RTX 3080 process per sec?
@Tigerbalm338
@Tigerbalm338 Жыл бұрын
To paraphrase a popular SNL skit: "Triangles baby! MORE TRIANGLES!"
@JohnKobylarz
@JohnKobylarz 2 жыл бұрын
Excellent video. As someone who remembers when the GeForce 256 was launched, it’s amazing to reflect on how far they’ve come and how influential their tech has been on the world. Before GPU’s, PC gaming was a much different affair. Even looking at JPEG’s was an somewhat intense system task before GPU’s became the norm. I learned a lot from this video, and enjoyed it. It helps me connect the dots regarding how AI learning works.
@villageidiot8194
@villageidiot8194 2 жыл бұрын
Go do an article on Innosilicon, a mainland Chinese GPU maker. How far behind are they? Is there any hope for a third or fourth GPU player in the market space. Will Intel Arc be the 3rd player?
@justice929
@justice929 2 жыл бұрын
Lisa Su and Nvidia CEO are Taiwanese. same with TSMC
@villageidiot8194
@villageidiot8194 2 жыл бұрын
@@justice929 Don't know how Nvidia & TSMC entered the chat. I was asking about Innosilicon, they have Fantasy One GPU graphics card. From what I can gather, their offices are in Zhuhai, Wuhan, Suzhou, Xi'an, Chengdu, Dalian, Beijing, Shanghai, Shenzhen, London, Silicon Valley, Toronto. Note that only 3 offices outside of China (UK, US, Canada) and 9 offices in China. Their headquarters are in Wuhan, Hubei province, China.
@perforongo9078
@perforongo9078 Жыл бұрын
I think a company like Innosilicon would do well in China itself because China is so protective of homegrown companies. But if I were to bet on a third player in the GPU market I'd bet on Intel.
@final0915
@final0915 Жыл бұрын
12:35 haha i wonder what images they collected for non-hotdogs
@ylstorage7085
@ylstorage7085 Жыл бұрын
no offense... but the Y-axis @15:24 ... dude... that's what I called, how to make a 3 looks 3 times bigger than another 3
@adman719
@adman719 2 жыл бұрын
I think you misunderstood or at least misrepresented parallel computing. When you talked about parallel computing and compared it to bees not needing to worry about what the other bee is doing. This isn’t really how parallelism works in computing. When a problem is worked through in parallel each core or unit the work is divided among has to be synchronized and are entirely reliant on what the cores around them are doing. Time steps are carefully plotted out with each cycle being a crucial block that every core must produce something within. In a sense there is freedom but within a very narrow band of one clock cycle. The bees would be more akin to asynchronous computing (asynchrony) en.m.wikipedia.org/wiki/Asynchrony_(computer_programming)
@jayeshmahapatra7085
@jayeshmahapatra7085 2 жыл бұрын
This is a video made for the general public, hence the simplistic explanation. This helps a general person visualize why parallelism is faster. I think you are being overly pedantic.
@psd993
@psd993 2 жыл бұрын
The workloads that are offloaded to these GPUs or even their predecessors, Vector processors, have a lot of parallelism at the data level. Data parallelism requires much less time step management hassle, compared to a regular workload that has been split into multiple threads on a CPU. Broadly we can think of this as data level vs instruction level parallelism. One is more easily parallelized than the other.
@taimalik1110
@taimalik1110 2 жыл бұрын
This is essentially my stock market news channel! Thank you sir, I shall invest even more in NVDA :P
@DK-ox7ze
@DK-ox7ze Жыл бұрын
Nvidia stock has taken a beating lately, just like many others tech stocks. I bought Nvidia few months back because of exactly the reasons mention in this video, but it's 42% down since then. Hope that it recovers soon.
@HA-cy4vx
@HA-cy4vx 2 жыл бұрын
good GPU history class
@Alphahydro
@Alphahydro Жыл бұрын
That's pretty interesting, and only the tip of what we'll be able to accomplish with GPU horsepower.
@isaacamante4633
@isaacamante4633 2 жыл бұрын
At 15:31 the graphic on the left is not anchored at zero.
@Cythil
@Cythil 2 жыл бұрын
And is not that clear it not. Generally is good form to indicate this.
@AO-rb9yh
@AO-rb9yh Жыл бұрын
Hmm. Pretty solid understanding of neural networks. Phd level.
How Nvidia Won Graphics Cards
22:42
Asianometry
Рет қаралды 411 М.
The iPhone Forever Changed the RF Filter
22:22
Asianometry
Рет қаралды 20 М.
Amar new remote control snake 🐍
00:11
Ruhul Shorts
Рет қаралды 59 МЛН
Dogs and people become brothers~ #funny pets #funny dog #funnyvideo #shorts
00:10
Why is this number everywhere?
23:51
Veritasium
Рет қаралды 1,3 МЛН
The Coming AI Chip Boom
15:41
Asianometry
Рет қаралды 337 М.
NVIDIA REFUSED To Send Us This - NVIDIA A100
23:46
Linus Tech Tips
Рет қаралды 9 МЛН
AI’s Hardware Problem
16:47
Asianometry
Рет қаралды 604 М.
Looking Back At ATI Technologies
16:42
Asianometry
Рет қаралды 124 М.
Jensen Huang of Nvidia on the Future of A.I. | DealBook Summit 2023
28:10
New York Times Events
Рет қаралды 685 М.
AMD: The Incredible Adventure Continues
25:55
Asianometry
Рет қаралды 84 М.
Why AMD's Chiplets Work
12:53
Asianometry
Рет қаралды 286 М.