Please ignore all scam comments! You can follow me on Twitter: twitter.com/johncoogan or join my discord: discord.gg/e9nKhPCNkq
@jenovaizquierdo2 жыл бұрын
I just report one scam just now.
@Marius-vw9hp2 жыл бұрын
Any chance we might get a video on OpenGL? That would be awesome! :)
@charlesficherwcoffeebreaks75212 жыл бұрын
Shabbat-Shalom.Asé.
@lb59282 жыл бұрын
Congratulations, think you might be the biggest Nvidia fanboy on KZbin. Nice job leaving out all Nvidia IP theft and scandals. A GPU is something that existed years before Nvidia existed. Nvidia has a fitting name as their destiny is to enviously strive to be exceptional but never achieve it. By your bad definition CPUs were the first GPUs because they are the first to accelerate 3D graphics via massive parallel processing using MMX in 1996. Also AMD CPU/GPU powers the top10 supercomputers in the world. How is that possible if Nvidia is as good as you say? Nice clickbait thumbnail it seems you know Nvidia's demise is more appealing then some fanboy worshipping Nvidia in an armature documentary.
@Soromeister2 жыл бұрын
You failed to mention 3dfx.
@vesperiadragon32212 жыл бұрын
I would say, nvidia hasn’t always continued to innovate. Their history shows a pattern of anticompetitive behavior mixed with many periods of stagnation where the only thing done to improve their cards was to add power draw and minor other things. They stifled innovation a few years because they had no competition. Not quite at the level that intel did, but sadly a good amount. They are the kings of graphics, for sure. Jensen has done incredible things, but I can’t in good faith applaud the company too much in its current state.
@ajcano16272 жыл бұрын
Right, It's been so long since they're still stagnant in a Single Digit Gigahertz Speed. Japan just made the Fastest Internet Speed with 1.02 PetaBit travels over 51.499 kilometres each second. Soon, 127,500 GB of data could be sent every second. The best part about this achievement is that the technology can be used immediately. A Petabit Internet Speed Needs a Equivalent Chip for the fully optimum usage for faster technological advancement. With that Realistic Virtual Reality Games will probably be possible, a game world that looks like the Movie of Matrix or just like VRMMORPG of the Anime Universe where a game that would feels real.
@ThePulsarRED2 жыл бұрын
You go read about DLSS and CUDA and Onniverse and all the research papers published by Nvidia. Compare it with Intel and AMD and see which company innovates more. the truth the both Intel and AMD duopoly have stifled innovation in the CPU space.
@PorkeyPineOnline2 жыл бұрын
@@ajcano1627 This comment reads like it was written by a bot. Why are you trying to accuse them of stagnation because their chips are in "Single Digit Gigahertz Speed?" CPUs figured this out the hard way almost 20 years ago, and everyone knows how GPU performance typically scales more with execution units, not speed. I thought everyone was mostly on the same page for the surface-level stuff?
@ajcano16272 жыл бұрын
@@PorkeyPineOnline everything needs to be processed fast just like how our brains process data, especially now that they're in a race for innovating a perfect Virtual Reality World. I just googled yesterday Intel this year created the fastest chip with more than 5ghz.
@adi62932 жыл бұрын
@@ThePulsarRED You go and read how Intel tried to bankrupt AMD then you will see why AMD wasn't in best shape to compete with either company for a long time, even know AMD is still much smaller than nVidia never mind Intel 😜
@Maceman19902 жыл бұрын
What I miss in these videos is a deeper look into the company. Getting a good team together is half the battle. Reducing choices made to 'the founder had this great idea' misses out on the power of good advice and high quality management.
@ken-adams2 жыл бұрын
It's always a matter of risk vs reward. Great leaders get things done! When he could have continued to get a good steady income being an employee at AMD or at lsi logic , he choose to give up stability and put all his money , experience and reputation on developing a new industry all together which then leads rise to many more brilliant things like the rise of machine learning and how that helped the whole world to make a vaccine for covid so impossibly quickly with such less negatives , able to impact the world on such level is beyond comprehension, every visionary innovations from electricity, battery , transistor, microprocessor, GPU and cuda cores , neural networks etc takes whole credit ! And the visionary leaders who set out to achieve these fantasies into reality deserves more than an employee can fathom !
@JohnCooganPlus2 жыл бұрын
100% agree with you. It's hard to highlight every different angle in every video, so I like to hone in on one key aspect. If you want to see a video about the power of high quality management, you need to watch this: kzbin.info/www/bejne/gWLbiJSrlpKhgqs
@privateerburrows2 жыл бұрын
@@JohnCooganPlus What I would have liked a bit more of is technical comparisons between what they were doing versus what everybody else were doing. I have some inklings as to what was going on. There was, of course, the invincible juggernaut of the time: 3DFX. We (consumers) all thought the Rivas had no chance in hell against 3DFX Voodo. But there was also another company whose hardware didn't use polygons; only planes, whose intersections were deduced on the fly. They lost the race by cheating; I remember; their driver detected a benchmark running and made special adjustments. It came out, and overnight they went bankrupt. There was also the Pyramid chip project, that was going to be optics/physics based, but no prototypes. And a whole bunch of 3D accelerator cards for those who could not afford a Voodo Monster3D. On the software side there was the competition between Direct3D and OpenGL, but there were other odd players, like the Voxel engine. Then again, the limit back then was on the CPU being able to transform to view frustrum; but then AMD came out with 3DNow! (floating point extension to MMX), and Microsoft supported it, and suddenly everybody was buying AMD's again, and the ball was on the graphics court again. When the Riva128 came out nobody cared initially; then I read that it had 128 full floating point math engines running in parallel, and began to take some interest... And then today we have AMD/ATI, that it seems to me are consistently beating NVidia at their own innovation game. But that's just my superficial overview; I don't really know what exactly made NVidia Riva-128 triumph over 3DFX, for example. Perhaps 3DFX was more "monolithic"?, executing a fixed pipeline?, less flexible? Just guessing. I remember at the time Microsoft had this evangelist guy, forgot his name, that went around the world getting companies on-board with the Direct3D dream. The guy was truly passionate about it, and he was looking for configurability and frank dialogue with hardware above all else. He hated the fact that some of the hardware of his day would say to the OS that it could do A, B and C, but then it was so slow at doing C that software would have done it faster. (EDIT: Remembered the name: Alex St-John.)
@fearisthemind-killer2 жыл бұрын
@@privateerburrows I remember when 3dfx announced bankruptcy and soon after it was announced that nVidia as buying up some of 3dfx. I think I read that in MaximumPC at the time. My crew couldn't believe it. We knew nVidia was coming on the scene, but we all had 3dfx cards and believed that 3dfx still dominated. Bankruptcy seemed to come out of left field.
@privateerburrows2 жыл бұрын
@@fearisthemind-killer LOL, you're so right; it was totally overnight, boom!, gone! And I remember a revolutionary audio chip at the time; a 3D positional audio chip that was giving the sound blaster type products a hell of a competition. It had also some hundred or so convolution engines working in parallel to model sound reflections based on game geometry. They had been adopted by the Monster brand, Monster 3D Sound cards. They lost the race by trying to steal the Monster profit and sell cards directly. Monster cut them off, and their cards went nowhere. Pity; the technology was very promising. Another thing back in those days was a company that tried to come up with an odor producer for more immersive games; I think they were planning some 256 basic odorants mixable under software control. Quite a decade, those 90's. Yeah, MaximumPC was my favorite mag; on a level with Scientific American, on the computers side of things. :-)
@masteringcode94922 жыл бұрын
This is another level of story telling, thank you very much, I just found out about your channel today and I watched so many videos and acquired tons of knowledge .
@chinothechigga27352 жыл бұрын
Same! Been on quite the little binge on this dudes videos tonight, although I should be going to sleep lolol I’m stuck in the comments.. I need help 😅
@pubdoart80527 ай бұрын
I agree. The story telling on this Channel is very good.
@johaansmagic17462 жыл бұрын
Dude, the second I receive a notification that he posted a video, I don’t hesitate
@leeargent58 Жыл бұрын
You know we,d do just fine in fact great if the rest of humanity was as/this quick/egaer to learn and/?become educated/human to be educated is to be human
@DeveloperTrance2 жыл бұрын
So much research this dude did. So much respect!
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out🚀🚀🚀
@kairosmoments82842 жыл бұрын
Hats off honestly
@bharold2 жыл бұрын
A small correction and clarification regarding "Microsoft called it Direct3D, we know it as DirectX today." It's always been DirectX (ignoring the short-lived Windows Games SDK name), and it's always been Direct3D. DirectX is the name for all the Direct* collection APIs/components (Direct2D, DirectDraw, etc.). Direct3D is specifically the 3D component, and the most recognized by the general public.
@robbdudeson3469 ай бұрын
It was Direct Draw first... I'm probably older tho, most people don't even understand the Evolution of Display Technology AT ALL... Once upon a time, Gaming in 3 colors was Hella Cool ... They weren't "good" colors either
@xorenpetrosyan28792 жыл бұрын
this video has so much research behind it and John really knows what he's talking about. Thanks for the bomb content, John!
@johnvu26902 жыл бұрын
Love the indepth research you obviously put into all your videos John!!! Awsome editing and you have the perfect voice for narating videos. You're easy to listen to and watch with different angles of information every few seconds. I've never gotten bored watching one of your videos from begining to end. I hope your channel blows up. Keep up the great work!!! Happy Thanksgiving!!!
@matthewsheeran9 ай бұрын
There was a time when their chips substrate/packaging was cracking and faulty in at least laptops for at least a generation: this should have been mentioned. I stuck with the Intel/AMD on CPU GPUs thereafter.
@CattleRustlerOCN2 жыл бұрын
I'm surprised there was no mention of 3dfx's discreet 3d gpu technologies including SLI in this video. It was a big step for Nvidia when they absorbed 3dfx. I'm pretty sure 3dfx was the first to manufacture discreet 3d gpu cards for the mass market and were absorbed by Nvidia just prior to the GeForce 256 coming out.
@hakanyucel16392 жыл бұрын
Yeah like I wrote 5 days ago and no answer on it 🤔
@mVic82 жыл бұрын
It seems like the video is focused on just Nvidia. He only references other companies, like AMD, because Jensen worked there. Although I think it would’ve been good for him to mention 3DFX, I feel like it’s a slippery slope in potential tangents. In short, it seems like he stays 100% on course to keep the video length down.
@hakanyucel16392 жыл бұрын
@@mVic8 Yeehh but 3dfx was the most important thing happened in Game
@hakanyucel16392 жыл бұрын
@@mVic8 Yeehh but 3dfx was the most important thing happened in Game
@Modenut2 жыл бұрын
Was wondering about that as well
@gotfan77432 жыл бұрын
It would have been nice if you would have included why Apple severed it's relationship with Nvidia in 2009. Macbooks and iMacs don't come with Nvidia GPU's. They all have AMD graphics.
@cubancigarman26872 жыл бұрын
Jensen is quite an aggressive business man. Apple also believes that it as an entity is larger than nvidia. I am an amd guy, but you have to give credit to Jensen as being David fighting with all the Goliaths. These guys are like all giants in their own rights and it’s hard where we shall put our monies down on. I suppose that the best case scenario for us users would be competition from many manufacturers as possible.
@djsaekrakem36082 жыл бұрын
:) of course, I got amd CPU and intels cpus, same with GPU's
@MrHeHim Жыл бұрын
Nvidia is very keen to keep there tech proprietary going out of there way to engineer it so that it runs much worse on competitors hardware or not at all. I.E. tessellation, it was redesigned before release to run worse on ATI hardware even thought it also made it run worse on Nvidia's, so long as it ran even worse on ATI. Then it was over implemented in games (Far Cry) to make ATI cards run even worse. They even covertly bought benchmarking companies to rig benchmarks. Now we see the same with Ray Tracing and DLSS. Older versions of DLSS have been shown to run much more efficiently when "unlocked", meaning if they didn't go though the trouble of locking it down to Nvidia cards it would run even faster. And Ray Tracing, well you can make that what you will. Intel is no better, and AMD has sued both and won billions. That was the actual seed money they used to kick start Ryzen
@jasonmorgan410811 ай бұрын
Thanks for the informative video. I appreciate that you didn’t include any in video ads!
@Glowbox3D2 жыл бұрын
Positives: this video was really well produced. John, you talk great! Very excited about the future of games, 3d and computation. Negatives: Ooooo the clickbait is real. What's the secret? The amount of 'professional' YT channels that straight up lie to get clicks is astronomical. Also, YT ads are getting bad.
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out🔘🔘🔘
@bugejo0502 жыл бұрын
This video is incredible! Your storytelling and insights are phenomenal. Thank you!
@RemiStardust2 жыл бұрын
The production quality is next level! The music choice and all that video footage - excellent!
@juliuseller24862 жыл бұрын
It has such a great vibe also. After watching, I feel really motivated.
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out💫💫💫
@andymetzen2 жыл бұрын
1:06 It's Taiwanese origin, not Chinese origin. As a Taiwanese, I hate being called Chinese, be respectful. You wouldn't use "Chinese origin" to describe a Japanese or Korean even though their ancestors also came from China.
@durexuncensored2 жыл бұрын
poor guy,cant correctly identify youself
@AftermathXJ2202 жыл бұрын
That's like calling Ukrainian a "Russian"
@jimmylee22112 жыл бұрын
Man, this video deserves 10 likes... very well researched and presented... Keep up the good work, top
@Daniel_WR_Hart2 жыл бұрын
only 10?
@flogjam2 жыл бұрын
The board partner ODM's don't get enough of a cut. EVGA has just left the partnership... keep an eye out for details on the aggressive practices behind this. Choosing SAMSUNG to fab the RTX30 series chips was purely based on price. If TSMC had manufactured those chips using their process, they would have been more efficient, cooler and potentially faster. (Ampere GA100 was made by TSMC as an example). They could also, use HBM2 on the chip to further increase performance, but they don't for the consumer cards. Another point - GTX970 RAM-Sham (i sent my pair back for a refund) - also selling to Crypto Miners directly........... yet, I still buy their products. Still it's best to know who you are dealing with and the sun does not shine out their backside.
@Mark-kt5mh11 ай бұрын
The cost benefit tradeoff of HBM doesn't make it favorable for consumer applications
@johnvannewhouse2 жыл бұрын
Subscribed a while back....but just now rediscovering the brilliance of your analysis. Damn.... you are good!! And your passion + logic speaks VOLUMES about the quality of your perspective.....keep it up, brotha!
@JohnCooganPlus2 жыл бұрын
thanks so much! i love hearing that. hopefully just getting a little bit better each week!
@jdobdob894711 ай бұрын
The need for high computing power is hard to imagine, however when someone shows what can be done with it, others will want to do the same. This is a powerful demand driving effect here.
@henrryfermin50332 жыл бұрын
"if your not looking for ways to innovate and expand your mission, your falling behind" I am applying this to my business today!
@Brakiri2 жыл бұрын
It is interesting how Jensen is glorified in this video, while ignoring the takeover of 3dfx that boosted nvidias knowledge quite a bit and ignoring the aweful takes from Jensen at AIBs, the misleading of customers with the naming sheme of the 10 and 40 series and the price gauging nvidia has been doing and is continuing with the 40 series. A 4070 level card for 900 bucks. But I think this video is more for investors then gamers / tech interested people, just showing how nvidia made them so much money. A more differentiated view would have been more honest.
@hkgamma2 жыл бұрын
Great video, but I'm a little sad that you never mentioned 3dfx. It has huge back then and it was the biggest Nvidia's rival.
@MannibalLector2 жыл бұрын
Those old Voodoo cards were top dog back in the day
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out🚀🚀🚀
@aninditabasak76942 жыл бұрын
And then 3dfx was acquired by Nvidia.
@rcavicchijr Жыл бұрын
Actually Universities like Stanford started their GPU computing projects with ATI, because they were going for open source. Unfortunately it was right around the time that AMD bought them, and AMD was in trouble so it was one of the programs that AMD cut, leaving the door wide open for CUDA to take off with AI.
@kevinkellogg6874 Жыл бұрын
I find your content to be captivating and highly informative. Thank-you for quality content.
@googlyeyedmoose64352 жыл бұрын
Hey John, I really enjoy your videos and appreciate you making them. Question: I'm curious what your thoughts are on Meta-Materials being able to use faster chips without the use of rare earth materials? It's would be really interesting to see what the future holds with it. Will Meta-Materials take over the current way we use chips?
@dantousuo13022 жыл бұрын
This over most peoples head
@scroopynooperz90512 жыл бұрын
A 9 year old asian immigrant kid helping a 17 year old with math... yeah, it tracks 😂
@slipknotbalushi71352 жыл бұрын
😏
@baked9212 жыл бұрын
John. I stumbled apon this and I have to say your way of telling this story is amazing and had me interested from start to finish. Your truly very a talented clear and articulate story teller. Keep up the amazing work! New sub here...
@narcisjr Жыл бұрын
love your videos man..keep up the good work
@robbdudeson3469 ай бұрын
The First really Great 3D card was 3Dfx Voodoo 2 ... If you dont know then you dont know, the Absolute Exponential Boost in Graphics was AMAZING, and when 3Dfx kinda started to fall behind, Nvidia bought it, and that is when they started to Really make some great cards. AMD bought ATI not long after that...
@raijin77072 жыл бұрын
I like how he showing games that was actually running on a AMD chipsets exclusively 😂
@gavdev122 жыл бұрын
Another incredible video John, thank you
@CSW762 жыл бұрын
What a great, well researched video, really well done, I know nothing about this topic but so captivated i had to watch to the end.
@edmundkudzayi75712 жыл бұрын
Thoroughly enjoyed the fast and engaging delivery. Well done.
@clavo335211 ай бұрын
Congrats on a great informative video ! Many prior compliments on your work here. Well deserved and well done ,!
@jeffcarbello9115 Жыл бұрын
This was so fun to watch. You are an exciting skilled narrator and IMO as good if not better then some on the biggest names.
@Line49Design Жыл бұрын
He IS one of the biggest names, so it's all good
@JoaoLopes-ps9zk2 жыл бұрын
What an amazing youtube channel! Incredible work! Thanks from Portugal
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out🔥🔥🔥
@redringer3142 жыл бұрын
When computer stuff is being discussed it seems like the writers have no idea what they are talking about...
@SuperSkandale2 жыл бұрын
Brings back the memories. Being a kid and getting your hands on a GeForce card was a feeling that is hard to replicate. Technology was fascinating and interesting back then. I had a Geforce 1 and it turned out it had problems rendering 8 bit textures. Had to return it and got back a Geforce 2 because the geforce series was discontinued. Happy happy kid :)
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out🔘🔘🔘
@berniel17992 жыл бұрын
Truly fascinating but I was not an adopter ATI was more my speed as competition is good and Nvidia does not believe in competition
@shreechatane92152 жыл бұрын
This channel is a gold mine, very underrated content.
@AftermathXJ2202 жыл бұрын
This video paints Nvidia into a hero, but at one point mentions Nvidia & Apple were/are similar, I couldn't agree more. Both are greedy & charge you as a consumer as much as they can get away with. Nothing was mentioned about GPP or how they made those billions in 2021 & early 2022.
@mramd90622 жыл бұрын
What i miss in this video is how Huang made all new tech proprietary, and that without that proprietary software NVIDIA was already for years pushed out of gaming. Lets be very clear,Huang will not rest before his products have 100% margins, and gaming cards will NEVER get that for him. That is why he is puting more effort in AI and server GPU's, he can ask whatever he want .... till Radeon gets really strong. Huang will leave gaming behind in 2 to 3 years, margins aren't going up as fast as he likes.
@QuaaludeCharlie2 жыл бұрын
Hey John I Subbed , Liked and Shared ..Thank You ... That was a Great Presentation of a really Great Story in Computer History :) QC
@andrewwickham46428 ай бұрын
Thank John for the most informative video. From knowing next to nothing of NVIDIA history to your fantastic deep dive. Congrats!! Cheers Andrew
@MerStudiosYT2 жыл бұрын
Bruh... Your videos are 🔥. Keep it up!
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out🚀🚀🚀
@marqusee2 жыл бұрын
Huge fan! love your work!
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out💫💫💫
@fredrickdenga75522 жыл бұрын
This has been the most comprehensive deep dive Coogan💥💥🚀🚀🚀
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out🚀🚀🚀
@braamies53392 жыл бұрын
One cannot turn risks into opportunities. Risks persist in every endeavor. One can only mitigate and reduce risks. Of course not knowing anything one could increase risk in anything.
@helmutkrahn93372 жыл бұрын
Great "story telling" - love your videos; glad I stumbled across you.
@Oceaneoc10255 ай бұрын
Who needs school now. Your videos are top tier. Thank you !
@matthewlasalvia70262 жыл бұрын
Always hated how Nvidia overprices their products. They’re the reasons pc gaming is such an expensive hobby. Thanks for this video.
@woolfel Жыл бұрын
one of the first applications to use CUDA was folding at home. sadly Nvidia is too greedy today and abusing their position.
@Marius-vw9hp2 жыл бұрын
Amazing video! I would love to see a video on the history of OpenGL and one on DirectX - there are none and it would be awesome since these are so important. Thanks! :)
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out🔘🔘🔘
@ilovelimpfries2 жыл бұрын
I guess we're just gonna gloss over 3dfx huh?
@olamidetboy18652 жыл бұрын
I’m always very impressed, educated and entertained by your videos
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out💯💯💯
@bldos53622 жыл бұрын
I'm watching this 2 months after release and nvidia stock prices are now taking a brutal beating as are all tech stocks. maybe it's time to buy?
@geekinasuit83332 жыл бұрын
There should have been a mention of ATI Technologies which was founded 7 years before Nvidia, and for a while had the best GPU's for the PC market. AMD purchased ATI in 2006 but fumbled the ball for almost a decade and at one point the company looked like it could go bankrupt. For a brief time 3DFX was king of the hill with their GPU's until the company failed and closed up shop. A lot could have gone wrong for Nvidia, sometimes it's pure luck that determines which company succeeds and which one fails. AMD may finally begin outperforming Nvidia in the GPU sector given how bad the RTX 4000 series looks and the over supply problem due to over selling into the crypto mining market which has collapsed.
@mycelia_ow Жыл бұрын
I'm actually saving up for a high end RDNA3 card now. if I were to get the same performance with Nvidia, I'd be spending 50% more. Though Nvidia still leads in AI and its integration into their cards, they also have better engineers and are the bigger company still. Makes it tough to choose between the two.
@artificial_digression2 жыл бұрын
bro this video it so well develop thanks for telling us the story
@ADEL-fz9qm2 жыл бұрын
Finally New video from one of my favorite channels.
@fallencheeto47622 жыл бұрын
This video was amazing as always. Learned a ton about Nvidia and the gpu
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out💯💯💯
@sam_hookjoy Жыл бұрын
You have an awesome channel, I like the way you deliver such useful information. Thanks
@paullangton-rogers23902 жыл бұрын
@John Coogan: I love your documentaries about key industry individuals and startups that became mega successful. One story long since forgotten about in the history of computing involves a little-known British company called Apricot Computers. There was a time in the early 80's, the dawn of the personal computer, where Apricot was bigger and arguably better than Apple for PC's. Apricot was basically the British Apple. They manufactured computers for businesses though not consumers. However by 1983-84 they had produced the world's first portable computer aimed at home-office users which had radical innovations for the time such as the world's first hard disk drive in a PC (10MB!) a wireless keyboard (long before Bluetooth existed) and even voice recognition technology for word-processing. Apricot was also the world's first computer manufacturer to adopt not just 3.5" disks but double-density 3.5" floppy disk drives, doubling the storage capacity from 720kb to 1.4MB. They had more RAM and CPU processing power than anything else on the market too. Apricot was for a time a computer manufacturing pioneer and market leader in the UK and dominated the marketplace for a short while with its own proprietory PC platform which even had a custom version of MS-DOS produced by Microsoft just for Apricot's which had a semi-GUI and a mouse pointer cursor. The Apricot Xi was one of the first early PC's launched around 1981/82..it was a direct rival to both the Apple I and Apple Mac, but was aimed at a different market the business sector. It was technologically superior in some ways, but lacked a colour screen. What Apricot's lacked in terms of colour functionality they certainly made up for in software. Apricot was in the hardware and software business. They made software bundle add-ons like spreadsheets (even before Lotus), word-processing (long before MS-Word), and so on. Although proprietory hardware, Apricot's didn't use closed architecture like Apple, making it easy to connect a mouse, modem and early dot-matrix printers to them so they were highly versatile PC's. Once IBM-compatible PC's began flooding the market by the mid to late 1980's Apricot began to rapidly lose market share and its PC's started to look over priced...it was unable to scale up the business and eventually folded. I remember at the time though before Apricot's decline, it was common in UK to walk into any bank, large organisaton (public or private sector) and see Apricot PC's everywhere on desks. I have a collection of early computers and amongst them are two ground-breaking Apricot's; the second-generation Apricot Xi PC and the world's first portable PC the Apricot Portable. They are still lovely machines to use even today and very asthetically pleasing to look at with incredible high quality finishes. I don't know much about the history of Apricot and how a British company in the early 1980's came to dominate the PC market for a short while. I'd love to know the full history of who started the company etc.
@killerb4569 ай бұрын
This was one of the most interesting videos I have ever seen on your channel. ❤🔥
@عبدالقادرعبدالرحمنعبدالله Жыл бұрын
powerful, inspiring and informative way of story telling.
@rrr00bb12 жыл бұрын
it drives me completely insane that there isn't an NVIDIA-Linux that just works out of the box with their cards for Machine Learning. The setup of the cards keeps accelerated training from being reproduceable.
@huonglarne2 жыл бұрын
How about using Docker? There are many images of Linux that are preinstalled with cuda
@rrr00bb12 жыл бұрын
@@huonglarne this is a good illustration of the problem. you want something to just work. From Pytorch, from Tensorflow in Python, and from Julia. We had interns at our office training algorithms; but without acceleration, they could not afford to do more than one shot per 8 hour day. I managed to take their code and get it accelerate to 10x faster. It was easier to just bring in my personal laptop and just give it to them for the summer. I asked for a System76 laptop for this reason; and even then, it's not all setup correctly out of the box. NVIDIA and Linux are such a total shitshow, even in Python. Moving to Windows isn't really a solution, because the whole dev environment around the model is best done in Linux; and the deployment environment surely is Linux.
@Slavolko2 жыл бұрын
Can you give a basic rundown of what the problem is? I know one DL developer and he only uses Nvidia cards with Linux. I never asked if it was difficult to setup, but it's clearly the only option for him, considering AMD doesn't support CUDA natively and doesn't have tensor cores.
@rrr00bb12 жыл бұрын
@@Slavolko The reasons vary wildly between Linux versions, even just minor versions of Ubuntu. NVIDIA has a small list of exact versions of Linux that work. And it's never the version that YOU need to get actual work done; because other stuff on your machine needs specific versions of libraries too. I had two totally different hardware machines that I had setup to accelerate Pythorch. On one machine, I could not boot into Ubuntu to even INSTALL it without a firmware upgrade; and then kernel parameters, and it took a week to get accelerated Pytorch (about 10x speed up on ML tasks!). I watched interns wasting all day running unaccelerated models, and just gave them my machine for the summer; so they spent 1 hour only waiting for jobs rather than all day waiting for one. Based on this, my boss bought 3 very high-end machines with 2 graphics cards each ($30k total, I think). The setup was different STILL from what I got working. They could not upgrade anything for fear of breaking the acceleration. Much of the problem is that even with a virgin system running the EXACT hardware, the instructions are ridiculously long and hard to follow; even when they happen to be correct. They are usually no longer correct by the time you are reading them. If you build Linux containers vs latest, you will notice that every few months the image is broken for a day or so ... usually over a weekend. So you end up having to pin to specific versions of everything, and you cannot satisfy everyone; and somebody got a version that's not what the doc says literally. I was at version X of Ubuntu, and did a distro upgrade; and found myself booting into a totally black screen. I managed to get in with a rescue distro to save my home dir. I just installed the newer Ubuntu version Y over it; and moved on. A month later, I also had to move up a version on a totally different machine with different GPU, etc. It TOO booted into a black screen. I rescued it the same way. A co-worker told me to use PopOS. More of it worked out of the box. But I need to get work done. So other things needed to be upgraded as normal. Eventually, acceleration stops working. I have a third machine now, running PopOS that is unsupported because I never updated; initially to not break the acceleration. I am moving back to regular Ubuntu, because too much stuff gets confused by it being not-really-Ubuntu, but claims-to-be-Ubuntu. I got a System76 Gazelle for my laptop upgrade at work (PopOS), and all the acceleration wasn't even setup by default; even with the expensive GPU I had in it. So, I followed instructions to get it accelerated. I think I got it working with PyTorch but not Tensorflow. On another machine, I got Julia accelerated with some library name I can't remember. Eventually, i reach an unsupported state where I can't install something I need to get work done. ie: Go compiler griping about a lib version, docker wants something, openscad .... so I upgrade. And the fucking acceleration for Pytorch/Tensorflow was eventually broken; and I don't know why. The specifics don't even matter; because your experience will be EXACTLY like this, but completely different in all the details. Fortunately, the ML stuff was more of a hobby. I eventually stopped trying to work with accelerated code. You kind of need a machine that has a hardware setup that is tested by NVIDIA, with a distro that is ALWAYS tested so that it doesn't eventually break. And it needs to be Ubuntu, because every time you need to install something; you are running some bash script that assumes Ubuntu. Python setups are a madhouse that are much of the problem. This isn't like normal libraries where you just write code to assume that the drivers are there, because it's always there... reliably. ML progress will really only take off when some random kid can put code in github, and you can run it.... and you KNOW that you both ran accelerated code. I think it's going to not be NVIDIA when this happens. And probably not Python either. Maybe a language that takes data-parallelism really seriously; first-class... rather than crazy strap-ons like numpy in wild environments where the whole environment can be made exact on dependencies; and trivial to reproduce on random computers. Imagine if your screen just went black with your code on different computers, or the keyboard didn't work. It's that kind of shitshow.
@Slavolko2 жыл бұрын
@@rrr00bb1 Thanks for that explanation. Sounds like a nightmare. I wouldn't have predicted it to be this bad at all. I don't run Linux personally, so the specifics of why that stuff is happening is a little out of my realm, but I understand enough to know that it's a PIA. One of the last things you mentioned was ML not taking off until a random kid can run code taken from GitHub. The thing is that, at least on Windows, we're been there for a while. My problem is that I have an AMD card that runs compute decently, but most of these ML software are designed by devs with Nvidia cards. Really annoying. Random kids with Nvidia cards can use GitHub instructions to get Stable Diffusion running locally, but I have to deal with an ONNX interpreter layer, which isn't straightforward for me. So I have to grab someone's model that's already setup with ONNX and DirectML, or I can follow the GitHub Gist instructions from an actual AMD software engineer because he's probably doing it in his spare time. So from my perspective, Nvidia will remain at the helm of wherever ML goes, but the language can change, sure. But also... based off what you wrote, it sounds like AMD cards would probably be more consistent across installations, so perhaps that's worth it to those that can get their code working with OpenCL or ROCm/HIP.
@andyc99022 жыл бұрын
Jason needs to put his bet on Assembly language!
@donrosalioroasters2 жыл бұрын
Great research; however, reviewing 10Ks for the past 6 years huge portion of the revenues generated were from mining. Their hopes are that AI and data will tale them to the next level. Much of thos revenues will be lost over time as well as the losses written off for posting false over inflated gains.
@MarkZX14R2 жыл бұрын
Good video - The old commodore amiga used specialist chips for graphics and sound and maths co processes etc.
@Hito3432 жыл бұрын
That's why they doubled GPU prices, from ''wanting to tell a story'' to pure greed.
@cmw37372 жыл бұрын
Another great informative video, if not comprehensive. George Hotz had a lot to say about NVidia's self driving car ambitions and other than the brief cameo of bitcoin appearing in one of your stock footage shots there's no mention of crypto mining in the success of Nvidia in being there when new applications came along.
@timeTegus2 жыл бұрын
a proprietery system in the software stack is nothing to be proud off.
@lilyflowerangel2 жыл бұрын
How do you make money then? It's a business. Not a charity.
@timeTegus2 жыл бұрын
@@lilyflowerangel Nvidia uses linux so they use open source software. And they are not payin for it.
@dukepham41912 жыл бұрын
this was a really good video, very inspiring!
@LionTree2 жыл бұрын
So good I watched it twice!! Thanks for the high-quality content John.
@seanc6754 Жыл бұрын
Unbridled greed is Nvidia's #1 business practice now..
@segercliffhanger Жыл бұрын
John Coogan, you're good. I'm hooking on. Thanks. Keep it coming.
@thepropheticpromise2 жыл бұрын
wow man great video, Im glad to be a subscriber
@frvo2 жыл бұрын
Amazing documentary! 👏🏼
@timeflex2 жыл бұрын
Talking about the dawn of 3D graphics without saying anything about 3Dfx? Btw, NVidia is not just about GPUs, but CPUs as well. Remember Tegra ARM CPU and its market failure? P.s. "crypto miners buying GeForce cards like crazy"? Nope, never heard of them.
@nogrammer2 жыл бұрын
They are an AI company now
@ken-adams2 жыл бұрын
Thank you for making this video ! World should recognise and reward the efforts and contributions of such people more ! The world desperately needs more brilliant visionaries who can outperform people of the past and lead the present into an incredible future !
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out🚀🚀🚀
@pirateradioFPV2 жыл бұрын
To understand Nvidia, you have to understand their desire to party like it's still crypto.
@jadtamimi3954 Жыл бұрын
Your videos are really addicting, and since its 3:30 am right now, I'm going to say they are addicting in a bad way.
@cryptotok422 жыл бұрын
Keep going, bro ❤️ from Somalia 🇸🇴
@BrentLeBlancCG2 жыл бұрын
John, I can hear your kid crying in BG at 08:26 . I understand the struggle when recording videos! Hahaha
@asr842 жыл бұрын
Indeed, I thought it was my own 1 y/o daughter for 1 second lol
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out💯💯💯
@illectricsheep2 жыл бұрын
Yeah! Incredibly engaging and insightful video. Respect!
@passionfly12 жыл бұрын
I have so much respect to a person who does so much research to then present the information in such a elegantly simple and easy to understand way like you did. CUDAS to you! (See what I did there?) 😁😁😁
@zeyad452 жыл бұрын
Phenomenal analysis! Huge bets on NVIDIA since 2019.
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out🚀🚀🚀
@lil----lil2 жыл бұрын
Amazing work. Much, much appreciated. Maybe one day the A.I will be able to do the storytelling for you by looking at your own videos 1000x and just reading off the new script. NOW wouldn't that be something!?
@27forlife2 жыл бұрын
I love your entrepreneurial videos but please add some listenable sound tracks , they make your videos more immersive and cool.
@chilam12 жыл бұрын
Great video! Keep it up 👍🏽
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out🔥🔥🔥
@brothatwasepic2 жыл бұрын
That's it all my meetings will be at Dennys going forward
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out💫💫💫
@RagdollRocket2 жыл бұрын
great video John, thank you!
@DavidHoskins Жыл бұрын
Nice to see Tom Holland is keeping himself busy between Spiderman movies.
@FaizalKuntz2 жыл бұрын
Nvidia revenue is thanks to miners and not from selling their GPU to gamers... I'm glad since crypto market is going down so graphics card price also going down but over relying on one company is really bad for consumer and I hope Nvidia could have a competition in GPU market
@danzwku2 жыл бұрын
did you know he's related to the CEO of AMD? Lisa Su? Lisa Su's own grandfather is actually Jen-Hsun Huang's uncle.
@slimal12 жыл бұрын
Halfway in... I'm still waiting for this secret
@QwetzxlV22 жыл бұрын
your videos are so good, it's a pity youtube doesn't promote them as much ):
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out🚀🚀🚀
@nts00117 ай бұрын
Very impressive, thanks mate !!!
@alhdlakhfdqw2 жыл бұрын
really intersting and amazing videos thank you very much! :)
@giftedballer2 жыл бұрын
Feedback Appreciated Wanting more info,mining & insight Write to the number above Endeavor to teach out💯💯💯
@genshikenguy2 жыл бұрын
nvidia never invented a single thing ever. they literally took years to decades older tech changed the name literally hacked corners off like rectangle quadratic graphics to triangles.. the triangle is a single line called a hypotenuse. by doing this they saved co-ordinates and memory by never using any other shape. Then they used a simple point system for a line for all their shapes which being two rows of data they could use a cheap type of compression software on their memory called memory clamping. So they used cheap calculators with cash register database software and banking COMMERCE software repurposed to run on cheapest calculator chips because tech companies advertised their calculators could run commerce software as it was good at maths as the maths units in the cheapest of computers which back then was kinda expensive. so nvidia went all in on the cheapest fakest everything. So when it said 6GB RAM on the box you were probably getting 1-4GB RAM. if it said rtx3090 it is trillions and trillions of times worse than a 5700xt. just based on bit depth alone. Yes fake bit depth is a thing. you can use some software tricks. CUDA is software thats invented to pretend they still have 80's cheapest of computer hardware. They no longer make computer stuff and fake it all with software with the cheapest fakest stuff. They literally exist to cut costs and corners and wait up to 50 years to just sell ancient AMD or other companies tech products with a new name.. 60's ray tracing in voodoo3dfx and in ps2 and everything AMD ever.. is now called RTX.. which yes its a software shortcut that can allow hardware to perform.. about 10% faster in professional graphics applications that work with lightrays.