GPU Computing Explained | How A GPU Works

  Рет қаралды 91,781

Futurology — An Optimistic Future

Futurology — An Optimistic Future

Күн бұрын

Visit Our Parent Company EarthOne ➤
earthone.io/
This video is the sixth in a multi-part series discussing computing. In this video, we’ll be discussing the rise of GPU computing and the role it will play in AI computational tasks.
00:00 Intro
[0:22-5:26] Origins of GPU Computing - Starting off we'll look at, the origins of GPU computing and the reason(s) for their mass adoption today.
[5:26-8:13] GPU Computing for AI - Following that we'll discuss, the use of GPU computing for artificial intelligence (AI) computing.
[8:13-9:38] General Purpose GPU Computing - To conclude we'll discuss, general purpose GPU (GPGPU) computing.
Thank You To The Members Who Supported This Video ➤
Wyldn Pearson
Become A Member & Help Us Grow ➤
subscribe.futurology.earthone...
Learn More About Us Here ➤
futurology.earthone.io
Join Our Discord ➤
subscribe.futurology.earthone...
Soundtrack ➤
♫ 00;00 "Dawn" by Sappheiros
♫ 00;22 "Sun" by HOME
♫ 05;24 "If I'm Wrong" by HOME
♫ 07;50 "The Getaway" by Miami Nights 1984
♫ 09;39 "June" by Aire Atlantica
Producer ➤ Ankur Bargotra
Follow The Producers Social Media Accounts ➤
/ enchorb
/ enchorb
/ enchorb

Пікірлер: 153
@OptimisticFuturology
@OptimisticFuturology 6 жыл бұрын
Want to learn more about the Technological Revolution? Watch our playlist here: kzbin.info/www/bejne/e3-6pKKNn999irM - ALSO - Become a KZbin member for many exclusive perks from exclusive posts, bonus content, shoutouts and more! subscribe.futurology.earthone.io/member - AND - Join our Discord server for much better community discussions! subscribe.futurology.earthone.io/discord
@holdthetruthhostage
@holdthetruthhostage 5 жыл бұрын
I don't think it will take 2040 most likely a start up company with 3d printing will accomplish this sooner
@bhuvaneshs.k638
@bhuvaneshs.k638 5 жыл бұрын
Ur channel is awesome... Please do a video on TPU
@safaulansari7782
@safaulansari7782 3 жыл бұрын
P
@darrenc2370
@darrenc2370 6 ай бұрын
the music from 7:46 onwards, anyone else getting Serial Experiments Lain vibes there? Edit: link for the uninitiated kzbin.info/www/bejne/n4C5natqhberkKcsi=optCloI6NAEgjj9w&t=248
@PauLWaFFleZ
@PauLWaFFleZ 6 жыл бұрын
Please PLEASE do NOT stop this series on Computing.
@kayrosis5523
@kayrosis5523 6 жыл бұрын
Graphics cards 100,000x as good as now? That simulation hypothesis might be onto something
@gs-nq6mw
@gs-nq6mw 4 жыл бұрын
Nope,the Moore law is expected to end by 2020 by most specialist,including Moore himself
@projectjt3149
@projectjt3149 4 жыл бұрын
g s how about Tensor Cores?
@thewalnutwoodworker6136
@thewalnutwoodworker6136 2 жыл бұрын
We are down to 2nm as of 2021, 1nm will be the end of silicon. We might to be able to push it father with other atoms to go sub nm. We will probably make major advancements in architecture before we go quantum. For example x86 is bloated af, risc is what we need past 1nm. AMD is making major pushes in architecture such as chiplets/mcm. As of now the rdna3 leaks are showing over 2x performance on rasterastion.
@spacevspitch4028
@spacevspitch4028 2 жыл бұрын
@@gs-nq6mw Yowza. 2 years ago and still going.
@player111q7
@player111q7 10 ай бұрын
No it's not, 2023 and it's still valid
@PauLWaFFleZ
@PauLWaFFleZ 6 жыл бұрын
Bro your videos are simply AMAZING. Your presentation is completely pulls you in, and makes you want to wish that the video never stops. Makes me more excited to be going to school for Computer Engineering. Keep at it bro...
@OptimisticFuturology
@OptimisticFuturology 6 жыл бұрын
Thank you for watching :)!
@projectjt3149
@projectjt3149 5 жыл бұрын
Oh it didn't stop for me! I am about to write a research abstract in college discussing GPU computing and this video was a HUGE help!
@L2Xenta
@L2Xenta 6 жыл бұрын
+1 sub for showing off the ambition of Star Citizen, a game project that takes a long time but challenges multiple barriers of the limited industry.
@googledev566
@googledev566 4 жыл бұрын
*_Keep creating such crucial and informative videos_*
@annoloki
@annoloki 6 жыл бұрын
Do check out IBM's TrueNorth architecture which'll put an end to all this GPU for AI stuff as TrueNorth doesn't run code to simulate neurons - it provides silicon based neurons on a chip... they are clockless, parallel, programmable, run faster than instruction based neural nets, but use an absolutely minute amount of power to do the same thing... neurons in silicon rather than in software is what will really revolutionise AI.
@OptimisticFuturology
@OptimisticFuturology 6 жыл бұрын
Thanks for watching! I have a video on neuromorphic computing coming soon!
@walter0bz
@walter0bz 6 жыл бұрын
I hope that AI solutions keep being programable. the same thing happened with graphics: we went from pure-software , to fixed-function hardware accelerators, to programmable hardware that eventually generalised into a fully programable parallel processor. The reason is the variety in algorithms (eg convolutions, capsules, various compression schemes.. ); i would prefer to see this variety increase; we'd just see chips that are more dataflow/low-precision oriented for AI (rather than the GPUs who's datapaths and precision grew around the demands of texturing and geometry)
@annoloki
@annoloki 6 жыл бұрын
Fantastic, if I can recommend something well worth looking at in that realm which you may wish to incorporate bits into your video (if you've not come across it already) that's an explanation of the general purpose cortical column, which is slightly different from the deep learning neural nets, using SDR (sparse distributed representation) in order to act as memory, and pattern prediction that branches out to predict many different possible futures, allowing pruning of branches that didn't occur once data comes in. It's what powers mammalian (including our) brains, and is being put to use for things like recognising when faults will occur on computer systems to helping predict power generation requirements on a grid. Numenta seem to be a/the leader in this work... it might be a bit out of the scope of what you're doing if you're more on the [near-]consumer tech, but it's super interesting and will play a big role in our futures, so I'll just give you this one video link as it contains some good visual explanations, and will spare you further recommendations unless they're helpful/welcome rather than redundant. Cheers for replying :-) kzbin.info/www/bejne/n3-wk6asgJ2Ebrs
@annoloki
@annoloki 6 жыл бұрын
walter0bz - TrueNorth is programmable, but not in the same way as a general purpose processor, because it's clockless, asynchronous, very highly parallel etc etc etc... it's more like electrical wiring, like having a load of gates, capacitors, transistors, all on silicon, and "programming" it uses an FPGA to wire them up to make your own circuits... it's more like a physical brain that you choose how neurons are connected to each other and how they behave than like a CPU... but it can do the job of a rack of servers in something that will fit in a phone and barely touch your battery... but I'll leave it there as SP's doing a video on the subject 'n he'll no doubt do a better job covering it, oh just to say, I wouldn't worry about limitation of hardware, when it comes to AI, it's US who are going to be falling behind, esp now Google are already using AIs to make AIs better than people can make them.
@walter0bz
@walter0bz 6 жыл бұрын
programability is a sliding scale. 'fixed-function GPUs' are configurable and can do different effects with layering, ordering according to a 'program', but fine grain programability in shaders was superior. The device I hope 'wins' would be the grid-of-RISC-cores, with inter-core messaging (this can be clockless, highly parallel..) - just with the right custom instructions (e.g. low precision dot-products) to handle AI workloads
@tomojeetchakraborty5459
@tomojeetchakraborty5459 6 жыл бұрын
I am your greatest fan .sir . I am very obilged by the knowledge u provide us . And make us aware of the modern trends .,💐💐💐💐💐💐
@OptimisticFuturology
@OptimisticFuturology 6 жыл бұрын
Thanks for watching :)!
@djsvideodiarys
@djsvideodiarys 6 жыл бұрын
Can't wait for QPU.
@OptimisticFuturology
@OptimisticFuturology 6 жыл бұрын
But could it run Crisis ;)
@djsvideodiarys
@djsvideodiarys 6 жыл бұрын
Hahahaha
@djsvideodiarys
@djsvideodiarys 6 жыл бұрын
Haha Hopefully it will solve Crisis
@ActualGenius
@ActualGenius 6 жыл бұрын
Feel like I was processing in CPU terms my whole life and instantaneously entered QPU age when I started watching this channel.
@TheDanm22
@TheDanm22 5 жыл бұрын
nothing will run crisis. @@OptimisticFuturology
@karehaqt
@karehaqt 5 жыл бұрын
Just discovered your channel via this series and you gained my sub, great videos so far.
@Slayer3915
@Slayer3915 5 жыл бұрын
Apparently it's hard for some people, but I for one appreciate your spoken words per minute.
@thejointcoach
@thejointcoach 6 жыл бұрын
Please do a video on quantum computing!! I love your channel and I think you deserve way more subscribers, I'll do my best to spread your name
@rahmanash9856
@rahmanash9856 6 жыл бұрын
Awesome as always .. waiting for Quantum and other types of computing, Graphene applications,AI and so much others
@Jixejo
@Jixejo 6 жыл бұрын
however you need to also understand that 5000 cores is not the same here as 5000 cores. these are cuda cores.. they are not the same as 'cores' which can read, write, execute, transfer, etc... 5000 cores sound too good to be true? it is too good to be true. the cuda cores can't perform as many of the same actions as cpu cores. they are more like relay units. but 5000 cuda cores is good, but its called cuda for a reason. its not the same thing as a 'core' as we know it. this is more marketing jargon in action more than anything, its not what a core is usually described as in the world of computer science.
@Jixejo
@Jixejo 6 жыл бұрын
but they wanted to call it a core because it would sell. but actually its not even a core, its closer to a floating point unit.
@ccmkoho
@ccmkoho 5 жыл бұрын
Or u can analogize it to a hyper threads.
@BattousaiHBr
@BattousaiHBr 5 жыл бұрын
Also the Nvidia graphs showing Moore's law were dishonest as they were comparing single threaded CPU performance vs their GPUs with thousands of cuda cores as if they were a single core.
@evilbred974
@evilbred974 5 жыл бұрын
In certain workloads however a GPU of the same price will blow a CPU out of the water. You cannot compare a 8150 to a V100, nor can you compare a 9980XE to a 2080 Ti in highly parallel operations. In fact, there is some company working on servers that completely cut classical x86 instruction set CPUs out, by using ARM CPUs controlling multiple GPUs in GPGPU servers.
@rawding1976
@rawding1976 6 жыл бұрын
One of my favorite channels!!! This channel is gonna explode with subscribers very soon! Watch & see! Great job as always!
@OptimisticFuturology
@OptimisticFuturology 6 жыл бұрын
Thank you for watching!
@The_Masked_Frenchman
@The_Masked_Frenchman 5 жыл бұрын
Watching these is reinvigorating my love for technology and makes me want to go back to school for computer engineering
@NietJeffrey
@NietJeffrey 5 жыл бұрын
I wish you every bit of youtube fame coming to you. You have some great content!
@wolfisraging
@wolfisraging 6 жыл бұрын
Great job, make more
@edwardbrownstien8741
@edwardbrownstien8741 5 жыл бұрын
Great channel . Love the content .
@meowmeowmoogabenrules4854
@meowmeowmoogabenrules4854 5 жыл бұрын
why is youtube barely showing me this. Amazing work man
@marymcreynolds8355
@marymcreynolds8355 6 жыл бұрын
Star Citizen...quite a peek of the latest peak.
@PabloGonzalezVargas
@PabloGonzalezVargas 5 жыл бұрын
Wow !!! I'm a big fan of your channel, impressive eloquent work *
@ELECTR0HERMIT
@ELECTR0HERMIT 4 жыл бұрын
excellent job.
@bommaritohawaii
@bommaritohawaii 6 жыл бұрын
Great job!
@srungarapusaikrishna5583
@srungarapusaikrishna5583 3 жыл бұрын
That my friend felt like a rocket science class!!!😵🤒
@anshulsharma9424
@anshulsharma9424 5 жыл бұрын
Keep up the good work
@madsgrand
@madsgrand 6 жыл бұрын
Please change the coldfusion inspired intro its just to close for comfort. On the content side your videos are so much better!
@pegasusted2504
@pegasusted2504 5 жыл бұрын
Good stuff all round :~)
@dangdiggity9916
@dangdiggity9916 5 жыл бұрын
one thing im wondering about for "ai" in gaming is if they can optimize ray tracing for certain games so when you open the game they have already figured out where on the map its used so its basically learning something before doing it for the first time by the user (but ofc probably still improving)
@winkipinky
@winkipinky 5 жыл бұрын
Fantastic .... 😁
@james_gemma
@james_gemma 6 жыл бұрын
Your channel should have way more views and subscribers. I guess it's only a matter of time before everyone discovers your excellent informative tech videos.
@cdreid99999
@cdreid99999 5 жыл бұрын
Funny the number of comments mirroring this. In a row. But I'm sure you're not sockpuppets or bots..
@originproductions6120
@originproductions6120 5 жыл бұрын
He's ripping off ColdFusion's intro completely
@govinds3951
@govinds3951 6 жыл бұрын
Jheeez good work
@peefwellington8794
@peefwellington8794 4 жыл бұрын
Boy the new gen of gpu In 2020 is gonna be incredible
@barney9008
@barney9008 5 жыл бұрын
nice delivery reminds me of cold fusion
@benyeo7930
@benyeo7930 5 жыл бұрын
I love all your videos - great content that is educational and serves as good primers for the uninitiated. 2 issues with all your videos - the voice of commentator is way too Low and boring (no fluctuations in tone at all - drones on and on) and the speed of narration is also way too rapid; good thing, there are subtitles to follow which solves the issue for the Super interested audience. However, even scanning the subtitles requires full and complete attention which reinforce the comment that the speed and tone issues are real!
@usertogo
@usertogo 5 жыл бұрын
Nice, now if somebody has a graphics accelerator how does one enable the cores to be easily used by the operating system and applications?
@Mordred478
@Mordred478 6 жыл бұрын
Great video, very informative. Is the implication here that Dell and other companies will soon start offering PCs with GPUs instead of CPUs?
@jwadaow
@jwadaow 5 жыл бұрын
No
@mike288190
@mike288190 5 жыл бұрын
I believe you would still need a cpu
@hemendrapratapsingh4156
@hemendrapratapsingh4156 2 жыл бұрын
So I thought to watch the complete ad today. But it skipped automatically. 🤐
@daniel_960_
@daniel_960_ 5 жыл бұрын
But what do the cores in apples mobile graphics mean? The a11 or a12 have only 3 or 4 graphics cores but are still really powerful. Previous generations had more cores as far as I know.
@borisgotov9838
@borisgotov9838 6 жыл бұрын
give simple piece of code a little bit more complex than hello world program. Something like rolling ball or rotating square...
@chuckbuckets1
@chuckbuckets1 5 жыл бұрын
ai and protein folding will be one of the most profound paradigm shifts of humanity.
@zahanjavaid
@zahanjavaid 5 жыл бұрын
I seriously liked your videos but not in a position to understand most of the part LOL 😂😂😂
@HeadStronger-HS
@HeadStronger-HS 5 жыл бұрын
this blew my mind.. Major advances in GPUs
@supremepartydude
@supremepartydude 6 жыл бұрын
As computer enthusiast for 30 years you did a great job.
@dinozaurpickupline4221
@dinozaurpickupline4221 3 жыл бұрын
An AI could be used to map tonal changes,of different structures and things & their outcome can be used to decrease load times In games like how cool would it be if a computer or software already knows reflection details of every object texture color variation size & scaling using separate sets of data & create further smaller tests this would increase performance drastically
@TheDanm22
@TheDanm22 5 жыл бұрын
in the first minute.... thats called moores law. it deserves the reference.
@sarmadnajim4839
@sarmadnajim4839 6 жыл бұрын
Wonderful document , direct and clear , smartly done 👍🏻
@frozencode5238
@frozencode5238 6 жыл бұрын
i love you man...
@wandrinsheep
@wandrinsheep 6 жыл бұрын
oho a star citizens fan i see, awesome
@Army2willis
@Army2willis 5 жыл бұрын
I see you popped in some SCU to show just how crazy graphics are today. You know you like SC
@system2072
@system2072 6 жыл бұрын
great videos man..... but can you please slow down while explaining.. you speak fast which is very hard to understand sometime...
@infinitworld7106
@infinitworld7106 6 жыл бұрын
MORE CONTENT!!!
@DANTHETUBEMAN
@DANTHETUBEMAN 6 жыл бұрын
as soon as computers get consciousness they will no longer serve humanity,
@flynnkay
@flynnkay 5 жыл бұрын
Ok then Ill unplug that bitch haha
@cdreid99999
@cdreid99999 5 жыл бұрын
You don't understand how computers work then. And we are nowhere close to an. What the hypewagon are calling ai now is what we used to call expert systems .. ie standard proging/algorithms. We simply don't have the processing power yet. We might be able to build a simulated human brain but it would be built on FPGA like neural networks boards, be the size of a Walmart and probably require a power plant to run
@TheDanm22
@TheDanm22 5 жыл бұрын
@@cdreid99999 ai is going under a new dynamic now. 1 ai is programmed to evaluate and judge. another ai is programmed to design and invent. this is the judge and the inventor ai. two ai working together to improve each other processes. it wont be the size of walmart. its going to evolve exponentially.
@viniciusbueno2160
@viniciusbueno2160 5 жыл бұрын
And now, 1 or 2 months ago IBM released the first quantum computer outside the lab!!!! Now things can move faster
@meatofpeach
@meatofpeach 6 жыл бұрын
Incredible KZbin channel. Wow. Keep it up
@Keiktu
@Keiktu 5 жыл бұрын
Insta suscribed
@snoogboonin
@snoogboonin 5 жыл бұрын
Your vids are fucking unreal dude. Subbed.
@DarthRaver-og7qq
@DarthRaver-og7qq 5 жыл бұрын
Dam think about what a laptop or desktop gamin rig will like look in say 50 years?? Could you imagine having something portable the size of a Nintendo Switch, yet as powerful as a full desktop gaming rig today with the best of everything. Thats crazy lol. I hope im still alive by then. Everything in the world looks like its actually heading toward a "Bladerunner" type civilization lol.
@PauLWaFFleZ
@PauLWaFFleZ 6 жыл бұрын
When can we plan on seeing some of the videos on the AI series?
@OptimisticFuturology
@OptimisticFuturology 6 жыл бұрын
Starting May!
@PauLWaFFleZ
@PauLWaFFleZ 6 жыл бұрын
Ah come on man, I can't wait that long... You gotta give me the line up for what else is going to be coming out until then...
@brushhog7089
@brushhog7089 6 жыл бұрын
nice horn in the background but really distracting I guess I'll go somewhere as I was for information
@tonytony7225
@tonytony7225 5 жыл бұрын
you gotta talk about amd's threadripper and its AI technology
@aoeu256
@aoeu256 5 жыл бұрын
GPU voice recognition never heard of it.
@MegaFlemo
@MegaFlemo 5 жыл бұрын
WOW
@MyWatchIsEnded
@MyWatchIsEnded 5 жыл бұрын
But can the GPU from 2040 run Crysis?
@ahuttee
@ahuttee 5 жыл бұрын
Might be possible
@bassbs
@bassbs 6 жыл бұрын
Did you, SP, do SU?
@Chrisimplayer
@Chrisimplayer 5 жыл бұрын
to me it's highly debatable if star citizen is still in development
@m_sedziwoj
@m_sedziwoj 6 жыл бұрын
From last week, look at Google TPU 3, 100 petaflops for DL, it 1000x more then Nvidia Titan V
@fuzzylumpkin8030
@fuzzylumpkin8030 5 жыл бұрын
Yeah that’s cool but at what cost to gaming
@kokomanation
@kokomanation 6 жыл бұрын
I feel that cpu and GPU are getting merged together
@rpzcsonli
@rpzcsonli 5 жыл бұрын
CPU can calculate everything , GPU needs special cores to be better than a CPU so if they add cores to calculate everything the GPU will become a CPU . It won't become one because it will be to big , expensive , powerhungry . They could make a AiPU(ai proccesing unit) and put it in a card , a RTPU(ray tracing proccesing unit) and put it in a card and so on and make it that you can add 20 cards in your computer and then we will have the true "performance" in everything but it won't happen because is stupid. CPU will be the "CENTRAL Proccesing Unit" untill humanity ends and GPU will just be it's slave doing all the work then it will put everything together and make it pleasent for you to interact with.
@nad1901
@nad1901 4 жыл бұрын
Still I don't know how people get the idea of using annoying music background on informative videos. And since when did we've music playing while we learning in schools :/
@vyor8837
@vyor8837 5 жыл бұрын
Volta isn't on a true 12nm node.
@BakiWho
@BakiWho 6 жыл бұрын
you sound like the hardy boys in south park :) i have a raging clue
@platin2148
@platin2148 5 жыл бұрын
The stacking will also make nvidia obsolete as both intel and amd can make pretty capable gpus so no need to work with nvidia. So i suspect the either go server or bet on Arm so if the could make a Arm1000 chip that is tightly integrated with there GPUs the basically would have won. And he saying that he made a special chip for AI which is completely wrong he made matrix calculations faster not AI but it could be even faster with FPGA's.
@strangevideos3048
@strangevideos3048 5 жыл бұрын
We live in Matrix!
@mohamedsalahoshi1486
@mohamedsalahoshi1486 6 жыл бұрын
*J* *U* *S* *T* *A* *M* *A* *Z* *I* *N* *G*
@Sean_Lightning_OBrien
@Sean_Lightning_OBrien 6 жыл бұрын
At this point I’m still waiting for the GYX 1180/2080 😂
@jackharpe3rd233
@jackharpe3rd233 4 жыл бұрын
I could care less for AI or a real life Hal or Skynet. Not because I'm scared, but cause all I truly want is more Pixels and more Polygons being rendered for my videogames. Unless Moore's law affects that then okay, let's use Ray Tracing and other Visual Tricks to fool our eyes into a better Graphical Future. I also want Great Story Telling as well which thanks to the rise of SJWs, E-Sports, Lllumination Studios, and Modern Activism has told the world of Computer Generated Imagery that Story Doesn't Matter Anymore! Please don't let us down Sony!
@ekaterinavalinakova2643
@ekaterinavalinakova2643 6 жыл бұрын
1.13 Quintilian flop gaming system by 2040. 100,000 x 11.3 teraflops.
@xsuploader
@xsuploader 5 жыл бұрын
Not quite, he said in the 2040s not 2040. At the current rate of 1.5x per year it would take log(100000)/log1.5 years or 28 years approximately putting the year at 2046. At around the 2045 singularity proposed by kurzweil.
@tomislavnikolic5778
@tomislavnikolic5778 5 жыл бұрын
Holy shit
@szirbektamas2571
@szirbektamas2571 5 жыл бұрын
After this video I feel myslef so stupid
@CCRob720
@CCRob720 3 жыл бұрын
what could we do with a billion gpu power.....
@harrym8556
@harrym8556 5 жыл бұрын
"1^14 FLOPS in performance..." Dude, what are you talking about?? You know that 1^14 = 1, right? Did you mean to say 10^14 FLOPS?
@kapilbsingh
@kapilbsingh 6 жыл бұрын
Whatever they will develop it will find a place in landfills.
@vladimirtchuiev2218
@vladimirtchuiev2218 6 жыл бұрын
And now people start using GPUs for crypto-currency mining, driving GPU prices up...
@SumWanYo
@SumWanYo 6 жыл бұрын
Why is the nvidia ceo so nervous?
@rpzcsonli
@rpzcsonli 5 жыл бұрын
doesn't know how to burn AMD to the ground so he can have all the moneiz and maybe some monopoly issues and anti competitive practices that he pays the governments to don't dismantle nvidia to pieces. Nvidia single handedly slowed progress for GPU's by making all the bullshit technology and buying the competitors and other technologies to use in their cards only so it will be "better" then stall for 2-5 years untill AMD catches on then get another "revolutionary" technology that is nvidia only and "help" developers by givind them the tech and money and destroy AMD performance and wait again untill AMD catches on and repeat. I'm not a AMD fanboy but i hate nvidia with everything i have because they did and do everything i sayed , google a little and you'll be enlightened by what nvidia did in the past 20 years.
@dr.zoidberg8666
@dr.zoidberg8666 6 жыл бұрын
We're reeping closer & closer every day to machines with the processing power & storage capacity to simulate human minds. Once that's achieved, all we need to do is figure out how to transfer someone over without breaking their stream of consciousness in the process, & we'll have a reliable path forward to radical life extension.
@raunak1147
@raunak1147 6 жыл бұрын
Dr. Zoidberg By 2021, or if before that, something revolutionary like Graphene/3D processors happen
@rpzcsonli
@rpzcsonli 5 жыл бұрын
@@raunak1147 you are dreaming to big just like the 1990 people that were thinking we will have flying cars ... maybe another 30 years untill then.
@Julia-hk9jp
@Julia-hk9jp 6 жыл бұрын
to sum it up this video is just nvidia commercial..
@cameronh3260
@cameronh3260 6 жыл бұрын
But can it run Minecraft?
@rpzcsonli
@rpzcsonli 5 жыл бұрын
with 400 mods yea but 600 mods ... i don't think so
@Gollywog
@Gollywog 6 жыл бұрын
you talk too fast. I love the info but it needs my full attention (not something I can listen in the background) because you say everything so fast.
@735Secure
@735Secure 6 жыл бұрын
StiX it’s because he’s just reading the stuff. If you have a technical background and are a scientist or an engineer you don’t just put on a show. He is all about the show. Descent information but I don’t thrust the information he provides.
@curiosity1865
@curiosity1865 6 жыл бұрын
Turn down the speed of video
@originproductions6120
@originproductions6120 5 жыл бұрын
@@polishpepe239 stop trying to sound smart. I'm sure you wouldn't have a problem with 10x speed as well right because you're such an intellectual. It's annoying because my full attention has to be on the video and if I'm playing halo this guy talks too fast for that. Stop trying to show off and be honest with yourself. Can you understand him at 2x speed? Probably, but it's missing the whole point of the video. That point is to absorb the information he's giving you and think about it, and you can't do all of that at 2x speed even if you're Albert fucking Einstein
@originproductions6120
@originproductions6120 5 жыл бұрын
@@polishpepe239 also I just watched it at 2x speed and now know that you're just bullshitting if 2x speed isn't even an ideal speed. Stfu no one cares about you trying to sound smart
@zalanta7
@zalanta7 5 жыл бұрын
this video is 4k
@perspgold8945
@perspgold8945 2 жыл бұрын
Not sure if it was the speed speaking or the content but this video was disjointed
@Goldnr
@Goldnr 6 жыл бұрын
Nvidia‘s Cuda - showing an AMD card...
@albertgerard4639
@albertgerard4639 6 жыл бұрын
Moors law never took into account bitcoin... ouch
@gertjanvandermeij4265
@gertjanvandermeij4265 5 жыл бұрын
Nvidia is just a big bully !
@madscientistshusta
@madscientistshusta 5 жыл бұрын
Excuse me,starcitizen is a joke.
@Drixidamus
@Drixidamus Жыл бұрын
Your channel is criminally unsubscribed
@utubekullanicisi
@utubekullanicisi 3 жыл бұрын
Too fast.
The Future of 'Classical' Computing
10:46
Futurology — An Optimistic Future
Рет қаралды 54 М.
Кәріс өшін алды...| Synyptas 3 | 10 серия
24:51
kak budto
Рет қаралды 1,3 МЛН
小女孩把路人当成离世的妈妈,太感人了.#short #angel #clown
00:53
DELETE TOXICITY = 5 LEGENDARY STARR DROPS!
02:20
Brawl Stars
Рет қаралды 16 МЛН
100😭🎉 #thankyou
00:28
はじめしゃちょー(hajime)
Рет қаралды 58 МЛН
How Brain Computing Works (What Is Cognitive Computing)
17:51
Futurology — An Optimistic Future
Рет қаралды 98 М.
5G Explained | What Is 5G | How 5G Works
24:20
Futurology — An Optimistic Future
Рет қаралды 1,4 МЛН
10 YEARS of NVIDIA Video Cards Compared!
14:23
Linus Tech Tips
Рет қаралды 3 МЛН
Quantum Computing Explained | How Quantum Computers Work (Achieving Quantum Supremacy)
14:40
Futurology — An Optimistic Future
Рет қаралды 84 М.
Computer Memory Explained | How RAM Works
10:22
Futurology — An Optimistic Future
Рет қаралды 47 М.
Silicon Photonics: The Next Silicon Revolution?
15:45
Asianometry
Рет қаралды 409 М.
The End of Moore’s Law?! (Shrinking The Transistor To 1nm)
11:24
Futurology — An Optimistic Future
Рет қаралды 438 М.
NVIDIA'S HUGE AI Chip Breakthroughs Change Everything (Supercut)
26:08
Ticker Symbol: YOU
Рет қаралды 1,4 МЛН
How GPU Computing Works | GTC 2021
39:36
Dan the Man
Рет қаралды 27 М.
Graphene Computing Explained (Making Computers Faster)
10:54
Futurology — An Optimistic Future
Рет қаралды 72 М.
Непробиваемый телевизор 🤯
0:23
FATA MORGANA
Рет қаралды 407 М.
i love you subscriber ♥️ #iphone #iphonefold #shortvideo
0:14