Want to learn more about the Technological Revolution? Watch our playlist here: kzbin.info/www/bejne/e3-6pKKNn999irM - ALSO - Become a KZbin member for many exclusive perks from exclusive posts, bonus content, shoutouts and more! subscribe.futurology.earthone.io/member - AND - Join our Discord server for much better community discussions! subscribe.futurology.earthone.io/discord
@holdthetruthhostage5 жыл бұрын
I don't think it will take 2040 most likely a start up company with 3d printing will accomplish this sooner
@bhuvaneshs.k6385 жыл бұрын
Ur channel is awesome... Please do a video on TPU
@safaulansari77823 жыл бұрын
P
@darrenc23709 ай бұрын
the music from 7:46 onwards, anyone else getting Serial Experiments Lain vibes there? Edit: link for the uninitiated kzbin.info/www/bejne/n4C5natqhberkKcsi=optCloI6NAEgjj9w&t=248
@PauLWaFFleZ6 жыл бұрын
Please PLEASE do NOT stop this series on Computing.
@kayrosis55236 жыл бұрын
Graphics cards 100,000x as good as now? That simulation hypothesis might be onto something
@gs-nq6mw4 жыл бұрын
Nope,the Moore law is expected to end by 2020 by most specialist,including Moore himself
@projectjt31494 жыл бұрын
g s how about Tensor Cores?
@thewalnutwoodworker61363 жыл бұрын
We are down to 2nm as of 2021, 1nm will be the end of silicon. We might to be able to push it father with other atoms to go sub nm. We will probably make major advancements in architecture before we go quantum. For example x86 is bloated af, risc is what we need past 1nm. AMD is making major pushes in architecture such as chiplets/mcm. As of now the rdna3 leaks are showing over 2x performance on rasterastion.
@spacevspitch40282 жыл бұрын
@@gs-nq6mw Yowza. 2 years ago and still going.
@player111q7 Жыл бұрын
No it's not, 2023 and it's still valid
@PauLWaFFleZ6 жыл бұрын
Bro your videos are simply AMAZING. Your presentation is completely pulls you in, and makes you want to wish that the video never stops. Makes me more excited to be going to school for Computer Engineering. Keep at it bro...
@OptimisticFuturology6 жыл бұрын
Thank you for watching :)!
@projectjt31495 жыл бұрын
Oh it didn't stop for me! I am about to write a research abstract in college discussing GPU computing and this video was a HUGE help!
@L2Xenta6 жыл бұрын
+1 sub for showing off the ambition of Star Citizen, a game project that takes a long time but challenges multiple barriers of the limited industry.
@annoloki6 жыл бұрын
Do check out IBM's TrueNorth architecture which'll put an end to all this GPU for AI stuff as TrueNorth doesn't run code to simulate neurons - it provides silicon based neurons on a chip... they are clockless, parallel, programmable, run faster than instruction based neural nets, but use an absolutely minute amount of power to do the same thing... neurons in silicon rather than in software is what will really revolutionise AI.
@OptimisticFuturology6 жыл бұрын
Thanks for watching! I have a video on neuromorphic computing coming soon!
@walter0bz6 жыл бұрын
I hope that AI solutions keep being programable. the same thing happened with graphics: we went from pure-software , to fixed-function hardware accelerators, to programmable hardware that eventually generalised into a fully programable parallel processor. The reason is the variety in algorithms (eg convolutions, capsules, various compression schemes.. ); i would prefer to see this variety increase; we'd just see chips that are more dataflow/low-precision oriented for AI (rather than the GPUs who's datapaths and precision grew around the demands of texturing and geometry)
@annoloki6 жыл бұрын
Fantastic, if I can recommend something well worth looking at in that realm which you may wish to incorporate bits into your video (if you've not come across it already) that's an explanation of the general purpose cortical column, which is slightly different from the deep learning neural nets, using SDR (sparse distributed representation) in order to act as memory, and pattern prediction that branches out to predict many different possible futures, allowing pruning of branches that didn't occur once data comes in. It's what powers mammalian (including our) brains, and is being put to use for things like recognising when faults will occur on computer systems to helping predict power generation requirements on a grid. Numenta seem to be a/the leader in this work... it might be a bit out of the scope of what you're doing if you're more on the [near-]consumer tech, but it's super interesting and will play a big role in our futures, so I'll just give you this one video link as it contains some good visual explanations, and will spare you further recommendations unless they're helpful/welcome rather than redundant. Cheers for replying :-) kzbin.info/www/bejne/n3-wk6asgJ2Ebrs
@annoloki6 жыл бұрын
walter0bz - TrueNorth is programmable, but not in the same way as a general purpose processor, because it's clockless, asynchronous, very highly parallel etc etc etc... it's more like electrical wiring, like having a load of gates, capacitors, transistors, all on silicon, and "programming" it uses an FPGA to wire them up to make your own circuits... it's more like a physical brain that you choose how neurons are connected to each other and how they behave than like a CPU... but it can do the job of a rack of servers in something that will fit in a phone and barely touch your battery... but I'll leave it there as SP's doing a video on the subject 'n he'll no doubt do a better job covering it, oh just to say, I wouldn't worry about limitation of hardware, when it comes to AI, it's US who are going to be falling behind, esp now Google are already using AIs to make AIs better than people can make them.
@walter0bz6 жыл бұрын
programability is a sliding scale. 'fixed-function GPUs' are configurable and can do different effects with layering, ordering according to a 'program', but fine grain programability in shaders was superior. The device I hope 'wins' would be the grid-of-RISC-cores, with inter-core messaging (this can be clockless, highly parallel..) - just with the right custom instructions (e.g. low precision dot-products) to handle AI workloads
@googledev5664 жыл бұрын
*_Keep creating such crucial and informative videos_*
@tomojeetchakraborty54596 жыл бұрын
I am your greatest fan .sir . I am very obilged by the knowledge u provide us . And make us aware of the modern trends .,💐💐💐💐💐💐
@OptimisticFuturology6 жыл бұрын
Thanks for watching :)!
@Slayer39155 жыл бұрын
Apparently it's hard for some people, but I for one appreciate your spoken words per minute.
@djsvideodiarys6 жыл бұрын
Can't wait for QPU.
@OptimisticFuturology6 жыл бұрын
But could it run Crisis ;)
@djsvideodiarys6 жыл бұрын
Hahahaha
@djsvideodiarys6 жыл бұрын
Haha Hopefully it will solve Crisis
@ActualGenius6 жыл бұрын
Feel like I was processing in CPU terms my whole life and instantaneously entered QPU age when I started watching this channel.
@TheDanm225 жыл бұрын
nothing will run crisis. @@OptimisticFuturology
@rawding19766 жыл бұрын
One of my favorite channels!!! This channel is gonna explode with subscribers very soon! Watch & see! Great job as always!
@OptimisticFuturology6 жыл бұрын
Thank you for watching!
@thejointcoach6 жыл бұрын
Please do a video on quantum computing!! I love your channel and I think you deserve way more subscribers, I'll do my best to spread your name
@rahmanash98566 жыл бұрын
Awesome as always .. waiting for Quantum and other types of computing, Graphene applications,AI and so much others
@madsgrand6 жыл бұрын
Please change the coldfusion inspired intro its just to close for comfort. On the content side your videos are so much better!
@dangdiggity99165 жыл бұрын
one thing im wondering about for "ai" in gaming is if they can optimize ray tracing for certain games so when you open the game they have already figured out where on the map its used so its basically learning something before doing it for the first time by the user (but ofc probably still improving)
@The_Masked_Frenchman5 жыл бұрын
Watching these is reinvigorating my love for technology and makes me want to go back to school for computer engineering
@borisgotov98386 жыл бұрын
give simple piece of code a little bit more complex than hello world program. Something like rolling ball or rotating square...
@james_gemma6 жыл бұрын
Your channel should have way more views and subscribers. I guess it's only a matter of time before everyone discovers your excellent informative tech videos.
@cdreid999995 жыл бұрын
Funny the number of comments mirroring this. In a row. But I'm sure you're not sockpuppets or bots..
@originproductions61205 жыл бұрын
He's ripping off ColdFusion's intro completely
@meowmeowmoogabenrules48545 жыл бұрын
why is youtube barely showing me this. Amazing work man
@marymcreynolds83556 жыл бұрын
Star Citizen...quite a peek of the latest peak.
@Mordred4786 жыл бұрын
Great video, very informative. Is the implication here that Dell and other companies will soon start offering PCs with GPUs instead of CPUs?
@jwadaow6 жыл бұрын
No
@mike2881905 жыл бұрын
I believe you would still need a cpu
@karehaqt5 жыл бұрын
Just discovered your channel via this series and you gained my sub, great videos so far.
@system20726 жыл бұрын
great videos man..... but can you please slow down while explaining.. you speak fast which is very hard to understand sometime...
@NietJeffrey5 жыл бұрын
I wish you every bit of youtube fame coming to you. You have some great content!
@barney90086 жыл бұрын
nice delivery reminds me of cold fusion
@srungarapusaikrishna55834 жыл бұрын
That my friend felt like a rocket science class!!!😵🤒
@bommaritohawaii6 жыл бұрын
Great job!
@dinozaurpickupline42213 жыл бұрын
An AI could be used to map tonal changes,of different structures and things & their outcome can be used to decrease load times In games like how cool would it be if a computer or software already knows reflection details of every object texture color variation size & scaling using separate sets of data & create further smaller tests this would increase performance drastically
@benyeo79305 жыл бұрын
I love all your videos - great content that is educational and serves as good primers for the uninitiated. 2 issues with all your videos - the voice of commentator is way too Low and boring (no fluctuations in tone at all - drones on and on) and the speed of narration is also way too rapid; good thing, there are subtitles to follow which solves the issue for the Super interested audience. However, even scanning the subtitles requires full and complete attention which reinforce the comment that the speed and tone issues are real!
@chuckbuckets15 жыл бұрын
ai and protein folding will be one of the most profound paradigm shifts of humanity.
@hemendrapratapsingh41563 жыл бұрын
So I thought to watch the complete ad today. But it skipped automatically. 🤐
@peefwellington87944 жыл бұрын
Boy the new gen of gpu In 2020 is gonna be incredible
@PabloGonzalezVargas5 жыл бұрын
Wow !!! I'm a big fan of your channel, impressive eloquent work *
@daniel_960_5 жыл бұрын
But what do the cores in apples mobile graphics mean? The a11 or a12 have only 3 or 4 graphics cores but are still really powerful. Previous generations had more cores as far as I know.
@wolfisraging6 жыл бұрын
Great job, make more
@usertogo6 жыл бұрын
Nice, now if somebody has a graphics accelerator how does one enable the cores to be easily used by the operating system and applications?
@zahanjavaid5 жыл бұрын
I seriously liked your videos but not in a position to understand most of the part LOL 😂😂😂
@supremepartydude6 жыл бұрын
As computer enthusiast for 30 years you did a great job.
@edwardbrownstien87415 жыл бұрын
Great channel . Love the content .
@HeadStronger-HS6 жыл бұрын
this blew my mind.. Major advances in GPUs
@nad19015 жыл бұрын
Still I don't know how people get the idea of using annoying music background on informative videos. And since when did we've music playing while we learning in schools :/
@fuzzylumpkin80305 жыл бұрын
Yeah that’s cool but at what cost to gaming
@MyWatchIsEnded5 жыл бұрын
But can the GPU from 2040 run Crysis?
@ahuttee5 жыл бұрын
Might be possible
@ELECTR0HERMIT4 жыл бұрын
excellent job.
@TheDanm225 жыл бұрын
in the first minute.... thats called moores law. it deserves the reference.
@PauLWaFFleZ6 жыл бұрын
When can we plan on seeing some of the videos on the AI series?
@OptimisticFuturology6 жыл бұрын
Starting May!
@PauLWaFFleZ6 жыл бұрын
Ah come on man, I can't wait that long... You gotta give me the line up for what else is going to be coming out until then...
@Army2willis5 жыл бұрын
I see you popped in some SCU to show just how crazy graphics are today. You know you like SC
@brushhog70896 жыл бұрын
nice horn in the background but really distracting I guess I'll go somewhere as I was for information
@anshulsharma94246 жыл бұрын
Keep up the good work
@DarthRaver-og7qq5 жыл бұрын
Dam think about what a laptop or desktop gamin rig will like look in say 50 years?? Could you imagine having something portable the size of a Nintendo Switch, yet as powerful as a full desktop gaming rig today with the best of everything. Thats crazy lol. I hope im still alive by then. Everything in the world looks like its actually heading toward a "Bladerunner" type civilization lol.
@pegasusted25046 жыл бұрын
Good stuff all round :~)
@infinitworld71066 жыл бұрын
MORE CONTENT!!!
@winkipinky5 жыл бұрын
Fantastic .... 😁
@govinds39516 жыл бұрын
Jheeez good work
@meatofpeach6 жыл бұрын
Incredible KZbin channel. Wow. Keep it up
@tonytony72255 жыл бұрын
you gotta talk about amd's threadripper and its AI technology
@DANTHETUBEMAN6 жыл бұрын
as soon as computers get consciousness they will no longer serve humanity,
@flynnkay5 жыл бұрын
Ok then Ill unplug that bitch haha
@cdreid999995 жыл бұрын
You don't understand how computers work then. And we are nowhere close to an. What the hypewagon are calling ai now is what we used to call expert systems .. ie standard proging/algorithms. We simply don't have the processing power yet. We might be able to build a simulated human brain but it would be built on FPGA like neural networks boards, be the size of a Walmart and probably require a power plant to run
@TheDanm225 жыл бұрын
@@cdreid99999 ai is going under a new dynamic now. 1 ai is programmed to evaluate and judge. another ai is programmed to design and invent. this is the judge and the inventor ai. two ai working together to improve each other processes. it wont be the size of walmart. its going to evolve exponentially.
@viniciusbueno21605 жыл бұрын
And now, 1 or 2 months ago IBM released the first quantum computer outside the lab!!!! Now things can move faster
@snoogboonin5 жыл бұрын
Your vids are fucking unreal dude. Subbed.
@vyor88375 жыл бұрын
Volta isn't on a true 12nm node.
@aoeu2565 жыл бұрын
GPU voice recognition never heard of it.
@m_sedziwoj6 жыл бұрын
From last week, look at Google TPU 3, 100 petaflops for DL, it 1000x more then Nvidia Titan V
@sarmadnajim48396 жыл бұрын
Wonderful document , direct and clear , smartly done 👍🏻
@Chrisimplayer5 жыл бұрын
to me it's highly debatable if star citizen is still in development
@bassbs6 жыл бұрын
Did you, SP, do SU?
@jackharpe3rd2334 жыл бұрын
I could care less for AI or a real life Hal or Skynet. Not because I'm scared, but cause all I truly want is more Pixels and more Polygons being rendered for my videogames. Unless Moore's law affects that then okay, let's use Ray Tracing and other Visual Tricks to fool our eyes into a better Graphical Future. I also want Great Story Telling as well which thanks to the rise of SJWs, E-Sports, Lllumination Studios, and Modern Activism has told the world of Computer Generated Imagery that Story Doesn't Matter Anymore! Please don't let us down Sony!
@wandrinsheep6 жыл бұрын
oho a star citizens fan i see, awesome
@kokomanation6 жыл бұрын
I feel that cpu and GPU are getting merged together
@rpzcsonli5 жыл бұрын
CPU can calculate everything , GPU needs special cores to be better than a CPU so if they add cores to calculate everything the GPU will become a CPU . It won't become one because it will be to big , expensive , powerhungry . They could make a AiPU(ai proccesing unit) and put it in a card , a RTPU(ray tracing proccesing unit) and put it in a card and so on and make it that you can add 20 cards in your computer and then we will have the true "performance" in everything but it won't happen because is stupid. CPU will be the "CENTRAL Proccesing Unit" untill humanity ends and GPU will just be it's slave doing all the work then it will put everything together and make it pleasent for you to interact with.
@platin21485 жыл бұрын
The stacking will also make nvidia obsolete as both intel and amd can make pretty capable gpus so no need to work with nvidia. So i suspect the either go server or bet on Arm so if the could make a Arm1000 chip that is tightly integrated with there GPUs the basically would have won. And he saying that he made a special chip for AI which is completely wrong he made matrix calculations faster not AI but it could be even faster with FPGA's.
@Keiktu5 жыл бұрын
Insta suscribed
@ekaterinavalinakova26436 жыл бұрын
1.13 Quintilian flop gaming system by 2040. 100,000 x 11.3 teraflops.
@xsuploader6 жыл бұрын
Not quite, he said in the 2040s not 2040. At the current rate of 1.5x per year it would take log(100000)/log1.5 years or 28 years approximately putting the year at 2046. At around the 2045 singularity proposed by kurzweil.
@frozencode52386 жыл бұрын
i love you man...
@CCRob7203 жыл бұрын
what could we do with a billion gpu power.....
@harrym85565 жыл бұрын
"1^14 FLOPS in performance..." Dude, what are you talking about?? You know that 1^14 = 1, right? Did you mean to say 10^14 FLOPS?
@kapilbsingh6 жыл бұрын
Whatever they will develop it will find a place in landfills.
@szirbektamas25715 жыл бұрын
After this video I feel myslef so stupid
@Sean_Lightning_OBrien6 жыл бұрын
At this point I’m still waiting for the GYX 1180/2080 😂
@mohamedsalahoshi14866 жыл бұрын
*J* *U* *S* *T* *A* *M* *A* *Z* *I* *N* *G*
@MegaFlemo5 жыл бұрын
WOW
@SumWanYo6 жыл бұрын
Why is the nvidia ceo so nervous?
@rpzcsonli5 жыл бұрын
doesn't know how to burn AMD to the ground so he can have all the moneiz and maybe some monopoly issues and anti competitive practices that he pays the governments to don't dismantle nvidia to pieces. Nvidia single handedly slowed progress for GPU's by making all the bullshit technology and buying the competitors and other technologies to use in their cards only so it will be "better" then stall for 2-5 years untill AMD catches on then get another "revolutionary" technology that is nvidia only and "help" developers by givind them the tech and money and destroy AMD performance and wait again untill AMD catches on and repeat. I'm not a AMD fanboy but i hate nvidia with everything i have because they did and do everything i sayed , google a little and you'll be enlightened by what nvidia did in the past 20 years.
@BakiWho6 жыл бұрын
you sound like the hardy boys in south park :) i have a raging clue
@Julia-hk9jp6 жыл бұрын
to sum it up this video is just nvidia commercial..
@Gollywog6 жыл бұрын
you talk too fast. I love the info but it needs my full attention (not something I can listen in the background) because you say everything so fast.
@735Secure6 жыл бұрын
StiX it’s because he’s just reading the stuff. If you have a technical background and are a scientist or an engineer you don’t just put on a show. He is all about the show. Descent information but I don’t thrust the information he provides.
@curiosity18656 жыл бұрын
Turn down the speed of video
@originproductions61205 жыл бұрын
@@polishpepe239 stop trying to sound smart. I'm sure you wouldn't have a problem with 10x speed as well right because you're such an intellectual. It's annoying because my full attention has to be on the video and if I'm playing halo this guy talks too fast for that. Stop trying to show off and be honest with yourself. Can you understand him at 2x speed? Probably, but it's missing the whole point of the video. That point is to absorb the information he's giving you and think about it, and you can't do all of that at 2x speed even if you're Albert fucking Einstein
@originproductions61205 жыл бұрын
@@polishpepe239 also I just watched it at 2x speed and now know that you're just bullshitting if 2x speed isn't even an ideal speed. Stfu no one cares about you trying to sound smart
@cameronh32606 жыл бұрын
But can it run Minecraft?
@rpzcsonli5 жыл бұрын
with 400 mods yea but 600 mods ... i don't think so
@strangevideos30485 жыл бұрын
We live in Matrix!
@vladimirtchuiev22186 жыл бұрын
And now people start using GPUs for crypto-currency mining, driving GPU prices up...
@perspgold89452 жыл бұрын
Not sure if it was the speed speaking or the content but this video was disjointed
@dr.zoidberg86666 жыл бұрын
We're reeping closer & closer every day to machines with the processing power & storage capacity to simulate human minds. Once that's achieved, all we need to do is figure out how to transfer someone over without breaking their stream of consciousness in the process, & we'll have a reliable path forward to radical life extension.
@raunak11476 жыл бұрын
Dr. Zoidberg By 2021, or if before that, something revolutionary like Graphene/3D processors happen
@rpzcsonli5 жыл бұрын
@@raunak1147 you are dreaming to big just like the 1990 people that were thinking we will have flying cars ... maybe another 30 years untill then.
@zalanta75 жыл бұрын
this video is 4k
@ITSotechAI10 күн бұрын
❤❤❤ hi all very
@Goldnr6 жыл бұрын
Nvidia‘s Cuda - showing an AMD card...
@tomislavnikolic57785 жыл бұрын
Holy shit
@albertgerard46396 жыл бұрын
Moors law never took into account bitcoin... ouch
@projectjt31492 ай бұрын
Even after all the success with Generative #AI and #NVIDIA recently, no one seems to be watching this video!