OUTLINE: 0:00 - Intro 5:10 - What does it mean to make hardware for AI? 8:20 - Why were GPUs so successful? 16:25 - What is "dark silicon"? 20:00 - Beyond GPUs: How can we get even faster AI compute? 28:00 - A look at today's accelerator landscape 30:00 - Systolic Arrays and VLIW 35:30 - Reconfigurable dataflow hardware 40:50 - The failure of Wave Computing 42:30 - What is near-memory compute? 46:50 - Optical and Neuromorphic Computing 49:50 - Hardware as enabler and limiter 55:20 - Everything old is new again 1:00:00 - Where to go to dive deeper? Read the full blog series here: Part I: medium.com/@adi.fu7/ai-accelerators-part-i-intro-822c2cdb4ca4 Part II: medium.com/@adi.fu7/ai-accelerators-part-ii-transistors-and-pizza-or-why-do-we-need-accelerators-75738642fdaa Part III: medium.com/@adi.fu7/ai-accelerators-part-iii-architectural-foundations-3f1f73d61f1f Part IV: medium.com/@adi.fu7/ai-accelerators-part-iv-the-very-rich-landscape-17481be80917 Part V: medium.com/@adi.fu7/ai-accelerators-part-v-final-thoughts-94eae9dbfafb
@Stopinvadingmyhardware2 жыл бұрын
You’re ignorant
@cedricvillani85022 жыл бұрын
Somewhere on an alternate timeline in an alternate universe is a butterfly watching Chaos TV, it’s watching a comedy special about a weird alien species that does everything it can to destroy itself, first by exploding the many so a few can have better more interesting lives with ability to take back choices because they’re not held accountable nor do they have to deal with the annoying feelings of guilt and empathy, lulz. Then when things are going really bad, they toss in a few more stones and the unaware many are trying there hardest to find a way to replace themselves as fast as possible and call it progress 😂 the first season is over, and the Butterfly realizes that it has just wasted away an entire day starts to vibrate it’s wings AND THEN EXPLODES.😢😮 The End…
@ramvirkumae9752 жыл бұрын
Ha
@Stwinky2 жыл бұрын
Banger video fellas. One time I told my mom via text that I purchased a GPU and when I called her later she kept trying to pronounce “GPU”, but not as an acronym . Her best attempt was “guppy-ooh”
@NoNameAtAll22 жыл бұрын
g'poo/g'pew?
@BoominGame11 ай бұрын
Well nowadays you can have the AI squeal with diesel.
@johnshaff Жыл бұрын
Yannic, thanks for this guest. Please continue identifying the core and leading edge components of technology and finding guests to explain them. Much better than channels who focus on the surface things everyone else is talking about
@vzxvzvcxasd71092 жыл бұрын
In truth, my profession has nothing to do with computers, but I learnt everything about ML from the sheer amount of videos I watched from this channel, to a point that I understand most of the videos that come out now. Started from attention is all you need. I like whenever you draw annotations of flow charts because it makes it so much easier to follow what a paper is trying to do. With your interviews with papers authors, I think it would be more insightful if you explained the paper first, then the interviewee gets to see you explainer before being interviewed. Almost like the peer review process. But then they are able to say if they agree on your interpretation, or to expand on things that they felt were the potential. This video was really nice, got to understand the bigger picture of how the system turns
@sebastianreyes80252 жыл бұрын
What's your profession? How did you gain interest in this? Is there a connection between what you do and these topics?
@CyborgHGWF Жыл бұрын
11:22 I am very glad you guys got the history right well done. I really appreciate hearing from someone who I as well lived though those phases of technology. He is right!
@Coolguydudeness12342 жыл бұрын
This is awesome, thanks!
@coffeewmike26 күн бұрын
6:42 this is so true. I have been working on a system and I have found that this is everything.
@billykotsos46422 жыл бұрын
oh come on ! IVE GOT so much stuff on my plate !! Oh dear ! But I will watch it ! for sure !
@parsabsh Жыл бұрын
Such a great talk! I think it's an amazing and helpful introduction to AI acceleration for anyone who is interested in the topic (as it was for me). Thanks for sharing your information!
@karolkornik2 жыл бұрын
Yannic! You are nailing it! I love it 😍
@catalinavram31872 жыл бұрын
This is such a great interview!
@TheEbbemonster2 жыл бұрын
Great video! I will read the blog for sure, this guy is a good and clear commutator ❤️
@100SmokingElephants2 жыл бұрын
Thank you for taking up this topic.
@asnaeb22 жыл бұрын
My experience with any ML accelerator other than GPU's is that my code won't run cause my model isn't 6 years old and the hardware doesn't support the new functions.
@flightrisk75662 жыл бұрын
do u mind if I ask which models at all? TPUs always seem to work for me no matter what kinds of workloads I’ve come up with, and I want to use them for a Kaggle competition, but I feel as though if there are going to be compatibility issues I should investigate whether they impact my use-case and make sure I strategize around that ahead of time
@asnaeb22 жыл бұрын
@@flightrisk7566 What does? Detectron2, Clova's OCR model and many many more. Nothing new has ever worked for me.
@chanpreetsingh0072 жыл бұрын
So true.
@ravindradas91352 жыл бұрын
Utasvkumar shigh rajàpu
@ravindradas91352 жыл бұрын
Utsav Kumar Singh Rajput
@silberlinie2 жыл бұрын
38:20; very good. You can compare this very nicely to what Stephen Wolfran is doing with his whole Mathematica project. That he's taking the focus away from the idea of the traditional teaching of mathematics with individual computational tasks to the idea of the functional description of the mathematical of the problem at hand.
@nicohambauer2 жыл бұрын
I love your content, until today i am still very confused/wondering: why do you have a green screen when you don’t color key/remove it in general?
@siegfriedkettlitz65292 жыл бұрын
Because he does whatever pleases him.
@YannicKilcher2 жыл бұрын
I'm hipster like that 😅
@nicohambauer2 жыл бұрын
@@YannicKilcher 😁😁🙌🏼
@b0nce2 жыл бұрын
Maybe he actually has a blue screen, which is keyed to green ¯\_(ツ)_/¯
@spaghettihair2 жыл бұрын
51:57 check out Graphcore. They've made the bet that graph-nns are the future and are developing hardware to support them.
@jasdeepsingh97742 жыл бұрын
thanks for the video, love the content. I will appertiate it, for more future videos explaining content like that
@alan2here2 жыл бұрын
perceptron, recurrence, and memory cells trained on temporal information is all you need
@CyborgHGWF Жыл бұрын
11:39 Pentium 4 wasn't out until the 2000s. You are right. I still have one.
@GeoffLadwig Жыл бұрын
Loved this. Thanks
@djfl58mdlwqlf2 жыл бұрын
As always, thanks for ur vid
@BlackHermit9 ай бұрын
He's such a cool guy, and he works at Speedata!
2 жыл бұрын
Amazing video. Thanks you very much
@EmilMikulic2 жыл бұрын
Reciprocal sqrt is useful for normalizing vectors, because e.g. three multiplies (x,y,z) are much faster than three divides. :)
@silberlinie2 жыл бұрын
37:00; for example, the Netherlands universities are leading in photonic computing, especially the university of Ghent.
@lucasbeyer29852 жыл бұрын
Duuuude, Ghent is not in the Netherlands, but in the better version of the Netherlands.
@silberlinie2 жыл бұрын
@@lucasbeyer2985 I am soo sorry. How can I say such a thing? We are in Flanders, Belgium, of course.
@fabianaltendorfer11 Жыл бұрын
Very interesting! One Question: The architecture for AI accelerators like TPUs etc are still based on GPUs right?
@souljaaking942 жыл бұрын
Thanks a lot man!
@sucim10 ай бұрын
Nice he already talked about Groq 2 years ago!
@alexandrsoldiernetizen1622 жыл бұрын
49:00 Neuromorphic computing isnt 'similar in theory' to the spiking neural net model, it is in fact based on, and relies on, the spiking neural net model.
@judgeomega2 жыл бұрын
so its NOT similar in theory, its REALLY just similar in theory. thanks for clearing that up
@alan2here2 жыл бұрын
What are the most ghosty instructions? Lets think up uses :) Q: So how long does all that take? A: 1 clock cycle
@Clancydaenlightened Жыл бұрын
Could make a chip that proceeses data like a gpu. But has more complex instruction set and opcodes like a cpu And add pipelining
@Jacob0112 жыл бұрын
very very useful content
@NoNameAtAll22 жыл бұрын
are you playing lichess in the background or smth?
@TheReferrer722 жыл бұрын
It seems to me that we are in the using vacuum tubes stage of machine learning. My bet that we be using an analog machine probably light for machine learning.
@Boersenwunder- Жыл бұрын
Which stocks are benefiting? (except Nvidia)
@cmnhl13292 жыл бұрын
Brainchip Akida: “Am I dead to you?”
@varunsai97362 жыл бұрын
Nice one
@alexandrsoldiernetizen1622 жыл бұрын
Increasing core count means eventually bumping up against Amdahls Law and data locality.
@anotherplatypus Жыл бұрын
Where'd you find this genius? I kinda just wanna hear you bring up a topic, and then nudge him back on course when he absent mindedly starts rambling.... like about anything... if you have any KZbinr friends of your channel, I'd love to hear him to keep going on-and-on if he had a wrangler to keep him on topic....
@sandeepreddy85672 жыл бұрын
Hi yannic, you can get Light Matter CEO Nick Harris on your channel. He is a cool & intelligent guy
@hengzhou4566 Жыл бұрын
DPU?
@BoominGame11 ай бұрын
Data Processing Unit, don't be a swine.
@randywelt82102 жыл бұрын
Man, I miss the explanation from the FPGA point of view. Flip-flops for sync yes no.. etc
@martinlindsey96932 жыл бұрын
Take your sunglasses off.
@tarcielamandac80922 жыл бұрын
@MASTER-qc3ei2 жыл бұрын
WOW
@tarcielamandac80922 жыл бұрын
Hello goodmorning frm Phillippiinees
@liweipeng91062 жыл бұрын
self learner is here. but basically speaking, the asian students are more confused by those abstract concepts than western friends.
@lucyfrye53652 жыл бұрын
You use the reciprocal square root for normalizing vectors (basically dividing by the pythagoras sqrt). It is used so much in graphics that the makers of Quake 3 made a famous trick to speed it up a bit. Outdated now but it's really not exotic in game development at all.
@Wobbothe3rd2 жыл бұрын
I love Quake 3 with all my heart, but the fast inverse square root algorithm is much, much older than Q3.
@喵星人-f4m2 жыл бұрын
he is wrong on in memory computing and near memory computing...
@tonysu88602 жыл бұрын
No! AI accelreator nodes have nothing to do with running an AI application! AI accllerator nodes are used when creating the NN algorithm. Creating a NN algorithm based on enormous amounts of data requires enormous amounts of computing power. When the AlphaZero program created the Leela chess playing AI, it took about a year running the machine learning 24/7/365 to create an algoithm capable of challenging the existing World Champion program called Stockfish and actually beat it (Stockfish has since won back the wolrd title and kept it through several matches). But again, once the NN algorithm has been created, the program can be run on relatively weak hardware very well. As an example every high end phone today runs AI programs at least 2 different ways... one is for voice recognition and the other is for video enhancement and manipulation. Longtime users of voice recogniztion will remember that in the past voice recognition programs had to be trained to recognize your voice. You would have to read prepared text into the computer which would be used later to try to recognize those soame snippets of tones and match for voice recgonition. Today, because of machine learning, enormous amounts of voice data saying different words in different dialects and intonations are stored in an algorithm which can easily and quickly recognize yours and others' voices accurately without training. So no, let's get this straight that enormous amounts of computing power and special computing machines like AI accelerators are needed to run AI programs like what is described in this video. All that heavy work has been done ahead of time on special computing devices including AI acceraltors so that you can run the AI program on common and even very weak computing devices like off the shelf mobile phones or your PC, and do wondrous things.
@wafflescripter90512 жыл бұрын
I hate hardware
@silberlinie2 жыл бұрын
An excellent man. I can judge that. However, he is out of place with Yannic. Because Yannic has little insight into these things. That is bad. Because he does not know how and where he can start. And you also see that he is not particularly interested. Because his world is several floors above these elementary hardware levels. Yes ok, he makes some effort, at least ...