How Machine Learning Changed Computer Architecture Design (David Patterson) | AI Clips with Lex

  Рет қаралды 23,065

Lex Clips

Lex Clips

4 жыл бұрын

Full episode with David Patterson (Jun 2020): • David Patterson: Compu...
Clips channel (Lex Clips): / lexclips
Main channel (Lex Fridman): / lexfridman
(more links below)
Podcast full episodes playlist:
• Lex Fridman Podcast
Podcasts clips playlist:
• Lex Fridman Podcast Clips
Podcast website:
lexfridman.com/ai
Podcast on Apple Podcasts (iTunes):
apple.co/2lwqZIr
Podcast on Spotify:
spoti.fi/2nEwCF8
Podcast RSS:
lexfridman.com/category/ai/feed/
David Patterson is a Turing award winner and professor of computer science at Berkeley. He is known for pioneering contributions to RISC processor architecture used by 99% of new chips today and for co-creating RAID storage. The impact that these two lines of research and development have had on our world is immeasurable. He is also one of the great educators of computer science in the world. His book with John Hennessy "Computer Architecture: A Quantitative Approach" is how I first learned about and was humbled by the inner workings of machines at the lowest level.
Subscribe to this KZbin channel or connect on:
- Twitter: / lexfridman
- LinkedIn: / lexfridman
- Facebook: / lexfridman
- Instagram: / lexfridman
- Medium: / lexfridman
- Support on Patreon: / lexfridman

Пікірлер: 22
@godspeed133
@godspeed133 4 жыл бұрын
Seems like ML will only advance us so far. Ultimately we need a new architecture or semiconductor tech breakthrough to revive Moore's law in some form. Otherwise we have plateaud, and that, while not the end of the world, is quite disappointing.
@kaseyboles30
@kaseyboles30 4 жыл бұрын
There are several promising techs on the horizon. Gallium nitrite for example. It won't go much if any smaller than the current limit, it will however run much higher frequencies with lower power. The trick is getting good crystals out of it. The current defects per mm2 is still around 100 times worse than current silicon, however once they get that down enough and refine the techniques down near current process nodes it could become a viable replacement. It's already seen some use in lower complexity devices do to it's superior resilience to temperature. There has also been some progress in photonic computing and other options involving graphene are being worked on. And quantum computing is of course being invested in. Though note quantum computing is more complementary than replacement as there are things conventional binary computing can do much better than quantum computing. Kinda like how gpu's and cpu's have different domains they excel at.
@maximilliansbabo2099
@maximilliansbabo2099 4 жыл бұрын
Go check out the company groq.... cuz you are exactly correct
@kaseyboles30
@kaseyboles30 4 жыл бұрын
@@maximilliansbabo2099 unfortunately the site tries to trick one into loading a 'flash update'. big red flag.
@povelvieregg165
@povelvieregg165 4 жыл бұрын
Doesn't seem like that though. As David Patterson says, this is actually some interesting times we live in. We will simply start creating a lot more specialized hardware. It makes total sense to me. Today it is not just about hardware not improving as rapidly but also the fact that you don't need that much more power. I have been quite happy with performance of my computers for at least a decade now. Almost everything you do today is fast enough except for really specialized areas. There are things like compiling large programs, processing video, rendering, realistic computer games, voice recognition and some other specialized areas where we really need or benefit from higher performance. But if you look at drawing applications, editors, 2D games, calendar, spreadsheets, presentation software and lots of other everyday stuff, we already have plenty of performance.
@blocksrey
@blocksrey 2 жыл бұрын
Code is flawed, we already have the hardware. Once AI start optimizing programs that’s when we’ll see more results
@ethiesm1
@ethiesm1 3 жыл бұрын
Machines that write our programs-- YES!
@ab8jeh
@ab8jeh 3 жыл бұрын
Seems like they should be talking about GPUs towards the end. Not sure why it didn't come up.
@d3ly51d
@d3ly51d 3 жыл бұрын
Why not simply standardize FPGAs and have the OS allocate areas of your chips to applications, then you can have software that come with their own hardware accelerators. Imagine an mp3 library coming with its own DSP chip design, or a crypto library with its own specialized hardware. And if you run out of FPGA floor space, your OS can then simulate the rest of it on the CPU, kinda like the trade-off between physical RAM and swapfile. Sounds like a good idea, but is probably a massive standardization effort across the entire industry and probably the economics of it all don't make sense at the moment. My bet is that in the future we'll see more reconfigurable hardware integrated with the software. Then you really have a thing that you can go ahead and optimize, and it will benefit everyone.
@minhajsixbyte
@minhajsixbyte 2 жыл бұрын
@Robert w and the rest of it is bought by amd
@W2wxftcxxtcrw
@W2wxftcxxtcrw Жыл бұрын
Idk sounds non trivial to implement. I say you develop the first prototype 😅
@Gooberpatrol66
@Gooberpatrol66 3 ай бұрын
I've wondered this as well
@GBlunted
@GBlunted 9 ай бұрын
TPUs before they were a household term! Very telling little chat you clipped here...😮🤔😊
@thisguy9279
@thisguy9279 4 жыл бұрын
Tenserflow and PyTorch aren't "languages"!!! They are frameworks or librarys.
@machinephile
@machinephile 4 жыл бұрын
actually, what I think, they are languages in a sense that they build abstractions and interfaces on machine learning methodologies.
@thisguy9279
@thisguy9279 4 жыл бұрын
@@machinephile That's what most of the librarys or frameworks do. They build abstractions and interfaces of complex concepts. By that definition moviepy would be a language because it builds abstractions and interfaces on ffmpeg. Even bootstrap would be one. By that definition, most of the classes ever written would be a language, because it implements something that can be used more easily and if you don't believe me just google the word tensorflow. Google will say the same.
@machinephile
@machinephile 4 жыл бұрын
@@thisguy9279 I understand what you are trying to point out, but I would like you to think about, "what is a programming language?" like really? Well, as we know, it consists of a set of rules by which you query the cpu and memory to perform instructions to compute an algorithm. I know it's quite silly to call out liberties as 'languages' but while they certainly are not programming languages but it is quite intriguing to think of them as semi-languages, where they implement their own set of rules.
@manhalrahman5785
@manhalrahman5785 3 жыл бұрын
fight
@borazan
@borazan Жыл бұрын
Thank you for telling that to an AI researcher and a computer architecture pioneer with a turing award, bet they didn't know what "tenserflow" was called...
@jcb1orion
@jcb1orion 4 жыл бұрын
why does this guy swallow cud every 10 seconds?
Неприятная Встреча На Мосту - Полярная звезда #shorts
00:59
Полярная звезда - Kuzey Yıldızı
Рет қаралды 7 МЛН
Osman Kalyoncu Sonu Üzücü Saddest Videos Dream Engine 170 #shorts
00:27
small vs big hoop #tiktok
00:12
Анастасия Тарасова
Рет қаралды 31 МЛН
He sees meat everywhere 😄🥩
00:11
AngLova
Рет қаралды 10 МЛН
Von Neumann Architecture - Computerphile
16:20
Computerphile
Рет қаралды 631 М.
Deep-dive into the AI Hardware of ChatGPT
20:15
High Yield
Рет қаралды 314 М.
AI Hardware w/ Jim Keller
33:29
Tenstorrent
Рет қаралды 29 М.
AI’s Hardware Problem
16:47
Asianometry
Рет қаралды 619 М.
How to Have a Bad Career | David Patterson | Talks at Google
58:37
Talks at Google
Рет қаралды 51 М.
Future Computers Will Be Radically Different (Analog Computing)
21:42
George Hotz criticizes OpenAI | Lex Fridman Podcast Clips
15:04
Lex Clips
Рет қаралды 172 М.
Что не так с яблоком Apple? #apple #macbook
0:38
Не шарю!
Рет қаралды 475 М.
КРУТОЙ ТЕЛЕФОН
0:16
KINO KAIF
Рет қаралды 1,3 МЛН