Artificial Intelligence Isn't Ready for Mass Application || Peter Zeihan

  Рет қаралды 207,046

Zeihan on Geopolitics

Zeihan on Geopolitics

Күн бұрын

Пікірлер: 1 000
@fritzbloedow29
@fritzbloedow29 Ай бұрын
The server farms aren't using consumer grade GPUs, they're using A100, H100, etc, those are made specifically for data crunching, they work very efficiently for AI, and would almost never be in a regular persons computer. We're still in the early days of AI development, with so many unanswered questions about it. Much of the time I think PZ gets things right, this time he is missing too many points on this complicated subject. Peter please don't take this as an attack, I still get a lot from your videos, keep up the good work.
@jf9593
@jf9593 Ай бұрын
lmao "experts" amirite
@urlauburlaub2222
@urlauburlaub2222 Ай бұрын
While this is true, more computing power doesn't better AI results. Take in mind, that most data it uses for "better results", is also legally grey if not outright stolen. So, it's sneaky, but even if you have a computer hiding and spotting you everywhere, the AI doesn't know what's right and wrong. So, technical advantages are more about limiting power waste, than actually improving things. GPU chips for computer graphics were not only pushed because of gaming in the past, but gaming was pushed to have them focussed on traditional human actions within military or in civil security like aircraft, shipping etc...Cheaper chips were pushed also 20 years ago, for education purposes in Africa or Asia, where AI was also used to teach them basics things to avoid polution and to fight analphabetism. Laws will soon be made more drastic, if there is a surviving population and if those population understands more of the legality and legitimacy of the data used, when they encounter it. Having an old world population, I don't see those elderly adapt for the sake of selling chips everywhere and selling of AI. Also, individualism and personal achievements of artists and designers of all kind are undercut or destroyed, if everyone get's the same, even if partly individualized through AI. This is totally against the whole philosophy of the Western/European-centered world, which made this technology possible and constructed it.
@djphat1736
@djphat1736 Ай бұрын
These systems use power supplies in the 3+ KW range, and several of them. For redundancy and for well, the need for power. Say nothing of the heat it produces, to which cooling it gets out of control. They are not efficient, or even price effective. Let alone if you can get it without waiting awhile. Efficient would be the ARM chips, specifically Apple M chips. And if they could scale it up quickly enough, or make a discrete Ai/GPU card. They could eat Nvidia's breakfast, lunch and dinner. That is the revolution we need in this space.
@thomastaylor648
@thomastaylor648 Ай бұрын
Garbage in garbage out. Not a new concept, but folks don't understand it, so this is the shit we get, fake experts... Reality is that we have to provide nearly perfect data to train these models to get acceptable performance from them. This is always going to be a problem. It really hasn't anything to do with chips or supply chains, or energy. Fixing those issues doesn't change that. That taste of AI you speak of is our cherry picking and those are typically academic examples, based on hand picked, highly processed training sets.
@TakeshiYoung
@TakeshiYoung Ай бұрын
Yeah, Peter misses the mark here. Training new AI models uses astronomical resources, but once they are trained they are a lot less resource intensive. And LLMs do far more than data processing
@williamsteveling8321
@williamsteveling8321 Ай бұрын
A few things that are off... First, as noted elsewhere, laptop GPUs are made for laptops. The desktop market uses more robust devices, commercial applications are similarly beefier Second, newer parallel processors (which GPUs are a subset) are coming out. Power is still an issue, but processing per Watt is better Third, they're not using targeted randomness. They're applying logic chains to input using weighted values at each stage. And they're pretty good at following instructions. Fourth, there are programmable analog processors being worked on that might radically (by a factor of 10^6 possible, 10^3 likely) and they're likely to be available a lot sooner than 2040. When setting weights for processing, they'd be a game changer depending on plausible applications Likely, Mr. Zeihan is correct that it's further out for practical purposes than some think, but it's also closer than he thinks. The truth is likely somewhere in the middle. My job is very AI-adjacent, so I see some things up close. Long-term very big worry, short term a low-to-mid level worry. NOTE: All of this ignores black swan developments which by definition are unknown unknowns
@thailandmalcolm
@thailandmalcolm Ай бұрын
Very informative. Thank you.
@smcg2490
@smcg2490 Ай бұрын
I agree. I enjoy watching Peter for his broad-brush analysis, but I wouldn’t stake too much on his predictions due to his lack of specialised expertise. His take on the AI space exemplifies this. AI isn’t just about chips or raw power; he overlooked the groundbreaking work being done in information processing that directly improves AI’s accuracy-and thus its usefulness. For instance, innovations like Liquid AI’s architecture and the steady stream of research papers on more efficient computing are reshaping the field in an almost daily basis. I believe AI has been in its early developmental stages, and only now are we approaching its true emergence. When AI can reliably evaluate itself and autonomously refine its own code, we’ll witness its full arrival. Personally, I think the most important and interesting area that true AI capabilities will have an impact on is human life longevity. When we can effectively research ageing as a disease, and get good treatments, what does that mean for the future.
@chaselevinson7950
@chaselevinson7950 Ай бұрын
As I understand it, LLMs still have a level of stochastic variability. The “temperature” of GPT model will determine if the next word in an answer entirely deterministic (always chooses the highest probability entry) or entirely random (picks from the list with some level of randomness). That’s why you’ll see the same prompt give different answers in the same LLM.
@TheSacredOrderOfKnightlyValor
@TheSacredOrderOfKnightlyValor Ай бұрын
This guy needs to get out of the house more often.
@Rob_F8F
@Rob_F8F Ай бұрын
😂😂😂
Ай бұрын
You mean a little less often. 😂 He needs more data. 😅
@macbaryum
@macbaryum Ай бұрын
It really sucks to be chained to a desk all the time.
@bonanzatime
@bonanzatime Ай бұрын
Actually, he spends most of his time 'Getting Out Of Town' before it's too late.😂.. selling snake oil is not as easy as you might think. .. especially with Trump in office.🤫
@davidsingh6944
@davidsingh6944 Ай бұрын
😂😂😂
@alexbuccheri5635
@alexbuccheri5635 Ай бұрын
As a physicist turned scientific software engineer, I'd say Peter's largely on the money with these points, although his timescales for new hardware are a little pessimistic. Couple of amendments: If one increases an algorithm's performance by >= 10x when porting from CPU to GPU, GPUs can actually be more energy-efficient. At least this is the figure we throw around in computational physics. Large, modern HPCs aren't cooled with fans, they use liquid cooling. Still, (if I recall correctly) the heat given off by Finland's LUMI machine actually powers the neighboring town. Cooling these things is no joke.
@dixonhill1108
@dixonhill1108 28 күн бұрын
Personally I think supply chains are gonna break down a lot faster than you think. Covid prevented a lot of supply chain interuptions, while the majority of people think it created the interuptions. It's just my opinion, but I think if you look at IQ scores things make a lot more sense quickly.
@josephtaylor6285
@josephtaylor6285 26 күн бұрын
@@alexbuccheri5635 Wow! Powering a whole town with the heat of the servers is a mind blower.
@BenGrimm977
@BenGrimm977 Ай бұрын
Listening to Peter talk about AI makes it clear that not every topic needs his take. He’s great with geopolitics, but AI clearly isn’t his lane. Being an expert in one area doesn’t mean you can speak with authority on everything, and sometimes it’s better to just stick to what you know.
@derekhall4017
@derekhall4017 Ай бұрын
Thought you were going to say something.
@maartenkranendonk8954
@maartenkranendonk8954 Ай бұрын
@@derekhall4017 Be quiet little boy.
@zee9709
@zee9709 Ай бұрын
what he know? everything is surface level, we here just want to hear his opinion regardless truth or not
@gladius1275
@gladius1275 Ай бұрын
Pointless comment since you make a statement with no elaboration or supporting data to refute.
@maartenkranendonk8954
@maartenkranendonk8954 Ай бұрын
@@gladius1275 Your comment is pointless. I just looked it up on ChatGPT
@mav45678
@mav45678 Ай бұрын
GPU is the size of a postal stamp not because "it was designed to be put into a laptop" (there is a big market for GPUs for dekstop, which have less space limitations), but because speed of light limits intra-chip communication. If you want parts of the chip to communicate with each other a couple billion times per second, they can only be so far away from each other for the electrical impulse (travelling at the speed of light) to reach them within alloted time.
@24zer0nd
@24zer0nd Ай бұрын
a big reason why nvidia chips dont have as much ram as gamers want as they are sticking with low memory bandwidth so the chips can be used in laptops, they are also waiting to use the new memory thats around the corner to fit higher amounts of ram on that same smaller memory bus
@oregonhighway
@oregonhighway Ай бұрын
I was a little surprised by his prediction, chips the size of a dinner plate. I don't think that's possible, it sounds like you agree
@TeodorSpiridon
@TeodorSpiridon Ай бұрын
@@oregonhighway You could if you go with AMD's approach of multiple chiplets on a larger substrate. Monolithic chips of that size are impractical due to the yields being absurdly low as the size grows.
@alex_madeira
@alex_madeira Ай бұрын
@@oregonhighway Google Wafer-Scale Processors
@alex_madeira
@alex_madeira Ай бұрын
@@TeodorSpiridon Lookup Cerebras
@H.G.Wells-ishWells-ish
@H.G.Wells-ishWells-ish Ай бұрын
One thing I've found interesting has been the shift in the drivers of technological innovation. In the 1950 through the 1990s, the military had been the main driver for innovation. But, in recent years, the gaming and entertainment industries have closed the gap, particularly in high tech. Even small drones seem to have been present for entertainment purposes before it was viewed as an applicable asset on the battlefield.
@lord0Fwar93
@lord0Fwar93 Ай бұрын
Mankind can advance without war, but some of us still chose war
@AtheismScientism
@AtheismScientism Ай бұрын
False. Drone tech dates back to WW2 and the US military was using small drones in the ‘90s. Civilian innovation has been the reason for certain applications, but small drone tech wasn’t a recent innovation of the video game industry…
@thedj3319
@thedj3319 Ай бұрын
Yes and no. Military innovation makes it happen, but market innovation makes it affordable. Drones are a perfect example for this. The US has been using drones for decades now. Thats what Predator drones are, and are far superior to market drones- staying longer in the air, more deadly ammo, better cameras, ect. But the market drones will do in a pinch if you dont have the option of widespread Predators. They are cheap, affordable, easy to manufacture.
@H.G.Wells-ishWells-ish
@H.G.Wells-ishWells-ish Ай бұрын
@@AtheismScientism 1) I never said small drone warfare was a component of the video game industry. They were a part of leisure entertainment industry. 2) Nor did I ever say the military didn't have some type of drone component. I said it SEEMS like small drone warfare didn't pick up until after civilian drones had become a significant part of the leisure industry. It was an observation, not a definitive 'true or false' assertion.
@RuffinItAB
@RuffinItAB Ай бұрын
The cold war was a serious driver of government R&D.
@tankomanful
@tankomanful Ай бұрын
😅its becoming very clear that Peter always has an opinion, even if he lacks the insights. Instead of explaining how AI can transform the workforce landscape, he jumps into his favorite topic of chip scarcity. Anyone watching this should take this commentary with a grain of salt
@cushmfg
@cushmfg Ай бұрын
But the thing is IS that computing happens on chips. There is only one TSMC.
@Stevenpwalsh
@Stevenpwalsh Ай бұрын
@@cushmfg China just trained a SOTA model for $5M, chips might not be everything.
@cushmfg
@cushmfg Ай бұрын
@@Stevenpwalsh 1. China 2. SOTA is a made up arbitrary acronym with no basis in reality 3. So what, there are a lot of existing useless models. Still needs to run on chips. Still probably doesn’t do anything more useful than any of the existing models
@bonanzatime
@bonanzatime Ай бұрын
@@tankomanful a grain of salt. And A Roll Of Toilet Paper!
@segalliongaming8925
@segalliongaming8925 Ай бұрын
@@StevenpwalshWhat’s SOTA? The equivalent of ChatGPT3?
@CasualRelief
@CasualRelief Ай бұрын
I think you might be wrong on this one. The LLM companies have only just run up against that power/compute wall. They haven't even tried to optimize the algorithms much yet. I would bet that we'll see a crazy improvement in the model efficiency in the next 2 years. The physical limitations of power and chips will slow things down for sure, but programatic improvements to the algorithms with aide from the models themsselves using long run time compute will lead to more efficient models. We will also see a shift away from the "general" model and we'll many very specialized models that super efficient for the task.
@diviningrod2671
@diviningrod2671 Ай бұрын
More importantly why couldn't AI tell me how many blueberries in a pie?? Is AI a nothing burger? Iirc it's 475 berries
@hi117117
@hi117117 Ай бұрын
The thing is that there really isn't a path forward for optimizing the software side of this. for many many years now even before large language models, The solution has always been to just throw more data at a bigger model, with significant advances in the software side of things only coming once every 5 years if even that. The only ones I can really think of are things like recurrent neural networks, convolutional neural networks, the stuff behind large language models, and switching the activation function to sigmoids. this represents something like 20 years of advancement in the field of AI. The point is basically that while it's not impossible to optimize the software side, it is much harder and if we're at that point now it basically puts us in an AI never a future much like how there's potential for fusion power but realistically we will never get it.
@Stevenpwalsh
@Stevenpwalsh Ай бұрын
@@hi117117 There is still a lot of low hanging fruit. We're finding all kinds of ways, some are data techniques ie: distillation of larger model output into smaller models. There are gains being made in parallelzation and caching with inference. Some people are playing with fundumental architecture ie (being more sophisticated than a simple soft max), there are gains from having better data etc.
@chevgr
@chevgr Ай бұрын
he's wrong on many things, including his speciality subject (he said it was impossible for Trump to win in 24)
@crackyflipside
@crackyflipside Ай бұрын
The last sentence you have is huge. If you use a software like Palantir Foundry, you'll see that the low cost models are waaaaaaay more efficient for the majority of intermediate LLM functions in the data pipeline.
@richdurbin6146
@richdurbin6146 Ай бұрын
I think letting the news marinate for a week before seeing his takes, work pretty well for getting better context.
@thecustodian1023
@thecustodian1023 Ай бұрын
Or a month. Way too much of what he says is nonsense that turns out to have been built on propaganda lies that just haven't been publically exposed all the way yet.
@JinKee
@JinKee Ай бұрын
To be fair, my comments aren’t much better than targeted randomness.
@MrNicoJac
@MrNicoJac Ай бұрын
You still greatly outperform a cat walking across a keyboard, and that's not even fully random! :)
@TimAZ-ih7yb
@TimAZ-ih7yb Ай бұрын
Yes, but your brain consumes 10W of power whereas the GPT needs 100 kW to fling its foolishness. 😊
@urlauburlaub2222
@urlauburlaub2222 Ай бұрын
@@TimAZ-ih7yb AI suggest to use people with low brains as energy, or was it Agent Smith serving the Matrix? I don't know, what do you suggest?
@thecustodian1023
@thecustodian1023 Ай бұрын
How many comments do we all see every day now that are grammatically and spelling-wise perfect but absolute word salads that have no relevance to the conversations they are in? Thats real-world AI.
@josephtaylor6285
@josephtaylor6285 26 күн бұрын
@@JinKee Don’t be so hard on yourself. Kudos for your self deprecation and self reflection.
@psych0r0gue1
@psych0r0gue1 Ай бұрын
One of the things I always wonder when I hear about the waste heat problem with AI is why we can't recapture some of this waste heat to use for secondary power production.
@MattsAwesomeStuff
@MattsAwesomeStuff Ай бұрын
Good question. It's because despite the volume of heat being large, the intensity of the heat is low. Intensity (large difference in temperature) is what you need to use it to accomplish something. For example, if I took a whole swimming pool of cold water, and heated it up 2 degrees, that would take a massive amount of energy. But you still couldn't cook a potato with it, it's only 2 degrees warmer than the cold water. If you wanted to use that warmer water to preheat your hot water tank to take a bath, you barely accomplish anything, you only gained 2 degrees. Entropy only goes in one direction. If you boil a pot of water and dump it into the pool, you can't get boiling water back out of it, even though energy was conserved.
@beans100
@beans100 Ай бұрын
@@MattsAwesomeStuff Thank you, good explanation !
@denisblack9897
@denisblack9897 Ай бұрын
Cause its fucking radioactive?
@psych0r0gue1
@psych0r0gue1 Ай бұрын
@@denisblack9897 well, it's not, but also we use radioactive stuff to generate power. Thank your trolling with us today. Feel free to climb back into your hole.
@snarky_user
@snarky_user Ай бұрын
Sixty years ago, computer centers were very large rooms "with a massive heat problem" because of vacuum tubes.
@Ayvengo21
@Ayvengo21 Ай бұрын
They still large rooms with massive heat problem but for other reasons.
@snoomtreb
@snoomtreb Ай бұрын
Indeed this is why bitcoin farms and data centers are often in Iceland. Cooling is way cheaper there ;)
@Brent-z2s
@Brent-z2s Ай бұрын
My dad saw one at a large insurance company in the 60s and I told him recently his phone has more computing power.
@joey199412
@joey199412 Ай бұрын
@@snoomtreb No. it's because of the almost limitless geothermal power generation.
@snoomtreb
@snoomtreb Ай бұрын
@@joey199412 it definitely helps. Same reason why aluminum smelters are there.
@PaulJackson-x8e
@PaulJackson-x8e Ай бұрын
Hit 240k today Appreciate you for all the knowledge and nuggets you had thrown my way over the last months. Started with 24k in September 2024..
@PaulJackson-x8e
@PaulJackson-x8e Ай бұрын
I will be forever thankful to you, you changed my life I will continue to speak on your behalf for the world to hear that you saved me from huge financial debt with just a little trade, thank you Jihan Wu you're such a life saver
@HollyGarwell
@HollyGarwell Ай бұрын
Jihan Wu Services has really set the standard for others to follow, we love him here in Canada 🇨🇦 as he has been really helpful and changed lots of life's
@RichardArthurBaker
@RichardArthurBaker Ай бұрын
Most rich people stay rich by spending like the poor and investing without stopping then most poor people stay poor by spending like the rich yet not investing like the rich but impressing them. People prefer to spend money on liabilities, Rather than investing in assets and be very profitable
@JenniferCochran-w5e
@JenniferCochran-w5e Ай бұрын
Please how can I get in touch with this coach Jihan Wu ? I really need to give him a try
@lalsingh7340
@lalsingh7340 Ай бұрын
Financial education is what we need right now for more than 70% of the society in the country as very few are literate on the subject. Thanks to Jihan Wu, the man that changed my financial life.
@4mb127
@4mb127 Ай бұрын
Good example how Peter has the knowledge vast as an ocean and deep as a puddle.
@SA2004YG
@SA2004YG Ай бұрын
😂
@immortaljanus
@immortaljanus Ай бұрын
He's a generalist, he'd said so many times. I imagine he has other people who are more specialized that do deeper, narrow analyses, he incorporates their outcomes into his generalist approach aftwerwards.
@tkzsfen
@tkzsfen Ай бұрын
I miss the "deep" part. What would you add to his comments, that he misses?
@nicholaidajuan865
@nicholaidajuan865 Ай бұрын
As a gamer I wish nvidia would refocus on the graphics market instead of building their H100 AI cores that are literally the size of dinner plates (an entire silicon wafer), are liquid cooled, and are made to be rack mounted for use in AI server farms. The world you describe is here and in such a world chips that are worth several hundred thousand each are flown and so are not subject to shipping restrictions
@woznotwoz-s8j
@woznotwoz-s8j Ай бұрын
No. Because money.
@3rdHalf1
@3rdHalf1 Ай бұрын
Gamers complaining that Nvidia don’t cater to them is like suburban stay at home mothers complaining about lack of products from John Deere. Gamers are not the reason Nvidia is the biggest company on earth. Kinda soqs, I know.
@AtheismScientism
@AtheismScientism Ай бұрын
I have yet to meet a group of people who complain more than Gamers…
@Stevenpwalsh
@Stevenpwalsh Ай бұрын
AI is how we build a world with diminishing worker population. It's the most important thing we are building today.
@JakeWestington
@JakeWestington Ай бұрын
Lol why would they do that? Look at their earnings on AI vs gaming. It's not even close anymore.
@tankvibe
@tankvibe Ай бұрын
As someone actively working on agentic frameworks, this feels about 6 months outdated. Low end functional chips that can run cars are available under $400 for the size of a wallet You have your size understanding of GPUs and TPUs backwards. Long time viewer, lot of respect.
@okalov
@okalov Ай бұрын
My favourite thing about AI is all the software developers raving about how it will replace all nuanced professions that require 'soft skills' within the decade, and yet we're seeing the first thing it really replacing is junior software developers and programmers...
@mav45678
@mav45678 Ай бұрын
That's fake news, I'm in the industry and I haven't seen or even heard of anyone's job replaced by AI yet.
@talideon
@talideon Ай бұрын
​@@mav45678 The problem isn't replacement, but hiring freezes on people going into low-level positions. This is deeply misguided, but it's happening.
@belava82
@belava82 Ай бұрын
@@mav45678 They already replacing all sorts of "designers" and "tech writers" who's job was to slightly modify some templates etc. They are giving those types of work for interns now.
@tklarp4735
@tklarp4735 Ай бұрын
Wishful thinking. I know people have a hate-boner for programmers because they make a lot of money, so they want to see them lose their jobs, but this isn't happening. Layoffs and slowed hiring are happening because of interest rates, not AI.
@tarazieminek1947
@tarazieminek1947 Ай бұрын
Yeah, it might replace some of the weaker junior devs, but that just means the senior devs become more productive and less people will be able to get entry level jobs in the software field. So senior devs actually become more valuable.
@frankbieser
@frankbieser Ай бұрын
The advantage of a GPU for AI is the same reason they are good for graphics; they are optimized to perform specific hashing functions (particular types of math if you will). CPUs can do a lot of different tasks at the same time too. But they are generalized processors for supporting all kinds of math operations. GPUs are specialized processors for a specific set of math operations which makes them more efficient and therefore useful for AI and graphics rendering.
@briancase6180
@briancase6180 Ай бұрын
No, incorrect. They do not "perform specific hashing functions" unless you consider multiplies and additions to be hashing functions (which is one way to look at them, but nobody does).
@HomeSlize
@HomeSlize Ай бұрын
Damn, Peter still around? He's one of the best examples of the Dunning-Kruger effect.
@OGPressident
@OGPressident Ай бұрын
100% he speaks far too confidently on too much stuff and is wrong so much on so many details it’s actually dangerous to take his point of view too seriously
@HomeSlize
@HomeSlize Ай бұрын
@@OGPressident exactly.
@NickApex
@NickApex Ай бұрын
If you take 5 minutes to look into Nvidia and its offerings you’ll realize how absolutely moronic this video is.
@raynash4748
@raynash4748 Ай бұрын
Some of his video's are comical.
@mastervibes2296
@mastervibes2296 Ай бұрын
Trying to pump your Nvidia stock?
@WhiskeyJ_TV
@WhiskeyJ_TV Ай бұрын
What about the software ? 😂
@michaelcallahan8412
@michaelcallahan8412 Ай бұрын
I work in AI, and I use Nvidia products, and nothing Zeihan said was inaccurate. I think you might be confusing GPUs built for AI with chips built for AI? But I'm not really sure what your issue with the video was.
@RyanDorough
@RyanDorough Ай бұрын
This Apple white paper might have some bearing on the topic. pbs.twimg.com/media/GeSeGlQWoAACgUr.jpg
@texasgermancowgirl
@texasgermancowgirl 26 күн бұрын
I work in this field and the biggest issue is that we don’t have the data center and energy infrastructure to run it on a mass industrial scale and data management
@BluegillGreg
@BluegillGreg Ай бұрын
To be fair, this begs the question: What the heck is a "postage stamp?"
@mcf8615
@mcf8615 Ай бұрын
😂😅
@mindguru22
@mindguru22 Ай бұрын
Same as what is “potato chip”?
@dirtydish6642
@dirtydish6642 Ай бұрын
*_Raises_* the question. _Begs the Question_ is a phrase referring to a logical fallacy.
@sluggo206
@sluggo206 Ай бұрын
It's those things your brother collects.
@luciusael
@luciusael Ай бұрын
Are you Gen alpha or something?
@kev2582
@kev2582 Ай бұрын
Peter, there is basically two distinct computation scenarios - training and inference. GPU is general purpose so it can do both. Custom inference ASIC is optimized for inference. Inference chips are many fold more efficient than GPU, but not orders of magnitude. Chip supply and enabled scenarios are two different things.
@gregkelly2145
@gregkelly2145 Ай бұрын
I'm not an expert, but I do know that Tesla IS using completely custom AI chips right now. The ones used in their cars are made in the US by Samsung. The D1 (Dojo) chips are made by TSMC in Taiwan. But, they are both completely custom AI chips.
@chevgr
@chevgr Ай бұрын
Peter def isnt an expert either.
@nod5770
@nod5770 Ай бұрын
peter's not an expert in anything. He's a front man for an intelligence war. His only skill is acting.
@segalliongaming8925
@segalliongaming8925 Ай бұрын
DOJO isn’t ready yet. That’s why xAI is still relying on nVidia chips.
@TrendyStone
@TrendyStone Ай бұрын
If anyone even mentions Elon Musk or Tesla, Peter goes into a crazy cognitive dissonance loop. It’s odd to see. Smart guy…and I read all of Peter’s books…but he can’t handle certain topics objectively.
@SignalCorps1
@SignalCorps1 Ай бұрын
so is AWS and I suspect Azure and GCP
@DemetriusTrumpClips
@DemetriusTrumpClips 26 күн бұрын
He's also wrong about the power needed to supply LLMs globally. LLMs such as LLAMA from meta have trained smaller modeles to be just as eficient/acurate. Once you train a massive LLM it can be optimized to run on less powerfull machines by having big models train smaller models.
@DogmaticAtheist
@DogmaticAtheist Ай бұрын
Nanometer is no longer a technical term but a marketing term..
@Apjooz
@Apjooz Ай бұрын
Who's talking about nanometers in the year 2025.
@TrendyStone
@TrendyStone Ай бұрын
@@ApjoozPeter
@jaku5796
@jaku5796 Ай бұрын
One of the problem is we can imrpove productivity by AI to reduce issue with lower population, but AI would not solve an issue with shrinking markets due too lower population.
@OS-xx6nq
@OS-xx6nq 8 күн бұрын
AI will exacerbate the issue as the demand would shrink due to people not being able to pay for goods and services due to job losses to AI and robotics.
@Stevenpwalsh
@Stevenpwalsh Ай бұрын
I use/build AI in the healthcare sector. My team of 2 guys is already finding millions in savings (stuff we used to have a hundred SME's do), the stuff we're finding with it is crazy. Beyond proof of concept, we're using it in production to find real value for customers. Peter is pretty stuck on compute limitations, it's just not a problem for us. One of the biggest boosts is taking the output from a very large model, and distilling it into a small model. We can do almost all the things (minus some of the clinical stuff, which is getting better... but not quite there yet) that we're doing today with a 14b parameter model, which is super cheap to run. We can run that 14B parameter model on a low end GPU, no need for a super high end space age machine. Though honestly, the foundational model companies are getting so cheap and fast, we don't even need to use the locally run distilled models either (we mostly keep using them for privacy reasons).
@FloydThePink
@FloydThePink Ай бұрын
So where in US is healthcare cost going down because these savings are being passed to the patient? Cynical me would think you are increasing CEO bonus and enabling even more stock buybacks. Nothing some Luigis can't fix.
@dalehill6127
@dalehill6127 Ай бұрын
​@@FloydThePinkI think you'll find that in the US cost savings, especially in the healthcare sector, are basically *never* passed on to the customer. After all, when you're ill you'll pay anything won't you, and that's all that US companies ever need to know.
@romik1231
@romik1231 Ай бұрын
I agree, but why don't we see AI flushing the market? I mean it should replace all chat based hotlines / support, maybe even the voice based ones. It can also do most of office work stuff. Still, I don't see it implemented anywhere. Why is it so?
@FloydThePink
@FloydThePink Ай бұрын
@@romik1231 AFAIK The cost of ai is prohibitive to small business due to cost. Nvidia stock has skyrocketed because of the insane sales margin that impresses even Apple. I know of no other chipmaker that is even kinda sorta close to Nvidia, so no competition to drive prices down anytime soon.
@Stevenpwalsh
@Stevenpwalsh Ай бұрын
@@FloydThePink In my eyes it's an upgrade to the bucket we've been using to scoop water out of the boat with a hole in it. AI won't fix this mess, the issue is multi-modal. Payers get a lot of blame because they're the ones saying no, but the issue is far wider than them.
@billkemp9315
@billkemp9315 Ай бұрын
Peter, you are missing some things in your assumptions. Check out photonics. Most people are unaware of the next level of computing, and it is easier to produce than electronics. They work at the speed of light and produce very little heat in the data center. Data center cooling is 40-50% of the electricity cost of a data center. The industry is finally shifting to liquid cooling instead of air cooling, which is 20 times more efficient. I have designed and built 34 tier 4 data centers, so I am not just some random guy with an opinion.
@bernl178
@bernl178 Ай бұрын
1,000,000%. I as well and following Photonics
@antonyphipps5671
@antonyphipps5671 29 күн бұрын
Hooraay, someone finally mentions optical soluitons (besides me and the folks at Poet technologies).
@woznotwoz-s8j
@woznotwoz-s8j Ай бұрын
Will AI replace Peter Zeihan in geopolitical predictions ?
@mindguru22
@mindguru22 Ай бұрын
Both are same. At least the Artificial part😂
@hardheadjarhead
@hardheadjarhead Ай бұрын
It is certain to be more accurate.
@thievingpanda
@thievingpanda 26 күн бұрын
P vs. NP. That is the real question here, will it be solved. Sadly, I think large portions of the workforce can still be automated by AI even if P vs. NP is not solved.
@colbysmith6201
@colbysmith6201 Ай бұрын
Happy New Year Peter..I enjoy your point of view and videos..Thanks for giving all us viewers something to think about.
@GregBaker-ifost
@GregBaker-ifost Ай бұрын
The reason that Peter is wrong in this video is that he isn't considering that there's a difference between AI *inference* (using existing models) and *training* a new model. Training a new model requires enormous amounts of power and compute (and specialised GPU cards). The world could stop training tomorrow (they won't, but even if they did) and we could continue to use existing models. Inference on the other hand *doesn't* new massive amounts of power and compute. When you query ChatGPT, you are probably going to interact with gpt-4o which uses about the same amount of hardware as a 20-person LAN party -- but of course it only uses that for a few seconds, before handling the next query. A gpt-4o-mini query uses less hardware than a high-end gaming PC. We already have more than enough hardware in the world to run inference. That's before you we start talking about the leaps and bounds being made in knowledge-distilled small models: rkwv, phi4, the ultra tree models of my PhD, and so on. These are models that are so small that they run quickly and well on your laptop (or even on the laptop that you stopped using because it was too slow for running Office any more).
@Art-is-craft
@Art-is-craft Ай бұрын
That’s the problem you are trying to predict he is just stating what is in existence now.
@JohnKerbaugh
@JohnKerbaugh Ай бұрын
Why does a president need to decide? Why not let the market decide?
@Apjooz
@Apjooz Ай бұрын
Yeah why not just lose. No biggie.
@Pdotta1
@Pdotta1 29 күн бұрын
Peter mentions Doom! Gave me a giggle.
@monkeywrench1951
@monkeywrench1951 Ай бұрын
How can [mainland] China disappear economically and at the same time take over Taiwan and deprive the US of AI chips ?
@judewarner1536
@judewarner1536 Ай бұрын
Because empires in meltdown often attack third parties either as a means of deflecting a disaffected population or as a scapegoating tactic.
@richardpavlov442
@richardpavlov442 Ай бұрын
Nukes
@sebastianfletcher-taylor1024
@sebastianfletcher-taylor1024 Ай бұрын
I completely agree. I use AI extensively in my work, and part of my job is to research what current models can and can't do effectively, and to engineer prompts to be as effective as possible while ensuring accuracy and correct results. I think Zeihan underestimates how much this field of technology is already being utilized by many people in a variety of fields.
@andrewblain3405
@andrewblain3405 Ай бұрын
The first thing we will use a good ai chip for will be to design a better faster cheaper more efficient ai chip. Obvs.
@JohnDoe-td3xx
@JohnDoe-td3xx Ай бұрын
Zeihan: These chips have the largest supply chains in history Anno Players: 🤔
@michaelcallahan8412
@michaelcallahan8412 Ай бұрын
*Me, working in AI for genetics:* Zeihan talks about a million things confidently, now finally I can see if he actually knows that he's talking about. *Zeihan:* Doesn't know what he's talking about.
@TyrannicG
@TyrannicG Ай бұрын
comments like this make you look very stupid. Its very easy to do this "Zeihan: Doesn't know what he's talking about." By quoting him, you know the exact part im referring too.
@Chr1s-fm6bi
@Chr1s-fm6bi Ай бұрын
Can always rely on someone who points to mistakes but can’t say what is wrong or correct it.
@richkroberts
@richkroberts Ай бұрын
If you are going to make such a comment, consider providing some examples of where Peter is wrong. Seriously, it sounds as though you might have constructive points to make given you have practical experience with AI.
@Broc-e5n
@Broc-e5n Ай бұрын
For those replying to Michael asking why... well, here's an example... Google's own AI chip is in its 6TH GENERATION. Google has been designing its own AI chips for over ten years. It made its cloud TPU available in 2018 and used it internally as early as 2015.
@rockydopeydoge6730
@rockydopeydoge6730 29 күн бұрын
Spot on with the “where are we going to deploy” question and humanity hasn’t been great at that so far unfortunately…
@Thandar324
@Thandar324 Ай бұрын
I wonder how quantum computers might affect this in the future?
@Equalzer
@Equalzer Ай бұрын
Necessity is the mother of innovation and constraints are the father of creativity.
@tinytim71301
@tinytim71301 Ай бұрын
Peter has the coolest sunglasses.
@tinytim71301
@tinytim71301 Ай бұрын
@ and you should stop drinking your bong water.
@Eric-ue5ed
@Eric-ue5ed Ай бұрын
It's always hard to pick winners. There is photonic computing, neuromorphic, PAPs, quantum, graphene-based transistors.
@tommoody728
@tommoody728 Ай бұрын
I thought Nvidea has already produced multiple generations of chips designed specifically for AI.
@Withnail1969
@Withnail1969 Ай бұрын
They have, Peter hasn't bothered to do any research as usual.
@navcenter77
@navcenter77 Ай бұрын
@@Withnail1969 The "AI" chips are just Gaming GPUs with a different label and a higher price tag. Nvidea has played this game before advertising to Bitcoin Miners during the last Boom cycle. You can build your own LLM on a standard PC with a generic GPU and have a better experience than ChatGPT
@joey199412
@joey199412 Ай бұрын
They are just adapted GPU designs meant for gaming. There currently are *no* actual AI accelerator chips specifically for training and inferencing transformer architecture AI. Not even Google's TPU fit that bill.
@SirSchmittyX
@SirSchmittyX Ай бұрын
@@Withnail1969 Which processors are designed specifically for AI? One generation back is the ampere and those were designed for graphics processing. They did package ampere chips specifically for data center use but the chip itself isn't designed to just handle AI. Current gen Hopper architecture is mostly an enhancement of that same technology but organized as a multichip architecture and packaged for data centers. I do think its a fine line he is treading with the language but it is true that Nvidia hasn't rolled out a specifically designed from the ground up AI silicon chip.
@Withnail1969
@Withnail1969 Ай бұрын
@@SirSchmittyX They have been making dedicated AI stuff for a year or two now, no?
@gyoza6510
@gyoza6510 Ай бұрын
Love this episode!
@I_Lemaire
@I_Lemaire Ай бұрын
Happy New Year, Peter and thank you.
@JamesR-f9l
@JamesR-f9l 29 күн бұрын
You are right on the end piece in regards to the pace of technology about every 15 years. You are are also right in regards to increased power consumption. For the short term there will be a hierarchal tier model for AI need. Keep in mind that massive parallel GPUs are preferred but not essential to run AI algorithm as they can be run on any CPU. In the future data farms will have miniature nuclear reactors which will allow for higher power loads. There is a lot if investments from big tech going into one particular green tech that is nuclear.
@MilosKvakic
@MilosKvakic Ай бұрын
Yo, just went through the book called 'The Hidden Path to Manifesting Financial Power' - honestly, didn’t expect it to be this solid. Definitely worth a look
@hasanamin4668
@hasanamin4668 Ай бұрын
Thank you for creating this video-it provided valuable insights into your depth(not deep) of knowledge about everything you are taking about.
@MatthewSargeant
@MatthewSargeant Ай бұрын
Yeah guys, should have done some research. This video is incorrect on many levels. Not only are AI edge chips and large server chips already in mass production, but other AI models dont need massive powerful chips to run, you can run them on your cpu and gpu on a normal pc, in the cloud etc etc whereever. Open source AI is progressing incredibly fast too. To do a lot of tasks you dont even need frontier stuff like the latest o1 03 models.
@dalehill6127
@dalehill6127 Ай бұрын
You know more about it than I do and I know more about it than Mr Zeihan.😊
@nicholasarthur
@nicholasarthur Ай бұрын
Quantum Computing is 200-300x’s more energy efficient than classical. Quantum is finally starting to be commercialized little by little and with partnerships between AI/Quantum companies including nvidia. That would be the rocket fuel that AI needs. Both are still in their infancy but things will take off a lot sooner than 2040. Exciting times. Love Peter, but early AI integration would complicate a lot of his stances or predictions. Which is ok, can’t predict the future on everything can we? 😂
@GregBaker-ifost
@GregBaker-ifost Ай бұрын
None of the serious ML algorithms in use today would be accelerated by quantum computation. Cryptography on the other hand...
@Voxta
@Voxta Ай бұрын
We’re a small team of developers working on some pretty amazing tech, and Peter really gets it. He speaks with a deep understanding of the field, and it’s inspiring to hear someone who truly knows what they’re talking about. Like he said, we’ve only had a taste of what this technology can do, and it’s already mind-blowing. Just a couple of years ago, getting a program to produce two coherent sentences felt like a win. Now, we’re working with compact systems that can recognize images, control software, animate 3D models, and even handle long-term memory. There’s no denying we’re still reliant on dedicated hardware, both on the development and consumer side, but the progress with local models has been incredible. The next few years promise to be a wild ride. Whether it turns out to be a good thing or not... well, we’ll find out soon enough. (This comment was rewritten by AI )🤣
@antonyphipps5671
@antonyphipps5671 29 күн бұрын
Peter, there are a number of companies working on internalizing optical systems (photons) within the GPUs to replace the "wires" that transmit/manipulate electrons (resistance causes the heat load). Optical computational and transmission systems will reduce enormously the energy demand for cooling systems in the server farms. Look at the Canadian firm Poet Technologies, for example.
@vgernyc
@vgernyc Ай бұрын
So the moment to quote Consuela from Family Guy....."Noooo, noooo, nooo." The current stock market bubble needs to pop to get the technology to develop properly. If Nvidia disappears or fades as a result, all the better.
@konrad7492
@konrad7492 Ай бұрын
Love to hear you talking about GPU's and IT in general. One thing i feel i need to say is - i don't agree gaming was the main GPU market driver before machine learning and Ai applications. Think of anything that is computer aided design or CGI, that was the main market driver. Large businesses.
@KatySei
@KatySei Ай бұрын
Weren’t you in New Zealand like 5 minutes ago ago?
@SalarAzad
@SalarAzad Ай бұрын
😂
@wawaldekidsfun4850
@wawaldekidsfun4850 Ай бұрын
While I respect Zeihan's geopolitical insights, his take on AI hardware shows concerning gaps in technical understanding. The claim about AI-specific chips not existing until 2025-2030 ignores already-deployed solutions from Google, NVIDIA, and others, while his "dinner plate sized" chip prediction misses how modern processors actually work. On AI tech, we'd be better served listening to those who work directly in the field.
@comentedonakeyboard
@comentedonakeyboard Ай бұрын
Given how flawed AI still is and what happened with "gain of function" in the genome of the corona virus, i would prefer it if AI was NOT used for genetic research (or anything else that might lead to dangerous results)
@20thcenturyboy85
@20thcenturyboy85 Ай бұрын
Thank you for your videos.
@nicknach
@nicknach Ай бұрын
I'm in the tech industry, and first time i EVER saw GPUs running non-gaming worloads was at an Oil and Gas company (back in 2010). They were using the GPUs to process data for geo-science (upstream oil exploration)
@jimmyolsenband
@jimmyolsenband Ай бұрын
Peter may not be right in the namoverse details of chips, but he is absolutely correct about the dynamic of scarcity and costs. We need to make choices and have conversations before being gaslit into over investing in tech that doesn't benefit us. Plus the chip isn't the problem, it's the gazillion of mini nuclear plants they want o make to power the servers that power the chips.
@JerichoTheBeagle
@JerichoTheBeagle Ай бұрын
They're literally trying to solve their software limitations with raw hardware power and it's futile.
@Kim-e4g4w
@Kim-e4g4w Ай бұрын
If it were up to me to decide I would prefer to accelerate research on longevity but the question is; Should we take a big gamble on focusing making a research AI first in order to afterwards boost our longevity research, or should we drop the golden egg on making automation (progress) go fully automated. Either way we get something very very good out of it, either buy more time with longevity interventions or to free all people from labor forever which frees up time for people to do other cool stuff. Either way huge boost to society.
@conjurermast
@conjurermast Ай бұрын
I think PZ is not qualified to talk about AI being targeted randomness and such at all. It's shocking how good it is at certain things, like translation & coding.
@Stevenpwalsh
@Stevenpwalsh Ай бұрын
Really makes you wonder, if he's this far off with AI.... how far off is he when he talks about all the other stuff he is supposed to be an expert on?
@chevgr
@chevgr Ай бұрын
@@Stevenpwalsh like the US presidential election for example
@TravisBerthelot
@TravisBerthelot Ай бұрын
AI is still horrible at coding.
@Cream-i5u
@Cream-i5u Ай бұрын
@@TravisBerthelotai is in its infancy, you wont be talking about mundane things like this in 3 years.
@TravisBerthelot
@TravisBerthelot Ай бұрын
​@@Cream-i5u AI is purely the result of the amount of compute that you can buy. I don't think you can afford infinite compute in 3 years anymore than I can. So the mundane is here to stay.
@pierredubois9366
@pierredubois9366 29 күн бұрын
Linear thinking in an exponential world
@marcbotnope1728
@marcbotnope1728 Ай бұрын
We will use it to create "content" and propaganda.
@Nicholas-t7w
@Nicholas-t7w Ай бұрын
Yo them sunglasses are Fire 🔥
@gordoncrespo2045
@gordoncrespo2045 Ай бұрын
How can a supposedly intelligent guy be so wrong on this topic.
@chevgr
@chevgr Ай бұрын
easily. I'm not an expert but I intuitively think he's wrong here. can you explain why?
@dalehill6127
@dalehill6127 Ай бұрын
Because it's a very specialised area and Mr Zeihan is a generalist so his skill set doesn't fit.
@naomieyles210
@naomieyles210 Ай бұрын
@@chevgr here goes: - many useful AI models are small, and can run on a PC. - training AI models is intense, running them much less so. - much of AI progress is about business fit, not raw technological power. - AGI is just ridiculous hype, but AI already has many uses. - We are not living in a command economy. We can do all these things with AI, but the organisations with the deepest pockets will get the best chips. - AI neural networks are just multi-dimensional geometry, so a GPU was a very sensible starting place.
@loadb5985
@loadb5985 26 күн бұрын
The biggest problem is the randomness, even in well organized data sets. The results are non-deterministic. N vs NP has not been solved and providing accurate response even in “small” problem spaces fail.
@aum1040
@aum1040 Ай бұрын
Peter's take in this video is so monumentally ignorant, I don't even know where to start critiquing him.
@gladius1275
@gladius1275 Ай бұрын
Pointless comment since you make a statement worth no elaboration or stopping data to refute.
@scrout
@scrout Ай бұрын
Start with why he knows so little about Nvidia....or Gemini, or Grok....
@markoconnell804
@markoconnell804 Ай бұрын
Well done.
@jigglejaggle4732
@jigglejaggle4732 Ай бұрын
We already have custom AI specialized chips, one example is the TPU by google, Google has been using these to train for a while. Furthermore, nvidia has been redesigning their chips to be more specialized for AI, to the point where they may not even be called GPUs anymore. You're way off, the singularity is nearer!
@audio9849
@audio9849 Ай бұрын
I was thinking the exact same thing. Why is Navidia worth 3 trillion? Because of their AI "GPU's".
@bobobricklayer
@bobobricklayer Ай бұрын
Don’t confuse poor Peter with anything technical, especially semiconductor.
@avocade
@avocade Ай бұрын
Groq
@bukkiahgolden6043
@bukkiahgolden6043 Ай бұрын
Revere! Walk in my footsteps Peter. ❤
@porkyfedwell
@porkyfedwell Ай бұрын
Artificial Intelligence is Artificial, but it isn't Intelligent. Anyone who's been "assisted" by an AI assistant already knows this "secret." Your jobs are secure for now.
@markcalhoun8219
@markcalhoun8219 Ай бұрын
AI is a great and expensive way to steal IP and get worse outputs than algorithms we already had. IE it's a pump and dump scheme.
@Stevenpwalsh
@Stevenpwalsh Ай бұрын
Skill issue
@tomrutledge393
@tomrutledge393 Ай бұрын
There is one more angle you might be considering but didn't mention .. the odds are pretty good that by the time we are ready for what is expected next state of the art, that line will have moved. We're able to design / improve designs faster than we can get those improvements to market. So how do we jump ahead to be ready to build the newest and best while it is still .. newest and best?
@Utoko
@Utoko Ай бұрын
This is so wrong. The current NVIDIA GPU's for Servers are highly optimised for AI. Just because you can design even better chips for AI training doesn't go against it. and all the issues he is talking about will be supported by AI. So there is no do AI or do improve financial sector. There go together like all the others. Also AI inference is becoming cheaper and cheaper fast. old GPT4 was $60 million token to $0.014 from DeepSeek model. 1/4000 of the cost to not even 2. years ago.
@goukux5908
@goukux5908 Ай бұрын
yes exactly. Sometimes Peter gets out of his swim land and it goes horribly wrong.
@ericgregori
@ericgregori Ай бұрын
The Dunning-Kruger effect is when poor performers in many social and intellectual domains seem largely unaware of just how deficient their expertise is. Their deficits leave them with a double burden-not only does their incomplete and misguided knowledge lead them to make mistakes but those exact same deficits also prevent them from recognizing when they are making mistakes and other people choosing more wisely.
@josephdouglas6260
@josephdouglas6260 Ай бұрын
Watching Peter consistently misunderstand technological progress always makes me smile. Companies did major AI specific tape-outs year before last, next gen last year- you’re completely missing how AI is reorienting geopolitical priorities and incentives.
@nadavshemer
@nadavshemer Ай бұрын
He's correct about dinner-plate-sized chips not hitting the market this year. Or next year. Or ever :D Guy's hilarious
@alst4817
@alst4817 Ай бұрын
Dude, I hope Jason Huang paid you, cos that is wildly optimistic. Lemme guess, you’ve never actually tried to use them for anything important?
@GMK189-f2k
@GMK189-f2k Ай бұрын
Yep I can’t even believe he does this as people who understand what is actually happening know he isn’t credible. It hurts him to make these videos as it casts doubts on other subjects he takes on. He couldn’t be more poorly informed. His belief that Blackwell and other AI chipsets are just juiced up gaming GPUs is embarrassing.
@JasonGriffing
@JasonGriffing Ай бұрын
Would love to your thoughts on the implications of workforce displacement by AI once the technology does reach scale
@phlyte7
@phlyte7 Ай бұрын
"Gamers. Who play fortnite and doom" I'm crying
@thailandmalcolm
@thailandmalcolm Ай бұрын
We use the chip to make me Grande Master in Overwatch 2!!!
@ScentlessSun
@ScentlessSun Ай бұрын
AlphaFold is already revolutionizing medicine.
@dennisclapp7527
@dennisclapp7527 Ай бұрын
Thanks Peter
@giuseppegiannini3697
@giuseppegiannini3697 Ай бұрын
Happy New year Mr.Zehian. 😊
@milescoleman910
@milescoleman910 Ай бұрын
Let’s go the the geopolitics expert for AI computer chip news
@Apjooz
@Apjooz Ай бұрын
But chips is geopolitics...
@spoddie
@spoddie Ай бұрын
This is bunk. TPUs are already mass produced
@TimAZ-ih7yb
@TimAZ-ih7yb Ай бұрын
And the result is still 90% hype and 10% useful work. The coming AI “letdown” will be a debacle for the ages. On the positive side we will see new CEOs at Google and Microsoft.
@marshallj2415
@marshallj2415 Ай бұрын
@@TimAZ-ih7yb WRONG
@philbiker3
@philbiker3 Ай бұрын
@@TimAZ-ih7yb more like 99% hype and 1% useful work.
@markcalhoun8219
@markcalhoun8219 Ай бұрын
TPU's are marketing, not technical results
@Stevenpwalsh
@Stevenpwalsh Ай бұрын
@@TimAZ-ih7yb There might be a "letdown", but it would be more like the 2000's internet bubble. ie: there was an insanely useful, and economically valuable thing, but we traded stocks like it was 2020, and not 2000. I'm using AI every day, the value is real....
@halbritt
@halbritt Ай бұрын
There are already specific processors being made for AI, that would be TPU that Google manufactures. Though they aren't available on the retail market. In practice, they tend to offer less peak performance than GPU from Nvidia, but are more power efficient. Other companies are starting to ship as of now. Tenstorrent, for example is now shipping data center class "AI" chipsets, though I'm not sure at what volume. They're using Samsung Foundry and I believe are on 3nm presently manufactured in Korea.
@Atiliusmagnus
@Atiliusmagnus Ай бұрын
There are two areas where you should not venture unless you lower your head enough to learn: AI and China's capacity to show you how wrong you are when you think you understand it (your predictions of imminent collapse of China go back to when you were a junior in Stratford, over 20 years ago). As for the rest and perhaps including your wrong takes in the mentioned subjects, I am a loyal follower because your analyses are helpful and interesting, whether right or wrong. For what is worth, you have changed my mind in several subjects. Thank you!
@Zarrov
@Zarrov Ай бұрын
if you follow this guy 20 years then you shpuld know that he is not making "predictions". He is analysing. His analysis of China is spot on and has been fulfilled. Look into the method, not into what you want conclusions to be.
@TomTomicMic
@TomTomicMic Ай бұрын
China's doing great, it's doing this that and the other, Hooray🎉!!!!........but it's doing great because it has the biggest debt to GDP ratio in the World, bigger than the Wider West (Including Japan!) combined and its Boomer generation is going "away" much faster, so China's access to money by way of its citizens savings will reduce over the next decade greatly. They are currently (...and have been since 2007/ 8, presently they are borrowing 22% to achieve 5% growth!) borrowing their way out of trouble, into unfortunately more trouble, there will be a big reset or terminal decline, that's the choice the CCP face, there is no magic money tree, communists always run out of other people's money!?!
@shinymike4301
@shinymike4301 Ай бұрын
@@Zarrov He predicted Trump would not win in 2024. LOL. Petey "predicts" all the time, and often badly. AI is here now and is growing exponentially. Petey is showing himself to be a Luddite.
@jackcahill2383
@jackcahill2383 Ай бұрын
Thank you
@spol
@spol Ай бұрын
"thank you for confirming my biases"
@BluegillGreg
@BluegillGreg Ай бұрын
This technological development is also used for Artificial Stupidity. Remember that.
@richiedubs1062
@richiedubs1062 Ай бұрын
I value Peter's insights into geopolitics, demographics, geography and even history, but he holds an Austrian view of economics and we live in an MMT world. I don't think we need to worry about capital disappearing with the boomers. We are WELL past the era of capital flows having anything to do with savers.
@millenniummastering
@millenniummastering Ай бұрын
Ironically you should have ran this past gemini, Claude or 01 first to check for accuracy!!! 😅
@k54dhKJFGiht
@k54dhKJFGiht Ай бұрын
Amen to that! We desperately need to hear politicians OPNELY and PUBLICLY DEBATE how AI should be used! American Society deserves to have input on this. Social Media strategic ambiguity was handled EXTREMELY POORLY!
@Analyst104
@Analyst104 Ай бұрын
Actually, the question is are Humans ready for AI? Considering Humans are going through a stupid phase, maybe they should hold off on AI until they get their shit together.
@romik1231
@romik1231 Ай бұрын
@@mountainmanmike1014 Yeah, you are correct. We did however, it would seem, be more competent, when we were poorer. We had to try harder. Now, having everything available we became lazy and somewhat less knowledgeable. I wonder if AI will make us smarter of dumber. If you can get any information with AI, why would you learn or memorize it?
@stopdropnroll
@stopdropnroll Ай бұрын
Humans are vile destructive creatures often, and always selfish. AI governance is part of levelling up
@archetypal.architect.whispers
@archetypal.architect.whispers Ай бұрын
I think these limits are great for driving way better efficiency models.
@crackyflipside
@crackyflipside Ай бұрын
Couldn't disagree with you more on this one, being a user of Palantir Foundry there is A TON that I could personally do for a business to augment their operations with AI within hours; and depending on how complicated their processes and supply chains are, they could see dramatic cost savings in a few months of work. Yes we will need more power and chips, but there are low cost (low power and compute demand) LLM that are designed to be integrated into really narrow functions.
@mholsather
@mholsather Ай бұрын
It’s machine learning evolved (as it always happens) leaning on specific chips tailored to that type of processing. Then companies rebrand it to “AI”. There are very few companies making money off of “AI” that aren’t selling it as a service. It can do remarkable things no doubt. Show me a company that’s broken through with a new monetization scheme that isn’t just selling it as a service.
Trump's Cabinet Picks: Loyalty Over Experience || Peter Zeihan
12:56
Zeihan on Geopolitics
Рет қаралды 369 М.
Climate Change Will Be Different for Everyone || Peter Zeihan
9:48
Zeihan on Geopolitics
Рет қаралды 255 М.
UFC 310 : Рахмонов VS Мачадо Гэрри
05:00
Setanta Sports UFC
Рет қаралды 1,2 МЛН
Don’t Choose The Wrong Box 😱
00:41
Topper Guild
Рет қаралды 62 МЛН
the ONLY way to run Deepseek...
11:59
NetworkChuck
Рет қаралды 656 М.
Economist Fact-Checks Zeihan's China Collapse Story
23:51
Money & Macro
Рет қаралды 1,1 МЛН
Peter Zeihan || Deglobalization: There's No Stopping It Now
10:36
Zeihan on Geopolitics
Рет қаралды 1,9 МЛН
China After Xi || Peter Zeihan
6:26
Zeihan on Geopolitics
Рет қаралды 757 М.
Milei One Year On || Peter Zeihan
6:46
Zeihan on Geopolitics
Рет қаралды 242 М.
The Failure of Chinese Real Estate || Peter Zeihan
19:14
Zeihan on Geopolitics
Рет қаралды 452 М.
Peter Zeihan || Is the AI Revolution Here?
9:30
Zeihan on Geopolitics
Рет қаралды 329 М.
It’s time for an update on the war in Ukraine || Peter Zeihan
8:57
Zeihan on Geopolitics
Рет қаралды 504 М.
UFC 310 : Рахмонов VS Мачадо Гэрри
05:00
Setanta Sports UFC
Рет қаралды 1,2 МЛН