Does Computational Complexity Restrict Artificial Intelligence (AI) and Machine Learning?

  Рет қаралды 7,685

Simons Institute

Simons Institute

Күн бұрын

Пікірлер: 8
@hnryjmes
@hnryjmes 3 жыл бұрын
Arora is a great speaker, the clarity of explanation in this talk is so underrated
@brian.josephson
@brian.josephson 7 жыл бұрын
"About 10^40 atoms in the human head" is clearly an overestimate. As it happens, there's a web page where this question is addressed (education.jlab.org/qa/mathatom_03.html ) and this gives the answer as 4.56*1026.
@brian.josephson
@brian.josephson 7 жыл бұрын
That should be 4.56 x 10^26 of course (for the non-mathematicians, ^ is used to indicate a superscript or a power).
@codegeek98
@codegeek98 3 жыл бұрын
You may Eliezer Yudkowsky's writings on _why_ the Chinese Room is fallacious
@rsinh3792
@rsinh3792 3 жыл бұрын
Sir reviewer of the manuscript asked me this question "Please compare your proposed machine learning model with other based method in term of time complexity? Calculate the time complexity of your algorithm", Can you please help me how to address this question?
@NeeleshSalpe
@NeeleshSalpe 6 жыл бұрын
I think the current computation model like CPU/GPU +Memory which mimic basic Neumann computer is the bottleneck in complex computation. Our all algorithms and assumptions are based on this basic computation model, we are bounded by CPU/ memory. If we have solved questions like NP-hard we need different computation model which I don't know now ... and we want to mimic brain whose computation model is completely different than ours one, we are trying to fit the square peg in round hole. That is why we will be always restricted. Once the new computation model comes beyond Neumann's limitations we could solve the problem.. Even though we are trying to mimic neurons with deep learning models, unfortunately, we can not mimic , as underlying base computation unit is not same!
@gJonii
@gJonii 5 жыл бұрын
You can simulate neural networks in polynomial time using our computers, so there's no particular point in changing the architecture.
@qwerdbeta
@qwerdbeta 4 жыл бұрын
He shrugs off noncomputable programs as very rare. They are rare in applied comp sci today, but definable, formally, they are so common compared to comparable problems (lower infinity cardinality). This guy is so loose he makes errors!
Algorithm Design in the Modern Era: Dealing with Uncertainty and Incentives
45:19
English or Spanish 🤣
00:16
GL Show
Рет қаралды 18 МЛН
1ОШБ Да Вінчі навчання
00:14
AIRSOFT BALAN
Рет қаралды 6 МЛН
这三姐弟太会藏了!#小丑#天使#路飞#家庭#搞笑
00:24
家庭搞笑日记
Рет қаралды 121 МЛН
Thinking Algorithmically About Impossibility
55:22
Simons Institute
Рет қаралды 4,5 М.
Yann LeCun - Power & Limits of Deep Learning
36:48
The Artificial Intelligence Channel
Рет қаралды 87 М.
Beyond Computation: The P versus NP question (panel discussion)
42:33
Simons Institute
Рет қаралды 26 М.
Some things you need to know about machine learning but didn't know... - Sanjeev Arora
1:05:48
Is Optimization the Right Language to Understand Deep Learning? - Sanjeev Arora
32:35
Institute for Advanced Study
Рет қаралды 12 М.
Generative AI in a Nutshell - how to survive and thrive in the age of AI
17:57
Andrew Ng: Opportunities in AI - 2023
36:55
Stanford Online
Рет қаралды 1,8 МЛН
Brief introduction to deep learning and the "Alchemy" controversy - Sanjeev Arora
39:54
Institute for Advanced Study
Рет қаралды 9 М.