28:24 "probability is derived from energy" probably refers to statistical mechanics, where any energy function on the possible states of a system defines a probability distribution on these states (Boltzmann distribution).
@imrematajz16243 жыл бұрын
Try again at 0.75 of normal speed...makes a huge difference in comprehension! His mind is hyper fast. And I am not a robot :-)
@CristianGarcia5 жыл бұрын
I think the Contrastive Predictive Coding paper achieves similar kinds of results for images and audio as the ones presented for text.
@WeidiXie5 жыл бұрын
And it actually also works on videos:arxiv.org/abs/1909.04656 kzbin.info/www/bejne/amSuenuLq62deJI
@snippletrap5 жыл бұрын
The ridge at 41:20, and the ambiguity it implies, calls to mind the gestalt idea of "multistability".
@whatsinthepapers61125 жыл бұрын
Sounds like we all need to put more energy into Energy-based models
@christianleininger29542 жыл бұрын
2:44 he says human play reach in 15 min of play and after at least 10 years of being a life learning how the world works (physics and predicting the future in his mind)
@minhvu89094 жыл бұрын
The slides: helper.ipam.ucla.edu/publications/mlpws4/mlpws4_15927.pdf
@robbiero3684 жыл бұрын
So actually it takes us months to learn anything with millions of examples too then, but what we learn first can be transferred to many things later.
@robbiero3684 жыл бұрын
For images would it not make more sense to just predict the label for the missing "thing" rather than the actual pixels, how many humans could do that after all?
@robbiero3684 жыл бұрын
Actually that's not true is it. Our visual system is constantly replacing or imagining missing data
@snippletrap5 жыл бұрын
The Chomskyans are right in part, for the same reason that LeCun mentions in the beginning of the lecture. What LeCun calls poor "sample efficiency" is what Chomsky calls "the poverty of the stimulus". Children require far less training data.
@visuality25415 жыл бұрын
this is gold
@_chip4 жыл бұрын
Why does he call his cost function an energy function? Isn’t that just a synonym?
@christoferberruzchungata27224 жыл бұрын
Because his lost IS BASED on the concept of how an energy function should behave. Not all loss functions are inspired by energy functions. I believe he emphasizes the "energy-based" idea to make a strong point that he is borrowing the concept/idea from physics and natural systems.
@ephi1244 жыл бұрын
"Babies learn by observation with little interaction", yes and that's because they inherit such capability from their parents: their neurons are already fine-tuned to have those features and the question is how do we enforce these in our ML models?
@Rishabhshukla134 жыл бұрын
Guess, pre-training is equivalent to that. So are genetic algorithms (in a different way though).
@ephi1244 жыл бұрын
@@Rishabhshukla13 Which tells me our approaches to mimic Biological neurons has been a fiasco. Like he said the way humans learn so quickly is neither supervised nor reinforced but pre-training is. The only choice we have is understanding Biological neurons (not superficially) and how evolution works and see if we have the resources to replicate them. And I'm not even sure if it is necessary to mimic Biology in order to build intelligent machines.
@vast6344 жыл бұрын
@@ephi124 Neurons always work in groups in the cortical column. Artificial NN always consider them singular logic elements. This way too fine grained, and not their job in biology. The whole column is the logical element, not the single neuron.
@agiisahebbnnwithnoobjectiv2283 жыл бұрын
The objective function of animal brains and therefore Human Level A.I. is impact maximization. You were chosen to receive this message. Help spread the word.
@agiisahebbnnwithnoobjectiv2283 жыл бұрын
This is never gonna work
@johnjewell50083 жыл бұрын
I am all for asking questions but when one of the premier AI researchers in the world is giving a talk, probably avoid asking basic details about transformers, especially when it is not the main focus of the talk hahahah this made me cringe a bit