This presentation was a delight to watch! The ability of NCP architecture to learn causal relationships and focus on what's important is impressive!
@prithvishah2618 Жыл бұрын
As always, thank you very much :)
@ajaytaneja111 Жыл бұрын
What a great lecture... Beautiful!
@antonkot6250 Жыл бұрын
This approach to NN looks very promising to me. Wish the best luck to Dr. Hasani and his colleges
@Phliee Жыл бұрын
This is a truly great lecture!
@JoaoVitorBRgomes Жыл бұрын
I found interesting the concepts of double descent, kolmogorov functions, robustness and effective dimensionality (I think this is why dimensionality reduction techniques can be beneficial)
@aninvisibleneophyte Жыл бұрын
great lecture. thank you for sharing!!!! a gem on the horizon for all of us studying epistemology and ontology.
@mojganmadadi6242 Жыл бұрын
Hi, thanks for the great videos. Quick question, How do you visualize the activated neurons as in 41:09 in this video. could you please share is there is any package or software to do so? Thanks a bunch! Really enjoyed your lectures.
@superman39756 Жыл бұрын
This course has helped me learn so much! Thank you! This lecture was amazing especially!
@liu973 Жыл бұрын
The talk was fantastic, much like the other sessions in the series! I'd pinpoint this lecture as the point where I begin to sense a challenge in keeping pace. My takeaway was a general sense that LTC is leaning towards a more innovative strategy for enhancement rather than focusing solely on scaling and fine-tuning.
@tantzer6113 Жыл бұрын
So, 19 neurons as opposed to a much larger number of neurons. But is the amount of computation genuinely reduced? These liquid neurons seem much more complex; so, are there really savings in complexity/computation?
@khileshwarprasad9474 Жыл бұрын
Yes, Quantisation is the term.
@pavalep Жыл бұрын
Super Interesting Lecture 👍
@Adsgjdkcis10 ай бұрын
This guy starts by saying ML isn't an ad-hoc field but by halfway through is introducing another ad-hoc architecture 🤣
@saharbehroozinia9700 Жыл бұрын
Very beautiful!
@santiagoblas8214 Жыл бұрын
Thank you!! This is gold
@abduljawad8689 Жыл бұрын
Great lecture
@eddiejennings5262 Жыл бұрын
Thank you! Great lecture. Very respectfully, I've thought about dynamical systems and neural networks, but not specifically with over parametrization, biologically inspired liquid networks and dynamic causal models. Great examples of learning causal relationships.
@venkatasivagabbita788 Жыл бұрын
It is not that n equations require n unknowns - it is that solving for n unknowns requires n equations. Why would equations require unknowns?
@sheevys8 ай бұрын
Interesting view regarding the kolomogorov Arnold representation. His buddies at MIT just released KAN paper, I wonder how this idea evolves.
@peki_ooooooo Жыл бұрын
Yeah, we need to explore new architecture for neuron networks. Nowadays architecture mainly depends on backprop( one way to learn), this method is not right. Learning from the forward pass and from the result(backward pass) should be two main factors of the learning process.
@RajabNatshah Жыл бұрын
Thank you :)
@liujay1670 Жыл бұрын
hmm. I've thought about the reason for that for a while and my conclusion is that this "double descent" occurs when using CNNs to image data where the convolution produces more number of samples than that of the original set (with some redundancy).
@umachandran17089 ай бұрын
Excellent information on mathematical structure of NN. Appreciate his inspiring dedication 🙏
@daniel-mika Жыл бұрын
Can we get a source for Reed et al. DeepMind paper that is mentioned about 10:00? I cant find the source
@ELKADUSUNhalifesi Жыл бұрын
I think neural activation functions should also be space dependent Both Space and time dependent I think..
@andrewferguson6901 Жыл бұрын
just wait til they figure out a way to run the neurons asynchronously :)
@Raymond_Cooper Жыл бұрын
I wonder jow did you get the idea to arrange "Modern era of Statistics"😊
@prod.kashkari3075 Жыл бұрын
High dimensional statistics is the real “modern” era of statistics