The Hilbert Projection Theorem - Full Proof! (Theory behind Machine Learning)

  Рет қаралды 4,722

ThatMathThing

ThatMathThing

Күн бұрын

Пікірлер: 10
@pierattiliodigregorio3064
@pierattiliodigregorio3064 Ай бұрын
Finally, someone who takes a deep dive into the mathematical foundations.
@stretch8390
@stretch8390 6 ай бұрын
Man, what a channel. Awesome to find videos not shying away from technical topics.
@fanalysis6734
@fanalysis6734 8 ай бұрын
What about the role of hilbert spaces in machine learning?
@JoelRosenfeld
@JoelRosenfeld 8 ай бұрын
This is a buildup to answer that question. Essentially, projections are all over the place in Machine Learning, and that's where your approximations are ultimately coming from. The implementation of projections in finite dimensional spaces turn out to be equivalent to gradient descent problems on loss functions. Where you are guaranteed to achieve the best approximation in a Hilbert space on finding that minimum. Moreover, the representer theorem, which is at the core of things like support vector machines relies on the representer theorem, which is a direct consequence of the Hilbert Projection Theorem. We will talk about that in the next several videos. Sorry for the wait :)
@davidg6324
@davidg6324 8 ай бұрын
That proof was very nice, I didn't know something so seemingly simple as the parallelogram law could be used this way!
@JoelRosenfeld
@JoelRosenfeld 8 ай бұрын
It really is an unsung hero of functional analysis
@vtrandal
@vtrandal 4 ай бұрын
The video does a great job of covering the Hilbert Projection Theorem and its proof but falls short of explicitly connecting this theory to machine learning, as the title suggests.
@JoelRosenfeld
@JoelRosenfeld 4 ай бұрын
@@vtrandal is fundamental to machine learning and establishing the connection is the point of the series of videos. More is coming.
@route66math77
@route66math77 8 ай бұрын
Exquisite content -- thanks 🙂
@JoelRosenfeld
@JoelRosenfeld 8 ай бұрын
I’m glad you like it! It means a lot to me.
What is a BEST approximation? (Theory of Machine Learning)
19:04
ThatMathThing
Рет қаралды 4,5 М.
The Structure of a new AI architecture (KAN)
12:17
ThatMathThing
Рет қаралды 2,6 М.
She made herself an ear of corn from his marmalade candies🌽🌽🌽
00:38
Valja & Maxim Family
Рет қаралды 18 МЛН
Quando eu quero Sushi (sem desperdiçar) 🍣
00:26
Los Wagners
Рет қаралды 15 МЛН
Что-что Мурсдей говорит? 💭 #симбочка #симба #мурсдей
00:19
How to unify logic & arithmetic
20:14
All Angles
Рет қаралды 51 М.
Newtonian/Lagrangian/Hamiltonian mechanics are not equivalent
22:29
Gabriele Carcassi
Рет қаралды 47 М.
The Riemann Hypothesis
29:57
Dave's Math Channel
Рет қаралды 862
Inner Products in Hilbert Space
8:41
Steve Brunton
Рет қаралды 126 М.
The Concept So Much of Modern Math is Built On | Compactness
20:47
Morphocular
Рет қаралды 446 М.
Why the world NEEDS Kolmogorov Arnold Networks
7:07
ThatMathThing
Рет қаралды 26 М.
Is the Future of Linear Algebra.. Random?
35:11
Mutual Information
Рет қаралды 377 М.
Deriving the Dirac Equation
16:34
Richard Behiel
Рет қаралды 113 М.
Functional Analysis 15 | Riesz Representation Theorem [dark version]
9:56
The Bright Side of Mathematics
Рет қаралды 2,7 М.
The better way to do statistics
17:25
Very Normal
Рет қаралды 262 М.