Neural Ordinary Differential Equations

  Рет қаралды 22,366

Andriy Drozdyuk

3 жыл бұрын

If you would like to see more videos like this please consider supporting me on Patreon -www.patreon.com/andriydrozdyuk
The PDF of the slides used in this presentation can be downloaded here: github.com/drozzy/Neural-Ordinary-Differential-Equations-KZbin
0:00 - Outline of the presentation
0:38 - Some Cool Results
2:12 - What is a Neural ODE? (Machine Learning Part)
12:15 - Connection to Dynamical Systems
14:26 - Dynamical Systems
20:03 - Pendulum, Example of a Dynamical System
23:22 - Adjoint Method
28:45 - Adjoint Method Proof
30:49 - Gradients w.r.t. theta
32:40 - Complete Backprop Algorithm
34:27 - Concluding Remarks

Пікірлер: 22
@shorray
@shorray 3 жыл бұрын
Herr YEAH! i was fighting like for 2 Weeks with the adjoint method and nobody really explained like this in detail. Thanks a lot keep going!
@kodfkdleepd2876
@kodfkdleepd2876 Жыл бұрын
Or maybe it just took you 2 weeks to get it and you just happen to be watching this video when it "clicked"?
@keb785
@keb785 2 ай бұрын
This is very helpful; I appreciate it as it provides a comprehensive review with detailed explanations
@tanmayahmed4622
@tanmayahmed4622 3 жыл бұрын
Thank You very much, Sir. This is by far most easy explanation of neural ODE.
@vishwajitkumarvishnu3878
@vishwajitkumarvishnu3878 3 жыл бұрын
Best video/blog so far on neural ODEs
@mohamedmusa7149
@mohamedmusa7149 2 жыл бұрын
Excellent exposition of the paper! Thank you.
@fbf3628
@fbf3628 Жыл бұрын
This is a truly great explanation!
@jishnuak3000
@jishnuak3000 Жыл бұрын
Thanks for explaining the proof, couldn't find it anywhere else
@siddharthshrivastava5823
@siddharthshrivastava5823 3 жыл бұрын
Awesome explanation!!
@mswification
@mswification Жыл бұрын
I agree with all the previous comments, this was a terrific explanation. I particularly appreciated that you included details of the proof of the adjoint method.
@Rjsipad
@Rjsipad Жыл бұрын
could you explain the difference between lower case f and theta? I'm a bit confused as to how they are different
@hannes7218
@hannes7218 11 ай бұрын
great explanation! :)
@gamebm
@gamebm 8 күн бұрын
Thanks for the video and detailed derivation. There is a question/comment which really puzzles me. First, if L is a number, which measure the deviation (eg. absolute difference between) of estimation z(t) from the real value, then the mapping z(t) -> L is a functional by definition. We would have, instead, \delta L/\delta z(t) = a(t) naturally defined as a functional derivative. However, as I tried to follow the arguments used in this video (29:28), I realized that it changes a lot (as dz(t+\Delta t)/dz(t) does have its counterpart in the context of functional). So I was forced to understand that the notion of functional is irrelevant here, one defines a number L as a function of another number z(t) which is a function evaluated at a given time, but must not be understood as a function of time. Under such an enforced context, the derivation then makes sense. PS: please do not use "partial" in the numerator and "d" in the denominator, as I don't believe this is the standard.
@gamebm
@gamebm 8 күн бұрын
On a second thought, a(t) is indeed reminiscent of the functional derivative defined as a(t)=\delta L/\delta z(t). It is very inviting to state that for z(t) satisfying \dot{z}=f(z,t), one has \dot{a}=-a \partial f/\partial z. Except for its variation the function z(t) is not fixed at both end points as per the definition of functional differential, therefore it one follows that path (which mostly should work), one must introduce some proper modifications.
@bwan03
@bwan03 3 жыл бұрын
Brilliant! You're really good at explaining I must say. Excellent job! May I please ask what you used for presentation and drawing equations Andriy?
@AndriyDrozdyuk
@AndriyDrozdyuk 3 жыл бұрын
Thanks! I think it was GoodNotes with ipad screen recording and apple pencil. (I just cut the surrounding window borders in the final recording)
@tenkunvan60
@tenkunvan60 Ай бұрын
28:44, i think the backward equation of the adjoint method might be wrong and the integral term should be negative
@francoisgauthier-clerc6413
@francoisgauthier-clerc6413 3 жыл бұрын
Good job, very clear explanation ! However, it's sad that you didn't introduce some implementation of the function f. How can we design and implement such continuous function ?
@AndriyDrozdyuk
@AndriyDrozdyuk 3 жыл бұрын
Oh that function doesn't exist - that's just for explanation purposes. This is what ODE solver does basically.
@XupyachkaX
@XupyachkaX Жыл бұрын
Здравствуйте, Андрей! Так всё-таки, не могу понять зачем гонять нейронки для дифуров? Дифуры строятся на законах, а нейронки это вроде как хитрые интерполяции которые работают только в узкой, натренированной области значений. Или нейронка может составить краткое дифуравенение по наблюдениям за динамической системой решая обратную задачу?
@boriscrisp518
@boriscrisp518 18 сағат бұрын
illegible hand written scrawl... just like my undergrad days
@danielschwegler5220
@danielschwegler5220 Жыл бұрын
Thanks for the superb explanation!
버블티로 체감되는 요즘 물가
00:16
진영민yeongmin
Рет қаралды 93 МЛН
NERF WAR HEAVY: Drone Battle!
00:30
MacDannyGun
Рет қаралды 30 МЛН
OMG🤪 #tiktok #shorts #potapova_blog
00:50
Potapova_blog
Рет қаралды 17 МЛН
Дибала против вратаря Легенды
00:33
Mr. Oleynik
Рет қаралды 4,1 МЛН
#miniphone
0:16
Miniphone
Рет қаралды 3,7 МЛН
iPhone 16 с инновационным аккумулятором
0:45
ÉЖИ АКСЁНОВ
Рет қаралды 618 М.
YOTAPHONE 2 - СПУСТЯ 10 ЛЕТ
15:13
ЗЕ МАККЕРС
Рет қаралды 142 М.
Ультрабюджетная игровая мышь? 💀
1:00