David Duvenaud | Reflecting on Neural ODEs | NeurIPS 2019

  Рет қаралды 27,168

Preserve Knowledge

Preserve Knowledge

Күн бұрын

Пікірлер: 12
@tookerjerbs
@tookerjerbs 5 жыл бұрын
I'm afraid this talk gave some people the impression that I was flippant about careful scholarship, or wrote a misleading paper with no substance. I want to be clear that I do take these things seriously, and that the paper made substantial contributions which still stand. The message I intended was that *even though we tried hard to be careful*, we still made mistakes, which we eventually fixed. Regarding novelty, I glossed over some details in the talk and maybe made things sound worse than they were. The original version of the paper clearly cited previous uses of the adjoint sensitivities method. The claim that Joel was annoyed about was that we (mistakenly thought) we were the first to have done so entirely using general and efficient vector-Jacobian products with autodiff. To check this claim, we had closely examined (and cited) the implementations in Stan, Fatode and Dolfin. After we published, Joel Andersson pointed out that his package, CasADi, did do a general vector-Jacobian implementation, which we cited in the next version. I think I also gave the wrong impression when I said "the only thing we're doing is bringing this to PyTorch" - I was just referring to the adjoint sensitivities method. There is plenty of other novelty in the paper. I also want to update the story about the MIT Tech Review article. After this talk, Karen Hao explained to me what she had meant about the 'inventing ODEs' claim. She initially thought that we were calling our new method 'ODE Solvers', which is why the first version of the article said we were proposing a new method with that name. As for the line about how we "need to work on our branding", she said: "That line was meant to tease the fact that you simply named your new neural network very literally, after ODEs, instead of choosing a simpler, perhaps more figurative, name. (Similar to if I had invented a new apple cutting device and just called it “apple cutting device” if you catch my drift.) Of course, I see now why it made it sound like you were the first to ever string together the words “ordinary differential equations.” Hence, why I corrected it upon request.". Until that conversation, I didn't realize that Karen Hao already knew about ODEs. The original version of the article did clearly give the impression that we had invented ODE solvers, but I'm sorry for having passed on my mistaken impression that Karen didn't understand ODEs at all in this talk.
@freddiekalaitzis5708
@freddiekalaitzis5708 5 жыл бұрын
"you can't cross the streams with Neurals ODEs" I see what you did there
@Erotemic
@Erotemic 5 жыл бұрын
This is how you science. Great job at owning your mistakes and valuing truth over profit.
@vicktorioalhakim3666
@vicktorioalhakim3666 5 жыл бұрын
The fact that he got away with such bullshit, HIGHLIGHTS the problems with modern review process and *doing Science*, especially in ML. Neural ODEs... LOL There's no way you can convince me that this bullshit based on Euler's method is actually useful XDD
@linglingfan8138
@linglingfan8138 3 жыл бұрын
It works, and people are using it. That explains everything. I like this work a lot.
@keixi512
@keixi512 5 жыл бұрын
Really appreciate his honesty.
@revoiceful
@revoiceful 5 жыл бұрын
Very valuable too see someone admitting to failures in science. Still, I think the idea is refreshing and we need that more even if there might be problems to begin with. Isn't that how science is supposed to work anyway ?
@loremipsum7513
@loremipsum7513 5 жыл бұрын
Simply respect.
@dewinmoonl
@dewinmoonl 5 жыл бұрын
pay respect in the chat
@impolitevegan3179
@impolitevegan3179 4 жыл бұрын
F
@bocckoka
@bocckoka 4 жыл бұрын
his mannerisms are like Elon Musk's
ODE | Neural Ordinary Differential Equations - Best Paper Awards NeurIPS
12:00
DSAI by Dr. Osbert Tay
Рет қаралды 35 М.
Don’t Choose The Wrong Box 😱
00:41
Topper Guild
Рет қаралды 62 МЛН
Neural ODEs (NODEs) [Physics Informed Machine Learning]
24:37
Steve Brunton
Рет қаралды 69 М.
NeurIPS 2020 Tutorial: Deep Implicit Layers
1:51:35
Zico Kolter
Рет қаралды 48 М.
Latent Stochastic Differential Equations | David Duvenaud
24:47
Towards Data Science
Рет қаралды 7 М.
Neural Ordinary Differential Equations
35:33
Andriy Drozdyuk
Рет қаралды 25 М.
How AI Powers Self-Driving Tesla with Elon Musk and Andrej Karpathy
29:48
Preserve Knowledge
Рет қаралды 64 М.
Liquid Neural Networks
49:30
MITCBMM
Рет қаралды 254 М.
That's how Top AI/ML Conference looks (NeurIPS 2019, Vancouver)
10:09
Oleksii Sidorov
Рет қаралды 9 М.
Neural Ordinary Differential Equations
22:19
Yannic Kilcher
Рет қаралды 57 М.
What are Normalizing Flows?
12:31
Ari Seff
Рет қаралды 75 М.
Don’t Choose The Wrong Box 😱
00:41
Topper Guild
Рет қаралды 62 МЛН