Numerical Differentiation: Second Derivatives and Differentiating Data

  Рет қаралды 28,443

Steve Brunton

Steve Brunton

Күн бұрын

Пікірлер
@lgl_137noname6
@lgl_137noname6 2 жыл бұрын
First Homework @ 2:22 Second Homework @ 3:55 Third Homework @ 6:37
@rizkamilandgamilenio9806
@rizkamilandgamilenio9806 Жыл бұрын
Do you know where is the solution for the first homework?
@seabasschukwu6988
@seabasschukwu6988 9 күн бұрын
@@rizkamilandgamilenio9806 there is no solution bratan
@jamesrav
@jamesrav 2 жыл бұрын
better lectures than my professors in college (eons ago), and for free ! Admittedly, textbooks are good for this type of stuff, since its pretty straightforward math, but nice to hear it explained out loud.
@clarenceroopnarine6180
@clarenceroopnarine6180 2 жыл бұрын
I appreciate your genuine effort to make these lectures as clear and understandable as possible. They are excellent! Well done!
@byronreams7307
@byronreams7307 Жыл бұрын
Around 33:00, Steve talks about the error at the endpoints. Specifically, that the error for derivative of sin(x) was small because sin(0) and sin(pi) have zero slope. Just to reinforce that, you can change the original function f from sin(x) to cos(x). Then later in the plot change the 'true derivative' plot to np.sin(-x)...minus because derivative of cos(x) is -sin(x) which is the same as sin(-x). It clearly illustrates that the backward and forward differences used at the endpoints are not as accurate. Those computed derivative points at each end deviate significantly from the true derivative.
@thomasjefferson6225
@thomasjefferson6225 Жыл бұрын
Hot damn. I watched this video because i didnt understand my lecture. Thanks Steve, you helped a lot. I go and say yeah lemme look at this weeks voluntary assignment since in europe we dont get graded homework. second question after doing the central aprox for the 1st and second dervative is multiplying delta out by 2!!!!!!!!!!!!!!! hilarious! Thanks a million steve, youre helping this adult learner more than you can imagine.
@smallwang
@smallwang 2 жыл бұрын
Remind of the paper~
@ash9788
@ash9788 Жыл бұрын
25:25 Papers you wanted to put the link in here.
@Eigensteve
@Eigensteve Жыл бұрын
Thanks for the reminder!! www.pnas.org/doi/10.1073/pnas.1814058116
@woowooNeedsFaith
@woowooNeedsFaith 2 жыл бұрын
Links for 25:33, please? I also wonder why most of the videos on this playlist are marked as unlisted. Those videos hardly get views. Apparently those videos won't be recommended.
@joaopedrorocha4790
@joaopedrorocha4790 3 ай бұрын
have some experience with the telescope thing ... beyond the discrete timing of the pictures thenselves, i would add the intervals due to clouds coming in, due to turbulence in the sky, due to instruments going nuts and you troubleshooting it , due to you falling asleep, due to you realizing you've felt asleep and going get some coffee, you going to the bathroom, instrument re-calibration pauses, and so on ... hehehe It's great to know that the schemes works without modifications for unevenly points !!!
@klave8511
@klave8511 2 жыл бұрын
Is the foreword or backward difference error really an error or is it just a time shifted (nearly) exact value? Or in other words, will a small time shift reduce the error over the whole data set. Measured data (Y axis) will have discreet (integer) values which puts a limit on the required error. Computed (eg FEA) can obviously use floats so simulations will be more concerned about the theoretical error.
@naturallyinterested7569
@naturallyinterested7569 2 жыл бұрын
No, this is really error. Here we are just talking about any function of one variable - this could be x, y, z, w, g, q, etc. also t. The error is also in df/dx, so it is (if we see error naively as a "spread" of values) a "spread" of the derivative, so the real derivative would be "somewhere close" to our calculated value. Now, in the case of a function in time (where t is time), let's say f(t) is a position, then this error would be in the velocity (df/dx), which would lead to a spread in the future position (which depends on the current velocity which we just calculated with error), so it's uncertainty. The calculated future velocity/position we get from a simulation might be off to certain degrees depending on this error, so an error of higher order (as in o(dt^n)) is better, as increasing our time resolution can "fix" this so we can have great confidence in our result.
@milos_radovanovic
@milos_radovanovic 2 жыл бұрын
Could you analyze in a video an error behavior of first order numerical differentiation techniques for analytic functions with the "Complex Step Differentiation" showcased on MathWorks Blogs?
@apancem7192
@apancem7192 2 ай бұрын
thank you very much
@chensong254
@chensong254 2 жыл бұрын
At 32:12, I think the reason why there is a larger error in the middle is that there is a bug. The central difference scheme should actually be dfdx[i] = (f[i + 1] - 2 * f[i] + f[i - 1]) / (x[i + 1] - x[i - 1])
@woowooNeedsFaith
@woowooNeedsFaith 2 жыл бұрын
I think you are confusing with 2nd derivative central difference formula at 10:43. (I had similar false memory/confusion.) Code example at 26:27 is for first derivate central difference (as code title says). And the comment at 32:08 originates from the previous lecture (Numerical Differentiation with Finite Difference Derivatives) on this playlist.
@chensong254
@chensong254 2 жыл бұрын
​@@woowooNeedsFaith Thank you for the clarification!
@marco_burderi
@marco_burderi Жыл бұрын
Thank you so much for the amazing work that you do. It's just fantastic.
@Eigensteve
@Eigensteve Жыл бұрын
Glad you enjoy it!
@alyssonrpg
@alyssonrpg 2 жыл бұрын
have you written a book (or have some in mind to indicate) that covers this subject? as a complementary source? :)
@enisten
@enisten 2 жыл бұрын
He has a book called "Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control", but according to the contents you check out on Amazon, it doesn't cover specifically this subject. It covers more advanced subjects.
@MrHaggyy
@MrHaggyy 2 жыл бұрын
He has not but you might wonna pick up a book about numerical methods or numerical analysis if you want the deep dive into math. Springer and Hugendubble are two publisher's with books named that way. Numerical Python is a book that takes a more applied approach in python. But once you can code this stuff up in Python you can code it up in basically any language you want.
@mohammedhassan5571
@mohammedhassan5571 2 жыл бұрын
excellent lecture excuse me, I have a question, why you did not explain in any video how to solve non-linear system of PDEs. because I want to learn about them and model some complex phenomena of heat and mass transfer. I really want to know about the famous solution algorithms of SIMPLE, SIMPLER, Coupled, and PISO.
@rushabhyeshwante
@rushabhyeshwante Жыл бұрын
for learning SIMPLE there's KZbin channel named fluid mechanics 101 by Aidan Wimshurst.
@elwood.downey
@elwood.downey 2 жыл бұрын
What if the data samples are not taken at regular intervals?
@MrHaggyy
@MrHaggyy 2 жыл бұрын
XD shoot the one collecting the data. Jokes aside there are reasons to store at none fixed timepoints. You get different timestamps delta t for forward or backward difference but it works the same. If you compute any central difference or higher order difference you get different delta t between timestamps in each calculation. So instead of 2x dt you will get summs of dt's like dt_-1(from -1 to 0) + dt_1(from 0 to 1). You might wonna take a line, sqare (something simple) and do it by hand the first time alongside the taylor expansion before you code it up. Keep in mind that your error will vary over the dataset, so you might want to assume the worst error for all datapoints or calculate the error alongside the derivertive.
@jessicapriscilacerqueiraba3493
@jessicapriscilacerqueiraba3493 10 ай бұрын
thanks
@NoamWhy
@NoamWhy 2 жыл бұрын
Call me lazy, but I just take a polynomial regression of N neighboring points, and the coefficients of this polynomial give me the first, second, third, ... derivatives of the function at the origin. Done! - You're welcome 🙂
@aarontoderash6028
@aarontoderash6028 Жыл бұрын
Thank you so much for this comment.
@NoamWhy
@NoamWhy Жыл бұрын
@@aarontoderash6028 Any time!
@milos_radovanovic
@milos_radovanovic 2 жыл бұрын
Am I permabanned from commenting on your videos? YT keeps deleting my comments and questions.
@arturoeugster7228
@arturoeugster7228 2 жыл бұрын
A far better way to differentiate numerically, evading the huge errors due to high 'frequency ' data inaccurracy is to use a FFT and ignore the higher terms and multyply with the indeces and do a IFFT, reliable fast way to avoid ' noise amplification. FFT is fast Fourier transform.
@ECHO2LIGHT
@ECHO2LIGHT 7 ай бұрын
Using T instead of X is confusing in handwriting
@lorenzovasquez1201
@lorenzovasquez1201 2 жыл бұрын
So fun!!! Don't miss out - Promo*SM!!!
Numerical Differentiation with Finite Difference Derivatives
36:57
Steve Brunton
Рет қаралды 51 М.
“Don’t stop the chances.”
00:44
ISSEI / いっせい
Рет қаралды 62 МЛН
REAL or FAKE? #beatbox #tiktok
01:03
BeatboxJCOP
Рет қаралды 18 МЛН
VIP ACCESS
00:47
Natan por Aí
Рет қаралды 30 МЛН
My scorpion was taken away from me 😢
00:55
TyphoonFast 5
Рет қаралды 2,7 МЛН
Numerical Integration: Discrete Riemann Integrals and Trapezoid Rule
29:43
Simulation By Data ONLY: Fourier Neural Operator (FNO)
17:21
machine decision
Рет қаралды 2,7 М.
Coding a Fourth-Order Runge-Kutta Integrator in Python and Matlab
36:52
Motivating Eigenvalues and Eigenvectors with Differential Equations
23:58
Forced Systems of Differential Equations in Matlab and Python
19:26
Steve Brunton
Рет қаралды 14 М.
“Don’t stop the chances.”
00:44
ISSEI / いっせい
Рет қаралды 62 МЛН