What is phi backslash x0 The backslash do as operation???? I understand these are initial conditions
@borisfilipovic525313 күн бұрын
The amplitude is growing when the perturbation frq is half of the natural frq of the system? Proof?
@3xor3y15 күн бұрын
I don't understand why we need M, when we can find gappy modes directly from the original higher dimension modes: U_gappy = P@U_r ?
@AhmedGharieb-q9t13 күн бұрын
I think M is the model
@sdsa00717 күн бұрын
Thank you! Your auto-white board and your emphasis on what is important in the equations helps me focus!
@sajidhaniff0122 күн бұрын
Outstanding explanation with a clear emphasis on eigenfunction expansion using an orthonormal basis.
@erickappel412028 күн бұрын
Nathan Kutz for president! Very nice and informative lecture.
@ehsankeshavarz2187Ай бұрын
Thank you Professor🙏
@mauriziolazzarini3018Ай бұрын
What does it cost to have a QR decomposition or a SV decomposition?
@JRuddАй бұрын
Wow, great explanations! It makes sense.
@serhatistin3116Ай бұрын
A gem channel 😊, brilliant guy
@eastindiaVАй бұрын
Thats BS dude, its just a silencer, those things are extremely loud. They were used as terror weapons in ww2 for a reason. Theres ways of using the same thing, and making it quieter, and more practical
@Bituman1293Ай бұрын
I would like to thank you, Professor Kutz, for the tremendous effort and dedication you have put into creating this video and so many others over the years. Your commitment to spreading knowledge to everyone has been truly impactful, as you have opened doors to learning opportunities that many can't afford or access. I sincerely hope you continue this extraordinary work, not only because of its immense value to others, but more importantly because it is clear that you find true joy and fulfillment in what you do. Your passion for research and teaching is motivating and drives so many to follow your paths. Your work truly makes a difference, and I wish you nothing but continued happiness and success in all that you do.
@WenjieZhang-up7glАй бұрын
nice
@JI77469Ай бұрын
Love the video, but I think it can be a bit confusing going from 5:30 where you just have "f(x_k, A_2, ..., A_M)" to 9:40 where you have "f_k(x_k, A_2, ..., A_M)." Looking at this, I initially thought that you might have a new fitting function for each data point, rather than the fitting function with the kth data point plugged in.
@umerhuzaifa4927Ай бұрын
No audio unfortunately.
@msuegriАй бұрын
In minute 18, shouldn't it be sqrt of lambda in u(x)?
@kamiljan11312 ай бұрын
My god thank you so much for that piece of art, I might survive fourier
@MuhammadRizwan-em7ev2 ай бұрын
Wow, this explanation is incredible! You truly make complex ideas feel simple and exciting-just like Richard Feynman did. You're the Feynman of this era!
@TheRsmits2 ай бұрын
Watching this again and realized the problem of over-fitting is analogous to the 'no true Scotsman' fallacy in philosophy where someone has given themselves so many parameters that they can fit any data.
@AlloyedFrequencies2 ай бұрын
thank you
@JunhanJiao2 ай бұрын
absolutely great explaination, great work
@ojan-qg9js2 ай бұрын
Thanks sweetie pumpkin
@teriyaki86432 ай бұрын
Amazing hoow simple these concepts seem when explained calmly and in full sentences. Thank you!
@extro46532 ай бұрын
The phases of the stereo channels in the audio are inverted relative to one and other.
@gk45392 ай бұрын
30:40
@gk45392 ай бұрын
15:00!
@micahdelaurentis65512 ай бұрын
choose a different way of explaining other than by saying "kick"-- it was confusing to me. Start out precise and once it's clear then you can say "kick" all you want
@ThompPL12 ай бұрын
The "Logistic Equation Map" is involved in this somehow ?? . . . i.e. "Bifurcation" 🤔
@nurkkr2 ай бұрын
this should be classified.
@abdolreza822 ай бұрын
Thanks so much for sharing this professor! I have a question: at time 5':30" of your lecture you said unlike PCA, this associates every mode with the time dynamics. Is this also true for Singular Spectrum Analysis? Thanks!
@Mohammadkeshtkar-w4d2 ай бұрын
You are amazing👏👏👏👏
@ArkaRoychoudhury2 ай бұрын
Professor, I wasted my time in countless other lectures before coming to yours. How easily you can explain complex things using such basic terminology is wonderful. Mind blown.
@dominicquick1072 ай бұрын
best explanation out there
@commonwombat-h6r3 ай бұрын
nice!
@commonwombat-h6r3 ай бұрын
nice!
@commonwombat-h6r3 ай бұрын
great video!
@erickgomez77753 ай бұрын
Low rank structure... Or a low dimensional manifold
@erickgomez77753 ай бұрын
Make Things Linear Again
@kaipoff71133 ай бұрын
omg nathan r u okay???? 19:45
@NicholasFranz3 ай бұрын
unfortunately, this was the moment he died
@gk45393 ай бұрын
also love that he is wearing a coat!!!
@givemeArupee3 ай бұрын
What book can i use to follow?
@Mayeverycreaturefindhappiness3 ай бұрын
what camera were they using?
@mohammedelsharkawy65413 ай бұрын
at 4:23 in the continious-time shifting example, isn't the shifted signal below supposed to be x(t+t0) since the signal is shifted to the left?
@mehdikoochak96782 ай бұрын
its not mentiond that t_0>0, which means based on its sign can be shifted to right or left, but can see what you're saying :)
@GeoffryGifari3 ай бұрын
How you mentioned the "tuning" part, it seems like there's still an "art" to neural networks
@meloyler93183 ай бұрын
I'm assigning this as a course project for a grad student. Thanks
@individuoenigmatico19903 ай бұрын
At 26:00, in the figures you are representing the real parts of e^[iw•n], ie. cos(w•n). The fact that you noticed is a symmetry of the signals around frequency π. In fact due to cosine properties we have, for all n, cos([w+π]•n)=cos([w-π]•n) even if the complex signals e^(i[w+π]•n)≠e^(i[w-π]•n) due to the fact that sin([w+π]•n)=-sin([w-π]•n).
@thabiso-n6b3 ай бұрын
Wow I am not working hard enough😅
@nagakiranmachiraju44773 ай бұрын
Thank you for a very informative and simiplified explination for LMS and gradient descent algorithms and how it is related and integrated with neural networks and ML.