No video

MSE101 L7.2 Non-linear least squares minimisation

  Рет қаралды 29,178

David Dye

David Dye

Күн бұрын

Пікірлер: 16
@SHONSL
@SHONSL 3 жыл бұрын
good god thank you for using proper mic and audio equipment. it makes following MUCH easier for us.
@zacharythatcher7328
@zacharythatcher7328 4 жыл бұрын
This is the first hint I’ve seen that utilizing the hessian matrix (matrix representing the curvature of a vector valued function) is a way to judge the value of the constant that you should use when minimizing the cost function. Thanks for that.
@UseLogicPlease123
@UseLogicPlease123 8 жыл бұрын
Many thanks for the video. Great presentation and all. Would be cool to see if you build all the way up to the more complex quicker solving methods.
@obewanjacobi
@obewanjacobi 4 жыл бұрын
Solid video, very helpful refresher for me
@apprentice2101
@apprentice2101 6 жыл бұрын
Thank you so much!! You are really awesome at explaining math!
@absolute___zero
@absolute___zero 4 жыл бұрын
What I can't understand in all these algorithms that convert the *Error* into a *Error Squared* function is: why are you doing that? Because by converting the Error into ErrorSquared you are creating another level of complexity!!!, and then you have to derive the ErrorSquared so you can do the minimization. Why to elevate complexity into higher degree and then lower that degree immediately in the next step? Cant we just derive the error function as it is in its original form? That seems like the obvious thing to do, and you would go straight to the minimum without any wiggling. And then if you do second derivative on that , you would converge to the minimum even faster, that would be the equivalent of the third order derivative on the ErrorSquared function, so, second order derivative on Error (not ErrorSquared) would be much powerful method.
@pedroparamodelvalle6751
@pedroparamodelvalle6751 4 жыл бұрын
The Error squared function is convex, hopefulyy positive definite also smoother. In simple terms we want to penalize either positive or negative error the same. You can also use absolute error but that is definitely not a smooth function.
@zacharythatcher7328
@zacharythatcher7328 4 жыл бұрын
You have to start with the understanding that this calculation is actually getting the squared distance between the observations and the estimating function. The Pythagorean theorem gives you distance. This is just distance squared. And when you think about it, distance is really what you want to minimize. Simply adding non-squared error is not even giving you an analogue of distance anymore. It would be giving you some weird measure that might actually have an incorrect minimum somewhere that is not representative of minimizing the distance when you get into higher dimensional space. Luckily, the distance equation generalized into higher dimensional space, so we know that we are continuing to minimize distance even though we have no notion of what distance really looks like in 4-D.
@skylogic
@skylogic 5 жыл бұрын
Great lesson! Though it'd be nice if you didn't block the camera's view of the board while writing and explaining the equations :)
@jogomez1988a
@jogomez1988a 3 жыл бұрын
¿Se podría aplicar este mismo método para una regresión sigmoidal? algún libro que pueda recomendar.
@manushanmugam
@manushanmugam 8 жыл бұрын
very nice lecture. Thank you
@adrian-ng
@adrian-ng 7 жыл бұрын
That is the nicest whiteboard I have ever seen
@DavidDyeIC
@DavidDyeIC 7 жыл бұрын
Thanks Adrian! Looking forward to supporting learners with you on Coursera!
@salihuibrahim3853
@salihuibrahim3853 6 жыл бұрын
HI, please how can I fit my curve using Levenberg-Marquardt Method using curveExpert software?
@miguelangeldiazsanchez5337
@miguelangeldiazsanchez5337 8 жыл бұрын
Great!! very useful!
@hsugoodman4223
@hsugoodman4223 5 жыл бұрын
讲的真好
MSE101 L8 Fitting a Gaussian
29:09
David Dye
Рет қаралды 9 М.
Lecture: Least-Squares Fitting Methods
44:39
AMATH 301
Рет қаралды 108 М.
Кадр сыртындағы қызықтар | Келінжан
00:16
Linear Least Squares to Solve Nonlinear Problems
12:27
The Math Coffeeshop
Рет қаралды 30 М.
Linear Systems of Equations, Least Squares Regression, Pseudoinverse
11:53
Harvard AM205 video 1.8 - Nonlinear least squares
27:24
Chris Rycroft
Рет қаралды 10 М.
Gauss Newton - Non Linear Least Squares
20:02
Meerkat Statistics
Рет қаралды 36 М.
9. Four Ways to Solve Least Squares Problems
49:51
MIT OpenCourseWare
Рет қаралды 118 М.
MSE101 Data Analysis - L4.1 Integrating the Gaussian #1
15:34
David Dye
Рет қаралды 3,1 М.
Nonlinear Least Squares
10:56
Postcard Professor
Рет қаралды 45 М.
MSE101 Data Analysis - L1.1 Introduction
5:10
David Dye
Рет қаралды 3,7 М.
Statistics 101: Nonlinear Regression, The Very Basics
21:55
Brandon Foltz
Рет қаралды 141 М.