The Vandermonde Matrix and Polynomial Interpolation

  Рет қаралды 48,784

Dr. Will Wood

Dr. Will Wood

Күн бұрын

Пікірлер: 44
@robmarks6800
@robmarks6800 2 жыл бұрын
Lovely! Would like to see a continuation into the fourier transform matrix!
@nymiantoft5907
@nymiantoft5907 2 жыл бұрын
That was so elegant
@af9466
@af9466 5 ай бұрын
Thanks for the video, this was a very interesting demonstration of an application of Vandermonde matrix I'd never heard of before, and all the steps were clear.
@area51xi
@area51xi 7 ай бұрын
You went from stating a rule regarding rows but then the example used columns which is very confusing.
@xizar0rg
@xizar0rg 2 жыл бұрын
The framework is essentially the same as what's in linear algebra textbooks (slightly more handwaving on the arithmetic than what I remember) from back in the day but presented more succinctly. (got tired of having to code it anew in fortran every time I needed it... damned kids and their libraries for everything.)
@martinepstein9826
@martinepstein9826 2 жыл бұрын
Great video. The Vandermonde determinant is really cool but I don't think it was necessary to show invertibility. If we know that Lagrange interpolation or whatever works for every choice of vector y that means the Vandermonde matrix is surjective, hence invertible.
@General12th
@General12th 2 жыл бұрын
This is a nice little video!
@yaniv242
@yaniv242 Жыл бұрын
at 7:15 when you substract a cols, i cant seem to understand why the second row after it only has x0 in the power of 1, shouldnt the action of sub x0^2 on the above col affect the whole col? sorry that part got me confused a bit
@jurrich
@jurrich Жыл бұрын
Note the blue text: we are subtracting "x0 * the previous column", for all rows. First we look at the effect that has on the first row, namely that it turns every element after the first into a zero. Then we look at what it does for the second row, so we start with [1, x1, x1². x1³, ...] and then apply same operation: the 1 stays the same; the next column gets "x0 * previous column = x0 * 1 = x0" subtracted, yielding x1 - x0; the next column gets x0 * x1 subtracted, yielding x1² - x0x1; the next column gets x0 * x1² subtracted, yielding x1³ - (x0 * x1²), and so on. The next row [1, x2, x2², x2³, ...] gets the same treatment, turning into [1, x2 - x0, x2² - x0x2, x³ - x0x2², ...], and so on all the way down the matrix.
@plushiie_
@plushiie_ 2 жыл бұрын
omg you saved me!
@DrWillWood
@DrWillWood 2 жыл бұрын
glad it was useful! :-)
@augustusnero5592
@augustusnero5592 2 жыл бұрын
This was the best explanation I've seen on the Vandermonde determinant.
@DrWillWood
@DrWillWood 2 жыл бұрын
Thank you very much!
@TESLA_13_DR
@TESLA_13_DR Жыл бұрын
@@DrWillWood noo, this is the best explanation video I have ever seen on yt, bravo!
@neolord50pro77
@neolord50pro77 Жыл бұрын
@@DrWillWoodit’s the best explanation in the universe
@drioko
@drioko 9 ай бұрын
what a long and complicated explanation. i don’t understand this and this video is making me panic
@pnachtwey
@pnachtwey Жыл бұрын
Yes, this is long winded. I just write the 4 equations and solve for the unknowns. This is simple if the points are equally space. Sometime one must use unequal intervals like t01 for the time between x0 and x1. I get equations in terms of time intervals. I make the substitutions for the time intervals so they aren't calculated over and over again.
@1495978707
@1495978707 2 жыл бұрын
Cool vid! 3:30 I would like to note that, while for a proof, inversion of the matrix is probably the best way to go about this, practically speaking, inversion of the matrix is usually the worst way to go about solving a matrix equation. The idea is that if a matrix is invertible, A is unique, not that it exists at all. But if A isn’t unique, that means the interpolating polynomial isn’t unique. Anyhow, this matrix depends only on choice of node location, not value at the node, whether the equation has a unique solution depends only on whether the matrix is invertible, not on the y values. And that’s why it’s useful to do the proof this way.
@slavinojunepri7648
@slavinojunepri7648 Ай бұрын
Your comment is absolutely correct. The matrix equation can never results in my multiple solutions for A because of the polynomial interpolation uniqueness. Therefore, the polynomial doesn't exist if the Vandermonde determinant is zero. Using the Gauss-Jordan method would be best to solve for A than having to invert the Vandermonde matrix. Matrix inversion requires the use of cofactors, a very computationally expensive and wasteful process. I cannot recall another way for doing so that would have an acceptable time complexity, say polynomial for instance.
@mb59621
@mb59621 2 ай бұрын
For representation of the general formula of vandermonde determinant , a double pi notation could be easier to understand , like a nested for loop. The left subscript is the lower limit, and right subscript is the upper limit here .. j=1π(n-1) {i=0π(j-1)} (xj - xi) is the correct representation then .
@dmitrypolozkov1335
@dmitrypolozkov1335 2 жыл бұрын
Thank you! Great video, keep new uploads on!! Greetings from Russia❤️🧸
@richardfrederick1885
@richardfrederick1885 9 ай бұрын
The audio was terrible!!
@pianoforte17xx48
@pianoforte17xx48 2 жыл бұрын
Finally the determinant vandermonde after searching literally everywhere with no hope of understanding it. You are a legend
@djridoo
@djridoo 2 жыл бұрын
Very cool ^^
@kafkayash2265
@kafkayash2265 2 жыл бұрын
Would really like to see an explanation regarding sylvester matrix too. Your videos are really great!
@joaoheleno3700
@joaoheleno3700 11 ай бұрын
You're beautiful
@redroth57
@redroth57 2 жыл бұрын
Awesome video
@henrik3141
@henrik3141 Жыл бұрын
Nice video. Just a small error at 0:53 to write that P_n = .... You also missed the chance of proving the "Key fact" by the vandermont determinant.
@DAIQRY
@DAIQRY Жыл бұрын
Amazing video!
@BSK_666
@BSK_666 Жыл бұрын
Very clear, thank you so much doctor..
@uamdbro
@uamdbro 2 жыл бұрын
Nice, I always saw the existence proof given by explicitly constructing a polynomial (using Lagrange polynomials) and then using Taylor polynomials or something in actual practical applications. Though now that I am thinking about it, if all we want is an existence proof, doesn't that follow immediately from uniqueness? Seeing as the Vandermonde matrix (once you have fixed the x-values) represents a linear map on a finite dimensional vector space, where injectivity is equivalent to surjectivity?
@nikita_x44
@nikita_x44 Жыл бұрын
Proof of uniqueness assumes existence. if it exists, then exists at most one.
@surajchess3114
@surajchess3114 2 жыл бұрын
But it doesn't say if a node is repeated then polynomial doesn't exist or multiple polynomials exist... so how to find the polynomial if a node is repeated? I can repeat the method for non repeating nodes and then multiply by (x-xr)^m when xr is repeated node and m is no of repetitions. Is this correct?
@derendohoda3891
@derendohoda3891 2 жыл бұрын
If a node is repeated then you have two output values for a single input which is not a function and so not a polynomial. So what you are doing is not interpolation. You are looking for polynomial regression, probably.
@AJ-et3vf
@AJ-et3vf 2 жыл бұрын
Awesome video! Thank you!
@the_nuwarrior
@the_nuwarrior 2 жыл бұрын
its quite similar to the DFT matrix
@bentationfunkiloglio
@bentationfunkiloglio 2 жыл бұрын
Love it
@rauldurand
@rauldurand 2 жыл бұрын
Very cool! What software do you use for the animation?
@DrWillWood
@DrWillWood 2 жыл бұрын
thanks! I use Apple Keynote for the animations
@callmedeno
@callmedeno Жыл бұрын
Wow that was peak clarity, you are gifted at this
@DrWillWood
@DrWillWood Жыл бұрын
Thank you! Appreciate that! glad it was useful
@kafuu1
@kafuu1 Жыл бұрын
Nice work!
@DrWillWood
@DrWillWood Жыл бұрын
Thanks!
@thespiciestmeatball
@thespiciestmeatball 2 жыл бұрын
That was awesome! Great job, Dr. Wood
Newton Interpolation and Divided Differences
14:03
Dr. Will Wood
Рет қаралды 35 М.
How (and why) to raise e to the power of a matrix | DE6
27:07
3Blue1Brown
Рет қаралды 2,8 МЛН
когда не обедаешь в школе // EVA mash
00:57
EVA mash
Рет қаралды 3,5 МЛН
Man Mocks Wife's Exercise Routine, Faces Embarrassment at Work #shorts
00:32
Fabiosa Best Lifehacks
Рет қаралды 5 МЛН
Please Help This Poor Boy 🙏
00:40
Alan Chikin Chow
Рет қаралды 19 МЛН
Complex numbers as matrices | Representation theory episode 1
19:17
Lagrange Interpolation
6:54
Dr. Will Wood
Рет қаралды 141 М.
3Blue1Brown's Probability Challenge Solved!
28:51
Mihai Nica
Рет қаралды 54 М.
Interpolation - Vandermonde matrix
11:56
The Math Guy
Рет қаралды 52 М.
The deeper meaning of matrix transpose
25:41
Mathemaniac
Рет қаралды 371 М.
What is Jacobian? | The right way of thinking derivatives and integrals
27:14
The weirdest paradox in statistics (and machine learning)
21:44
Mathemaniac
Рет қаралды 1 МЛН
Is the Future of Linear Algebra.. Random?
35:11
Mutual Information
Рет қаралды 301 М.
когда не обедаешь в школе // EVA mash
00:57
EVA mash
Рет қаралды 3,5 МЛН