No video

18: Jacobian Determinant - Valuable Vector Calculus

  Рет қаралды 36,548

Mu Prime Math

Mu Prime Math

Күн бұрын

Пікірлер: 58
@elecmayte4267
@elecmayte4267 Жыл бұрын
Tired of professors who only teach the algebric way instead of actually trying to describe what happens when we use these concepts. Thanks for making it clear
@thethug1946
@thethug1946 4 жыл бұрын
A stellar career as an educator awaits you. You are a born communicator
@hotvodka7641
@hotvodka7641 2 жыл бұрын
"Hello welcome to multivariate calculus I'm Professor The Thug"
@whitealpha2265
@whitealpha2265 4 жыл бұрын
Awesome video, finally understood the intuition behind the Jacobian and not just the algebra
@MGTOW-nn9ls
@MGTOW-nn9ls 3 жыл бұрын
The best explanation on internet. Excellent job. Thank you
@GizmoMaltese
@GizmoMaltese 3 жыл бұрын
This was a very clear and concise explanation of a Jacobian. It's always been a bit of a mystery to me. Whenever they talked about focusing on a tiny area I felt lost. But now that you tie it to single variable calculus and Reimann sums it makes sense. Then tying into transformations from one variable to two variable transformations and the determinant. It all clicks. I feel like I finally now understand a lot of the stuff I learned in college.
@trigon7015
@trigon7015 4 жыл бұрын
Funny thing: I was on the Khan Academy multivariable integrals course learning about triple integrals when this video came up. Perfect timing!
@jonathangrey6354
@jonathangrey6354 4 жыл бұрын
I took Calc AB last year and I understood u sub very well but I didn’t understand what the purpose of multiplying by the du factors were for. So last year I came across the Jacobian determinant and my mind was blown. It all makes sense that we just account for stretching caused mapping to different planes.
@dennisbrown5313
@dennisbrown5313 2 жыл бұрын
Excellent; many sites discus the 'how' but never really the why. This makes perfect sense
@vasuhardeo1418
@vasuhardeo1418 3 жыл бұрын
Dude i came for the Jacobian and also learned whats going on for a single variable U substitution, thanks so much man. that scaling idea is pretty cool much less the transformation, WOW , awesome.
@NoActuallyGo-KCUF-Yourself
@NoActuallyGo-KCUF-Yourself 3 жыл бұрын
The geometry really helps. I've used Jacobians before, but never really thought about what they _mean._
@nournote
@nournote 4 жыл бұрын
Amazingly explained.
@user-fh5km1ic2t
@user-fh5km1ic2t 10 ай бұрын
Man you are on a different level for real. Crazy insights!
@herewego8093
@herewego8093 2 жыл бұрын
4:40 a really brilliant way to explain why multiply with g'. Amazing
@farrela6710
@farrela6710 4 жыл бұрын
Thank you so much! Your explanation is very easily understandable!
@LovepreetSingh-jv4uj
@LovepreetSingh-jv4uj 3 жыл бұрын
Awesome brother.. this connects real analysis linear algebra and differential equations 👍
@alejrandom6592
@alejrandom6592 2 жыл бұрын
it's been days since I have tried to understand u-sub geometrically, thanks for this :)
@pendawarrior
@pendawarrior 4 жыл бұрын
Thanks for the valuable intuition 👍
@muntedme203
@muntedme203 Жыл бұрын
Excellent presentation. Thanks
@pd7484
@pd7484 3 жыл бұрын
amazing teacher omg
@xoppa09
@xoppa09 3 жыл бұрын
More thinking on this matter... the 'backwards substitution' as you call it (nice name for it), which was x = g(u) in the one dimensional case, reminds me of the method of 'introducing a parameter' in precalculus. For example we might look at a complicated curve in the x,y plane made up of points (x,y), and tidly describe it by introducing a parameter t, e.g. (x,y) = (x(t), y(t)). For example the ellipse (a cos t , b sin t ). Similarly a change of variables can be thought of as a type of 'introduction of parameter(s)', where the same number of parameters are introduced as there are coordinates. Then in the u,v parametric space a rectangular differential area du dv gets mapped to a curvilinear differential area in the x,y space, in such a way that du dv gets stretched/flipped/punched by the factor |J|. Note that i said changing variables is *like* introducing parameters , i understand that it is not exactly the same. I would say its a special case of introducing parameters, since the number of parameters introduced does not usually match the number of coordinates. I usually think of a parameter, such as 't' , as a knob or slider off to the side of an x,y graph that will cause a point P in the x,y space to move around, like a ghost moving pieces around on a game board. For change of variables , the new variables u,v can be thought as two sliders inside its own "cartesian" or rectangular slider space. This means that if you grab a point in u,v space and trace the shape of a rectangle, in the x,y space you will simultaneously trace the shape of a circle (in the case of a polar coordinate change of variable) There are mathematical ways to formalize this more efficiently. Also your images make me think of the chain rule , the jacobian is just the chain rule for the derivative of a compositional function in 1 dimension. Thanks for posting this video!
@benastorga1615
@benastorga1615 2 жыл бұрын
Thank you so much... seriously!
@NexusEight
@NexusEight 3 жыл бұрын
Nice one!
@LucasVieira-ob6fx
@LucasVieira-ob6fx 2 жыл бұрын
That was a very nice explanation!
@sam08090
@sam08090 3 жыл бұрын
Excellent
@iyadindia862
@iyadindia862 4 жыл бұрын
Pls create a vedio on Jacobian of Implicit functions...
@donegal79
@donegal79 3 жыл бұрын
brilliant. Well done!!!!
@matejpavlovic3469
@matejpavlovic3469 3 жыл бұрын
Hello, can you please tell me what is g(x) in those matlab pics? What is your initial integral? Ty!
@hemantbhosale3060
@hemantbhosale3060 2 жыл бұрын
Nice one I understood a concept that wasn't understood
@BuddyNovinski
@BuddyNovinski 2 жыл бұрын
I never had this in calculus -- and no geometry to visualize. I am so angry that I gave up on linear algebra in frustration when I could have used a better background and some hints to understand it in 1977.
@gamingfuneducation9934
@gamingfuneducation9934 Жыл бұрын
Thank you for a Great video, I have 1 quastion. Why the rectangle transforms into parralelogram in 2D case? Why not in some complex shape, while talking in some general case
@MrDlanglois
@MrDlanglois 2 жыл бұрын
Thank you
@tsunningwah3471
@tsunningwah3471 8 ай бұрын
it would be better if you could show what the horizontal and vertical axis represent (does the horizontal axis represents x or g(x) ?
@liviumircea6905
@liviumircea6905 3 жыл бұрын
Algebraically I know why dx =g'(u)* du and hence to derive the substitution formula , but at your explanation I don't get why multiplying by g' the the squish canceled out
@scollyer.tuition
@scollyer.tuition 3 жыл бұрын
[Note: my notation here is slightly different to that used in the video - I don't mention the function g] We take a rectangle occupying the interval [x, x+h] on the x-axis; it has width h (= x+h - x) We now map all points on the x-axis to a new u-axis by u=u(x). In this case, the interval [x, x+h] on the x-axis goes to [u(x), u(x+h)] on the u-axis; it has width u(x+h)-u(x) on our new u-axis. We want to relate the u-width u(x+h)-u(x) to the x-width x+h - h = h. In general, there will be some stretch factor (that depends on x) that maps the x-width to the u-width; call it S(x) and remember that a stretch of real numbers means "multiply" i.e. u-width = S(x) times x-width So we have u(x+h) - u(x) = S(x) h by definition of a stretch factor. Rearranging to solve for S(x) gives us: S(x) = (u(x+h)-u(x))/h Now consider that with a Riemann sum, we allow the width of the rectangles to go to 0 to ensure that we get an exact area from the sum i.e. we really want to consider S(x) for a very narrow rectangle near x i.e. we have to compute: S(x) = lim_{h -> 0} (u(x+h) -u(x))/h And what is that? Well, it's nothing other than the derivative of u(x) by definition of that limit. So we have shown that our stretch factor S(x) = u'(x) = du/dx for an infinitesimal rectangle on the x-axis being mapped to an infinitesimal rectangle on the u-axis via a mapping u=u(x). And, of course, to ensure that the Riemann sum on the u-axis spits out the same number as the Riemann sum on the x-axis, we *divide* by that stretch factor (to undo the stretch of the rectangles - remember that the y-displacement doesn't change), or equivalently, we *multiply* by dx/du. So the terms in the Riemann sum on the u-axis must look like f(u(x)) dx/du, and that is the form of the integrand that ensures both x-axis and u-axis integrals produce the same number. As a trivial example, suppose that we want: \int_0^1 1 dx = [x]_0^1 = 1 (i.e. we are integrating a constant, 1) and we let u=2x. Then this stretches the x-axis into the u-axis by a factor of 2, so on the u-axis our limits become 0 and 2. We can construct the x-axis Riemann sum with one rectangle, and so the unfixed-up u-axis integral is: \int_0^2 1 du = [u]_0^2 = 2 This is out by a factor of 2, which is our neglected stretch factor. Note that the stretch factor that we have to divide away is du/dx = 2 giving dx/du=0.5 and the corrected u-axis integral is: \int_0^2 1 dx/du du = \int_0^2 0.5 du = [0.5u]_0^2 = 1 - 0 = 1 as required.
@liviumircea6905
@liviumircea6905 2 жыл бұрын
@@scollyer.tuition Thanks :)
@pinkykushwah4173
@pinkykushwah4173 10 ай бұрын
Very nice
@Ishanka.
@Ishanka. 5 ай бұрын
Fact that he writes in left tells the story.
@PatiparnPojanart
@PatiparnPojanart Жыл бұрын
Thx a lot
@alexgodeye3031
@alexgodeye3031 6 ай бұрын
When we are working with the one-dimensional case, why do we not multiply by the absolute value of g' rather than g', if it's only squishing or stretching we care about?
@MuPrimeMath
@MuPrimeMath 6 ай бұрын
If g' is negative, then this corresponds to "backtracking" on the graph of f(g(x)). We want to count backtracking as negative so that the backtracked region is not counted multiple times when it only occurs once in the original integral.
@alexgodeye3031
@alexgodeye3031 6 ай бұрын
@@MuPrimeMath Thank you for replying.
@AhMedmohamed-ly5td
@AhMedmohamed-ly5td 2 жыл бұрын
Is there proof of why we use the determinant specifically as the Jacobian coefficient?
@hrithiklanghi6418
@hrithiklanghi6418 Жыл бұрын
Nice one
@azmath2059
@azmath2059 4 жыл бұрын
Why does the determinant correspond with how much the area is scaled?
@MuPrimeMath
@MuPrimeMath 4 жыл бұрын
In some ways, that's actually the definition of a determinant! I have a video deriving the formula for 2x2 matrices: kzbin.info/www/bejne/pp3FZ5mQp9mSkNE
@tomaszsidorczuk7784
@tomaszsidorczuk7784 2 жыл бұрын
should not it be g(u) and g'(u) instead of g(x) and g'(x) on matlab right upper and lower drawings?
@MuPrimeMath
@MuPrimeMath 2 жыл бұрын
The function input is a dummy variable because we graph on the same interval either way, so it doesn't matter what letter we use to denote it.
@tomaszsidorczuk7784
@tomaszsidorczuk7784 2 жыл бұрын
why do we define u and v implicitly rather then explicitly?
@MuPrimeMath
@MuPrimeMath 2 жыл бұрын
There's no reason that we must define u,v implicitly, but doing so makes it more straightforward to compute the Jacobian, since the Jacobian is defined using the partial derivatives of x,y with respect to u,v.
@xoppa09
@xoppa09 4 жыл бұрын
All this forwards and backwards transformations are confusing. Why not consider a mapping from u,v space to x,y space, where a small square in u,v space gets stretched by the scale factor J = |xu xv ... | to a parallelogram in x,y space.
@MuPrimeMath
@MuPrimeMath 4 жыл бұрын
That is essentially what we're doing, since the area of any shape is scaled by the same amount by a linear transformation. We talk about backwards transformations to show why it makes sense to multiply by that scaling factor!
@audreydaleski1067
@audreydaleski1067 Жыл бұрын
Let's do tripple.
@jitendratiwari6886
@jitendratiwari6886 4 жыл бұрын
Please also share your email id with us. I am learning optimization in my university will you make video on problems associated with it. this is the topic, in my first semester i studied laplace transform from your channel. 1) Optimization Examples, Concepts 2) Methods of nonlinear optimization - unconstrained and constrained case, optimality criteria, descent methods. Methods for differentiable and non-differentiable functions 3) Shape Optimization 4) Topology Optimization (ESO/BESO, SIMP, Level-Set-based) 5) From Structural Optimization Results towards Additive Manufacturing 6) Optimization under Uncertainties (Robust Design Optimization, Reliability-oriented Design Optimization) 7) Linear Optimization - Simplex Method - Operation Research (optional) 8) Optimization in Networks, e.g. Shortest Paths (optional)
@MuPrimeMath
@MuPrimeMath 4 жыл бұрын
I haven't learned about optimization methods yet! In general, the series that I'm doing will depend on the classes that I'm taking at any given time.
@trigon7015
@trigon7015 4 жыл бұрын
69,420th
@trigon7015
@trigon7015 4 жыл бұрын
Yeah, you really deserve more views. Especially because of the quality of this video: it was very concise and helped me understand this topic.
@CDChester
@CDChester 4 жыл бұрын
1st
19: Vector Fields - Valuable Vector Calculus
4:52
Mu Prime Math
Рет қаралды 1,9 М.
Change of Variables and the Jacobian
13:08
Serpentine Integral
Рет қаралды 294 М.
❌Разве такое возможно? #story
01:00
Кэри Найс
Рет қаралды 1,8 МЛН
Идеально повторил? Хотите вторую часть?
00:13
⚡️КАН АНДРЕЙ⚡️
Рет қаралды 18 МЛН
Can This Bubble Save My Life? 😱
00:55
Topper Guild
Рет қаралды 74 МЛН
Proof: Orthogonal Matrices Satisfy A^TA=I
9:48
Mu Prime Math
Рет қаралды 13 М.
What is Jacobian? | The right way of thinking derivatives and integrals
27:14
The Jacobian Matrix
40:21
Christopher Lum
Рет қаралды 7 М.
how Laplace solved the Gaussian integral
15:01
blackpenredpen
Рет қаралды 738 М.
evaluating a double integral using the Jacobian
10:32
bprp calculus basics
Рет қаралды 10 М.
Oxford Calculus: Jacobians Explained
29:25
Tom Rocks Maths
Рет қаралды 249 М.
Navier-Stokes Equations - Numberphile
21:03
Numberphile
Рет қаралды 1,1 МЛН
Jacobian prerequisite knowledge
9:22
Khan Academy
Рет қаралды 237 М.
What Lies Between a Function and Its Derivative? | Fractional Calculus
25:27
The Key to the Riemann Hypothesis - Numberphile
12:38
Numberphile
Рет қаралды 1,3 МЛН
❌Разве такое возможно? #story
01:00
Кэри Найс
Рет қаралды 1,8 МЛН