How do you DERIVE the BELL CURVE?

  Рет қаралды 110,489

Mathoma

Mathoma

Күн бұрын

Пікірлер: 315
@MomStillah
@MomStillah 6 жыл бұрын
Thank god someone cares to explain this equation that just floats around in the math realm with no explanation from teachers other than, "here!"
@univuniveral9713
@univuniveral9713 5 жыл бұрын
Come on, it is not an equation floating around. There are many derivations in books. Actually this derivation is rather too long.
@wurttmapper2200
@wurttmapper2200 4 жыл бұрын
@@univuniveral9713 The best derivations are the easiest to understand imo
@univuniveral9713
@univuniveral9713 4 жыл бұрын
@@wurttmapper2200 True
@donegal79
@donegal79 3 жыл бұрын
Just because you were too lazy to seek out a proof. Hey, but too easy to blame your teachers. Dufus.
@jameslapp554
@jameslapp554 3 жыл бұрын
@@donegal79 the job of a good teacher is to point a student in the right direction and not just hand wave complex topics. Plus, he did go and look for the proof as is evident in him watching this video.
@malignusvonbottershnike563
@malignusvonbottershnike563 4 жыл бұрын
This video made me so happy, best 35 minutes of my day for certain, maybe my whole week. Cheers for this; if you see this comment 3 years later, you're a legend for taking the time to explain this so clearly.
@nurulc
@nurulc 2 жыл бұрын
What I love about this presentation, unlike some others I have seen, is that it does not skip over any steps. Each step is very clear and easy to follow. Wonderful job sir., thank you.
@youtubeyoutubian7099
@youtubeyoutubian7099 4 жыл бұрын
It’s beautiful to observe that the number of KZbin “likes” decrease as the video is of educational nature and not about useless make-up tutorials etc. This itself IS a proof that the number of curious people actually wanting to understand and therefore watching this helpful video completely are from the “other side” of the Gaussian Distribution. ;-) Thanks for the fantastic job!!
@marvinbcn2
@marvinbcn2 2 жыл бұрын
This video is a brilliant illustration of Einstein's famous sentence: "Everything should be made as simple as possible, but not simpler". The derivation is beautiful, elegant and crystal clear. Thanks so much for sharing your knowledge!
@kerensahardesty9851
@kerensahardesty9851 5 жыл бұрын
This is fantastic! Everything is explained and paced so well; no other video online has derived the normal distribution so clearly as you have.
@Adam-Friended
@Adam-Friended 7 жыл бұрын
"We are more likely to find a dot near the bulls eye." You've oblivious never seen my wife play darts.
@Math_oma
@Math_oma 7 жыл бұрын
+FriendEd More likely to strike someone else in the eye...amirite?
@jorgejimenez4325
@jorgejimenez4325 5 жыл бұрын
wait what are you doing here
@HiQuantumSKY
@HiQuantumSKY 4 жыл бұрын
It was a good joke though... But the last equation, the general one covers that too. Bad at throwing darts?? The value of σ is more. Or if your good at throwing darts but your shots clusters somewhere other than bull's eye. Your μ will be different. So never loose faith in maths, specially in general maths.
@HisMajesty99
@HisMajesty99 7 жыл бұрын
This video is absolutely brilliant! I've always wanted to know how the normal distribution curve was derived, and your explanation was perfect! Thanks so much!!! Math is beautiful
@stevenson720
@stevenson720 5 жыл бұрын
"Maths is beautiful" your so right. 😁
@stevenson720
@stevenson720 5 жыл бұрын
@Ask why? ? differential equations. It's the maths for working out when more than one thing is changing at the same time. As opposed to normal calculus which only changes one thing. Like how far a ball goes if you throw it, if you change both the angle and how hard you throw.
@tzywg3399
@tzywg3399 5 жыл бұрын
J. I can’t really link your comments to your profile photo... it’s so illogical
@knivesoutcatchdamouse2137
@knivesoutcatchdamouse2137 4 жыл бұрын
@@stevenson720 I know my question is about 11 months too late, but regarding your reply, I.e. "differential equations ... for working out when more than one thing is changing...", etc. , isn't that actually what multivariable calculus is actually for? I suppose it really depends on whether we're talking about ordinary diff equations or partial diff equations, since ODE's deal with one independent variable, while PDE's are for multiple independent variables. I'm no math major, just an independent learner and lover of maths, so my response might not be 100% accurate, but if it's not, anyone may feel free to correct me.
@viniciuscaina8400
@viniciuscaina8400 6 жыл бұрын
I'm from Brazil and i just found this class right here. I ask to my professor how to derivate this formula and she did not know it. This was one of the most impressive class that i ever saw, Thank you so much!!!!
@Barba_007
@Barba_007 3 жыл бұрын
I'm so grateful that I found this absolute gem of KZbin, keep posting videos. You probably inspired thousands of people to be more interested in math/science
@inversemetric
@inversemetric 6 жыл бұрын
My physics professor from Greece pronounced it "φ"
@yerr234
@yerr234 5 жыл бұрын
wrong, i'm pretty sure it is pronounced "φ"
@wolfpackdot
@wolfpackdot 4 жыл бұрын
@@yerr234 What's funny is that my professors here in Germany, even though one is from Russia and the other is a German, both pronounce it "φ".
@bukunmiodukoya5039
@bukunmiodukoya5039 3 жыл бұрын
You're all funny😂
@naman4067
@naman4067 2 жыл бұрын
It's φ not φ
@Jocularious
@Jocularious 6 жыл бұрын
Dude this was a great video, keep up the great work!! I love how at 3 am in the morning I am binging on your videos, goes to show that you have skill
@maxwall2924
@maxwall2924 5 жыл бұрын
Wow, this was awesome. I'm reading E. T. Jayne's Probability Theory. In Chapter 7, he performs this derivation, but as is often the case, he assumes the reader is as fluent as he is with functional analysis. This video really helped me fill in the gaps. Can't wait to watch the rest of your videos. Thanks a bunch!
@rileyjeffries1282
@rileyjeffries1282 2 жыл бұрын
I am incredibly new to statistics and have never actually taken a course, but I have taken physics and engineering courses that apply physics; it's pretty neat to see that the dirac delta function is a limiting case of the normal distribution in which lambda approaches infinity, which I realized as soon as you showed how lambda transforms the shape of the graph. Very cool video!
@Dawg89Kirkland
@Dawg89Kirkland 6 жыл бұрын
Great vid. Cleared up much of the mystery of where normal distribution comes from. Have forgotten most of my calculus but was still able to follow along!
@jackhuangly89
@jackhuangly89 3 жыл бұрын
Fantastic explanation, step by step and does not assume the viewer knows any particular mathematical derivation. best video i've seen on the derivation of the normal distribution!
@spanishlanguageeducational3737
@spanishlanguageeducational3737 6 жыл бұрын
Brilliant!!!! This is a MUST for any student of statistics. The way statistics are normally taught... many unanswered assumptions... which are answered by this video.
@carlossp1
@carlossp1 4 жыл бұрын
This is the most clear and complete derivation of the Normal distribution I've seen. Thanks for sharing
@andreweinhorn
@andreweinhorn 6 жыл бұрын
One of the best videos on the internet, love it!
@HareSrinivasa
@HareSrinivasa 6 жыл бұрын
I have always wondered about this formula. Your explanation is the most concise and understandable one even to a novice like me. Thanks a million.
@libertarianspirit
@libertarianspirit 7 жыл бұрын
I'm software developer. I'm pretty comfortable with discrete math and terrible with calculus. But your explanation is so clear that I was able to understand most of it. Thank you for such high quality content. This video deserves more views! I'm subscribed (which happens rarely!).
@wyw4466
@wyw4466 6 жыл бұрын
Great works in this video! After watching this video, I just can't appreciate enough the original inventor of this function Carl Friedrich Gauss!
@sreejanichatterjee9399
@sreejanichatterjee9399 6 жыл бұрын
I don't know what to say but you just saved my life. I have been looking for a proper derivation and not the 5 mins ones for months. Thank you thank you thank you so so so much.
@imrematajz1624
@imrematajz1624 4 жыл бұрын
Beautiful derivation of a ubiquitous formula: the hat trick comes at 22:48! To complete the discovery of this treasure, I am now deep diving to revise my Euler integral. Thanks so much!
@jacoboribilik3253
@jacoboribilik3253 6 жыл бұрын
I love this video, this derivation is spectacular, I love it when mathematics links various seemingly unrelated concepts with each other and yields this beauty. That being said, I also think Gauss' proof in the article you provide is far easier and more accesible to students like myself.
@lucasmoratoaraujo8433
@lucasmoratoaraujo8433 2 жыл бұрын
Thank you! Most simple and well demonstrated video about the subject that I've seen. Congratulations!
@alpha_kappa_357
@alpha_kappa_357 2 жыл бұрын
i was so confused how the "2" comes in the formula now i finally understand thanks :)
@fedexos11
@fedexos11 6 жыл бұрын
Amazing, finally someone that explains completely and holistically how to derive the Gaussian density function. Thank you!
@willdavis2053
@willdavis2053 3 жыл бұрын
Man, what an incredible video! Loved your derivation.
@lilagraham6481
@lilagraham6481 4 жыл бұрын
For the transition from y = 0 to the general case, the 1D equation can be generalized to 2D due to radial symmetry, which makes the x axis equivalent to any other line going through (0,0). Regarding the number of dimensions, a minimum of two is necessary to specify coordinate independence and radial symmetry, which together give the form of an exponential. Lovely, unique video. Thanks!
@giantneuralnetwork
@giantneuralnetwork 7 жыл бұрын
Really well explained. I'm relatively new to this type of thinking and it was illuminating! On the fly definitions of new functions that lead you in any direction you'd like, seems really powerful, but also like a puzzle. Thanks for making this video!
@diegosantoyo3322
@diegosantoyo3322 3 жыл бұрын
Esto es bellisimo, me ayudo de una manera impresionante en mi clase de diseño de experimentos. Simplemente gracias!, no lo hubiera logrado sin ti This is beautiful, it helps me in an impressive way in my experiment design class. Simply thank you!, I wouldnt achieve it without you
@adityakhedekar9669
@adityakhedekar9669 3 жыл бұрын
best video for derivation of gaussian distribution ever.
@soffer
@soffer 6 жыл бұрын
Seriously Mathoma... thank you so much sir. Thank you. You are doing God's work. You will ride shiny and chrome in Valmatha. Excellent video.
@guvencagil5043
@guvencagil5043 3 жыл бұрын
It's a very good video. Thank you. I just wish the frequent commercials weren't as loud.
@standidderen9270
@standidderen9270 4 жыл бұрын
Wow, amazing explanation that cleared up everything about the normal distribution for me! You’re calm way of teaching is very clear and enjoyable, thank you!
@justforknowledge6367
@justforknowledge6367 5 жыл бұрын
Thank you for the brilliant derivation from nearly the First Principles. Thank you indeed. I deeply expect that there should have been a "History of Mathematics" YT channel, along with providing the reasons for the clever decisions taken at crucial steps to derive historically important equations. These steps have nothing to do with computation, but just a "Leap of Intelligence", because of which mathematics has prospered for so long. Began with Pythagoras's proof of root2 being an irrational number.
@atakan716
@atakan716 9 ай бұрын
Thank you for taking your time and explaining it beautifully!
@paula.agbodza5135
@paula.agbodza5135 5 жыл бұрын
Absolutely brilliant! You make Mathematics look like what it's meant to be, simple. Thank you for this great video.
@zairoxs
@zairoxs 7 жыл бұрын
It took me 2 days to understand the concept behind this topic for my statistics class. This video cleared everything up for me. Thx!!
@rineeshparai1780
@rineeshparai1780 4 жыл бұрын
I actually enjoyed watching this video. I expected to learn, but i never expected to enjoy the derivation of PDF. This was fun!
@akenny4308
@akenny4308 4 жыл бұрын
I have been looking for a video like this for so long, a clear derivation from scratch of the normal distribution
@prajwalchauhan6440
@prajwalchauhan6440 3 жыл бұрын
Amazing... This video really helps to understand the Gaussian distribution a lot better. Thank you.
@thattimestampguy
@thattimestampguy 3 жыл бұрын
0:42 Thought Experiment: Dart Board 1:57 Probability Denisty Function; Fi
@SK-ck3qb
@SK-ck3qb 4 жыл бұрын
This is an outstanding video. You have explained all the details with clarity. Thank you!
@kg6395
@kg6395 3 жыл бұрын
We need great explainers like you... Awesomely explained.
@libertyhopeful18
@libertyhopeful18 7 жыл бұрын
hey great idea for a video. i found in undergrad that my stats prof didnt really care to elaborate on this, and so stats ended up being my least favorite math class. turns out you need to understand this stuff to do my actual job, so this is much appreciated
@vemana007
@vemana007 5 жыл бұрын
Awesome explanation...I have been trying to understand the function behind normal curve all this while and this is so beautifully explained...thanks a ton
@sureshapte7674
@sureshapte7674 7 жыл бұрын
this is a fantastic lecture and literally reveals mathemagic of normal distribution curve, i am going to see this again and again and...
@BoZhaoengineering
@BoZhaoengineering 4 жыл бұрын
Class teacher often omit the derivation of Normal Distribution. I always wonder how the bell curve formula is derived. Here the answer I am. Thanks a lot.
@vijayakrishna07
@vijayakrishna07 3 жыл бұрын
I have a neat trick for resolving 14:00. Replace the unknown g(.) by (uoh)(.) where h(.) is squaring function.
@labalimbu829
@labalimbu829 4 жыл бұрын
wow i am watching your video at 12/25/2020 and i suppose your video is my Christmas gift. so beautiful explanation.
@Willzp360
@Willzp360 7 жыл бұрын
Amazingly satisfying video; I've enjoyed your videos for a long time and this was particularly good. Everything is explained at just the right level and the derivation was so logical it all felt obvious by the end. Thank you for putting so much effort into your videos!
@Math_oma
@Math_oma 7 жыл бұрын
+Will Price That's very kind of you to say. My pleasure.
@HarshitaSolanky-bc1yf
@HarshitaSolanky-bc1yf 11 ай бұрын
We love you from the bottom of our hearts. Everything is soooooooooo clear unlike other videos on youtube. I am trying to learn data analytics, on my own from youtube for free Cause I want a neww skill your standard deviation
@skeletonrowdie1768
@skeletonrowdie1768 6 жыл бұрын
you nailed it with this video tho! it's so cool to see that this derivation is actually an insight derived from a multivariable case.
@davidmwakima3027
@davidmwakima3027 4 жыл бұрын
I like how you're writing the integral signs! Anyway this is a super clear video, thank you so much! I'm reading Maxwell's 1859 paper and I wasn't sure where the Ce^Ax^2 came from.
@dopplerdog6817
@dopplerdog6817 Жыл бұрын
Well, good job in explaining this, but it leaves a big question unanswered (as do most "derivation" videos of the Gaussian distribution. Namely, you explained how to derive f(x), the pdf of the x coord of the dartboard, and this is "a" bell curve. We haven't shown this is also "the" Gaussian bell curve of the central limit theorem - it's conceivable that f(x) only roughly looks the Gaussian but is not identical. How do we show f(x) is the Gaussian of the clt?
@hdmat101
@hdmat101 5 жыл бұрын
This video is quite interesting, I got bored one afternoon and I searched the derivation for Normal distribution because I was learning probability questions involving Normal distributions in school where we use a table to find the values for standardized normal distribution where n=1 s=0 using a table . This is really good except that I lack the attention span to understand most of this
@leafyleafyleaf
@leafyleafyleaf 4 жыл бұрын
So now, 26 years after I first encountered this equation in college, I finally know where it came from. A cool thought experiment coupled with some “cosmetics” which (true to their name) conceal its true identity.
@charliebrannigan1275
@charliebrannigan1275 7 жыл бұрын
Thanks so much for the clarity of the video.
@walterwhite28
@walterwhite28 3 жыл бұрын
14:15 How can squaring remove the root? Shouldn't it be sqrt(x^4+y^4) if x-->x^2 and y-->y^2?
@yuhjikuioj7112
@yuhjikuioj7112 3 жыл бұрын
He said Exponentiating them.
@duckymomo7935
@duckymomo7935 7 жыл бұрын
oh, thats how you get e^-(x^2)
@Math_oma
@Math_oma 7 жыл бұрын
+Mi Les Indeed.
@AbdullahPunctureWale-BePgCxx
@AbdullahPunctureWale-BePgCxx 5 жыл бұрын
😮😮😮👏👏👏👌 awesome... for years this has been troubling me how to derive normal distribution equation... trust me I had night mares... 😅😅😅 I am too bad with just mugging up... thanks a ton... I used to wonder how we are able to connect two independent sample space through mysterious Z... now its quite clear. 1. The sample space has to behave like a normal distribution phenomena 2. Anyway the probability density curve will have area as 1... and that makes me to understand t distribution even better... thank you...
@mnsh6313
@mnsh6313 5 жыл бұрын
27:21 I am unable to get the intuition about the variance integral part - how the formula came up?
@andy_lamax
@andy_lamax 4 жыл бұрын
The formula tries to check how much data has varied from the mean value. The square is to generalize those which have varied less (-ve) and more (+ve) to the mean
@shafqat1138
@shafqat1138 4 жыл бұрын
So the formula for variance is Summation of all the Squared Deviations; ie Sum ([X-Xbar]^2). In a continuous setting, we integrate rather than sum, therefore we integrate ([X-Xbar]^2) and multiply it with the PDF. As Xbar, or the mean, is equal to 0, we only end up integrating [X]^2*pdf from negative infinity to positive infinity. Hope this helps man. en.wikipedia.org/wiki/Variance#Absolutely_continuous_random_variable
@DeadlyCuddlez
@DeadlyCuddlez 6 жыл бұрын
This was wonderful, thank you, I'm excited to see the rest of your channel now!
@IAmTheWaterbug
@IAmTheWaterbug 5 жыл бұрын
Terrific derivation! The only potentially ambiguous part is at 21:07 when "A" is written as the total area under the curve, whilst "A" is still written at the top right as "A = -h^2". Probably not confusing to anyone who understands the rest of the derivation, but it still bothers me a bit to have "A" on the screen twice, with two different usages. 😉
@azimuth4850
@azimuth4850 8 ай бұрын
Loved it. And to think that's the just beginning....
@Kriegsherr2
@Kriegsherr2 5 жыл бұрын
Yes. This video Is perfect. I've been looking for this for years. You are wonderful. Thank you so much. Excellent explanation!!!
@Huiando
@Huiando 5 жыл бұрын
Beautiful! Thank you so much. I wish this was around when I went through my MS.
@ahmedidris305
@ahmedidris305 5 жыл бұрын
Crystal clear explanation, thank you Sir for the great work!
@GypsyNulang
@GypsyNulang 7 жыл бұрын
Excellent explanation, and a channel I'm looking forward to digging deeper! I'm a fellow medical student, with an engineering background so these videos really do intrigue me. Got a few questions... - What field within medicine interests you? I wonder if there's any area which is more conducive to your kind of conceptual understanding and deductive reasoning... I kind of put that part of my brain away the last few years, and only use it as a hobby like watching these videos haha - The video's derivation makes sense. And there's a sort of beauty of setting up the normal distribution from a darts analogy (i.e. probability falling off with distance). A rotational symmetry also makes sense. But what's not intuitive to me is why it is a 2D darts setup, and not a 1D or 3D darts setup? Not asking for a derivation, but just curious for any intuitive insights here.
@Math_oma
@Math_oma 7 жыл бұрын
+GypsyNulang I have the intuition that I'll become a pathologist someday although I've been told I have a surgeon's personality. I like pathology because the hours are regular and I'd have time to teach too. We could do a 3D board if you wanted but I like the 2D example because we have experience with that. Maxwell in his derivation works in three dimensions, thinking of gas diffusing from a central source in all three directions. The derivation is conceptually the same, save for a few constants: statistical independence of all coordinates and dependence only on distance from origin.
@GypsyNulang
@GypsyNulang 7 жыл бұрын
Ah thanks yea the multiple axes demonstrate the statistical independence
@stevenson720
@stevenson720 5 жыл бұрын
Clear clean well described well paced, excellent. Thank you.
@zacklee5787
@zacklee5787 6 ай бұрын
In the actual multivariate normal, x and y can be correlated (linearly dependent) so that instead of all points in a circle being equally likely, it's all points in an ellipse. Does the math still work out in that case?
@46pi26
@46pi26 6 жыл бұрын
In your opinion, what looks nicer when written out: Gaussian distribution or the Schrödinger equation? And in general, what formula do you think is most aesthetically pleasing? Also, if you haven't seen it yet, look up Gauss's signature. It's one of the best things ever.
@toniiicarbonelll287
@toniiicarbonelll287 3 жыл бұрын
somebody gives this man a nobel prize
@NickVetter
@NickVetter 4 жыл бұрын
I understand the formula now but I guess I don't know where a normal distribution came from. Like how do they figure out where the dots would be.
@marcossiqueira-mat
@marcossiqueira-mat 2 жыл бұрын
Mathoma, could you include your bibliographic sources in the video description? Thanks.
@CBenky
@CBenky Жыл бұрын
Beautiful explanation
@madhukareppalapelly4239
@madhukareppalapelly4239 5 жыл бұрын
You are awesome bro.. Finally someone cared about proof..
@andrewbetz535
@andrewbetz535 3 жыл бұрын
How did this derivation ensure an inflection point 1 standard deviation from the mean?
@invisi.
@invisi. 6 жыл бұрын
this is so well presented, thank you so much for this
@zhengyuanwei4544
@zhengyuanwei4544 5 жыл бұрын
most beautiful prove I saw!!!
@aBigBadWolf
@aBigBadWolf 3 жыл бұрын
So basically, we want a function that is rotationally symmetric (which requires in 2d only 1 variable to describe) but force a decomposition into 2 functions just like we decompose the norm into x and y. I don't find it intuitive that such a decomposition has to exist. If I choose another rotation symmetric function, should I always expect such a decomposition to exist? E.g. we can change the motivation slightly by describing the dart target not as a plan but as a curved surface (e.g. a ball), then the norm would not be appropriate anymore. Is there still a valid decomposition that gives us f(x)f(y)=f(x+y)?
@grantarneil8142
@grantarneil8142 2 жыл бұрын
Not quite. Your discussing the fact that this is a special case, but missing the reason why this derivation works generally for data which conforms to normal distribution. What I think needs to be touched more on to convey the generality (that I admittedly had to think through during the video to figure out - it did not initially seem intuitive), is that this property of the function: g(x)*g(y) = g((x^2 + y^2)^(1/2)) works for any "n" dimensions. The way its been written is only a special case for the distribution of data being in 2 dimensions. Generally, no matter how we represent a set of data (i.e. for whatever "n" applies), normal distribution should hold true, and so we actually get a property of our function more like this: g(x{1})*g(x{2})*g(x{3})*[...]*g(x{n-1})*g(x{n}) = g(((x{1}^2)*(x{2})^2)*(x{3}^2)*[...]*(x{n-1}^2)*(x{n}^2))^(1/2)) (where the coordinate-axis of the k-th dimension is represented by x{k} - I can't write x subscript(x) properly on my phone, so curly brackets are the best I've got, because who uses set notation inside an equation anyway! 😂). (if you're looking at that on a phone like I am, it may be difficult to see, but it's basically just utilising the fact that pythagoras can always be used to find the distance of a point from the origin in any number of dimensions (i.e. finding the magnitude of a vector)). Notice, while it is necessary for this particular function to maintain this property in any number of dimensions, it does not require huge dimensions to be applicable (it's always going to work). So, if you were to simply plot a point on a 1 dimensional line the very common example of people's height (which we know to follow normal distribution), the densest region will be around the mean (and if we standardise the data, it will reside around 0), and so it follows that the probability of finding someone with any height in a given range will be heighest there. We can then uses the same probability density function as we normally would to describe a continuous set of data. The only reason why he would have chosen to derive this with the 2 dimensional special case, is that it is the smallest number of dimensions in which this property is of obvious necessity (because at 1 dimensions, it pretty much just state that the function exists). I hope that helps. I'd honestly be more interested as to how you could prove the gaussian function is the only solution which holds for all dimensions.
@rhke6789
@rhke6789 9 ай бұрын
excellent pesentation of the derivation. congrats
@rudyorre
@rudyorre 3 жыл бұрын
Great explanation! I followed everything except for the part where you introduced the integral for variance (around 28:00). Could someone clarify where the integral of x^2 * f(x) * dx comes from?
@DMarci2001
@DMarci2001 3 жыл бұрын
Basically the variance of random variable X is defined as the expected value of the squared deviation from the mean of X. E((X-mean)^2) = variance, where E is the expected value. The expected value is basically the same as the weighted mean, only the weighted mean is for discrete values of x, and the expected value is for continous functions of x, that can be integrated. The formula of the waited mean is the sum of all the occuring values multiplied by their weight(number of occurance), divided by the total number of weights(total occurances of each value). In the case of expected values of continous probability density functions, the weights are not the number of occurences of each value, rather the probabilites of each value(f(x)*dx), and since the probabilities add up to 1, the division by the total number of weights, or probabilities - in the continous case - gives only the integral of x multiplied by the probability of x through all values (E(X) = Integral of x*f(x)dx). Since the mean of the normal distribution function is 0, the squared deviation from the mean in this case gives only x^2, so E(X^2)=x^2*f(x)dx. I hope it's a bit more clear, altough I think was quite confusing, so you should read this through several times and really focus on every part, if you want to understand it :D.
@ytfiseveryhandlenotavailable
@ytfiseveryhandlenotavailable 6 жыл бұрын
This was truly beautiful. Thank you so much for the great content!
@ambicagovind6986
@ambicagovind6986 3 жыл бұрын
I am not sure if the equation written at 21:37 is correct; isn't φ(x) the probability distribution function and not f(x)? Why is f(x)dx being integrated?
@grantarneil8142
@grantarneil8142 2 жыл бұрын
It is correct, but only because it doesn't matter what he chooses to represent the particular probability function (may it be a distribution or density function for discrete or continuous data respectively), or the proportionality constant "lambda". I suppose if you really wanted to be pedantic and make sure that everything he is saying is correct, then after the line of work: f(x) = lambda*e^(Ax^2) we would then say: =) phi(x) = (lambda^2)*e^(Ax^2) Now redefine f(x) and our constant "lambda" (which remember, does not depend on how we choose to name it), such that: f(x) = f(x)/(lambda) =) f(x) = phi(x) and thus, from here on, our f(x) represents the probability function which we want. Also, redefine lambda: lambda = lambda^(1/2) (remember: both are still constants) =) phi(x) = lambda*e^(Ax^2) and thus, we may now return to his line of working where f(x) represents this particular probability function: f(x) = lambda*e^(Ax^2) Allowing us set the integral from negative infinity to positive infinity of f(x) equal to 1.
@rodrigoserra2112
@rodrigoserra2112 5 жыл бұрын
Thanks for this! Really, really helpful! Will you make videos about other distributions?
@radiant_rhea
@radiant_rhea 3 жыл бұрын
both gaussian integral and integration of PDF follow same pattern { int.(Ae^(-x^2)) }, but one of them integrates to one and other one integrates to sqrt(pi). I understand Error Function and its derivation. What's the intuitive understanding behind the relation ship?
@bhargavpatel2054
@bhargavpatel2054 5 жыл бұрын
Lots of assumption but worth it !! However for in the case of multidimensional scenario, x^2 + y^2 != r^2 so i think Gaussian distribution might need improvement.
@pianoman47
@pianoman47 Жыл бұрын
This was a great explanation! Thank you!
@michaellewis7861
@michaellewis7861 4 жыл бұрын
How do you intuit the idea that P(x1)P(x2)P(x3)....P(xn)=P(0)^(n-1) • P(sqrt(x1^2+x2^2.....xn^2)
@avmathtech6162
@avmathtech6162 2 жыл бұрын
How is the probability is (phi).dA??
@gyeonghunkang7215
@gyeonghunkang7215 5 жыл бұрын
Great video!! Helped me immensely in understanding where the normal dist. pdf came from. Thnx a lot😆😆
@xoppa09
@xoppa09 7 жыл бұрын
8:00 I am a bit confused a bit here. How did you go from *φ (x) = λ f(x)* to *φ (√x^2 + y^2 ) = λ f(√x^2 + y^2 )* Also according to your argument *φ (x) = λ f(x)* is only true for points on the x axis since *φ (x) = φ (√x^2 + 0^2 )*
@Math_oma
@Math_oma 6 жыл бұрын
+xoppa09 All these functions are of a single variable, no matter what name you give to that variable, call it 'x' call it '√x^2 + y^2', or 'y', or whatever. The observation we make when we see φ(x) = λf(x) is that whenever something is evaluated by φ, this is the same as it being evaluated by f times some multiplier λ. It probably would have been better if I had written φ(•) = λf(•) or drop the argument and write φ = λf instead of φ(x) = λf(x) to emphasize this
@larry_the
@larry_the 3 жыл бұрын
Does anyone know from which book the 2. further reading comes from? ------ Edit: For everyone wondering, the book is called "Probability theory - the logic of science" by E. T. Jaynes.
@Fr0zenFireV
@Fr0zenFireV 6 жыл бұрын
30:51 Why does that satisfy normalization condition? Could you explain?
@rajendramisir3530
@rajendramisir3530 6 жыл бұрын
Absolutely! I enjoy this fresh look of the derivation of this class of Gaussian functions. I like the way you explained
@Caradaoutradimensao
@Caradaoutradimensao 5 жыл бұрын
no words to thank you enough
@lutfilutfi3310
@lutfilutfi3310 4 жыл бұрын
excellent derivation very intuitive,i needed it for understanding gaussian regression
Geometric Algebra - Rotors and Quaternions
36:26
Mathoma
Рет қаралды 16 М.
Why π is in the normal distribution (beyond integral tricks)
24:46
3Blue1Brown
Рет қаралды 1,7 МЛН
黑天使只对C罗有感觉#short #angel #clown
00:39
Super Beauty team
Рет қаралды 36 МЛН
Правильный подход к детям
00:18
Beatrise
Рет қаралды 11 МЛН
But what is the Central Limit Theorem?
31:15
3Blue1Brown
Рет қаралды 3,7 МЛН
The Gaussian Integral
10:09
RandomMathsInc
Рет қаралды 671 М.
What is Jacobian? | The right way of thinking derivatives and integrals
27:14
Visualizing quaternions (4d numbers) with stereographic projection
31:51
Why does pi show up here? | The Gaussian Integral, explained
5:45
This is why you're learning differential equations
18:36
Zach Star
Рет қаралды 3,5 МЛН