The better way to do statistics

  Рет қаралды 256,916

Very Normal

Very Normal

Күн бұрын

Пікірлер: 357
@fernandojackson7207
@fernandojackson7207 5 ай бұрын
I went out with a Bayesian probabilist. She had a great posterior, but too many priors.
@mathboy8188
@mathboy8188 5 ай бұрын
That must a be well-known joke in stats circles. If you came up with that, dude, that's brilliant.
@exentrikk
@exentrikk 4 ай бұрын
Ba-dum tschhh
@cameronhill7769
@cameronhill7769 7 ай бұрын
I used to be a frequentist, but then I updated my beliefs.
@tracywilliams7929
@tracywilliams7929 6 ай бұрын
Lol! Very good!
@hoppybrewologist
@hoppybrewologist 2 ай бұрын
Just Mean
@ricafrod
@ricafrod 7 ай бұрын
As a PhD student who has used frequentist statistics for as long as I remember, I’d only ever heard gossip and rumours about Bayesian statistics, but your video hooked me from start to finish on such a fascinating subject! Great video!!!
@very-normal
@very-normal 7 ай бұрын
Thanks! Despite the title name, I think having both under your belt is better than just choosing a “side” in this nerdy debate lol
@rtg_onefourtwoeightfiveseven
@rtg_onefourtwoeightfiveseven 7 ай бұрын
I'm an astrophysicist, and in our field Bayesian statistics is the way. Great and all, except everyone seemingly expected me to know what an MCMC analysis was (it wasn't mentioned anywhere in the refresher lectures at the start of my PhD) despite never having heard of it before I started. This video was a massive help.
@very-normal
@very-normal 7 ай бұрын
If that isn’t the PhD student experience, I don’t know what is Also, I’m using your comment to brag to my friends that I spoke to an astrophysicist
@rtg_onefourtwoeightfiveseven
@rtg_onefourtwoeightfiveseven 7 ай бұрын
@@very-normal Haha, glad I count as clout to someone. I'm just a humble 1st-year PhD student, but no need to tell them that. ;-)
@very-normal
@very-normal 7 ай бұрын
Oof, first year is tough, it was definitely the most character building I’ve done in a short span of time. Hang in there, and best of luck!
@kadaj131313
@kadaj131313 7 ай бұрын
Half my professors would fight you over this title, the other half would agree with you
@very-normal
@very-normal 7 ай бұрын
😈
@jasondads9509
@jasondads9509 7 ай бұрын
I swear the title changed, what was it before?
@very-normal
@very-normal 7 ай бұрын
nah it didn’t change
@ThatOneAmpharos
@ThatOneAmpharos 6 ай бұрын
@@very-normal what was the probability it would have changed if the probability of change was the probability of it not changing?
@tracywilliams7929
@tracywilliams7929 6 ай бұрын
Lol!
@RomanNumural9
@RomanNumural9 7 ай бұрын
Math finance PhD student here. Great video! Just so you know there's a book called "Deep Learning" by Ian Goodfellow et al. It covers Bayesian stats, including MCMCs and other things. It's a great resource and if you wanna know more about this stuff I found it a pretty reasonable read! :)
@Nino21370
@Nino21370 7 ай бұрын
🔥
@JetJockey87
@JetJockey87 7 ай бұрын
Ray Kurzweil's novel "How to Create a mind" goes into this as well with his description of the Monte Carlo Markov Chain Models he used to build his software that has been an absolute staple in the Medical Industry for dictation and still continues to outperform Transformers models. Dragon Naturally Speaking. Which is really hilarious if you are the kind of nerd to know what that software is as well as knowing who Ray Kurzweil is and what he is more famous for - Singularity-esque Neohuman Futurism propaganda
@lupino652
@lupino652 7 ай бұрын
Yep, a classic, well peaced for someone qith background
@raphaelscaff2399
@raphaelscaff2399 7 ай бұрын
Cool
@luisjuarez7291
@luisjuarez7291 7 ай бұрын
Hey out of curiosity, if you have a doctorate in math finance, do you find that a lot of job opportunities are still based off whether you have a CFA or CPA on top of your degree? (If you don’t pursue being a profesor or researcher)
@qwerty11111122
@qwerty11111122 7 ай бұрын
As an introduction to Bayes theorem, i think that 3b1b really helped me form an intuition about this statistics using his "bayes ratio" of multiplying your prior by a ratio formed by the likelihood and margin to form the posterior, a new prior
@barttrudeau9237
@barttrudeau9237 7 ай бұрын
3b1b was where I first heard of Bayes Theorem. I've been hooked ever since.
@leassis91
@leassis91 7 ай бұрын
this video from 3b1b is a life saver
@fetilu0975
@fetilu0975 25 күн бұрын
The video about screening diseases from a bayesian perspective is especially enlighting. The prior distribution is of such an importance even from a frequentist point of view.
@avenger1825
@avenger1825 7 ай бұрын
I always get excited when I see one of your uploads; I've been studying heavily about statistics coming from a pure mathematics background, and your videos are always very helpful to build the conceptual foundations that textbooks often obscure in favor of specialized, theoretical language. This has already cleared up several things I didn't quite understand about Bayesian statistics, so thank you (for this and your other videos)! :^)
@very-normal
@very-normal 7 ай бұрын
These kind of comments are the ones that get me really fired up (in a good way). The videos are doing what I want them to do, thanks for letting me know and taking the time to comment!
@tomalapapa100
@tomalapapa100 7 ай бұрын
Ive studied math as a degree and specialized in statistics and finance. Had rhe same experience with numerous frequentist clases but few bayesian. Ive studied on my own and with a couple of clases that were available to me in grad. Struggled a lot to get the gist of bayesian statistics. This video is s perfect for people with knowledge of frequientist view who wish to then learn bayesian
@KokomiClan
@KokomiClan 2 ай бұрын
The MCMC approach is one we used with STAN to generate models for the returns of financial timeseries (daily feeds). We had our Garch models setup and every parameter of each model had its own posterior distribution. The power with this was that we could a.) make forecasts continually every day and update it but more importantly, b.) detect changes across all the models when new data arrived. Further to this, being able to run scenario analysis was important and we used this during Covid to estimate recovery times for the assets (i.e. when to switch back into risky positions - out of cash into equities) and it worked really well. We didnt write the model to forecasts returns but rather it proved useful when forecasting volaility. It helps solve the approach of you fitting a model with fixed parameters and then updating it with a new set every 6 months. Instead, uncertainty was baked into the model via its parameters being distributions. The benefit of the Bayesian approach was that we could account for uncertainty better and it worked with our risk management approach: Deploy, monitor, collect, update, test, deploy...etc.
@charlesbwilliams
@charlesbwilliams 7 ай бұрын
Its so cool to see MCMCs get some love. The only use I’ve ever seen of it in my field (Psychology) is in Item Response Theory. Awesome video!
@Bayesian_Wrapper
@Bayesian_Wrapper 7 ай бұрын
It's also used quite a bit in economics industrial organization or in quantitative marketing!
@elinope4745
@elinope4745 7 ай бұрын
KZbin recommended this video to me on my recommended feed. Only about one in twenty videos is any good. The odds were low that someone would make a video worth while, subbed, liked.
@joelbeeby866
@joelbeeby866 7 ай бұрын
My university UG finance course has taught me rudimentary statistics but not to the level that I want. Your videos are genuinely amazing self-studying and really bring out the logic in statistic, which textbooks almost never do. Thank you! Please keep it up!
@very-normal
@very-normal 7 ай бұрын
Thanks for watching! It’s always very encouraging to see they’re helping people out, thanks for taking the time out to tell me
@danielerdody160
@danielerdody160 3 ай бұрын
I took a class on stochastic models which relied heavily on Bayesian methods. This video is helping me better understand my old notes. Thank you!!
@nzt29
@nzt29 7 ай бұрын
Best video i’ve seen on this so far. I like the comparison between the two methods and that fact that you map back the data and parameter variables back to the typical A and B seen in the Baye’s thm definition. edit: I should have phrased this instead as how you connected Baye’s thm to distributions.
@adw1z
@adw1z 5 ай бұрын
I’m in final year of undergrad, my exams on all these topics in Bayesian inference and constructing credible intervals, Bayes decision rules, minimax and admissibility, and sampling methods such as MCMC/Metropolis Hastings are all on it next week 😭 Thank u for explaining this so simply in a way anyone could understand and enjoy, you really did earn a new subscriber
@very-normal
@very-normal 5 ай бұрын
Good luck with your exam!
@mmilrl5768
@mmilrl5768 7 ай бұрын
I’m currently finishing up my first of many statistic courses and the first month of the course we spent on Bayesian statistics then started focusing more on the frequency side of things. I had no idea these are often were even considered different things. Very cool video!
@user-jn7ic7un1e
@user-jn7ic7un1e 7 ай бұрын
Check out conformal prediction
@justdave9195
@justdave9195 7 ай бұрын
Could you please make a video on Generalized Linear Models too? These explanations are soooo helpful.
@gapsongg
@gapsongg 5 ай бұрын
Broooo you are insanely good at explaining stuff. Really nice video! Perfect speed. Perfect visuals. Everything made so much sense.
@RohanKuntoji
@RohanKuntoji 4 ай бұрын
Incredibly well explained with realistic and easily relatable examples! Highly recommend watching this video for a quick & easy grasp over basics, or even to refresh fundamentals.
@hyunsunggo855
@hyunsunggo855 7 ай бұрын
The cool thing about variational inference is that it converts the problem of computing the intractable integral into a more manageable optimization problem of, with respect to the parameters, optimizing some quantity, the variational free energy! This not only makes the problem often easier (through the more flexible variational graphical model) and more tractable (than e.g. MCMC, etc..), but also enables borrowing insights from mathematical optimization theory, to solve the particular formulation of the problem. By the way, this connection to mathematical optimization is why it is called "variational" inference in the first place, directly connected to calculus of variations! Also, VI has amazing applications in deep learning, namely, variational autoencoders (VAEs), in which it's applied to the latent space for the induced probability distribution, for explaining the data distribution, to become much, much more complex, compared to the classical examples you've shown in this video. For example, diffusion models, that can create those amazing images, can indeed be seen as an instance of VAE! Thank you for this great video! I learned a lot! :)
@waylonbarrett3456
@waylonbarrett3456 7 ай бұрын
I'm developing an AI model based on variational inference
@hyunsunggo855
@hyunsunggo855 7 ай бұрын
@@resnon Well, LDMs such as SD make use of VAEs to reduce resource requirements but that's not what I was talking about. You see, mathematically, you can think of the noising & denoising steps themselves as an instance of a VAE. Which includes non-LDMs that operate on the original data domain such as the pixels themselves.
@hyunsunggo855
@hyunsunggo855 7 ай бұрын
@user-ju2pu8cf2l While LDMs such as SD indeed utilize VAEs for reducing resource requirements but I wasn't talking about that specific use case. The noising and de-noising steps as a whole is also an instance of a VAE under a certain interpretation, not limited to LDMs but also pure non-LDMs, that operate on the original data domain such as the pixels themselves, as well. (Idk why but my previous reply was deleted. 🤔)
@xanmos
@xanmos 7 ай бұрын
Very comprehensible video about Bayesian Statistics. I have seen most of your videos and i will recommend it to my students. I am teaching undergraduate basic Statistics and i must stay, ur videos are very well-made. I will put it in my course sites so my students could learn from you as well. ❤
@very-normal
@very-normal 7 ай бұрын
Thank you! I’m honored!
@laitinlok1
@laitinlok1 2 ай бұрын
10:43 this is essentially the total probability theorem in a continuous probability density function, for discrete probability functions, it should use summation.
@cameronkhanpour3002
@cameronkhanpour3002 7 ай бұрын
Great video once again! You mentioned MCMC algorithms, to add some details, they work by constructing a regular/ergodic markov chain that has a unique stationary distribution, and we want that stationary distribution to be the target distribution so you can, say, sample it for inference. So the real question is now, how do you design a transition kernel that (if converges) leads an initial vector to the proper stationary distribution you wish to sample after some burn in time. I know this from a probabilistic graphical models perspective where this is used extensively in the form of Gibbs sampling, a special case of Metropolis-Hastings algorithm, and Rao-Blackwellized particles, which sample certain (more complex/loopy) parts of a network then does exact (analytical) inferencing on the rest. Variational inference is a form of inferencing as an optimization problem, such as in mean field approximation where you choose a simpler distribution Q and compute new parameters that gets it close to your actual distribution P. The main way I have learned is to minimize the KL divergence (relative entropy) between Q and P (in that order since KL is not symmetric). If anyone would like to read more Bayesian Reasoning and Machine Learning by David Barber is really good (IIRC variational inference in particular is in chapter 28).
@KirinDave
@KirinDave 7 ай бұрын
What's funny about this presentation is that it makes it look like the MCMC approach is the one only the deep practitioners use. In reality, non-statisticians who need to use this stuff prefer the MCMC approach because it's *very flexible* and just requires we provide a model and priority and then then an optimizer and the data fight it out in the computation. So in a very real sense, the MCMC approach is easier for non-statisticians and preferred.
@very-normal
@very-normal 7 ай бұрын
Ah yeah, in hindsight my “levels of Bayes” framing does give off this feeling. Not intended, but something for me to think about for future videos. Thanks for your insight!
@derWeltraumaffe
@derWeltraumaffe 7 ай бұрын
I'm still learning frequentist statistics right now in university (first year psychology) so this video still goes way over my head, but it was a really nice overview to get a general idea of the concept of bayesian statistics. All we learned about it is "btw there's a thing called bayesian statistics. ok... let's move on." not kidding.
@very-normal
@very-normal 7 ай бұрын
This was my experience word for word in undergrad too
@figmundsreud9363
@figmundsreud9363 7 ай бұрын
Very nice introduction to Bayesian Statistics. What I like about Bayesian Statistics is that one can in many contexts interpret Frequentist methods just as a special case of Bayesian methods with uninformative priors. Also from my experience many people are just Bayesian for pragmatic reasons because for many problems Bayesian methods just work better (frequentists also just discover the importance of shrinkage estimation and the most generalized way to apply shrinkage methods is through Bayesian priors). So I think the philosophical debate between Frequentist and Bayesian methods is somewhat overhyped. The biggest downside of Bayesian methods is their computational cost. Currently working with a model that doesn't even have a computable expression of the Likelihood. MCMC somehow still works (magic) but estimation even for a relatively small model takes several hours
@very-normal
@very-normal 7 ай бұрын
I think the debate is overhyped too, it’s truly just nerd drama I feel your MCMC pains, best of luck 🫡
@postblitz
@postblitz 3 ай бұрын
This video is fairly difficult because of the maintenance of jargon. I got the sense that the Conjugate or the property of Conjugacy is when the prior distribution has the same shape as the posterior distribution i.e. the new observation doesn't change the prior distribution which means the chosen prior is stable despite the new evidence i.e. you've chosen a particular belief/set of parameters and they fit the data processed so far. I may be wrong on this but that's the gist. Arbitrarily choosing a prior distribution is still a fanciful and mysterious thing given the information of this video.
@fetilu0975
@fetilu0975 25 күн бұрын
Look up 3blue1brown two video's on bayes. One is about the general idea behind bayes and the other is about the importance of the priors and of the bayes factor (which haven't been explained here but are at the core of bayesian inference).
@hugomsouto
@hugomsouto 3 ай бұрын
This video is a work of art. Thank you very much!
@CakeIsALie99
@CakeIsALie99 2 ай бұрын
Being bayesian gives you the ability you conjure up a prior that perfectly matches the result you are trying to demonstrate, truly miraculous
@very-normal
@very-normal 2 ай бұрын
if only it worked liked that lol
@DataTranslator
@DataTranslator 3 ай бұрын
“None of them were sufficient” , clearing throat I see what you did there 😀
@TheOnlyNightmare
@TheOnlyNightmare 7 ай бұрын
Loved this! Definitely a subscriber now 🎉 I got confronted with Bayes in a Seminar where we used various Machine learning and deep learning models with the expectation of already knowing all this prior to starting. It led to me having no confidence in the model results even though they outperformed some other approaches.
@jamesvaughan748
@jamesvaughan748 4 ай бұрын
The sound bite at 8:16 is what made me subscribe 😂
@very-normal
@very-normal 4 ай бұрын
wooooow
@z_wulf
@z_wulf 7 ай бұрын
This video was great. As someone who is studying data science on my own online, I'm 32 right now, I had a hard time understanding the Bayesian Theorem for a bit there. I wish I saw this video way earlier, it would have saved me weeks of wrapping my brain around it.
@very-normal
@very-normal 7 ай бұрын
Thank you! Best of luck on your data science journey, I hope I can help you out more with the statistical portion of it
@tobiaspeelen4395
@tobiaspeelen4395 6 ай бұрын
when i first saw the concept of bayesian statistics, i thought: "wow, thats dumb, having probabilities only rely on your own belief" but now i see that it is a way of deducing what is likely the real probability and im like"WOW, AMAZING"
@ottoludewig1244
@ottoludewig1244 7 ай бұрын
I began studying Bayesian Statistics last year for developing the tools toward an applied study with Climate Change model data, and the main tool that facilitates the posterior calculation is somewhat recent, called INLA. The theory that gives it structure is pretty dense and difficult, but by far is the easiest to implement and has the least computational costs for the cases that it is applied to. A year deep into this I've found it fascinating how these perspectives open up so much compared to the restrictive nature of frequentist analysis. I recommend the Rethinking Statistics course for all audiences and the Gelman book (Bayesian Data Analysis) for a more mature audience with background in math and Statistics.
@marcovitturini9481
@marcovitturini9481 7 ай бұрын
Thanks to your channel i'm considering chosing a biostat MSc. Thanks for explaining and inspiring
@andrewhancock2451
@andrewhancock2451 2 ай бұрын
At 6:40, a Binomial distribution is described as a "probability that an event will happen". I think that's the Bernoulli trial. If one draws from a random variable with a Bernoulli distribution, one gets an integer rather than a head/tail or zero/one. This ambiguity has consequences for my understanding later on. I'm a bit confused by the slide at 10:20. The data D is described as the series of n Bernoulli trials in the 1st equation, but the last equation shows that D is a count of the heads/successes in the n Bernoulli trials. I suspect that D must be the count itself if the distribution is going to be parameterized by p. If so, however, then D is just a randomly drawn count (i.e., a scalar number) and I suspect that there is no "joint probability" associated with D. In the general case, perhaps for a more complicated example, D could a collection of scalars (i.e., measurements, "observations", or observed categorical/quantitative data) and there is a joint probability to think about. As one possible example, I suppose that D *could* be chosen to be a sequence of n Bernoulli trials, but unlike the above case where D is a sampling of a random variable with a Binomial distribution, there are different arrangements of a given number of heads and tails, i.e., one sample value of a Binomially distributed random variable corresponds to many possible sequences of Bernoulli trials. Each such arrangement would be considered a different D. I was intrigued by the reference to non-analytical priors and likelihood functions. Douglas Hubbard has a book "How to Measure Anything" where he has spreadsheets of formulas and data representing distributions. It's tough to grind through, but I found that it was possible to catch a glimpse of the Bayes formula in them. Not enough to make me confident, which is why I've been prowling the internet for years to solidify my intuitive understanding.
@entivreality
@entivreality 7 ай бұрын
Really great explanation! Love the progression from elementary to more advanced topics. A video on empirical Bayes methods could also be cool :)
@Y45HV1N
@Y45HV1N 6 ай бұрын
Really cool video and it's easy to follow. I just think it would be better with the frequentist bashing. I think the parts about what frequentists hope they could do or fool themselves into thinking are more a reflection of misinformed /poorly trained frequentists. Ultimately frequentism has itself two main approaches, the Neyman Pearson NHST approach and the Fisher compatibility approach. The search for p
@very-normal
@very-normal 6 ай бұрын
Bayesians aren’t allowed to take two steps without making fun of frequentist methods
@jamesmcadory1322
@jamesmcadory1322 7 ай бұрын
It’s funny because when I was a Physics major two professors in the department argued over whether frequentist or Bayesian Stats were better and would teach their labs differently based on their preference.
@mathieudespriee6646
@mathieudespriee6646 2 ай бұрын
Nice video, it helped me, thanks. Your explanation of "conjugate prior" at 12:40 just made the things click. I read these words so many times without undestanding....
@TerabyteTy300
@TerabyteTy300 6 ай бұрын
For some reason I always love when someone says “hi mom” in a video. It’s just wholesome and nice to know they are getting their mom’s support.
@q-tuber7034
@q-tuber7034 Ай бұрын
Today I learned: some people pronounce Bayesian like “beige” rather than “Bayes”
@very-normal
@very-normal Ай бұрын
beigians
@lbognini
@lbognini 19 күн бұрын
Me again! Yet other trivialties😂🤣 At 04:00 Using S and W instead of A and B in the formulas would have helped many people out there. This may seem banal but it's important to move from the academic notations, that most people struggle with, to getting more closer to real word. It also helps in thinking about the different events, their likelihood and conditional probabilities, since we're switching them around. P(S|W) is more intuitive than P(A|B)
@trevorgalvez9127
@trevorgalvez9127 7 ай бұрын
I used Bayes Theorem for a simple learning model for establishing categories for various phrases that were similar but not exactly the same. Going through thousands of records manually was possible, but using this allowed me to do it in a day with the help of excel and python.
@thegimel
@thegimel 7 ай бұрын
Great video as the rest of your content. You have a pleasantly simple, intuitive and concise way of presenting the D :) I would very much like for you to dive deeper into the Likelihood in particular, and why it isn't a real PDF even though it can look like one. Cheers!
@simonpedley9729
@simonpedley9729 7 ай бұрын
what I love about bayesian statistics is the way i can get almost whatever results i want by changing the prior
@very-normal
@very-normal 7 ай бұрын
lol i wish that’s how it worked
@jrlearnstomath
@jrlearnstomath 6 ай бұрын
Looking forward to more on variational inference, it's really doing my head in
@stevenjackson8226
@stevenjackson8226 6 ай бұрын
Cool. Nice overview. This is the most rigorous presentation I've seen. I get Bayesian statistics at an intuitive level, but was curious about how it works mathematically. And there it is.
@frankjohnson123
@frankjohnson123 7 ай бұрын
Love the videos, bit of friendly criticism: when working with a specific example (like 3:35 on), it would help to switch from generic variable names like A and B to something more specific to the example (e.g., S and W in this case). It could even be looser for pedagogical purposes, like replacing A with "Sub" and B with "Watch", though I know not everyone likes that.
@very-normal
@very-normal 7 ай бұрын
Ah that’s a good idea I also find it helpful to see the events in the expression but didn’t make the connection there. thanks for pointing that out!
@davidl.e5203
@davidl.e5203 6 ай бұрын
If my understanding is correct, Bayesians are basically frequentists plus moving average. Frequentists draw conclusions about probabilities based on historic observations and take for granted of its probability. Bayesians subset the historic observation by time-scale, then make predictions for the next time-scale. If the next time-scale probabilities don't match the historic probabilities, update probability.
@logaandm
@logaandm 5 ай бұрын
I started off with "Yeh, yeh, yeh. Conditional Probability, meh" and ended with "This is how the world works. I must learn this!" Great introduction on why Bayesian Statistics is so important.
@BrakeForLoop
@BrakeForLoop 6 ай бұрын
Very helpful! I did get a little lost when trying to think about applications from my experiences.
@VTdarkangel
@VTdarkangel 6 ай бұрын
I don't know much about Bayesian methods beyond the most basic premises of it. However, even at that basic level, I can see the power of it. As an engineer, I was really only taught the fundamentals of frequentist statistics. While it has proven useful to understand that (despite my grumbling at the time I took the class), I could see the problem of assumptions being required in the analysis. Bayesian methods seem to account for that.
@alishermirmanov5608
@alishermirmanov5608 7 ай бұрын
Amazing video, provides great intuitive understanding!
@anecetcetera7861
@anecetcetera7861 Ай бұрын
This almost feels like it could be used as a really transparent way to disclaim biases.
@very-normal
@very-normal Ай бұрын
how can something disclaim bias but also be transparent about them lol
@idrankwhatt
@idrankwhatt 7 ай бұрын
Fantastic as always, I am starting on the biostatistics track myself!
@very-normal
@very-normal 7 ай бұрын
Best of luck!
@francomarchesoni9004
@francomarchesoni9004 5 ай бұрын
The best book on this is Jayne's "the logic of science"
@brainsify
@brainsify 7 ай бұрын
A pdf is a density function not a distribution function. At least in the text books I’ve taught from. I understand this isn’t a big deal, but you made a whole thing.
@very-normal
@very-normal 7 ай бұрын
Based on my experience, the terms can be used interchangeably, but I can see where the confusion can come from
@James-bv4nu
@James-bv4nu 7 ай бұрын
Isn't a distribution function, a density function? The area under the curve, f(x), is probability; therefore, f(x) is the probability density. That is, f(x) dx is in unit of probability; that makes f(x) in unit of probability per unit x.
@very-normal
@very-normal 7 ай бұрын
For me, it’s mostly a semantics thing. The pdf f(x) can be referred to as a “probability density function”or a probability distribution function. When I refer to the cumulative distribution function, I’ll make sure to say “cumulative” instead of just saying “distribution”. This is one of those topics where it’s really easy to get lost in the sauce. If I say “probability density function”, then someone will invariably say that I should also include “probability mass function”. It just becomes too wordy for the script, so I stick to probability distribution. As long as I show what I’m referring to, my hope is that people will get what I’m saying
@girishm5880
@girishm5880 2 ай бұрын
3:26 😊 Cute way to ask for subscription using Prior probability
@ResearchStatisticsCorrectly
@ResearchStatisticsCorrectly 3 ай бұрын
Wonderful presentation, definitely better than I have been able to do so far. However, (maybe its somewhere down there in the comments). Bayes did it in 1763, not '1963.'
@KinomaroMakhosini
@KinomaroMakhosini 7 ай бұрын
What a coincidence I am starting bayesian analysis next week on my Stohastic Probability class😂
@RabbitLLLord
@RabbitLLLord 7 ай бұрын
To understand variational distribution better, perhaps understanding variational autoencoder can be a good start
@very-normal
@very-normal 7 ай бұрын
Thanks! I’ve heard about it vaguely, but I’ll look more deeply into it!
@me5ng3
@me5ng3 7 ай бұрын
I had to take bayesian statistics for my machine learning classes. I didn't know know that they aren't as taught in other places, since in Germany they're fairly popular. They are even taught in highschool
@PR-cj8pd
@PR-cj8pd 7 ай бұрын
No, everyone will see Bayesian probability, not Bayesian statistics
@huhuboss8274
@huhuboss8274 6 ай бұрын
I am from Germany too and I don't think bayesian statistics are taught in highschool. May you confuse it with bayes theorem, which also has a frequentistic interpretation?
@NoMoreToxicRule
@NoMoreToxicRule 6 ай бұрын
So, I never knew the thing I hated most about statistics was called the frequencist approach, and I always hated p values and null hypothesis, which is my opinion is worthless. I knew of Bayes, but when I went to school, it was never taught. Great video and now I'm subscribed.
@barttrudeau9237
@barttrudeau9237 7 ай бұрын
This is a great video on the subject. I really hope you produce more content on Bayesian stats. (maybe dive into PyMC.?) Thank you!
@mikestein5983
@mikestein5983 6 ай бұрын
Anyone wanting a deep dive into Bayesian stats as it applies to research should consider the book Statistical Rethinking by Richard McElreath. He has also prepared a semester’s worth of lectures available on KZbin. This is not a quick fix, but essentially a graduate course. It is IMHO quite accessible and doesn’t assume too much in terms of math background.
@dandandan18
@dandandan18 7 ай бұрын
My university teaches both approaches to statistics, but professors and lecturers don't make the distinction known (at least for a bachelor's degree). Bayes' theorem is taught, including how probability densities may differ, how we arrive at the prior belief (or distribution), and the difference in philosophy that affects where Bayesian statistics are often used. Ultimately though, we employ frequentist approaches for theses since it's easier to teach and more common for bachelor's degrees since there's less credibility to design models (i.e., there's little mastery of the field to justify the prior beliefs that will affect the distribution). However, as a civil engineering student, I most often see studies that do not incorporate prior beliefs when modeling real world phenomena that would incredibly affect data interpretation. For instance, studies on flooding, landslides, groundwater flow, the structural health of bridges and concrete buildings, and project management are heavily time-dependent, which makes prior beliefs more significant, but I only encounter risk reports and models that only focus on the current data. I do think that for most studies, frequentist methods are more applicable and would quite suffice given the more theoretical nature of the data and of the methodologies, since most call for single-parameter hypothesis testing. But given the cost of testing materials and tools, I believe the Bayesian approach (incorporating expert knowledge and credible intervals instead of "typical confidence intervals") could be incredibly helpful for handling smaller sample sizes.
@bringbackthedislikecount6767
@bringbackthedislikecount6767 7 ай бұрын
Currently taking statistical physics, found out some similarities between the two, such as prior probability in Bayesian statistics and priori theorem for each microstates of a system in statistical physics. Interesting to learn about Bayesian statistics from a physics major’s perspective nonetheless
@wolfzbyte
@wolfzbyte 3 ай бұрын
Interesting. As it turns out, my undergrad was heavier into Bayesian probability.
@ulrichtietz1327
@ulrichtietz1327 6 ай бұрын
In the realm of chance where priors dwell, Bayesian paths weave complex spells. With data new, we seek to blend Beliefs of old, to truths we bend. A prior’s whisper, soft and slight, Adjusts with evidence brought to light. Yet minds may twist and turn in vain, To grasp the likelihood's arcane chain. Conjugates and posteriors deep, In nested sums that never sleep. A dance of numbers, hard to track, Where certainty, forever, lacks. Through the haze of dense equations, We seek insight, past frustrations. A Bayesian view, so broad, so vast, Yet understanding, comes not fast.
@hopefullysoonaweldingengineer
@hopefullysoonaweldingengineer 7 ай бұрын
haha a brilliant example in the introduction. suscribed.
@feifeizhang7757
@feifeizhang7757 4 ай бұрын
I need to listen again for understanding 😂❤
@realalehomebrewer8273
@realalehomebrewer8273 5 ай бұрын
Did my Masters Degree using the concepts from Bayesian Reliability Analysis applied to radiation carcinogenesis.
@Jaylooker
@Jaylooker 7 ай бұрын
The Bayesian method sounds similar to how neural networks update their nodes to new data. There are Bayesian neural networks that implement Bayes theorem which allows them have a confidence percentage instead of just an answer to some presented data.
@very-normal
@very-normal 7 ай бұрын
Oh that’s cool, I didn’t know that was a thing. I really do be learning a lot from my comment section nowadays
@lbognini
@lbognini 4 ай бұрын
Great video. A little thing about the format: -distractive images -a bit hard to read the text and focus on what you say since the phrasing if often different -The fonts with borders(contours) are hard to read.
@maltez6446
@maltez6446 7 ай бұрын
Where was this video two weeks ago when i was writing my exam project on HMM's, this shit is too hard to comprehend for a second year bachelor student :( such a great video!!
@Helios-1337
@Helios-1337 5 ай бұрын
Bayesian statistics is common in computer science, I’ve studied it at least twice during my degree.
@RexAstrum
@RexAstrum 6 ай бұрын
Thank you for this!
@tommys4809
@tommys4809 6 ай бұрын
Using p notation sometimes indicates discrete random variables and calculating the marginal would be summation, f would denote probability density functions of continous random variable which the marginal could be calculated by integrating with respect to theta
@laitinlok1
@laitinlok1 2 ай бұрын
I have learned both ways in uni, it is interesting
@laitinlok1
@laitinlok1 2 ай бұрын
I think in some ways the Bayesian method for testing vaccines make sense, it is a typical way to say the probability of getting covid given they have taken the vaccine is often used as a metric of how effective the vaccine instead of the probability of getting covid.
@jpnye
@jpnye 2 ай бұрын
Thanks!
@very-normal
@very-normal 2 ай бұрын
Wow thank you, I appreciate it!
@chaosenergy1990
@chaosenergy1990 Ай бұрын
Can we use the frequentist model to generate the prior knowledge if there is none?
@jayceh
@jayceh 7 ай бұрын
As head of experimentation at a tech company, we have to disabuse frequentist-only interpretation of A/B test results Turns out once you care about being right and predictive, rather than academically accurate, you have to become Bayesian
@very-normal
@very-normal 7 ай бұрын
Is this head of experimentation hiring 👀 (jk bayesians rise up tho)
@jayceh
@jayceh 7 ай бұрын
@@very-normal in our early days we had some frequentist disasters from the product management team Wasted a lot of time making things blink because one blinky element accidentally showed significant results Thankfully we've come a long way since then (thanks Thomas Bayes!)
@Unaimend
@Unaimend 7 ай бұрын
Another good one. Thanks
@kef103
@kef103 2 ай бұрын
The probability of a 185 foot yacht named Bayseian sinking at anchor in some freak combination of circumstances ? I think some quantum entanglement double slit craziness had to happen .
@0xoRial
@0xoRial 5 ай бұрын
15:31, it's not 1963, it's 1763 !!
@EkShunya
@EkShunya 7 ай бұрын
my good sir . u have my attention and subscription thank you
@TheThreatenedSwan
@TheThreatenedSwan 7 ай бұрын
Updating my priors right now 🤖
@Megasteel32
@Megasteel32 7 ай бұрын
lmao im taking the required entry level stats/prob class for my comp sci major and this was all our last test was about, good to know that me being super confused was normal.
@very-normal
@very-normal 7 ай бұрын
it’s very normal
@HiVisl
@HiVisl 5 ай бұрын
So I should frequently use Bayesian statistics? 🤔
@waylonbarrett3456
@waylonbarrett3456 7 ай бұрын
Isn't it interesting that P(A|B) is directly proportional to P(B|A)? This seems to imply that when one finds a sequence running one direction, they are likely to also find it running in the opposite direction. Everything is an oscillator.
@varbias
@varbias 7 ай бұрын
TRIGGERED! Frequentist for life 😤 nice video though, as usual
@very-normal
@very-normal 7 ай бұрын
d r a m a (and thank you)
@angelmarauder5647
@angelmarauder5647 5 ай бұрын
It's important to note that priors are not subjective; subjectivity is purely a semantic problem solved by definitive terminology prior to priors priorization 🤓😋😉
@ufuoma833
@ufuoma833 7 ай бұрын
What a time to be alive!
@very-normal
@very-normal 7 ай бұрын
👀 two minute papers watcher?
@nyx211
@nyx211 7 ай бұрын
Ice cream for my eyes!
@provocateach
@provocateach 7 ай бұрын
Likelihoodists: are we a joke to you?
@bokehbeauty
@bokehbeauty 7 ай бұрын
This video could be a nice kick-off to a series where you explain each of the layers and the related methods. As is, it is too packed for me. The assumptions and limitations of the methods would need explanation, as to my experience this is where even Profs get sloppy.
@jrlearnstomath
@jrlearnstomath 6 ай бұрын
Great video, thanks!!
The biggest beef in statistics explained
21:04
Very Normal
Рет қаралды 83 М.
Bayes theorem, the geometry of changing beliefs
15:11
3Blue1Brown
Рет қаралды 4,5 МЛН
СКОЛЬКО ПАЛЬЦЕВ ТУТ?
00:16
Masomka
Рет қаралды 3 МЛН
The Singing Challenge #joker #Harriet Quinn
00:35
佐助与鸣人
Рет қаралды 39 МЛН
數學近年大考試題單元9-2三角函數(考古題)
8:29
正經講數學—JJLin
Рет қаралды 5
What Researchers Learned from 350,757 Coin Flips
25:46
Another Roof
Рет қаралды 47 М.
5 tips for getting better at statistics
17:16
Very Normal
Рет қаралды 25 М.
The Key Equation Behind Probability
26:24
Artem Kirsanov
Рет қаралды 146 М.
How on Earth does ^.?$|^(..+?)\1+$ produce primes?
18:37
Stand-up Maths
Рет қаралды 402 М.
The most important ideas in modern statistics
18:26
Very Normal
Рет қаралды 110 М.
How to systematically approach truth - Bayes' rule
19:08
Rational Animations
Рет қаралды 123 М.
The Bayesian Trap
10:37
Veritasium
Рет қаралды 4,1 МЛН
The most important skill in statistics
13:35
Very Normal
Рет қаралды 322 М.
Is the Future of Linear Algebra.. Random?
35:11
Mutual Information
Рет қаралды 357 М.