The Replication Crisis: Crash Course Statistics #31

  Рет қаралды 98,210

CrashCourse

CrashCourse

Күн бұрын

Пікірлер: 76
@jamestaylor8577
@jamestaylor8577 5 жыл бұрын
This topic is very underrated in sciences at the moment. Probably one of the most important as it's the whole point of science.
@loganv0410
@loganv0410 6 жыл бұрын
As a minor history note: carrots and improved night vision was a cover story from WWII to cover the deployment of radar
@BugRib
@BugRib 6 жыл бұрын
Honestly, I think this has been an open secret in the social sciences for quite some time. Even a layperson like me can often read about studies done in psychology for instance, and see that the methodology used was sorely lacking-especially compared to the certainty expressed about the results.
@tobiasvanbeekhuizen9018
@tobiasvanbeekhuizen9018 6 жыл бұрын
Hey, I have almost finished my major in methodology and statistics in behavioral and medical sciences... and indeed methodology is often lacking in social science research. However let me assure you that statistical knowledge among biomedical and biology students, to name a few, is often even worse (I was surprised to find out). Luckily statistical training is starting to become increasingly important in most of the universities, at least in the Netherlands. And I think most progress will start with improved education.
@TommoCarroll
@TommoCarroll 6 жыл бұрын
I agree - the more repetitions we can feasibly do the better. It's the basis of good science in my opinion! Any one else?
@TommoCarroll
@TommoCarroll 6 жыл бұрын
Flaming Basketball Club Yeah for sure! That's what my channel is all about! Well, science mainly! Why do you ask?
@AbCDef-zs6uj
@AbCDef-zs6uj 5 жыл бұрын
I disagree. If a study cannot be reproduced, it means that the original was extra lucky -- and that isn't something you should take lightly and not appreciate. Studies with irreproducible results are like diamonds: everything gains value on the basis it being rare and special. Replicating scientific results would be like turning your previous diamonds into cruddy lumps of coal, and you wouldn't want that, would you?
@km1dash6
@km1dash6 6 жыл бұрын
A big problem with cutting the alpha level from 0.05 to 0.005 is that it makes the likelyhood of a type II error (false negative) more likely.
@Petch85
@Petch85 6 жыл бұрын
Could you add the links to sources in the descriptions for the studies you references in the videos? This would make et much easier to find them. Also it would be a good practice :-)
@THEMithrandir09
@THEMithrandir09 6 жыл бұрын
If every respectable journal would have those who publish reproduce another study before theirs is considered this problem would go away quickly
@deebmonkey23
@deebmonkey23 6 жыл бұрын
Possibly one of the biggest and bestest Crash Course episodes of all time. Dabs and applause.
@frankschneider6156
@frankschneider6156 6 жыл бұрын
The prime problem is publish or perish. If you just pay a scientist (=prolong the grant) if he publishes, thus if he has to publish to not starve, so he will publish no matter what ... that's exactly as intended. As unfortunately scientific discoveries can't be forced and are often just luck, but the scientist has to publish something ... anything to not end up as a hobo, so he will find something ... anything to publish. If not hell make something up if necessary. Bending the truth is bad .. being without a place to live is worse, so guess what people do ? What do you expect ? Ruining your life. just to be honest ? Seriously .. just drop the measurement that doesn't fit the intended explanation and everything will be fine. That's not even lying or faking data, it's giving people what they want.. The same, but even worse is true for PhD students. Without a PhD you are academically dead meat and can think about driving a taxi for Uber or working as callboy in the future to not starve. To avoid these unpleasant alternatives you'll do anything to find something, somehow that's publishable. If it is reproducible or relevant is completely irrelevant, because you need the money to not starve and pay the rent. In the end it works exactly as it was intended: financial incentives (=not starving) lead to an increase in the number of papers (= formally scientific progress) and everyone is happy. The scientists don't starve the people founding it get their papers, so everything is dandy, well until they find out 20 years later, that 90% of all the stuff published ain't worth the paper it used to be printed on because it is scientific junk (and that's not only true for pseudo "sciences" like social "sciences", but also for hard natural sciences). In economics such unintended outcome due to wrongly set incentives is called cobra effect. The real problem here is that scientific progress can't be forced. If one tries to force people to do something that they can't really influence, they'll find another way to somehow comply with the requirements... and the result is scientific junk. As long as the incentives are set towards mass production of papers and not towards producing real scientific research, this won't change. As a result does every scientist know that a newly published paper ain't worth anything until it has been reproduced several times by independent (=not affiliated) labs. The major problem here is not science ... the problem is people who don't understand science, ruining it by thinking that something unknown and never done before can be solved by just throwing money at the problem and putting pressure on people. That's NOT how science works. That's how paper production works.
@AdamantRecluse
@AdamantRecluse 5 жыл бұрын
Frank Schneider Any chance this is something you’ve studied in some depth or perhaps firsthand? I’m currently working on a research project around this whole mess-from how we ended up here to the whole human nature/incentives side of it-and found your comment aligned well with my present take on the matter while providing new insight and angles I hadn’t considered.
@bakunaut6255
@bakunaut6255 6 жыл бұрын
It's also important to point out that when this is studied, different areas (of say psychology) have different replicabilities. So cognitive psychology does substantially better than social psychology, for example.
@Mister.Psychology
@Mister.Psychology 4 жыл бұрын
I was just whining about her explaining a stat with that slow walking experiment in a former video. And here she is disputing the study she used as an example. Good.
@teunvandenbrand1324
@teunvandenbrand1324 6 жыл бұрын
A point missed in this video is that of publication bias. Rejections of null hypotheses are more likely to get published than failure to reject null hypotheses. Finding no statistically significant effects is not very interesting reading material, but might help prevent other researchers wasting time and money on the same fruitless hypothesis.
@jiholee5237
@jiholee5237 4 жыл бұрын
She is by far the best in presenting seemingly boring subject to the widest group of audiences. Thank you for making this freely available!
@mariksen
@mariksen Жыл бұрын
THANK YOU SO MUCH for explaining such a complicated topic in a simple and understandable manner and in less than a quarter!! 😘😘😘😘😘
@Brainstorm69
@Brainstorm69 6 жыл бұрын
Great topic choice. In Neuroscience and psychology a lot of good things are happening to increase reproducibility. But it's important to remember that a lack of replicability does not immediately something is wrong. It's hard to do everything exactly the same way.
@Desrtfox71
@Desrtfox71 6 жыл бұрын
It does mean something is wrong. Reproducibility is a core element of the scientific process. yes it's hard to do everything the same, that's why all data should be open, and the methodology has to be very accurately described and recorded.
@irwainnornossa4605
@irwainnornossa4605 6 жыл бұрын
I see it little bit differently. The problem is how people think about published results. It should be something „Hmmm, there might be something here.“. And only those studies, which have been replicated and re-tested and analyzed to death, would get published. Just like SW developement. When you first create something, it's buggy.
@MusicalRaichu
@MusicalRaichu 5 жыл бұрын
the problem is that journals don't like replication studies because they're not novel. a published paper is, supposedly, new research, not something done before.
@4Goalkeepers
@4Goalkeepers 6 жыл бұрын
An absolute gem. Thank you for this true Education!
@jessephillips1233
@jessephillips1233 6 жыл бұрын
I wish there was a journal dedicated to re-running experiments. It would only accept papers that were expressly attempting to reproduce papers that had already been published somewhere. Ideally there would even be funding grants awarded to researchers reproducing results.
@unleashingpotential-psycho9433
@unleashingpotential-psycho9433 6 жыл бұрын
I am going to start using the power pose at work and see what happens.
@abcabc-uv6ce
@abcabc-uv6ce 6 жыл бұрын
The importance of understanding math should be pointed out a lot more and be taken more serious .
@victorjozwicki8179
@victorjozwicki8179 6 жыл бұрын
12:22 Did the animation just dab
@JEOGRAPHYSongs
@JEOGRAPHYSongs 6 жыл бұрын
♫ We need to replicate ♫ We need to deliberate! ♫ We see data repeat ♫ It is such a treat! ♫ To see science at work ♫ That is just one perk ♫ Of replication ♫ I feel elation! ♫♫
@lystic9392
@lystic9392 5 жыл бұрын
"No single study is going to show us the way the world REALLY is. But that study and the studies that follow it that do and don't find the same relationships will get us closer and closer." Well said. Though if there is a political or intellectual dogma/homogeneity in universities and work environments it could be that doing more studies will only lead to further misdirection. I do think looking critically at the level of diversity of thought and political interest is going to be part of a solution to improve the accuracy of studies. I also see a point raised repeatedly in comments here and elsewhere, that it doesn't matter what you publish as a scientist as long as you can keep publishing. I have no experience with this, but if true, that doesn't sound like a system that inspires honesty in science. Maybe there is something to be gained by taking a closer look at how scientists keep their jobs. However it may be, this was a well put together, thought inspiring video.
@learninglockit
@learninglockit 6 жыл бұрын
Is the replication crisis something that occurs intentionally or more than likely from simple human error? Also, one may consider whether or not a replication crisis is subconsciously the result of a scientist attempting to achieve results that they already have perceived as true.
@Hahahahaaahaahaa
@Hahahahaaahaahaa 6 жыл бұрын
One major issue that has always been present and widely known forever is that is much easier to publish positive results then negative results. 20 different scientist run the same study and one of them gets results by random chance all of a sudden they're publish results look conclusive. But especially for something like psychology a .05 confidence level could be fine. There's also the pressure to not try to replicate your own findings because then you're basically losing Publications for yourself
@stevieinselby
@stevieinselby 6 жыл бұрын
So the survey found that lots of people thought there was a problem with replication. But when they ran the survey again, they couldn't get it to show there was a problem. Meh. ;-)
@dijonmoutard6647
@dijonmoutard6647 4 жыл бұрын
New survey indicates that surveys are meaningless.
@KommentarSpaltenKrieger
@KommentarSpaltenKrieger 5 жыл бұрын
If we are speaking about the "Study finds: People who drink beer are smarter than people who drink wine" type of research, than I'm not baffled. But still, this is a huge problem. Maybe there needs to be a different vetting process or some sort of re-shaped peer review in which reproducibility is required. This would vet out all of those studies which everybody seems to buzz about. Quality could win against quantity.
@simplisummarizify
@simplisummarizify 6 жыл бұрын
You define replicability and reproducibility clearly in the start, but from minute 5 on, you use "reproducible" when you mean (according to your own definition) "replicable".
@drmcgene
@drmcgene 6 жыл бұрын
I think it’s not intentional and, although simple human error does occur, it is not simply a mistake. The dogma we live by is “ publish or perish”. To be tenured or promoted or get funding you have to have a steady output of papers. So, in many labs there is the concept of the LPU, or least publishable unit. The smallest amount of data for a publication is probably just a couple of p values less than 0.05. Also, in animal studies, there is a commitment to using the smallest number of animals that provide the statistical power necessary to generate a statistically significant result. To demand that all animal studies be replicated will require a societal consensus that we are going to probably triple the number of animals used in experiments. In other fields funding increases will be required.
@frankschneider6156
@frankschneider6156 6 жыл бұрын
Exactly .. if you force your PhD students to make a decison: taxing driving for uber for life or optimizing the data so that thy look better .. guess what they'll do.
@valchonikolov266
@valchonikolov266 6 жыл бұрын
Hey CrushCourse, I am a fan of your channel and i had one idea. I would really love if you could put recommended literature to the courses. Or at least for Statistics.
@Ms1robby
@Ms1robby 5 жыл бұрын
Peer review and subsequent publication also need to be buttressed. Many poorly constructed studies with indifferent support are nonetheless positively reviewed and published simply because they seem topical, shocking and, in short, make great leading articles. Having read many poor papers I've come to believe that evidentiary standards should be most stringent for new and unexpected results. One bad paper can be the pillar for 100 even worse ones. Finally, fewer bad papers would be written in the first place if we remember Carl Popper. Theories can only be DIS-PROVEN! Theories should be excepted only after two conditions are met: the theory can be dis-proven in principle and has survived all attempted and conceived falsifications. We need fewer and better papers.
@flamenco108
@flamenco108 6 жыл бұрын
Thank you for this crashcourse. It was really mind-opening piece of statistics ;-)
@aNytmare
@aNytmare 6 жыл бұрын
DFTBQA! Thank you Adriene! You are awesome!
@AndroidOO3
@AndroidOO3 6 жыл бұрын
How about contested studies. 2 researchers do the same test in isolation at the same time then present their findings and debate eachother to come up with third a combined test.
@AndroidOO3
@AndroidOO3 6 жыл бұрын
NoisyCricket42 It attempts to fix the analytical problems caused by the wide diversity of subjective analysis through a debate towards that goal to create a third test that is superior to the other toe. It doesn't fix flukes but it gives us a higher grade of analysis through reflection and interaction.
@rich8304
@rich8304 5 жыл бұрын
The real issue is the political use and abuse of this problem.A brief study of how our early sailing explorers were able to find themselves longitudinally is a great read and should make everyone sceptical of anything they hear, read or even witness first hand.
@AndroidOO3
@AndroidOO3 6 жыл бұрын
Not all scientists are good scientists, a lesson that nobody learns.
@TommoCarroll
@TommoCarroll 6 жыл бұрын
Unfortunately it's something that people should be more aware of. I've just got a degree in science, I don't practise science (just communicate it nowadays via the y'tubezzz) but the amount of times I see people talking about adjusting the experiment to get the results they want or need in order to qualify for something infuriates me! _getting 'bad' results is part of GOOD science, collect data on everything_
@acetate909
@acetate909 5 жыл бұрын
This is really a psychological study problem. The various hard science fields aren't having this issue to this degree.
@acetate909
@acetate909 5 жыл бұрын
@@TommoCarroll I agree though. Bad science containing false data is worth than no science. Document all studies, even the ones that seem to fail, and utilize the entire spectrum of research data to make scientific progress.
@pozzowon
@pozzowon 5 жыл бұрын
What is DFTBAQ???? Don't forget to be asking questions?
@Vliegendehuiskat
@Vliegendehuiskat 6 жыл бұрын
DFTBAQ? I don't know that one.... Anyone here to enlighten me?
@Tuckems
@Tuckems 6 жыл бұрын
Vliegendehuiskat Don’t forget to be asking questions, or something like that
@tenou213
@tenou213 6 жыл бұрын
Yes, statistical reliability instead of significance needs to be emphasized. Luckily we can now ask these questions even with the knowledge that we will get a lot of pushback. First off: Vitamin C does nothing for your cold.
@lorettap.925
@lorettap.925 4 жыл бұрын
The heck is even "power posing"?
@anakagung7613
@anakagung7613 4 жыл бұрын
Kenneth Rothman already dismiss using p-value
@6thwilbury2331
@6thwilbury2331 6 жыл бұрын
My favorite scientist makes a cameo at 3:08.
@robberyyyyy
@robberyyyyy 4 жыл бұрын
how come their browser is in dutch ?
@lofatmo
@lofatmo 6 жыл бұрын
@Crashcourse where is the data on the red cards study? I've been looking for more data on this for years!
@brianh2771
@brianh2771 6 жыл бұрын
“Power Poses”? Geez, the bar on what can be called “Science” is pretty damn low these days.
@frankschneider6156
@frankschneider6156 6 жыл бұрын
Brian H At least that has a proper solid foundation in behavioral biology. Body language IS an important fact in social behavior. But it gets a lot worse when people talk about "power ties". Seriously.
@EMES365
@EMES365 6 жыл бұрын
I wonder if people take into account that darker skin brings more attention. It makes one look stronger, more aggressive and tone. Hence body builders tanning to look more cut and draw attention to their body definition. My the things you think about when your board...
@armorsmith43
@armorsmith43 6 жыл бұрын
Carrots only help you you see Messerschmitts.
@seaxneat9154
@seaxneat9154 6 жыл бұрын
I promise u power posing is BS
@Eirikr430428
@Eirikr430428 6 жыл бұрын
Stand up straight with your shoulders back.
@mikewilliams6025
@mikewilliams6025 4 жыл бұрын
The major reason that psychology, in particular, is having a reproducibility problem is because psychology, in particular, isn't science.
@TriaMaxwell
@TriaMaxwell 6 жыл бұрын
I thought the title was "The Republican Crisis"
@lordshell
@lordshell 5 жыл бұрын
Let's be accurate here. We're talking government-funded, academic science has a replication crisis.
@marcopolo2395
@marcopolo2395 6 жыл бұрын
this series goes awful
@peterbard
@peterbard 6 жыл бұрын
It seems like you're conflating statistics and science i don't see them as equivalent
@heris6091
@heris6091 6 жыл бұрын
It's related, in psychology and other social studies use statistics to measure effect of behavior or 'other things' to human or whatever related to it :v. If you don't believe me, you can read journal from science direct or other social science journal, and find many of them use statistic to justified their research result.
@rparl
@rparl 6 жыл бұрын
Breaking news: Math Makes Students Taller!
Regression: Crash Course Statistics #32
12:40
CrashCourse
Рет қаралды 712 М.
Degrees of Freedom and Effect Sizes: Crash Course Statistics #28
13:30
SISTER EXPOSED MY MAGIC @Whoispelagheya
00:45
MasomkaMagic
Рет қаралды 13 МЛН
Don't look down on anyone#devil  #lilith  #funny  #shorts
00:12
Devil Lilith
Рет қаралды 47 МЛН
Your Favorite Research Is (Probably) Wrong: The Replicability Crisis
16:07
Is Most Published Research Wrong?
12:22
Veritasium
Рет қаралды 6 МЛН
The Reproducibility Crisis
21:33
Sabine Hossenfelder
Рет қаралды 93 М.
P-Hacking: Crash Course Statistics #30
11:02
CrashCourse
Рет қаралды 140 М.
Your Body Language May Shape Who You Are | Amy Cuddy | TED
21:03
The scandal that shook psychology to its core
29:35
Neuro Transmissions
Рет қаралды 366 М.
Controlled Experiments: Crash Course Statistics #9
12:27
CrashCourse
Рет қаралды 275 М.
What are p-values?? Seriously.
26:00
zedstatistics
Рет қаралды 178 М.
SISTER EXPOSED MY MAGIC @Whoispelagheya
00:45
MasomkaMagic
Рет қаралды 13 МЛН