The Reproducibility Crisis

  Рет қаралды 92,496

Sabine Hossenfelder

Sabine Hossenfelder

Күн бұрын

This is an interview with Dorothy Bishop, Professor for Psychology at the University of Oxford, UK. We speak about the reproducibility crisis in psychology and other disciplines. What is the reproducibility crisis? How bad is it? What can be done about it and what has been done about it?
You can read Prof Bishop's comment for Nature magazine (which is mentioned in the video) here:
www.nature.com/articles/d4158...

Пікірлер: 850
@brianfoley4328
@brianfoley4328 4 жыл бұрын
An outstanding interview...this should be mandatory viewing for Graduate and Post Graduate level students.
@buzz-es
@buzz-es 4 жыл бұрын
And climatologists
@myothersoul1953
@myothersoul1953 4 жыл бұрын
@@buzz-es And climate change skeptics.
@buzz-es
@buzz-es 4 жыл бұрын
@@LuisAldamiz Which makes one wonder why all of their predictions and models are consistently wrong.
@tofu-munchingCoalition.ofChaos
@tofu-munchingCoalition.ofChaos 4 жыл бұрын
@@buzz-es Cite a scientific source which proves exactly this claim (peer reviewed). Should be no problem if you are informed.
@buzz-es
@buzz-es 4 жыл бұрын
@@tofu-munchingCoalition.ofChaos Why don't you watch the video a second time and draw some conclusions on the validity of the "peer review process". Better yet there's a good book you can read about the rampant academic fraud going on ...... "Intellectual Imposters". You guys are turning Science into ideological dogma.
@isabelab6851
@isabelab6851 4 жыл бұрын
My college professor in mathematic statistics class...said to be careful...not to torture the data until it confesses
@u.v.s.5583
@u.v.s.5583 3 жыл бұрын
Here in Statistical Gestapo we have methods and means to extract exactly the information that we want to have!
@Juscz
@Juscz 3 жыл бұрын
THAT is brilliant!!!!
@xaviermachiavelli5236
@xaviermachiavelli5236 9 ай бұрын
​@@Juscz7&3• : ▪
@The_Worst_Guy_Ever
@The_Worst_Guy_Ever 8 ай бұрын
The data was under duress
@AndrewBlucher
@AndrewBlucher 4 жыл бұрын
It's ubiquitous. A student of mine in IT wanted to reproduce some interesting published results. We wrote to the authors and asked if we could have their software. They replied that their paper was a thought experiment! Absolutely not what their paper said.
@SabineHossenfelder
@SabineHossenfelder 4 жыл бұрын
!!
@bernhardschmalhofer855
@bernhardschmalhofer855 4 жыл бұрын
@rrobertt13 Your comment confused me. The double slit experiment has been done many times. Here is a paper showing diffraction patterns using molecules mit a mass of over 10000 hydrogen atoms: pubs.rsc.org/en/content/articlelanding/2013/CP/c3cp51500a#!divAbstract
@vicore5647
@vicore5647 4 жыл бұрын
Dr Ioannides entered in on this issue 15 years ago: Why Most Published Research Findings Are False www.google.com/url?sa=t&source=web&rct=j&url=journals.plos.org/plosmedicine/article%3Fid%3D10.1371/journal.pmed.0020124&ved=2ahUKEwj0rbKCgvboAhVVZM0KHbPJCiEQFjASegQIChAB&usg=AOvVaw3ej46EjYOkYi2cVzeTN8z-
@cottawalla
@cottawalla 4 жыл бұрын
Interesting. The few times I've been contracted to translate a research scientists model from, for example, a spreadsheet into another form I've always found fundamental and sometimes significant math or logic errors that have emerged from the coding itself, not necessarily in the maths that they have developed on paper. These models had already been used as the basis for published papers. Although I'd always note the errors and their consequences in detail and provided what I considered a correct version for them to review, the version I delivered had to be consistent with model as provided, of course.
@robharwood3538
@robharwood3538 3 жыл бұрын
@@benschreyer8295 Yes, certainly! The point is that you run it on *new data* that you've collected yourself. The software encodes the 'method' by which the original researchers analyzed *their data.* So, if you run the exact same, deterministic analysis program on new (not pre-determined) data, you should (i.e. will probably, within a calculable margin of error) get very similar results. If there was some methodological problem with the initial research, however, and their data was biased in some way, then the 'results' they found may very well not be 'reproducible'. The analysis software is deterministic, but the experimental data which it is used to analyze is not! Hope that clears things up a bit. Cheers! 😊
@Prayukth
@Prayukth 4 жыл бұрын
Dr. Sabine is such a wonderful interviewer. She hardly interrupts Prof. Dorothy who brings up some interesting points on why this has become such a huge problem.
@yareps
@yareps 4 жыл бұрын
I came to say the same thing. It is the rare interviewer who lets their guest tell their story uninterrupted. At most, Sabine merely asked pertinent questions to spur Professor Bishop.
@rupertchappelle5303
@rupertchappelle5303 4 жыл бұрын
I agree, she allows the person to talk. the big problem is that when these things are not reproducible we are loathe to dismiss them as bogus, fake and wrong. Convince enough people that they are mentally ill and get them taking powerful drugs and your society fails. Podlings are not very useful. Especially to science.
@jasperlawrence5361
@jasperlawrence5361 4 жыл бұрын
@@rupertchappelle5303 hear, hear
@kyberuserid
@kyberuserid 4 жыл бұрын
Even more impressive considering that she has well known similar issues with the physics community which were barely referred to at all.
@theultimatereductionist7592
@theultimatereductionist7592 3 жыл бұрын
@@rupertchappelle5303 Drug consumption is individual liberty & freedom of choice. Doesn't physically hurt others. Eating murder torturing animals by refusing to go vegan, and burning fossil fuels forcing Anthropogenic Global Warming (AGW) onto others, physically hurts others.
@bazoo513
@bazoo513 4 жыл бұрын
The significance of this cannot be overestimated. Thank you, Drs. Hossenfelder and Bishop!
@JonFrumTheFirst
@JonFrumTheFirst 4 жыл бұрын
When I studied genetics and evolution in the late 1990s, my browsing through journals eventually led me to review articles in ecology that pointed out most of these problems. In my stats classes, power was never mentioned. In our lab, I saw data gone over with different statistical tests until the p threshold was passed. It was clear to me then that there were many problems with the papers that were being published in my field. The truth is that those who knew just kept their mouths shut, and those who didn't know made sure they stayed innocent. Within psychology, they had a different problem - what I call the plausibility problem. That is, they claim to have 'proven' things that didn't pass the 'sniff test' in the first place. Before you do a statistical test, you have to know you're dealing with a hypothesis that is at least plausible - I'm thinking of the famous 'leaning in' study, for instance. Ridiculous on the face of it. What they had was lemmings leading each other off the cliff, all going happily, because 'that sounds cool.' The truth is that many people got Ph.Ds with no training in scientific rigor at all. As we said in grad school, 'there is no science in social science.' But of course, there has been little science in much of academic science for decades. Careerism has reigned supreme.
@oo88oo
@oo88oo 4 жыл бұрын
JonFrumTheFirst, I look forward to universities being bypassed by other ways of learning. Obviously subjects like computer programming can already be completely studied outside the university system.
@simonxag
@simonxag 4 жыл бұрын
I disagree about the "sniff test" - most real science fails it - the spinning earth (when 1st proposed) - Newton's laws (if you quiz most students you'll find they don't believe these now !!) etc. But your last 2 sentences are truer than they ought to be - I can remember stuff from the early 1980s :-(((((
@GregerWikstrand
@GregerWikstrand 4 жыл бұрын
If it doesn't pass the sniff test - what stops you from falsifying the claim?
@JonFrumTheFirst
@JonFrumTheFirst 4 жыл бұрын
@@simonxag The best way for me to reply is to send you to this article about a controversial 'priming' study in psychology .statmodeling.stat.columbia.edu/2016/02/12/priming-effects-replicate-just-fine-thanks/ Much of this kind of priming work just makes no sense - there's no explanation for how you get from here to there - it just happens, due to statistical magic.
@kreek22
@kreek22 4 жыл бұрын
The careerism issue is akin to the ancient problem of bureaucratization: bureaucracies are always interested foremost in expanding there size and power. Politicization of research also reinforces the careerism problem. If, as in sociology, everyone is a Leftist--there is no one to call out the "politically desirable," but unsupported research papers.
@rc5989
@rc5989 4 жыл бұрын
Back in university a long time ago, my sociology professor had only 2 rules for our questionnaire project: 1 Hypothesis comes first, then only run the data against that hypothesis. 2 NEVER run ‘everything against everything’ (fishing for correlations).
@kreek22
@kreek22 4 жыл бұрын
Fishing for correlations strikes me as a useful way to find new research directions--but, it is corrupt to hunt for correlations unrelated to the original conception of the project, then publish those.
@mirkomueller3412
@mirkomueller3412 4 жыл бұрын
At least number 2 sometimes generates very funny correlations...
@JonnyDeRico
@JonnyDeRico 4 жыл бұрын
Anyone that ever tried to build / reproduce something published in a paper, knows the problem that even the most important parameters are often wrong or completely missing. Pressures, flow rates, temperatures, voltages - just guessing everything? 😤
@DavidSmith-kd8mw
@DavidSmith-kd8mw 4 жыл бұрын
Congratulations to Dorothy Bishop. I imagine you are not greeted warmly in all quarters.
@MaryAnnNytowl
@MaryAnnNytowl 2 жыл бұрын
Women who bring up issues with a situation, event, or field generally aren't, even moreso than men who do the same. Having spent over 2 decades in a very male-heavy job, I can attest to that.
@buybuydandavis
@buybuydandavis 4 жыл бұрын
In the internet age, *raw data* should be published and made universally available. Post hoc data analysis is extremely useful to point to question for further studies.
@chuckschillingvideos
@chuckschillingvideos 3 жыл бұрын
Even that is only a partial solution because the public has no idea what subsets of raw data may have been filtered/excluded.
@dtibor5903
@dtibor5903 5 ай бұрын
​@@chuckschillingvideosdata should be organised and labeled correctly
@chuckschillingvideos
@chuckschillingvideos 5 ай бұрын
@@dtibor5903 Yes, but if it has been excluded, it has been excluded. You cannot collate what has already been excluded.
@MrJesseBell
@MrJesseBell 4 жыл бұрын
Sabine, you are quickly becoming my favourite KZbin personality.
@keithtait3214
@keithtait3214 4 жыл бұрын
MrJesseBell she certainly is unique She seems to always have a scheptical view of any well found data She seems to be doubtful of any accepted science like dark matter and dark energy and the Big Bang She is a muck taker in the best way possible Go girl Go
@logaandm
@logaandm 4 жыл бұрын
Michelson-Morley was a null result. One of the most important experiments in all of science.
@RockHoward
@RockHoward 4 жыл бұрын
A null result that went against existing theory though. That made it impoirtant.
@StefanTravis
@StefanTravis 4 жыл бұрын
So a null result is one that goes against the null hypothesis?
@StefanTravis
@StefanTravis 4 жыл бұрын
@Cosmic Landscape Just for clarity, why don't you tell us what you think the null hypothesis of the experiment was - and why.
@WestOfEarth
@WestOfEarth 4 жыл бұрын
You touch on the question I had. Suppose you submit your pre-proposal, and it's approved. But in gathering data and analyzing it, you discover your original hypothesis and proposal was incorrect. What happens with publication in this scenario?
@fewwiggle
@fewwiggle 4 жыл бұрын
@@WestOfEarth The results are still published (if you are talking about her new paradigm)
@xqt39a
@xqt39a 4 жыл бұрын
In my probability class in the 1960s, we were told that the misapplication of statistics was a systemic problem in medical studies. Maybe this information will become understood better because of videos like this.
@Number6_
@Number6_ 8 ай бұрын
People misapply to cover, protect or to gain. Very rarly is it a mistake. Professional researchers cover up fraud to protect reputations and to gain money and position. This is not going to change until scientists stop pussy footing around this issue .
@coppice2778
@coppice2778 3 жыл бұрын
It is quite bizarre that statistics is not a core course in so many areas of science. These people are doing work they lack a core competance to do. In my daughter's biology degree course she was taught you use R, but wasn't taught statistics. This is insane.
@MyMy-tv7fd
@MyMy-tv7fd 2 жыл бұрын
deep statistical concepts require real, deep thinking and wrangling - following a 'recipe' is infinitely easier
@annamyob4624
@annamyob4624 Жыл бұрын
Every scientist should have intro stats and methodology; but not every scientist has a head for it. What we need is for all research, to be overseen and reviewed by stats/methodology experts. I've worked at major institutions that required this; but apparently this is not a widespread practice. It should be mandatory. Both the institutions and the funders need to take responsibility for this.
@maalikserebryakov
@maalikserebryakov Жыл бұрын
@@annamyob4624You are correct and this is an excellent idea. I wonder if the institutions have not thought of this before?
@dislikepineapples
@dislikepineapples 6 ай бұрын
​@@maalikserebryakov I think there is a lack on statisticians as well. Almost all uni courses that are not closely related to psychology got way worse statistics classes at my place. At least that is my impression
@mello.b3373
@mello.b3373 4 жыл бұрын
Exactly the same problems are with nutritional science.
@johnathancorgan3994
@johnathancorgan3994 4 жыл бұрын
Nutritional science is arguably worse, depending so much on observational studies and "questionnaire"-based self-reporting.
@chuckschillingvideos
@chuckschillingvideos 3 жыл бұрын
The problem rears its ugly head in ALL science where financial remuneration is a consideration.
@gbBakuryu
@gbBakuryu 3 жыл бұрын
@Tarzan Well, social sciences are skewed by politics, I'm not sure which is worse.
@anthonymccarthy4164
@anthonymccarthy4164 3 жыл бұрын
@@johnathancorgan3994 I doubt that nutrition has worse methods than psychology regularly allows itself to get away with. Though it's not especially successful.
@anthonymccarthy4164
@anthonymccarthy4164 2 жыл бұрын
@@SimonWoodburyForget That's true, but it's no less true in psychology and especially psychiatry which is intimately tied to the pharmaceutical industry. While it is difficult to do science around something as complex as nutrition, where there are actual physical manifestations that could be studied if it were possible, that isn't true in psychology, never was and never will be. Psychology is science due to the fact that science is whatever scientists let be called that but it can never follow valid scientific methods, nor does it seem to even try to.
@hrperformance
@hrperformance 4 жыл бұрын
I actually think data/information analysis is something that should be taught much earlier than university. It is a key skill to sidestep the bullshit of so many people/organisations that try to misinform others in order to get what they want. Hopefully then we can raise a generation truely capable of seeing the world as it is!
@dawnrock2391
@dawnrock2391 2 жыл бұрын
Hey, could you elaborate a bit? How is your above writing relevant to the topic of the reproduction crisis? I just don’t understand exactly what you mean, but am curious to see more of your logic
@icecreamcancer
@icecreamcancer 8 ай бұрын
The bosses will just fire those academics who try to maintain truth seeking principles
@benwitt6902
@benwitt6902 4 жыл бұрын
There was a time when a person got respect for the achievement of becoming a professor.Not any more, putting politics ahead of science has squandered public trust. Sabine seems fearless and has no sacred cows, she's restoring the faith.
@hankhafliger482
@hankhafliger482 4 жыл бұрын
Are you kidding me she clearly states her sacred cow is environmentalism and "climate change". somehow trying to stear the conversation about the failure of science away from the abomination of truth that was once"global cooling", then "global warming, now rebranded as all encompassing "climate change".
@obsidianjane2267
@obsidianjane2267 4 жыл бұрын
The problem is that a large portion of the public accepts the politics and takes the "science" claims at face value without any critical thought. So any one can make any claim they want today, and it won't receive any question or criticism as long as it doesn't counter the political narrative in both academia and public policy.
@cyberf1sh
@cyberf1sh 4 жыл бұрын
@@hankhafliger482/videos/videos Hank, you sounds like a well meaning guy who has been misled by fake news. There was never any scientific consensus for "global cooling". In fact, most predictions made about global warming in 1970's by climate scientists have been proven true. Source: journals.ametsoc.org/doi/abs/10.1175/2008bams2370.1?mobileUi=0&%C2%A0=& I hope you educate yourself on the topic and stop spreading this misinformation
@obsidianjane2267
@obsidianjane2267 4 жыл бұрын
@@cyberf1sh That paper exhibits the same selection bias as Hank and the Deniers (cool band name). There were plenty of theories published, so its easy to pick and choose based upon hind-sight and political bias. The reason why (some) '70s climate predictions have proven more accurate was because they were conservative estimates based on the acknowledgement of the error inherit in contemporary and geological climate measurement. That is no longer true. Prior work that stated that its data was abstracted from indirect measurement or even completely speculative is now cited as absolute bedrock fact. Its quite apropos that the paper characterizes modern climate science as an "enterprise". When climate science became its own thing in the '80s and 90's, the politicization of it led to ever increasing hyperbole propelled by both the derivative nature of historical climate data that is easy to "interpret" and the need to publish/find funding. And so we find our selves here where people are polarized to either completely dismiss climate change or hyperventilate about a crisis that is always a decade and election cycle away.
@jasperlawrence5361
@jasperlawrence5361 4 жыл бұрын
@@hankhafliger482 You are wrong about climate change, nor does a past error have any influence on the following theory, except perhaps to make the scientists more aware of getting it right the next time.
@LouisGedo
@LouisGedo 4 жыл бұрын
I agree with the other commenters who praise the importance of this interview
@buzz-es
@buzz-es 4 жыл бұрын
LOL.....I see what you did there
@LouisGedo
@LouisGedo 4 жыл бұрын
@@buzz-es Darn.....you caught me! :)
@uhmnope4787
@uhmnope4787 3 жыл бұрын
I'm glad that in my psychology textbook there was an entire section devoted, explaining why the study we just learned about doesn't explain this or that thing, reminding us that psychology is full of theories and finding substantial proof that something works one way and not the other is incredibly difficult.
@dougm275
@dougm275 2 жыл бұрын
They did this in my organizational psych book as well.
@bhangrafan4480
@bhangrafan4480 3 жыл бұрын
As someone who worked in scientific research (molecular biology) for 10 years I can say I fully expected this problem to arise. I'm just surprised it's taken so long to be recognised. The scientific community are a community of human beings like any other, they have pressures, pressures to publish, pressures to produce positive results and foremost of all pressures to get funding. The ways these problems come about varies from sloppy science, i.e. selection bias, to out and out fraud. I often worked with people who would reject experiments which did not give the results they wanted by saying, ("Something went wrong with that one."), while keeping the results that did what they wanted ("It worked properly that time."). From this approach very weak, in fact borderline supporting evidence can be winnowed into very strong supporting evidence for some hypothesis. Some people are just dishonest and know they have to get results if they want to stay on in their careers, so they make results happen. The problem in science is that in theory negative results are equally important to positive results, but in terms of career development the 'great scientist' is the one who comes up with the big idea which is 'proved' correct by experiment, not some nameless person who spent years disproving things but never got a theory or effect named after themselves. Careerism and financial pressures are reasons why people bend their results, it is a structural problem in science. Added to this is the greatly increased commercialism of society, especially as experienced in Universities which are now highly PR focused and want positive stories to publish etc. It is not even only careerism and financial pressures. When I switched to teacher training I mixed with a whole cadre of people from social sciences backgrounds who seemed to share the view that it is okay to fake results as long as you do it for the right political reasons.
@ho-mw6qp
@ho-mw6qp 3 жыл бұрын
Wonderfully put. This is exactly what is happening.
@mikelouis9389
@mikelouis9389 4 жыл бұрын
My good gods, isaac Asimov basically predicted this decades ago. Or, perhaps, he merely incorporated his own experiences in academia into his fiction. I remember this being a major issue in the decadent soon to implode empire his protagonist in his magnificent Foundation series, Hari Seldon ( oddly enough, a psychologist/ mathematician lol) was struggling against. I think that I need to reread his masterpiece with wiser eyes. By the way, Mr Asimov was also a professor of Biochemistry at Boston University, his insights are not to be taken lightly.
@thorin1045
@thorin1045 4 жыл бұрын
It happened in physics and to some lesser degree in other sciences at the end of the 19th century, the idea of we know everything. Than quantum mechanics and relativity happened, and most science realized we know almost nothing. Psychology has the issue of no hard fact based on unchanging rules and stuff, but working with humans and such, that change without any control over time.
@vikramkrishnan6414
@vikramkrishnan6414 3 жыл бұрын
@@thorin1045 Not really. Physics had new experimental facts come in that upended theory. Here the experiments don't even reproduce
@dextrodemon
@dextrodemon 2 жыл бұрын
he's a psychohistorian not a psychologist :p
@jwarmstrong
@jwarmstrong 2 жыл бұрын
@@dextrodemon Hari was a mathematician who develops & works on psychohistory & in the real world Lloyd deMause developed a formal psychohistorical approach from 1974 onwards
@kubhlaikhan2015
@kubhlaikhan2015 Жыл бұрын
When I did my Psychology degree, the reproducibility problem was solved by penalising you for doing genuine research and rewarding you for making the results up in the library.
@yv6eda
@yv6eda 4 жыл бұрын
Great conversation! Thanks Sabine!
@erichodge567
@erichodge567 4 жыл бұрын
Absolutely first-rate, and very much needed. Thank you!
@jdenmark1287
@jdenmark1287 2 жыл бұрын
Thank you, thank you, thank you. I found this so distressing as a non-traditional undergrad and seeing this going on everyday in my psychology department. I still have to deal with it on a graduate level in public health. Mostly with people that think referencing a paper is an adequate argument, without having ever actually read and understood the methods section of the paper.
@dbudelsky
@dbudelsky 4 жыл бұрын
Reminds me of the "Study Finds that Childhood Leukemia Rates Double Near Nuclear Power Stations" which was a big hype in the beginning of 2012. Never in the hysterical reports was stated what "doubled" means, only that the stats were bad. I think in fact it was 4 positives instead of 2 in a comparison group. The test was made in France were there were 2,700 total cases of the type of child leukemia in question over 5(?) years, so the correct answer would simple be "not measurable".
@keithtait3214
@keithtait3214 4 жыл бұрын
Kaspar Hauser without sensationalism the newspapers wouldn't have anything to print
@TheEulerID
@TheEulerID 3 жыл бұрын
There were several such studies (one around the nuclear facility at Windscale aka Seascale). One of the main problems is that the researchers would adjust the size of the area around the facility to find something which appeared to be statistically significant. What's more, they didn't include control areas; when it was done, it was found that there were other childhood leukaemia clusters not associated with nuclear facilities. What's more, the same leukaemia clusters were not found round other nuclear facilities. www.ncbi.nlm.nih.gov/pmc/articles/PMC4146329/
@nova9651
@nova9651 4 жыл бұрын
I love how this is a perfect intro for your next video, Sabine
@jaimeriveras
@jaimeriveras 4 жыл бұрын
Excellent interview. Confidence is a delicate thing; it can be easily lost. Efforts to preserve it, such as this one, are well worth the effort.
@celiacresswell6909
@celiacresswell6909 Жыл бұрын
You’re not wrong! After the last 2 years, what can we trust? Tortured stats, obscure and unpublished method and data, complacent professionals….in many situations I think anecdote and personal experience is more reliable.
@tofu-munchingCoalition.ofChaos
@tofu-munchingCoalition.ofChaos 4 жыл бұрын
A very simple hack: Consider a first paper (usual p-values...) not as a "proof" (not as statistical inference) but as a paper using exploratory statistics. Now papers testing - rejecting or not rejecting - the hypothesis of the exploratory paper are the "proofs" or "disproofs" of the hypothesis. Beneficial side effect: Robustness of the result.
@raminagrobis6112
@raminagrobis6112 4 жыл бұрын
I wholeheartedly endorse the treatment of data reproducibility, Sabine. I have started noting trends in medical science where the "conclusions" (not just the method) of a given research project are announced in advance. Such bogus research is merely confirmation of an already prevalent or intuitive notion of answers to a prospective question and is completely anthetical to "good" research. The worst part is that such projects may succeed better in getting funded since they appear like a "safer" bet on where to inject money into various disciplines.
@raminagrobis6112
@raminagrobis6112 4 жыл бұрын
@@thealienrobotanthropologist You are absolutely right with your assessment. Being a biophysics- and biochemistry-trained biologist with a strong interest in physical chemistry as a framework for analyzing questions in cell biology, I have often faced (1) deep misunderstanding of the true nature of problems at hand among colleagues with a "softer" training in chemistry as well as (2) a flawed evaluation (imho) of the relevance of the topics on which I published. What I'm most sorry about for people who respond this way is how much excitement and sheer satisfaction one gets when one obtains a "hard" answer to a complex biological phenomenon thanks to a "pure science" approach instead of the mumbo jumbo and buzzwords that drain all the attention publicly. Because that's how the real world works: easy answers are preferred because they hurt your brain less, if you struggle to put some order in a more cluttered mind, which sometimes describes biologists' minds rather accurately as compared to mechanistically oriented ones....
@custos3249
@custos3249 4 жыл бұрын
Welcome to what happens when business is allowed to pollute and displace objectivity. A result is right not because of its verified conjuncture with reality but by how much pizzazz marketing can smear on it.
@chuckschillingvideos
@chuckschillingvideos 3 жыл бұрын
@@custos3249 BUSINESS? Try gummint. The vast, vast, vast majority of pure science research (ie physics, for example) is funded by government research grants. These grants are always awarded to scientists conducting certain types of research considered likely to reach politically desirable conclusions.
@canonwright8397
@canonwright8397 4 жыл бұрын
Thanks for this. Glad to know someone somewhere is working on this problem.
@djbabbotstown
@djbabbotstown 4 жыл бұрын
Thanks for the vids Sab. Know you’re busy keep them coming.
@esmenhamaire6398
@esmenhamaire6398 Жыл бұрын
Superb! Thank you so much for this, Drs Bishop and Hossenfelder!
@MarkSMays
@MarkSMays 4 жыл бұрын
I like your science posts. This review of creditability is very timely. Thanks.
@katg-gk5ox
@katg-gk5ox 3 жыл бұрын
It is so good to see topics and interviews like this at the level you can bring to them. I am so tired of seeing topics about the sciences written by people who have a journalism degree and nothing else.
@RonnieD1970
@RonnieD1970 4 жыл бұрын
Excellent discussion and VERY IMPORTANT interview! Two superstar scientist!
@TheTrumanZoo
@TheTrumanZoo 4 жыл бұрын
great interview. amazing honesty.
@tom23rd
@tom23rd 3 жыл бұрын
this video restores quite a lot of my regard for scientists, and thus science. tackling such problems in an open way tells me that science does indeed hold itself accountable, and seeks integrity. thank you for this
@theultimatereductionist7592
@theultimatereductionist7592 3 жыл бұрын
I agree with your hatred of the wrongness of calling an experiment that confirms the null hypothesis a "failed experiment". I hate that stupidity, too. However, I am highly skeptical about all your other claims about what all other scientists do.
@rbarnes4076
@rbarnes4076 4 жыл бұрын
It was highly apparent to me years ago that very little actual science was done in certain branches. I'm glad there is a push to improve rigor in some of these disciplines. We might actually start making real progress. Sabine, thanks for shedding a light on this issue. It is one of the best things I've seen in years in science.
@steveroberts
@steveroberts 4 жыл бұрын
Fantastic stuff Sabine. Thank you
@mirkomueller3412
@mirkomueller3412 4 жыл бұрын
Didn't even notice that this kind of problem exists - even though i always had the feeling that sometimes not the whole story was told, or - at least - was embellished towards desired results. Great step forwards. This will make the world a better place. Hope in humaity restored...
@threedot141
@threedot141 4 жыл бұрын
Fantastic interview. I think every working scientist should watch this. It certainly makes me think more carefully about my own work.
@kevalan1042
@kevalan1042 2 жыл бұрын
Very good! The technique of pre-publishing based on methods is an excellent idea.
@DoctorZaeus
@DoctorZaeus 4 жыл бұрын
This was excellent. Thank you.
@williammorton8555
@williammorton8555 4 жыл бұрын
The issue of reproducibility is one of the major problems facing not only psychology but many different fields of study such as sociology, nutrition, economics; all of the soft sciences. The soft sciences have wrecked havoc on cultures in the last 100 years. Excellent interview scratching the surface of a profound problem.
@judgeomega
@judgeomega 4 жыл бұрын
hence why they are called 'soft', and why some have such disdain for them
@MaeV808
@MaeV808 4 жыл бұрын
I've read even biomedical sciences and medical/phamaceutical studies are affected not just soft sciences. It's good people are willing to acknowledge and address it. I fear some people use this crisis as a means to dismiss science, soft or not.
@wesbaumguardner8829
@wesbaumguardner8829 3 жыл бұрын
I love Sabine's courage and audacity. She is a real truth seeker.
@toddq6443
@toddq6443 4 жыл бұрын
"We have a problem, but we're working on it." Yes my brilliant friend, that's why we keep watching your videos. 😃 TNQ
@dennisdonovan4837
@dennisdonovan4837 4 жыл бұрын
It’s great to see, listen and learn from two (obviously) well educated women … ❤️👍🏽❤️
@jasperlawrence5361
@jasperlawrence5361 4 жыл бұрын
Sabine is a wonderful interviewer, and presenter for that matter.
@robharwood3538
@robharwood3538 3 жыл бұрын
Prof. Hossenfelder, I just watched this video a second time (after already having given it a thumbs up some months ago, but being a bit hazy on what had been covered in it) and I wish I could give it *another* couple of thumbs up! Wonderful interview! I actually used the Closed Captions (CC) to read along to make sure I would retain all the information better in the future. 😅 I'm glad I did, because it really gave me a much deeper appreciation for Prof. Bishop's points and various ideas for solutions. I feel like at this point in human/scientific history, we have so many journals and so many scientific articles going through the publication process that it may now be worthwhile to start to *formalize* the study of the scientific method (especially the whole peer-review and publication aspect of it) into a kind of Scientific Study of the Methods of Generation of Scientific Knowledge from Scientific Research -- whatever such a field should be called: perhaps something like Meta-Science, or Scientific Epistemology, or -Scientolo- (err, forget I said that! 😉), or Researchology, or something or other. This comment got kind of long. 😅 Skip ahead to "TL;DR: Main Idea" if you prefer to get to the main idea first. Otherwise, a little bit of ... *Preamble and Justification* I know there's already a couple of existing fields, Philosophy of Science, and History of Science which purport to study the methods of science, but what I'm proposing is something more like a well-grounded _Science_ of Science, based on actual scientific research into observations of the natural phenomenon of 'knowledge generation via various well-defined methods'. Of course initially this would be focused on how we humans ourselves perform the scientific method, but presumably the same 'meta-science' could be applied to: * machine learning and automated science (which is already a thing), * potentially to other animal species developing their own cultural knowledge base(s) (which arguably already occurs at least in chimps, though of course far more limited than human science, but still a potentially interesting field for comparative study), and, * who knows?, maybe one day to the methods of science we observe in some extra-terrestrial intelligent species/culture (even if it happens to have gone extinct long ago and all we have to go on are some artifacts) discovered somewhere out in the universe, assuming we ever get there. Personally, I think Scientific Epistemology would be a decent name, though perhaps it doesn't emphasize the specific kinds of 'meta-research' I'm envisioning quite enough. Which leads to the ... *TL;DR: Main Idea* For example, my main idea of what could be done, inspired mostly by this specific interview, is to start a *journal* (online, printed, or perhaps/probably both) which a) itself employs the 'best practices' of research that the new field of (for the sake of discussion) Scientific Epistemology has established, and b) publishes various actual studies of *other* past, present, and future scientific publications (journals and papers) in various existing scientific sub-fields: Physics, Chem, Bio, Psych, Sociology, Anthropology, Archaeology, etc., and perhaps even borderline subjects like History (ref. "Note on History" below). Such studies in this new journal could, for example, specifically perform 'reproducibility reviews' of specific fields (like Psychology, which sparked this interview), or specific journals or publishers, to establish a kind of 'credibility' or 'reproducibility' rating for different fields or publications. Such studies might be done by randomly sampling from among the previously published papers in a specific field or publication, and then performing reproduction 'sub-studies' on each of the results. I believe Prof. Bishop mentioned a study which did just this within Psychology, and that's where the 35% reproducibility figure came from, if I'm not mistaken. My main idea is thus to expand this into an entire field of science on its own, specifically with the establishment of a new journal (or many) specialized around publishing such studies (and many other kinds of inter-related studies besides) for *all fields of science,* not just ones that have such glaring problems like Psychology. It could even perform the same kinds of studies on Scientific Epistemology *itself,* so that we can eventually learn how well and/or effective Scientific Epistemology itself is! *Conclusion* Thus, this new field (whatever it's called) and new journal(s) focused specifically within this field would be able to *quantify* and even develop working hypotheses and theories about the natural phenomenon of *science* itself, and all of its sub-fields, and thus be able to provide us with insight on: which sub-fields have reproducibility crises, how bad (or mild) they might be, and (especially) what changes might be undertaken to improve reproducibility within those specific fields (different sub-fields may be in need of, or simply are more amenable to, different methodological 'fixes'). Of course, developing such a field and journal(s) would require significant public investment, but I could imagine a near future where governments and/or research institutes would become sufficiently aware of the reproducibility crisis (and not just within psychology), and could be motivated enough to invest in further study. Also, there's the issue that some studies are initially so time consuming and/or expensive that literally 'reproducing' their results would be far outside the budgets of a fledgling new scientific field. However, as the field itself develops, and establishes new ideas about how the concept of 'reproducibility' can be achieved in different ways (I imagine the field of Information Theory (e.g. information entropy, Bayesian Inference, etc.) could help untangle this), I would wager that even establishing the theoretical 'reproducibility' (or some analog or approximation of it) of hugely expensive experiments such as from the LHC, or even unique historical events (see "Note on History" below) could be adequately established. Anyway, I could go on with many more ideas about the broader aspects and applications of 'Scientific Epistemology', beyond just the one issue of reproducibility crises, but I think the comment is too long already. 😅 Hope you found it worthwhile to read, and apologies if not. Cheers! 😊 *Note on History* Check out Dr. Richard Carrier's exploration of applications of Bayes' Theorem to resolving questions about unique historical events and reforming the traditional methods and criteria of the study of History; in his books _Proving History_ and _On the Historicity of Jesus,_ for example.
@kerryjlynch1
@kerryjlynch1 3 жыл бұрын
Thank you for exploring this important issue.
@PaulBrassington_flutter_expert
@PaulBrassington_flutter_expert 4 жыл бұрын
Great video and another example of showing how difficult the Truth is to find, well done!
@mylespowers3965
@mylespowers3965 3 жыл бұрын
Thank you. Hopefully more light on this topic in all disciplines will make significant changes.
@billwehrmacher3842
@billwehrmacher3842 4 жыл бұрын
How refreshing to listen to really intelligent people.
@lawrencejohnson3259
@lawrencejohnson3259 4 жыл бұрын
Excellent video, interview!
@FB0102
@FB0102 3 жыл бұрын
super important work, great job!!!
@joba1560
@joba1560 4 жыл бұрын
Thanks for tackling that issue. As a part of the public I always assumed science tries to get rid of bias as good as possible. Once done, it will boost progress for sure, even if it is harder at first, if knowledge is built on hard facts, not concious or unconcious wishful thinking.
@williamkacensky1069
@williamkacensky1069 4 жыл бұрын
Excellent reference point.
@wanderingquestions7501
@wanderingquestions7501 2 жыл бұрын
Sabine continues to demonstrate she is not only a good speaker but she is also a very good listener
@hyrocoaster
@hyrocoaster 4 жыл бұрын
19:56 wonderful. I remember discussing this weird practice with my fellow students in my bachelor's. We thought we were expected to deliver something "exciting" rather than something reliable, that could really be backed up with data or be disproved by it. Negative outcomes were not wished for. So everyone either avoided choosing a topic that might not create the desired results, or we just randomly collected information first and then created a story around it.
@sonomabob
@sonomabob 4 жыл бұрын
It is so pleasant to listen to intelligent people speak. Soothing. And rare.
@xreed8
@xreed8 4 жыл бұрын
Thank you for answering why theres a reproducibility crisis early and very succinctly.
@applewoodcourt
@applewoodcourt 4 жыл бұрын
As an undergrad, majoring in psychology, I did a field study class with two professors. The experience turned me off to psychological research and studies, and caused me to have a skeptical view of any scientific studies. It was obvious to me, with only basic statistics classes under my belt, that the study was fundamentally flawed, yet our study and results made it into a journal publication. My "lowly" associate professor (sarcasm) taught Statistics 101 with the companion book "How to Lie With Statistics" and it profoundly influenced my view of research studies. There is also a "peer review crisis" but that is a topic for another video. Cheers!
@elitav5491
@elitav5491 3 жыл бұрын
Thank you, a great and interesting interview!
@brianarbenz7206
@brianarbenz7206 3 жыл бұрын
Many institutions today are under pressure to "score a big hit," meaning to get their college's name in sound bites and social media posts. Hence, a finding's evident impact is treated as more important than the quality of the methods.
@geanderbacheti2724
@geanderbacheti2724 7 ай бұрын
Nice interview, love your videos Sabine, thanks!
@matthewbrightman3398
@matthewbrightman3398 3 жыл бұрын
Couldn’t get past first base? Not a phrase I would have expected! Love it!
@iramkumar78
@iramkumar78 2 жыл бұрын
Thanks Sabine for talking to a Psychologist. This is good.
@dr.nityasagarjena4512
@dr.nityasagarjena4512 7 ай бұрын
Negative results of a study should also be encouraged to publish, and get considered as a scientific work. Hope it will minimise the process to make everything positive even if it doesn't.
@josdelijster4505
@josdelijster4505 4 жыл бұрын
Very well done, very interesting
@hrishikeshbalakrishnan3762
@hrishikeshbalakrishnan3762 3 жыл бұрын
Wonderful interview
@HarshColby
@HarshColby 4 жыл бұрын
People only have confidence in science because it's rigorous. The moment it becomes "whatever you want to show", that's when confidence goes away.
@jaimeduncan6167
@jaimeduncan6167 4 жыл бұрын
It's not rigorous in general. Did you pay attention to the video? People have confidence in science because they believe it's rigorous and that many individuals and groups reproduce the studies, but it is not, and they are trying to fix it. Can you imagine as a student starting a career trying to reproduce a result from an important group? 1. It won't be important, 2. Do you really want to angry the people in the important group? What if you are a male and the head of the important group is female? do you want to be the misogynistic dude that go after women? What if it's an old venerable dude? do you want to be the dude that went against the luminary? What happens is you do, and you get published and then you are proven to be wrong? It's not the science, but the fact that science is done by bureaucracies (that is necessary, today's problems are to complex for the lone wolf).
@HarshColby
@HarshColby 4 жыл бұрын
@@jaimeduncan6167 It's rigorous, in general. It's not perfect and needs constant attention. 1. A student like Wassim Dhaouadi or Srinivasa Ramanujan? The reason students don't regularly get known for breakthroughs is they don't generally make breakthroughs. Even Einstein had problems at first. Prove you're right and you get recognized. 2. Discoveries by female researchers (Williamina Fleming, Vera Rubin, etc) often have been suppressed, but it's a problem being addressed. I'll agree today's problems are too complex for one person, but even a group of researchers need to present their data for peer review. Some bad papers make it through the process, but it's far from "not rigorous in general".
@jaimeduncan6167
@jaimeduncan6167 4 жыл бұрын
@@HarshColby You don't have to be a student (by the way, Ph.d students do lots of reasearch), you can be a young researcher. I don't know if you are in research, or have close friends that are, but if you are you will agree with me about the importance of connections. The important of having your papers quoted (citations as they say). This is a complex *social* problem. This need to be address so science deserves the reputation it has. By the way, I am fully aware that Science is the best model of the world we have, and I am aware of its power. Also some disciplines are more rigorous than others, but we have a mayor issue. People go to jail because and laws get written because of bad science, and actions are not taken sometimes because some nut-head don't understand that the scientific models are the best we have.
@HarshColby
@HarshColby 4 жыл бұрын
@@jaimeduncan6167 I agree with you here.
@MarkRaymondLuce
@MarkRaymondLuce 4 жыл бұрын
A great interview with Dr. Bishop at Oxford Sabine, I had no idea the reproducibility crisis was also rampant in the field of psychology - I did know that it was rampant in the field of physics and medical studies, but it only makes sense that it is found in the field of psychology also, and now I suspect in all sciences; I get a very strong feeling that this video interview is the foundation of which is going to lead into a very revealing well thought out in-depth statement on the state of particle physics in your next or one of your upcoming videos. It's nice to see and hear you laugh.
@toddq6443
@toddq6443 4 жыл бұрын
I wouldn't be altogether surprised if your suspicion is correct. TNQ
@keithtait3214
@keithtait3214 4 жыл бұрын
Mark Raymond Luce small sample sizes and cost of getting complex data..Sounds like they need more money
@kennethcastle847
@kennethcastle847 4 жыл бұрын
This was a wonderful discussion. I had no idea that there were problems of this sort affecting scientific papers. I had naively assumed that refereed journals would correct the problems. I agree that it is better to talk openly about as it shows how science does try to make sure that results are valid and reproducible, as a process unto itself. I like the idea of defining the problem first, with the analysis methods defined ahead of the data collection, as a good way to re-program the thinking on fallibility. It is OK to fail, in the sense that you don't get the result you thought you should. This is how science moves forward as it then makes you think deeper about the problem to see why the original hypothesis was erroneous.
@zero132132
@zero132132 4 жыл бұрын
Some kinds of problems aren't things a journal can do. You can't ensure that everyone announces that they lost the lottery, and it'd be hard to sort out the odds of winning or losing when you don't know how the lottery is organized and you don't know how many losers there are.
@grahamnumber7123
@grahamnumber7123 3 жыл бұрын
Yes it's the big "Climb down" gently does it. These people have reputations to keep .-)
@richb2229
@richb2229 3 жыл бұрын
Great interview and an issue that’s been obvious in science for some time.
@patomalley55
@patomalley55 4 жыл бұрын
As a lay person a wonderful insight into academias continual evaluation of self correcting to establish the truth.
@whoopstheregoesgravitas
@whoopstheregoesgravitas 4 жыл бұрын
I'm not sure if it's already been mentioned, or even done, but a quick way to reference studies where reproducibility has been achieved must surely help. The mention that _some_ low quality work made it into text books had me thinking, it would be nice to verify if the problem is in the textbooks to the same degree as journals. If not, it might be worthwhile having a sound way to prove that case.
@LuKo3x5066
@LuKo3x5066 3 жыл бұрын
From my experience, the problem is well known in economics - during my studies I was taught and warned about these biases. Let's hope the science will prevail!
@mikehart6708
@mikehart6708 Жыл бұрын
I am not a scientist but I grew up in a time when science was highly revered and it has become virtually a religion for many of us lay people. It is what we believe in. It is encouraging that some scientists, such as yourselves, are honest enough and concerned enough to try to deal with this problem. I guess it is naïve to think that if a scientist sticks steadfastly to the scientific method, then all will be well and yet, some of this problem seems to stem from some very basic violations of the very basic scientific method that we were all taught.
@georgekhumalo5283
@georgekhumalo5283 4 жыл бұрын
5 Seconds in and I already clicked like.
@KeithCooper-Albuquerque
@KeithCooper-Albuquerque 4 жыл бұрын
This was an excellent interview! As several people have commented, the significance of this problem cannot be overestimated or overstated. It should be mandatory viewing at all student levels. With all of the science-denying, flat-earth believing sites on the web, these issues become fodder for their arguments. Scientists need to clean house for the good of the scientific method and get on board with steadying reproducibility -- ensuring that up-and-coming scientists take these points to heart and approach their papers with this in mind.
@SimonSozzi7258
@SimonSozzi7258 4 жыл бұрын
This is really important.
@rapauli
@rapauli 3 жыл бұрын
Certainly an important question. Wish I could hear from a psychology researcher.
@CaptainGuntu
@CaptainGuntu 4 жыл бұрын
I would also add that in psychology, there has been an excessive focus on central tendencies in data (primarily as indexed by the mean, but also the median and so-called robust measures of central tendency), when that may be the least interesting and informative aspect of the data for many types of distributions.
@istvansipos6395
@istvansipos6395 4 жыл бұрын
Sabine... love you... period... 💗💗💗💗💗
@Knervik
@Knervik Жыл бұрын
There is no understanding of general psychology without respect for parametric statistics. Confirmation bias and institutions' expectations for quick results sully our understanding. It's good to hear that this is in change.
@patrickegan8866
@patrickegan8866 2 жыл бұрын
You get a sense of how bad the problem is when there's such resistance to register studies with open science. I know 1 single researcher who uses it and I've been on non-psych research teams where publishing the paper funded by a govt grant was reliant on finding a sig result. We should be registering studies on a blockchain so that anyone can look at what RQ's have been investigated already to narrow down what needs to be done next (replicate, modify or just move on). There's tens of thousands of dissertations around the world sitting in drawers or hard drives doing nothing. Also, we learn lots of great stats during undergrad but then for our dissertations we're only allowed to run surveys by the university and then go try and publish it to tick KPI's.
@js3883
@js3883 3 жыл бұрын
Good honest interview and responses. It is good Bishop is doing this. Many large biology companies require properly designed experiments in the discovery phase far before it has to go to the FDA. Wasting time = Wasting money.
@js3883
@js3883 3 жыл бұрын
Of course there was the time a researcher came to me with hundreds of repeated measurements in time on two chimps each with a different treatment. Chimps are very expensive. Did I say the researcher was also a Chimp. I used him for the control. Don't let chimps run experiments!
@sebastianschulz6531
@sebastianschulz6531 4 жыл бұрын
Danke Sabine, super Video, bin froh mit meinem FH Abschluss nicht weiter gemacht zu haben. Mich hätte das Promovieren in den Wahnsinn getrieben. Ich sprach mit einigen, die eine Promotion unternahmen, und die Tatsache, dass eine Arbeit erst geschrieben wurde, als man ein Erfolg hatte und interher die Fragestellung darum aufstellen konnte, das war bizarr,für mich ist das 1984, wo der Redner mitten in der Rede den Feind wechseln kann, oder wie Jeopardy im Fernsehen.
@mello.b3373
@mello.b3373 4 жыл бұрын
Funny thing, because psychology is probably one of the most unscientific fields in science. A lof of information that is being taught in this science field has no scientific basis whatsoever, this includes the astounding numbers of drug prescriptions which are essentially useless for most people in the long term.
@LuisAldamiz
@LuisAldamiz 4 жыл бұрын
True: it's one of the "softest" sciences, but you'd be surprised on how other disciplines rank: leherensuge.blogspot.com/2010/04/genetics-and-biolog-rather-soft.html Next in that study are materials science and pharmacology, followed by clinical medicine, biology, economics and genetics, all of which (except surely the very political "economic science") sound "hard science", just because they are not humanities. Ecology and geosciences are very "hard" instead, only behind space science. Physics in general is hard-ish only.
@thstroyur
@thstroyur 4 жыл бұрын
I agree to an extent; I believe that the so-called social sciences are 'soft' at least in part because they involve certain aspects of the human condition that are not really amenable to investigation by the scientific method; the misguided effort to still do so, manifested here as bad science - a symptom rather than the root cause -, nonetheless, may have deleterious social effects, because it pushes for a reifying narrative for the human nature, and the layman merely consumes this uncritically in spite of itself
@LuisAldamiz
@LuisAldamiz 4 жыл бұрын
@San Diego Rideshare Driver - In what sense do you say that? You could be saying something serious and even important but I don't get it. On the other side it could be just quantum uncertainty sarcasm.
@mello.b3373
@mello.b3373 4 жыл бұрын
@San Diego Rideshare Driver i'm referring to the actual scientific evidence which is obtained via experimental research. Within that framework, psychology, same as nutritional science, is basically a junk pseudoscience.
@thstroyur
@thstroyur 4 жыл бұрын
@San Diego Rideshare Driver Nonsense; the whole 'QM interpretations' debacle merely showcases how low modern physics has gotten in its ill-advised attempts to deny the relevance of its natural philosophical roots in methodology. AFAIC, at this point and after _all_ experiments, anyone vindicating a psi-ontic position has the extraordinary burden of the proof, and that about sums it all up...
@davidturner9827
@davidturner9827 2 жыл бұрын
In a particular program (to remain unnamed) the first thing I was taught was, when reading a paper, to look at the references first and “see if they’re respectable”. This attitude is a major problem. While the views of the big names in a field may on average correlate more closely with the truth, the positive feedback loop that trust networks create can steer an entire discipline awry. Some prominent examples in psychology come to mind.
@Alexander_Sannikov
@Alexander_Sannikov 4 жыл бұрын
this "in-principle acceptance" does sound like a very interesting thing to eliminate the publication bias. i wish something like that becomes popular in other areas of science.
@jpphoton
@jpphoton 4 жыл бұрын
Brilliant minds. Excellent.
@haroldbridges515
@haroldbridges515 3 жыл бұрын
What' striking to is how unusual such a discussion between two scientists from different fields is. Much better than listening to science journalists whose grasp of the subject may in fact be tenuous.
@alannolan3514
@alannolan3514 4 жыл бұрын
Danku so much. Real police!
@wbwarren57
@wbwarren57 4 жыл бұрын
Businesses are committing all of these faults in data analysis big time! Businesses collect huge amounts of data on consumers and then analyze it every which way without really any hypotheses in order to find what they think are patterns. Unfortunately, it may well be the case that they are finding things that really don’t exist and then applying those “insights“ to the consumers detriment.
@nziom
@nziom 4 жыл бұрын
That's a really good interview
@MyKharli
@MyKharli 4 жыл бұрын
Its so nice to see scientific philosophers face the human condition to try get better results.
@rickfearn3663
@rickfearn3663 4 жыл бұрын
Excellent interview, thanks Dorothy and Sabine. Now about the pharmaceutical industry and their drug testing. When will the truth ever come out about the vested interests throughout the development, distribution and medical sectors. Thanks to both of you. Sincerely Rick
You don't have free will, but don't worry.
11:05
Sabine Hossenfelder
Рет қаралды 1,2 МЛН
Einstein’s Other Theory of Everything
13:20
Sabine Hossenfelder
Рет қаралды 123 М.
У мамы в машине все найдется
00:38
Даша Боровик
Рет қаралды 2 МЛН
I PEELED OFF THE CARDBOARD WATERMELON!#asmr
00:56
HAYATAKU はやたく
Рет қаралды 34 МЛН
Can You Draw The PERFECT Circle?
00:57
Stokes Twins
Рет қаралды 67 МЛН
Kathleen Stock Questioned by Oxford University Students
1:03:11
OxfordUnion
Рет қаралды 626 М.
How Scientists Can Avoid Cognitive Bias
8:28
Sabine Hossenfelder
Рет қаралды 83 М.
Time to Get Real about Climate Change
7:19
Sabine Hossenfelder
Рет қаралды 278 М.
Stroke and worse after Moderna
18:13
Dr. John Campbell
Рет қаралды 422 М.
Is Most Published Research Wrong?
12:22
Veritasium
Рет қаралды 6 МЛН
Calling Bullshit 7.4: A Replication Crisis
7:23
UW iSchool
Рет қаралды 18 М.
Limits of Scientific Psychology | Nick Brown | TEDxRhodes
16:29
TEDx Talks
Рет қаралды 28 М.
Freeman Dyson - Why I don't like the PhD system (95/157)
6:57
Web of Stories - Life Stories of Remarkable People
Рет қаралды 742 М.
What if the Effect Comes Before the Cause?
19:24
Sabine Hossenfelder
Рет қаралды 408 М.
How Neuralink Works 🧠
0:28
Zack D. Films
Рет қаралды 29 МЛН
APPLE УБИЛА ЕГО - iMac 27 5K
19:34
ЗЕ МАККЕРС
Рет қаралды 98 М.
Теперь это его телефон
0:21
Хорошие Новости
Рет қаралды 1,4 МЛН
Распаковка айфона в воде😱 #shorts
0:25
Mevaza
Рет қаралды 1,5 МЛН
Apple ХОЧЕТ, чтобы iPhone ЛОМАЛИСЬ чаще?
0:47
ÉЖИ АКСЁНОВ
Рет қаралды 2 МЛН
Xiaomi Note 13 Pro по безумной цене в России
0:43
Простые Технологии
Рет қаралды 1,6 МЛН