Functionalism

  Рет қаралды 77,898

Jeffrey Kaplan

Jeffrey Kaplan

Күн бұрын

Пікірлер: 222
@johnnydrydenjr
@johnnydrydenjr 4 жыл бұрын
A cat can be a mousetrap.
@elisha2358
@elisha2358 2 жыл бұрын
And this proves the point that a mouse trap is multiply realized. However, it does not prove that a cat can be multiply realized.
@elisha2358
@elisha2358 2 жыл бұрын
@UCCvINqJHA036L_BpiSQAjIQ I'm not sure I agree that a lion can be a cat.
@Simulera
@Simulera Жыл бұрын
@@elisha2358 the use of the word “cat” means it it can be multiply realized, but there are type distinctions to be careful about here.
@GermanZorba
@GermanZorba Жыл бұрын
Most cats are mousetraps😂
@gorgzilla1712
@gorgzilla1712 Жыл бұрын
Well yeah actually
@JAYDUBYAH29
@JAYDUBYAH29 2 жыл бұрын
You are a fantastic teacher. This lucid common sense conversational style is so often missing from how many teach philosophy.
@hoagie911
@hoagie911 Жыл бұрын
One of the issues with the attack on the brain-mind identity theory is that, while perhaps octopuses and other animals feel "pain", they may not do so in exactly the same way as we do. That is to say, while the mental states may be similar, they are not the same, and thus there is no issue with the brain states being different. This gives rise to questions of how we can know if mental states are similar or the same, and how rather different brain biologies can lead to (what we assume to be) similar mental states. But, importantly, these seem to be open questions, while the functionalist needs them to be closed in order for their attack on identity theory to hold.
@L4wr3nc3810
@L4wr3nc3810 Жыл бұрын
Good point, though i think you can still argue that no human brain is really the same, yet we feel the same stuff
@VladimirGluten47
@VladimirGluten47 11 ай бұрын
This was exactly my thinking. Rather than 'pain' being both brain state B and brain state X for humans and octopuses respectively, isn't is more likely that human pain is caused by brain state B, but not X, and octopus pain is caused by brain state X but not B? And I feel like we can get as specific and atomic as we would like. My pain is brain state 1, yours is brain state 2, and my pain tomorrow when i am hurt differently than to today is brain state 3.
@hoagie911
@hoagie911 11 ай бұрын
@@L4wr3nc3810 But do we feel exactly the same stuff? We certainly feel very similar things, but then our brains are all very similar.
@nikhilweerakoon1793
@nikhilweerakoon1793 7 ай бұрын
That’s a good point. It seems to me in that case; it depends on how you define similarity/your metaphysics of properties. Even if mental states aren’t one and the same as one another, perhaps a relevant sense of overlapping similarity in the mental states constitute realizing the same function and hence being an a mind. It also seems to depend on how you define a “brain.” a functionalist if a brain is strictly biological, would better explain the possibility of artificial intelligence coming to have consciousness in some sense. In my view at least. At the end of the day however to me it seems the debate is entirely irresolvable due to the epistemic inaccessibility of others mental states nevertheless of animals.
@bonafide_ok
@bonafide_ok 6 ай бұрын
“Pain” doesn’t even have a clear definition to begin with. The identity theory is trash
@a.a.h.s.2345
@a.a.h.s.2345 3 жыл бұрын
I can't thank you enough for these videos. I've been really struggling with an essay regarding the Identity Theory and Functionalism, but these videos have practically saved my life (and grades)! Keep up the great work!
@profjeffreykaplan
@profjeffreykaplan 3 жыл бұрын
You're very welcome!
@jamesboswell9324
@jamesboswell9324 Жыл бұрын
"The function of a cat is to be grumpy and scratch your furniture and stuff like that..." LOL Said like a true dog-lover!
@mattmerc8513
@mattmerc8513 3 жыл бұрын
Thank you for making this complex theory so easy to understand in one go. the quality of these vids are top notch
@novas.6814
@novas.6814 3 жыл бұрын
This was great. Thank you for taking your time to help us out.
@profjeffreykaplan
@profjeffreykaplan 3 жыл бұрын
My pleasure!
@gusbozeman8337
@gusbozeman8337 3 жыл бұрын
this man just saved me with this video 2 hours before I have to turn in my paper. Mr. Kaplan is a godsend
@captainscarlett1
@captainscarlett1 Жыл бұрын
I'm reminded of my training to be an NCO in the army...functional leadership...a leader is someone who carries out the functions of a leader by addressing needs...group needs, individual needs and task needs. This is teachable and doesn't require individual qualities or situational expertise.
@mariuszpopieluch7373
@mariuszpopieluch7373 4 ай бұрын
This channel is excellent. I teach philosophy of mind and it’s my go to intro to each topic. Really like the energy and enthusiasm-it’s a pleasure to watch and listen.
@GeneralPublic
@GeneralPublic Жыл бұрын
An octopus has what is called a distributed nervous system instead of a central nervous system. All 8 tentacles contain stuff similar to brain stuff, a network of neurons throughout the body. If you chop off a tentacle, the tentacle can survive for a little while on its own, and it shows signs of being sentient, and can react to things just like an octopus, even without a brain. While an octopus does have a central brain, their 8 tentacles act like 8 additional brains, sort of, so they are sort of like 9-brained creatures, except the network of neurons is all one big network, and as long as it’s all connected, the entire octopus functions like one giant brain. Since an octopus can be big and its entire body is part of this distributed nervous system that is sort of like a brain, that makes octopi pretty smart compared to other invertebrates. Imagine if your arms and legs had brain matter in them and could think and act immediately without having to send nerve signals all the way to your head and wait for a nerve signal from the head to tell the muscles how to move, if there were some brain-like structures in your limbs that could respond to sensory stimuli and tell your muscles how to react really fast. Technically humans have a little bit of this ability but not very much, in the spine, like your spine can react to certain nerve signals from your legs and tell your legs how to react, but this is much more limited than in an octopus, this is part of how reflexes work, like when a doctor hits a certain location on your knee and your leg goes up without you telling it to do that consciously. But most of what humans do is controlled by the brain, whereas in an octopus, all 8 tentacles can pretty much operate on their own. They are connected the same way the right brain and left brain in a human are connected. Most people just have one consciousness instead of 2. But in some people, the right brain and left brain aren’t really connected well enough, and end up operating separately, with 2 separate consciousnesses. There are theories about this stuff, how if you cut a consciousness into parts it splits into multiple consciousnesses, and if you connect multiple consciousnesses up together well enough, they merge into a single consciousness. So if I hooked my brain up to yours with electrodes in the same part of the brain in millions of wire connections, theoretically, according to neuroscience, our consciousnesses would merge into one single shared consciousness for both of us that knows everything either of us knows, senses what either of us can sense, controls all our muscles in both bodies, but operates as a single mind, a single consciousness. This is from science, not philosophy. And if you completely disconnect the 2 hemispheres of the brain in a person or another mammal with a similar type of brain, you end up with two minds, two consciousnesses. So with an octopus, technically it has one mind, one consciousness, but it is not centralized, but distributed throughout its body, and can be split apart. And if you cut off a tentacle, well, the octopus just lost not just part of its body but part of its mind too, so it is very upset. And the tentacle’s mind is now separate from the rest of the octopus’s mind so it is upset too, and it is still sentient and can still think and feel pain and have emotions, unlike for instance if you chop off a human arm or leg, the arm or leg has no consciousness or mind, because in humans, almost all of that happens in the brain, with a tiny bit happening in the spine instead. Although maybe the spine has a little mind of its own that is separate from the larger mind in the brain. We don’t really know for sure. I just felt like talking about how octopus tentacles are sentient and can still think even if you chop them off, and scientists have proved this in experiments. Please don’t hurt any octopi in experiments. It might not be very ethical, now that we know even their tentacles are sentient. Personally I have witnessed something similar in lizards. There are lizards where if you catch them and are holding them by the tail, their tail detaches from the rest of the body and the lizard escapes, bleeding a bit from where its tail came off, and then it can grow a new replacement tail. And the tail that came off wriggles around for awhile on its own, not even connected to the main body. The tail seems to have a bit of its own nervous system that can, at least, tell it to wriggle around a lot if it is disconnected from the main body of the lizard. Is the tail sentient or conscious at all? Who knows? It might be. It can certainly move on its own after being detached. Also if you chop the head off a chicken, the chicken can run around for awhile without a head or a brain, still knowing how to run on its legs, which it typically does until it bleeds to death. Is a headless chicken sentient? Maybe. Some people think so. But a chicken or lizard doesn’t have quite as distributed a nervous system as an octopus, nor is it as centralized as a human or other mammal like a cat or dog. This thing where the brain does all the thinking is mostly just a mammal trait, not as true for other animals. Birds have tiny brains but that doesn’t make them dumb, crows are fairly smart, some of the thinking isn’t done in the brain. A distributed nervous system is sort of like massively parallel cloud computing where multiple regular computers networked together act like one huge supercomputer. It’s a similar phenomenon but not exactly the same because computers don’t seem to be sentient or conscious... not yet. Maybe in a few years they might be. If a simulation of a neural network is just as smart as a real brain, then according to functionalism, it has a mind too. AIs seem like they might get to that point in the next few years, at least a little bit. Like if you think a fruit fly is conscious and sentient, AIs will almost certainly reach that level, at least if you consider an emulated or simulated consciousness to be just as real as the real thing. Which you should. In software, you can play the same game directly on hardware, or in an emulator emulating that hardware, for instance playing Super Mario Brothers either on an actual original NES game console, or on a program that emulates an NES. In either case, whether on original Nintendo hardware or a Nintendo emulator, you are still playing the same game, Super Mario Brothers, and functionally it is exactly the same. So if you could emulate or simulate a brain, this would also be just as much a mind as the real thing, according to functionalism, because a mind, like Super Mario Brothers, isn’t defined by what it’s physically made of but instead by its function. Super Mario Brothers is a functional kind, just like a brain, you could play it on the latest smartphone and it would still be Super Mario Brothers. Maybe you could even train an octopus to play it, with the right interface and control system designed for an octopus to use. Different tentacles could control different buttons, up down left right A B select and start, 8 buttons for 8 tentacles, a perfect match. Each tentacle would either be pushing a button or not pushing it depending on the position the tentacle points in. Then we would just need an interface to communicate the state of the game to the octopus because maybe vision might not be its main sense. With enough funding, we could create a Nintendo for octopi, and use the Nintendo-playing octopus to solve all of philosophy. Or at least the octopus would have that experience, like in Robert Nozick’s experience machine. Or we could put it in a Chinese room. I guess the Nintendo would be like a Chinese room because it would raise the question, does this octopus actually know how to play Super Mario Brothers, just like asking, does the person in the Chinese room actually know Chinese? The answer is yes, obviously, because this octopus is really smart. All of its tentacles can think, after all... its entire body is like one giant brain.
@severussin
@severussin Жыл бұрын
I went on the journey with you and enjoyed it. Much appreciated!
@L4wr3nc3810
@L4wr3nc3810 Жыл бұрын
This was real fun to read, thanks!
@eliasrose3842
@eliasrose3842 11 ай бұрын
great comment, thanks.
@perplexedon9834
@perplexedon9834 Жыл бұрын
Functionalism, or at least the octopus argument, seems kind of circular to me. We have no way of knowing if there are different kinds of subjective ways that it is to "be like" a creature, because any organism necessarily only has access to one kind. It could be that the form of consciousness in Brain State O is fundamentally different from Brain State B, that both are ways in which a being can "be like" something, but that are qualitatively meaningfully different in some way. The idea that they are qualitatively the same IS the conclusion of functionalism. One of the premises alone entails the conclusion, therefore the argument is circular. This applies especially well when considering AI, because we may well have to concluded that there is no test by which we could identify if an AI is having an experience comparable to Brain State B. We as individuals can only conclude that the human mind is capable of producing consciousness, and really only our own. Interestingly if an AI were to be similarly conscious, it would have to conclude the same about us. I would actually like to believe that a bunch of gears and string, or some rocks rolling down a hill, or the superstructure of the universe, could have a conscious experience, but no physicalist theory bridges that gap. Functionalism best explains the easy problem of consciousness for sure, as we can call conscious any set of information processing matter that functional meets certain criteria, but you need something unfalsifiable like panpsychism to bridge the hard problem.
@FR-kb1fc
@FR-kb1fc Жыл бұрын
Another great lecture. A couple comments. First, gold is the most electrically conductive element that does not easily oxidize, so it functions well as a conductor in oxygen-containing environments. Second, there is a different sense of functionality that I've been trying to understand. Prof. Kaplan says that gold has a certain number of protons; but really, we can't know that, what we can know about gold is that it functions in a certain way. By this I mean, if we put gold ions in a machine called a mass spectrometer, the gold will behave in a certain way, and from this behavior, we infer that it has a certain number of protons. And it isn't just gold, it's everything in the material universe. We can only know how a thing functions. We can never know what a thing "is". That is the kind of functionalism that I'd like to understand better; however, based on this lecture, that seems quite different than the meaning of "functionalism" in philosophy.
@bonafide_ok
@bonafide_ok 6 ай бұрын
In the extreme, all concepts are a functional kind
@magnesiumbutincigarette2271
@magnesiumbutincigarette2271 2 жыл бұрын
It is a great success that you explain some arguments in an easy way although the texts about philosophy of mind are so hard to get it. Thank you so much 🙂✋
@calorion
@calorion Жыл бұрын
Somebody is really misunderstanding the identity theory. It's either Putnam or me. My understanding of Place's argument (in the previous lecture) is that *a given instance of pain* in a human mind is identical to brain process B. This is a way to explain how physicalism works, not intended to be some sort of overarching neuroscience claim about brain processes. Place isn't claiming that every instance of pain in every person (or being!) looks identical on a neuroscience level! He's saying that in a particular brain, brain process B gives rise-is identical-to the experience of pain. So…I don't see how there's anything for Putnam to object to here. Have I just completely misunderstood Place?
@BlazeOrangeDeer
@BlazeOrangeDeer 9 ай бұрын
Yes, the possibility of another brain state that also contains pain seems inherent in Place's argument. The simplified version of Place's argument he presented naturally extends to this case, because if two people each experience pain they might be in brain states B1 and B2 (presumably different people are not ever in exactly the same brain state). So pain could not have been a single brain state but was always a collection of brain states that are alike in some way. Functionalism is then a clarification of what it is about this collection that justifies labeling them as painful.
@gm2407
@gm2407 Жыл бұрын
I am so glad that before I came accross this video I was considering the mind I independently used fuctionalism to described cognition to require something to experience and an apparatus of any form to abstract the interactive experience, the cognition is the processing of the mind of the experience. I feel like I am starting off on the right path for philosophical thinking.
@universe36
@universe36 3 жыл бұрын
Awesome! You are a great teacher!
@profjeffreykaplan
@profjeffreykaplan 3 жыл бұрын
Thank you. That's very nice of you to say!
@DarisDamaris
@DarisDamaris 3 жыл бұрын
I'm so impressed. Mr. Kaplan, you did such a great job explaining something complex and made it so easy to follow. You are a perfect educational role model! Cheers
@bthomson
@bthomson Жыл бұрын
I did have some teachers like this but you can NEVER have too many! 😍🙏👍❤💎🎖🏆🏅🎯
@williamgarner6779
@williamgarner6779 Жыл бұрын
Only in modern times could someone discuss in this way mousetraps and cat and not at least in passing ask if a cat is a mousetrap. That was their main purpose (as far as humans were concerned) for a thousand years.
@thomassoliton1482
@thomassoliton1482 Жыл бұрын
Functionalism - “If you can do the job, then you are the mental state”. Does this make any sense if there is no “you” in that context? Could a machine do this? Would it be “conscious”? Which is the whole point of this proposition? Yes, but that would not necessarily guarantee the “entity” with the “mental state” would be aware of that state, which is required of consciousness. An ant recognizes scents and executes specific behaviors as a result - but is it aware of what it is doing? Certainly not in the manner we consider to be awareness. The ant has little choice in terms of it’s response to the scent, while humans have many choices - what we call “freedom of choice”. In fact, it cannot be proven that “we” really have a choice. What seems like multiple choices are actually many outputs (options) which we can envision and evaluate. Our brains are designed to weigh those option rapidly and make a choice. There is no “me” making a choice - just me watching the choice being made. Perhaps I choose door “B” but then I smell something bad near the door that “persuades” me to choose another door instead. Free will is just an illusion hiding the fact that most of our choices are subconscious. All we can really do is observe our choices and learn from them. That is what conscious fundamentally is - a constant process of observation and evaluation underlying adaptation. Awareness is the process of mental evolution.
@homesteaddetroit117
@homesteaddetroit117 Ай бұрын
Being aware of the "state", is the "state". What is awareness? Ask the computer if it's ever had a (thought, idea, conclusion) it hasn't told anyone about. Independent thought, without reason, prompt or destination
@MrGking1303
@MrGking1303 2 жыл бұрын
Absolute legend, just saved my exam
@johnnygate3399
@johnnygate3399 Жыл бұрын
Looks like a form of behaviourism to me. Jealousy needs behaviours such as whimpering. What about emotional spartans who feel jealous but do not whimper or betray their feelings?
@atmansoni6406
@atmansoni6406 4 жыл бұрын
Very well explained, thank you for the animated examples :)
@profjeffreykaplan
@profjeffreykaplan 4 жыл бұрын
You're very welcome!
@markelmobuenaobra7047
@markelmobuenaobra7047 2 жыл бұрын
Thanks for this, Mr. Kaplan!
@nathanialblower9216
@nathanialblower9216 3 жыл бұрын
If Super Spartans are a problem for behaviorism, aren’t they a problem for functionalism?
@bthomson
@bthomson Жыл бұрын
Your expression when you are trying to think up what makes a cat is priceless! A cross between disgust and long forgotten memories! I do not think that cats are your thing!
@anuragc1565
@anuragc1565 3 жыл бұрын
Wow ... thanks for such a clear and succinct explanation.
@profjeffreykaplan
@profjeffreykaplan 3 жыл бұрын
You are welcome!
@Menschenthier
@Menschenthier 9 ай бұрын
Since I wasn't in the course, which text by Bradley is it?
@jonstewart464
@jonstewart464 2 жыл бұрын
Jealousy is a great example to expose why functionalism is wrong. We all know that jealousy has *qualia* that we experience while our brains carry out the function of processing the inputs and churning out the outputs. When we use the word "jealousy" we're talking about the the qualia - what *feels like* to be jealous, and not the function nor (as in identity theory) the physical goings on in the substrate i.e. certain neurons firing in a certain pattern. Let's say I have a dream, where there's some character that I don't know in my normal life, and they're kissing a man on a beach. I don't see who they're kissing, but at that moment I wake up, and I'm filled with this overwhelming feeling of jealousy. There's no input, just images in my dream that don't make much sense. There's no output, I don't exhibit any whimpering, I don't tell anyone "I'm fine". I just wake up with this overwhelming feeling of jealousy conjured by my dreaming mind. There's no inputs, no function, no outputs - but there's definitely jealousy. So functionalism is false, and so is any other theory that does not account for qualia.
@jonstewart464
@jonstewart464 2 жыл бұрын
@@arletottens6349 I don't think it's enough for waking with jealousy to *sometimes* have a function. If sometimes there's no function at all, then it cannot be true that (waking with) jealousy *is* a functional state, it must be that jealousy is something else (a feeling) *usually/sometimes associated with* some function or other. If we take away the function are we left with nothing? No, we still wake up with all the qualia of jealousy.
@jonstewart464
@jonstewart464 2 жыл бұрын
@@arletottens6349 That's exactly what I would I would be saying if I believed in a theory of cars that defined them purely by their ability to move and had no knowledge or experience of engines. But I don't, I believe in engines, and I know what they do.
@jonstewart464
@jonstewart464 2 жыл бұрын
My reading is that if no inputs and no outputs can be identified, then no function can be identified. I don't see how an internal state of the mind/brain can be regarded as an input (e.g. a dream). And I don't think it makes sense to say that an experience *might sometimes* have an output, each experience is unique. The language of inputs and outputs isn't sufficient to capture what we care about.
@jonstewart464
@jonstewart464 2 жыл бұрын
@@arletottens6349 That's really interesting. But doesn't functionalism say that it's only when you do probe the internal state that it is conjured into existence? If functionalism allows for all kinds of internal states to exist when not being probed by inputs, then how is it a theory of mind rather than an experimental paradigm for finding out about minds?
@okamisensei7270
@okamisensei7270 2 жыл бұрын
I'm not a functionalist either, but just to probe your argument: I think you're assuming an atomic entity in the mind. The agent experiencing the images, might be different from what is creating and presenting it. Dreams tend to blur the line between the one experiencing and the experience, but you could still say that the image of the people kissing is an input that is separate from the agent that receives this input and then experiences jealousy. Functionalism could still be a valid theory without an atomic agent
@udyret28
@udyret28 2 жыл бұрын
I can’t find this primer by Bradley that is in the readings. Can anyone help?
@theedj007
@theedj007 Жыл бұрын
I cannot for the life of me locate the primer/reading by Adam Bradley; my internet kung fu has utterly failed at locating this resource. Is it a chapter in a book? A journal article? Any help in pointing me in the correct direction would be appreciated!
@lizazhbankova4070
@lizazhbankova4070 6 ай бұрын
Any luck finding the text? My google kung fu doesn't work with it either.
@checosa777
@checosa777 Жыл бұрын
can we have the pdf readings? i wanna read more about it
@tAntrik18
@tAntrik18 4 жыл бұрын
One can say that the mental aspects of my life, consciousness, pain, etc, are some function of neurons in my brain. And this is similar to the position of identity theory. And if that is true, then the argument for multiple realizability of mental states is not clear. My conscious experiences at any point are just an outcome of a particular way my brain fires neurons. And I happen to call some of these firings as pain. In the same way, I am labelling some of the brain states of octopus as pain. But the mental aspect of octopuses life is still given by the particular way it's brain works. I am not sure though if identity theory can be cast this way. Edit: I have already watched the video on identity theory. Still cannot solve the problem. Great videos though.
@SitcomedyCD
@SitcomedyCD 3 жыл бұрын
I think that the motivating objection that brought up functionalism against identity theory isn't to deny that consciousness and sense experiences are a function of neurons and your brain, but that identity theory does *not* allow for an understanding of mental states as multiply realizable. If the identity theorist says that pain is identical to c-fiber firing, then things that aren't brains that fire c-fibers to create pain aren't really being recognized as pain under the theory. This is regardless of whether what you're talking about is actually having the experience of pain, and surely the important part of pain is the sense experience and not how it came about! The functionalist theory doesn't have this problem because it allows that consciousness and sensations could be multiply realizable. It doesn't have to be C-fibers, it can be alien gelatin heating up, or light hitting a receptacle or whatever. It doesn't matter what type of being you have, as long as it has something "like" a brain that fulfills the role of being the thing that the experience arises from, then that being has a mind with sensations and experiences that correspond to its own unique mechanisms.
@Brian.001
@Brian.001 2 жыл бұрын
@@SitcomedyCD Yes, but in the middle of that you assumed that 'pain' is multiply realizable, which has not been established. All we know is that an octopus and a person can get into a similar functional state - not that the experience each of them has is the same. They could be two unpleasant but quite distinct experience types. Saying it is 'just obvious' that an octopus can experience pain, judging by the functional states it exhibits, is just begging the question. I have never seen a demonstration that pain can be experienced identically across species. If we were to accept functionalism, we would be holding that token brain states of distinct types can each realise a single type of experience. Token-type identity is not even comprehensible, given that pain is an experience, not a function. If an experience is identical with a brain state, they must share their type.
@SitcomedyCD
@SitcomedyCD 2 жыл бұрын
@@Brian.001 nice, thanks
@alancosgrove4728
@alancosgrove4728 2 ай бұрын
I enjoy your lectures very much for getting to the salient issues and recapping prior to making the main points. You mention various pages in the book or notes but on your web site I can only find a few of them in the course materials and no mention of a reading list or a course book. Can you tell me if you prepared extensive notes for all of the course or is there a specific book(s) to accompany this excellent philosophy of mind series please? Thank you.
@AlexCebu
@AlexCebu 5 ай бұрын
Looks like Putnam didn't realise that pain itself is a multiple state, a group of states. We have toothace, we have stomach pain etc. Pain itself regardless or specifics is an abstraction and general idea.
@msmd3295
@msmd3295 10 ай бұрын
Even if mental states are multiply realizable that’s not really the core of brain-mind identity. The central element of importance is the fact that without the brain, mind cannot be realized. One would have to demonstrate that mind can exist without brain for there to be any duality. It is well known scientifically that mind ceases when the brain ceases. Thus they’d have to be elements of the same thing… a physical brain.
@ignaciocorralesbriceno8783
@ignaciocorralesbriceno8783 3 жыл бұрын
agradecido mi rey, siga así mi wacho peluo
@robertochacon5338
@robertochacon5338 Жыл бұрын
your channel is simply great! :) thanks for your hard work!
@matthewgingell3792
@matthewgingell3792 3 жыл бұрын
Under the functionalist view, if a super stoic receives jealously inducing inputs but those inputs have no impact on their behavior/outputs, would we still say they experience a mental state of jealousy?
@mithrae4525
@mithrae4525 Жыл бұрын
Putnam: "Behaviourism is wrong because Super Spartans don't exhibit behaviours associated with pain." Also Putnam: "The outputs/functions associated with pain are where the real deal is at. Can't think of a counterexample to that, no siree!" I suppose one could argue that the function of pain, jealousy etc. are informative rather than behavioural; a super spartan who feels pain can then decide how to respond or modify their actions, such as removing their hand from the fire that's causing tissue damage (without showing any typical pain responses like crying out etc.).
@christiangreff5764
@christiangreff5764 Жыл бұрын
@@mithrae4525 I have to agree with you that functionalism does, at least on a basic level, seem pretty much like behaviorism with extra steps. The Super-Spartan counter-example to behaviorism is flawed in the first place. The Super Spartans, as described in the text itself, have the disposition to act in typical 'pain-behavior-like manner', they are just supressing it. Aka to convince us readers that the super spartans are, indeed, feeling pain, they had to be described in such a way as to have the very disposition they are supposed not to have to prove that mental states do not equal dispositions. We can, of course, imagine an overhauled example were mad neuro-science has lead to totally changed up pain responses in the subject (victim) of Dr. Evil. But then again, if in this new scenario there are truely none of the typical pain-behaviors shown (like, at the very least, avoidance to future repetition of pain-causing stimuli), how would we ever identify what we observe as pain-behavior? Fundamentally, is there such a point at which the underlying perception and therefore the mental state itself has to be changed to cause a certain behavior? I mean, if the unfortunate subject, after Dr. Evil's operations, willingly seeks out what are to us causes of pain and carefully avoids what we would excpect to cause fun or pleasure, then does it still make sense to classify what they are feeling from the former as pain and what they are feeling from the latter as fun?
@mithrae4525
@mithrae4525 Жыл бұрын
@@christiangreff5764 It seems you're saying that the 'disposition to exhibit pain-related behaviours' is simply a subjective mental state that recipients of pain have, including Super Spartans. But then how is that behaviourism any more? How is it even a physicalist theory any more? Even Cartesian dualists would agree that pain is a subjective mental state which disposes people to act in certain ways. Surely the whole point of physicalism is to associate mental phenomena with something objective; in this case with their consequent behaviour, not merely with a mental disposition.
@christiangreff5764
@christiangreff5764 Жыл бұрын
​@@mithrae4525 Ah, sorry, it seems I did not manage to clearly communicate the point I was trying to make. Indeed, as you put it, mental phenomena are inferred by their consequent behavior. So much so, that if not at least conscuiously supressing the associated reactions, we would probably see any claim of an entity experiencing a specific mental phenomenon (such as pain) as highly dubious.
@chrisw4562
@chrisw4562 11 ай бұрын
Thanks for another great lecture. I don't buy the multiply realizability argrument. In the example, that argument would require human pain to be identical to octopus pain. Really? It feels to me a lot of the philosphers are making things up to prove their point, describing that in a language that nobody can understand, and then it takes generations to debunk them. Brilliant.
@genec9560
@genec9560 4 ай бұрын
I am enjoying your channel immensely, Jeffery. This video got me thinking ... What if a cat had a bad accident, and needed 51% modern robotic mechanical parts in surgery to regain full function. Is it still a cat? What about 75%, 99%?
@4_P3R50N
@4_P3R50N 10 ай бұрын
How does conscious experience fit in with this? If pain is solely defined by input & output, then the actual experience doesn’t matter if we don’t define the experience as a required output. Does Putnam express an opinion on this?
@AlexCebu
@AlexCebu 5 ай бұрын
Why people call jealousy A MENTAL STATE? when jealous 26:22 a someone is tense and agressive etc. and all those are BODY STATES.
@terekrutherford8879
@terekrutherford8879 Жыл бұрын
Great content and series. But I'm really having trouble understanding how mental states are multiply realizable. If human pain is brain state B and octopus can only experience octopus pain in brain state O, that seems reasonable and still explains octopus behavior. I don't understand the assertion that different beings experience pain in the same way. Each brain should experience things differently due to its different biology and the sensory organs of the body providing information to the brain, and each mental state should be unique due to these variations.
@REDPUMPERNICKEL
@REDPUMPERNICKEL Жыл бұрын
Seems to me your understanding is just fine. The only entity which you can know with absolute certainty to be conscious is the entity that is your self and even then, only while your self is conscious. I am certain that you are conscious while you are reading this sentence but I cannot be *absolutely certain* like I am about my self as I type it. That I may be typing in a dream does not change the fact that I am conscious of the doing. The differences between awake conscious and dream conscious seem to be, one's understanding of the possible while dreaming may not be fully participating in the process hence my dream belief that I have levitated... The other night, very oddly, I dreamed that I was completely another person. Subsequently, my appreciation for the power of imagination is much greater. Cheers!
@realbland
@realbland Жыл бұрын
But obviously an octopus' brain experiences pain differently to a human's, so its still "pain" but its a different version of pain than the pain we experience because they have brains that work differently. Its the same problem as the argument from what's it like to be a bat, we just happen to both have the sense for when the body being harmed in some way, which we call pain.
@xin9458
@xin9458 Жыл бұрын
This is a good point! Extending this, it can be argued that no two humans experience pain the exact same way either - we might all exhibit neuronal patterns that roughly resemble a textbook "Brain State B," but it's never going to be completely identical. Every person's pain is different because every mind is unique (just as every table is unique? His might be an old packing case while mine is made of wood), but the fact that pain is a mental process associated with a specific physical state is conserved...
@realbland
@realbland Жыл бұрын
@@xin9458 exactly! consciousness is no more a physical kind than a cat is, but thats not because consciousness manifests identically across all life forms with the capacity for it, but, in the same way that there can be many different kinds of cat, and many forms that a cat can take, there are many "ways" that consciousness appears.
@xin9458
@xin9458 Жыл бұрын
@@realbland That makes sense! Also very in line with the philosophy of biology in general - within the frameworks presented in this video series though, would this position be something in between the Mind-Brain Identity Paradigm and Functionalism? As in, mental states definitely correspond to brian states, but there are multiple brain states that functionally create consciousness...?
@realbland
@realbland Жыл бұрын
@@xin9458 i would say its closer to the identity theory, just while recognizing the fact that biology and chemistry in reality is somewhat complicated, and generalizations necessarily cant reflect that
@jamesforgason5341
@jamesforgason5341 2 жыл бұрын
you are saving my life for my test tomorrow
@CaedmonOS
@CaedmonOS Жыл бұрын
I love the way you start your videos so much also apparently I'm a table
@DevonBagley
@DevonBagley Жыл бұрын
The functionalism argument doesn't really address the claim of identity theory. There are many types of computers that all operate on completely different physical laws but still produce the same output. It may be wrong to specifically say that computers produces result A by specifically doing specific process B, but it is not wrong to say that all computers produce result A by doing physical processing. The fact that the process is different for different computers doesn't negate the fact that result A is always given for input B in spite of the physical differences in the process that takes place.
@johnokazaki7967
@johnokazaki7967 Жыл бұрын
9:48 if anything that "fulfills the function" becomes that thing, then we can say that a cow that gives a driving course is an instductorsince it pkays that role. However, for an instructor to "instruct" you first need a set of characteristics, for example, being able to talk the targets language. In this scenario, a cow, though its fulfilling the role and purpose of teaching, it can't truly teach since it lacks a vital characteristics which is talking like humans do. I think, if something fulfills a function doesn't really means it is the thing described, instead, it requires to have characteristics that -allow- them to fulfill a function, thereby making it the thing we say it is. Thus, what makes something is not its ability to fulfill a purpode, instead, its individual characteristics that create the emergent property that makes allows the to fulfill a function.
@Bhuyakasha
@Bhuyakasha Ай бұрын
How is the last example of jealousy different from the behaviorist explanation though? Seems like they would give the same account there.
@mmkTTS
@mmkTTS Ай бұрын
If jealousy function is realized in brain state c, then this c is jealousy, according to functionalism. On the other hand, according to behaviorism, the behaviour like whimpering is jealousy, not brain state c. At least, that's how I understood it anyway.
@4_P3R50N
@4_P3R50N 10 ай бұрын
Are causes and effects defined as physical, or would a change purely in ones conscious experience count as such? (If something like that even exists according to functionalism)
@pepedestroyer5974
@pepedestroyer5974 2 жыл бұрын
You are a Great teacher. Your videos are gold. I hope you have more subscribers.
@mateoromo5587
@mateoromo5587 Жыл бұрын
Thank you so much for this video!!!
@none8680
@none8680 Жыл бұрын
How are we concluding that pain in humans is the same as pain in octopuses? Or jealousy in one person is the same as jealousy in another?
@chrisw4562
@chrisw4562 11 ай бұрын
Thanks for the lecture. Functionalism seems easy to debunk. A computer program can do many functions the same as the mind. Does that mean the computer has a mind? I don't think many would agree. However, will a computer eventually be able to have a mind? I think the answer is yes.
@farafonoff
@farafonoff 10 ай бұрын
'Debunking' functionalism involves unmeasurable stuff like 'mind', 'understanding', 'Consciousness'. If we try do define and measure them, than we can build a computer to have this things. Different words, same issues as with 'god', 'angels', and 'soul'.
@BlazeOrangeDeer
@BlazeOrangeDeer 9 ай бұрын
Obviously if a machine can do some but not all of the things a mind can, a functionalist would not call it a mind. It's missing the complete functionality. You might still be able to consider it a partially functional or defective mind
@mmkTTS
@mmkTTS Ай бұрын
Wouldn't functionalism face the same objection as behaviourism, namely Super Spartans objection? Suppose when super spartans get kicked, they endure the pain and do not shout or cry or do any other pain behaviors. If this is so, then even though they actually feel pain, they wouldn't realize pain function. Then theory functionalism seems to face a counter example. Is this right?
@kristianvarga9845
@kristianvarga9845 Жыл бұрын
really good explanation
@pierrelabrecque8979
@pierrelabrecque8979 Жыл бұрын
I read enough comments to understand any further venerating will be redundant. Perhaps, because your content is so captivating, it took a couple videos to realize you write like Leonardo Davinci. You Likely talk like him to.
@MrDanDant
@MrDanDant 6 ай бұрын
It occurs to me, if "brain" is a functional kind, functionalism and identity theory are identical. Am I wrong?
@ad4id
@ad4id Жыл бұрын
Great video. Well done
@charlesb2222
@charlesb2222 Жыл бұрын
"If you ain't made of the right stuff, you ain't gold" -MC Kaplan
@MugenTJ
@MugenTJ Жыл бұрын
Well, this if anything this functional theory clarifies identity theory. Not a refutation. As far as human mind, it is still just the brain. A computer can exhibit a mind that of different materials. Just like you can have a biological cat or a robotic cat. These two theories are complimentary. Not opposing. In fact, the material generating the function matters a whole lot. I got an itch every time he said it doesn’t matter the thing is made out of! 😅😅
@CMVMic
@CMVMic Жыл бұрын
Isnt this making a category error. Functionalism has to do with the mind of a thing. If we define humans and cats as substances, then functionalism doesnt apply to these definitions. Functions apply to a substance's identity, what it does that grants it subjectivity. These are just semantic distinctions but a cat can be defined according to its functions and so can a human being. In this sense, a philosophical zombie and a human can be identical. It really just depends on how we define these things. A robot that behaves and looks exactly like a cat and be defined as a cat. I think the bigger issue is with how one defines a thing as identical. To be identical, a thing must not have any numerical or spatial distinctions i.e. A=A, but then to claim something is identical in degrees or have identical parts is to infer that the two things are not identical in every aspect. Also, a table, whether it is made out of word or steel, it is still the same fundamental substance, arranged in specific ways to fulfil certain functions. I think we can claim mental states can be physical events.
@popcarvalho
@popcarvalho 2 жыл бұрын
If I can do the job, I am the mental state. But how this statement could solve the mind body problem? I´m not sure if I undestood properly...
@ioannaantonaki4883
@ioannaantonaki4883 Жыл бұрын
can you provide the readings in pdf?
@dwinsemius
@dwinsemius Жыл бұрын
Great stuff, but I find myself trying to figure out how Kaplan learned to write backwards on thin air? Does that mean I not really a philosopher or scientist, but really just an engineer? How can I put trust in ideas of a guy that doesn't know how a mousetrap works?
@northernbrother1258
@northernbrother1258 Жыл бұрын
Another problem with the dualism theory of mind is that if the brain is damaged the "mind" also suffers.
@sirreginaldfishingtonxvii6149
@sirreginaldfishingtonxvii6149 Жыл бұрын
The most common counterargument I have heard about this is likening the brain to a computer setup. If the display is damaged what is shown on screen can be abnormal, yet the graphics card, processor, and motherboard are all intact and sending the same signals they always were. The point being that there can be a sort of mental "chain of command" where some things can't go through if one part is damaged. This does of course bring up further questions. Like "if this is the case, then does our brains also limit our "minds" purely by existing, and binding the mind to a very limited meat computer? Are brains a limiting interface?" There's also the argument that the mind, as the "software" or OS the brain is running, can still be messed up if you damage the physical parts of the computer it is stored within.
@originalandfunnyname8076
@originalandfunnyname8076 Жыл бұрын
I feel like functionalism is just extension to identity theory - instead of identifying mental states with concrete brain activity and neurons in humans, we can say that there're multiple possible ways to construct some mental state, still using only physical brain, but not necessarily human brain. Just like a table can be made of wood, metal or any solid matter, mental states can also made of brain process B OR mental process O. But they can only be made of mental processes, like a table couldn't be made of water for example. So I'm not fully convinced with this counterexample to Identity theory.
@GregoryWonderwheel
@GregoryWonderwheel Жыл бұрын
Descartes got his dualism from Aristotle who divided reality into the Physical and the Metaphysical. Science then said the Metaphysical was merely superstition. Descartes was attempting to rehabilitate Metaphysics by talking about mind, but he didn't have a psychological perspective to make sense of metaphysics so he used his physical perspective to describe mind. The idea that material things can't be moved by immaterial mind is a category error based on false premises for "material". It's a mistake to identify mind with brain matter, and it's equally mistaken to identify mind with mental states; though each mistake has a relatively good reason for making the mistake based on the assumed categories that result in the category errors.
@ivanilyic6492
@ivanilyic6492 2 жыл бұрын
Thank you!
@jessewilley531
@jessewilley531 11 ай бұрын
The sad thing is... I put a cardboard box in my college apartment between my writing desk and living room. I always intended to get a real table to go there but it wound up staying there all four years I was there.
@adeelashraf7366
@adeelashraf7366 2 жыл бұрын
If mental states are multiply realizable then how a cat or man is not?
@landongonzales9076
@landongonzales9076 Жыл бұрын
11:04 this aged like fine wine
@glenrotchin5523
@glenrotchin5523 Жыл бұрын
How do you write backwards so well?
@robbiekatanga
@robbiekatanga 2 жыл бұрын
But how does functionalism address the mind-body problem?
@aidenheffernan7556
@aidenheffernan7556 Жыл бұрын
the differences between the kinds seem arbitrary.. like gold has multiple isotopes so are there are multiple ways to be gold
@AdolfoLeija-id3tz
@AdolfoLeija-id3tz 4 ай бұрын
I think it is an error to say that pain is a brain state. I think pain is a brain process. For me brain state is like s picture and a process is like a movie. I guess saying that pain is a dynamic brain state would be better for my understanding. Just a little bit confused with the word state that invokes a static system.
@GibsonMumba-k4j
@GibsonMumba-k4j 7 ай бұрын
With this explanation I'm passing exams tomorrow, lol😁
@farhadmodaresi4182
@farhadmodaresi4182 Жыл бұрын
20:20 had me cracking lol
@jonthomasspears2255
@jonthomasspears2255 2 жыл бұрын
if our table is broken (oh no!) is it still a table?
@NotRelatable2u
@NotRelatable2u Жыл бұрын
@Jeffrey Kaplin Is Garfield not a cat?
@LoveAndLustInc
@LoveAndLustInc Жыл бұрын
I dig it. The AI community would have a field day with this theory!
@xyzoopsie7804
@xyzoopsie7804 3 жыл бұрын
I heard "vending machine out of Plato"😂
@denizsarkaya5410
@denizsarkaya5410 3 жыл бұрын
I have the opposite problem: whenever someone says "Plato" I hear "Playdough" ahaha
@muralidharanv9634
@muralidharanv9634 9 ай бұрын
Objection to the objection to the identity theory: Of course a table is multiply realisable but it can only be made of things which exist in this entire universe. So, the things a table can be made out of can be many but the number is still limited. Similarly, of course your mental states are multiply realisable but it can only be made out of billions of neural connections in your brain. So I don't see any problem with the Identity theory. I don't see any threats the Octopus argument poses against the Identity theory. So, we've just got a different material (Brain state O) to produce mental states which are multiply realisable. It doesn't say that Brain state O cant produce mental states as Brain state B. Its just an added material from which the mental states can be made out of.
@DatDack
@DatDack 3 жыл бұрын
I don't quite understand how multiple realizability disproves the identity theory. Identity needn't be necessarily exclusive. Many things can all have property A, we'll call it catness. For example a calico and a black cat are both cats. Saying that a calico is a cat isnt disproven by a black cat also being a cat. So if Place asserts the mind-brain identity theory and says pain is the result of human brains in humans, that can be true while also being the result of other things in different cases (the octopus brain, or an AI, etc). The octopus brain causing pain for octopuses doesn't make the human brain not cause pain in humans! Place is making an empirical claim, and the fact that mental states are created by other things in other organisms doesn't make them NOT caused by the brain in humans. Can someone explain where I'm misunderstanding? Thank you!
@DatDack
@DatDack 3 жыл бұрын
HELP PLS
@filipkaraivanov8158
@filipkaraivanov8158 2 жыл бұрын
In the identity theory mental states just are brain states. So, the mental state of pain just is nociceptive neurons firing. The claim is not just that they correlate but that they are identical. Hence, pain is necessarily nociceptive neurons firing. Here the multiple realizability objection applies because a thing can have the mental state of pain without nociceptive neurons firing and hence the two cannot be identical.
@Jensen8918
@Jensen8918 2 жыл бұрын
@@filipkaraivanov8158 Sounds like both the octopus and the human brain have analogous ways of experiencing pain, even i their architecture is different. Their architect remains the same, biology through natural selection. Seems so simple to me but I may be missing something. I think people are taking Place to literally, or if he is being that literal he is stupid. I call this theory Strong Illusionist Identity Functionalism.
@MsJavaWolf
@MsJavaWolf Жыл бұрын
There are always several interpretations of those theories, even by professional, academic philosophers. I don't think it disproves the type of identity theory that I'm familiar with. I just see the concept of a brain in a broader sense, it's just a physical information processing object. Octopi still have a sort of nervous system that might be able to produce mental states, and since their nervous system differs from ours, they might also be in slightly different mental states, I don't see any contradiction. I mean, if you had such a strict definition of what a brain is, then why even assume that cats have mental states? Their brains are already significantly different from ours, it seems arbitrary to include all sorts of animals with different brain structures but exclude octopi.
@yungzed
@yungzed 6 ай бұрын
how is a robot cat not a cat by definition of functional kind??
@aceofspades25
@aceofspades25 2 жыл бұрын
Is the non-functional kind "mammal" not multiply realisable? Dogs, cats, mice and apes are all realisations of mammal.
@Phylaetra
@Phylaetra Жыл бұрын
Are all functional kinds multiply realizable? Certainly some are (your examples all are). But just because some are doesn't mean _all_ are. OK - so, the mind is identical to the brain _in_ _human_ _beings_. The wood arranged in a certain way is identical to the table _for_ _wooden_ _tables_. Sure - a mind (should we want to take the road that (1) mental states are functional kinds (are they really? I mean, maybe, but that needs to be shown) and (2) that all functional kinds are multiply realizable (also yet to be shown) and (3) that a mind is nothing but a collection of mental states (what if it is a collection of mental states and something else that is not multiply realizable?) - OK minds are multiply realizable, meaning that physical brains are not the only substrate for minds. BUT - that doesn't mean there is something other than a physical substrate - which we can call a 'brain' for any mind that inhabits it. And changing that substrate necessarily changes the mind. Currently, the only minds we have to observe (that we are reasonably sure about) are our own, which are uniformly housed in human brains - although, if you know of an exception, I would love to hear about it. Even minds we can imagine have some physical substrate - even if they are patterns of energy.
@AexisRai
@AexisRai Жыл бұрын
I feel bamboozled. Isn't the "mind-brain identity theory" (MBIT) just saying "every mind state is in fact some brain state"? Why would the octopus be a problem for MBIT? The octopus objection seems to merely be saying this: "Hah, the octopus certainly can't embody pain as B like a human, only as O. Yet it is still "the same state" functionally, speaking because functionally it's pain. So MBIT is wrong because B and O are both "the same mind state" but they aren't "the same brain state" ! Checkmate!" Why is this a problem for MBIT? Does the original statement of MBIT actually say something stupid like "any given mind state has EXACTLY ONE particular realization as a brain state"? The whole "multiply realizable" thing felt like a ruse; it felt obvious the whole time that there are multiple states that could realize a mind state, well before introducing the octopus, simply because _human_ brains are different.
@xenoblad
@xenoblad 2 жыл бұрын
20:21 This may seem childish, but I want someone to make a gif of Prof. Kaplan just making an octopus sound.
@coffeeisgood102
@coffeeisgood102 Жыл бұрын
My cat is not grouchy, but it’s job is to annoy.
@Einzefugen
@Einzefugen 4 ай бұрын
new brain identity theory : mind can be made of any type of neuron like structure. for a specific creature, a specific mental state represent a emotion like pain
@HeDeRust
@HeDeRust 3 жыл бұрын
are about the eliminativism materialism?
@MusingsFromTheJohn00
@MusingsFromTheJohn00 Жыл бұрын
While I believe functionalism, as far as I can understand, sounds generally correct in my belief, there is an important point to this. If we make an AI to which we give a number of text or verbal inputs and the text or verbal outputs are the same as a human given the same inputs... that does not mean it is the same... because these of text or verbal inputs and outputs are vastly reduced summaries of the actual full inputs and outputs from the human. In other words, a feeling like "pain" in a human is extraordinarily complex and far beyond being fully contained within a small summary of that feeling through some written text or verbal words. Now, I do believe more advanced AI systems will eventually be developed which can truly feel pain like a human does, however the current AI like Chat GPT 4 cannot, even if it may seem like it on the surface.
@rickwyant
@rickwyant Жыл бұрын
Mind is a process? Generated by the brain?
@mikeg.6590
@mikeg.6590 2 жыл бұрын
So, is a human shopkeeper a vending machine?
@jamesboswell9324
@jamesboswell9324 Жыл бұрын
That's by far the best response I've come across in the comments here. Totally nails in an amusing way.
@homesteaddetroit117
@homesteaddetroit117 Ай бұрын
Was the girl in the beach picture hot? I'm checking if I'm understanding the function correctly. The whimpering is just like everything else in life. You can struggle against it or flow with it. To flow with it you have to engage but you can't try to guide it. Nothing can last forever. You just ride it until the destination and do your best wherever it drops you off.
@waynr
@waynr Жыл бұрын
I'm pretty sure I can do the job of a table! The question is, will anyone hire me? 🤔
@MrGeometres
@MrGeometres Жыл бұрын
The brain is the hardware, the mind is the software.
@parheliaa
@parheliaa Жыл бұрын
A good old "Duck principle" form the software development world.
The famous Chinese Room thought experiment  - John Searle (1980)
28:30
Jeffrey Kaplan
Рет қаралды 445 М.
The Mind-Brain Identity Theory
33:52
Jeffrey Kaplan
Рет қаралды 87 М.
POV: Your kids ask to play the claw machine
00:20
Hungry FAM
Рет қаралды 21 МЛН
From Small To Giant Pop Corn #katebrush #funny #shorts
00:17
Kate Brush
Рет қаралды 69 МЛН
The Zombie Argument (from David Chalmers)
18:27
Jeffrey Kaplan
Рет қаралды 95 М.
Peter Singer - ordinary people are evil
33:51
Jeffrey Kaplan
Рет қаралды 3,8 МЛН
Hilary Putnam on Meaning & Externalism (2011)
35:43
Philosophy Overdose
Рет қаралды 8 М.
Debate on Mind-Brain Relation: Searle vs Eccles (1984)
55:20
Philosophy Overdose
Рет қаралды 52 М.
What is it Like to be a Bat? - the hard problem of consciousness
30:55
Jeffrey Kaplan
Рет қаралды 531 М.
Bernard Williams' Attack on Moral Relativism
30:35
Jeffrey Kaplan
Рет қаралды 97 М.
Functionalism in 10 Minutes
10:01
Self, Mind and Body
Рет қаралды 21 М.
What is 'Theory of Mind' and why should you care.
6:23
Brain Academy
Рет қаралды 9 М.
POV: Your kids ask to play the claw machine
00:20
Hungry FAM
Рет қаралды 21 МЛН