We're on a journey to learn about life through The Twilight Zone. Consider joining us. But, even if you don't, thanks for taking some time to be with us. Do you think robots will become an answer to our loneliness problem?
@LifeofWalk9 ай бұрын
Nice doc and analytical discussion! I like the factual tidbits like how the lady is still alive! Good chance we are a simulation so not that much different. Arguably we can have feelings and freewill in the confines of a simulation because we only know what we know just like a robot. My favorite part of the episode was how he kept calling her "RoButt" instead of "Robot" 😁
@thekeyofimagination9 ай бұрын
I miss us saying, robutt. 😆 Thank you for the interesting thoughts. You’ve made me want to go back and listen to Bostrom’s thoughts on simulations again.
@CSiri-cc2hq9 ай бұрын
We havent failed as a society, we've simply extended life beyond whats reasonable. So a "machine companion" makes so much sense as a pacemaker or an artificial heart valve.
@jackiekjono9 ай бұрын
Jean Marsh was also a companion on Dr Who.
@thekeyofimagination9 ай бұрын
I didn't know that. She has had a wonderful career. I'll have to check that out. Thanks.
@melissacooper87249 ай бұрын
I mostly remember her as Mombi in the 1985 film Return To Oz.
@thekeyofimagination9 ай бұрын
I haven't seen that movie in at least 2 decades. I'll have to check that out.
@totorod9 ай бұрын
Downward inflection perfect
@MarkFlavin19 ай бұрын
I really like the lens you presented here. With the epic of loneliness we are seeing it is no doubt that synthetic companions be they ai chatbot or robots will start to fill the gap. But what happens when we start to care for them and they can’t care for us. Their models may mimic feelings but gap between feeling and mimicry is essentially infinite. But more scary is when they can feel and for whatever reason your companion doesn’t like you. Can you compel feelings? Where does the line between ownership, companionship and selfhood lay?
@thekeyofimagination9 ай бұрын
I think you're asking a lot of the important questions here, especially about compelling feelings and ownership. Those are going to be really difficult conversations that are probably not as far off as we might think.
@MarkFlavin19 ай бұрын
@@thekeyofimagination I feel like the conversation is just starting. People are starting to form attachments and with the rise of date a virtual companion apps people are starting to establish attachments to chat bots and other synthetic relationships. One of the key things you said in the video was your concern that by automating and systematizing everyday transactions we are further devaluing authentic person to person interactions. You also mentioned the value and effort relationships require. The biggest loss I worry people are going to be facing in the near future is these daily interactions. It is amazing how much emptier your day feels when you don’t have to go to the office, your work is a series of tasks and not interactions then turning shopping, dining and other day to day micro interactions from person to person to person app is it any wonder people feel out of practice when socializing. Sorry for the ramble but this is a huge topic and I think your close hit it on the mark.
@thekeyofimagination9 ай бұрын
@@MarkFlavin1 I agree, especially about the daily interactions problem. I think many people feel a pull toward believing they want fewer interactions with others, but we often don't realize how potentially damaging this can be for our overall health--it can very quickly and easily become social isolation and depression. I see this a lot in my students. I used to have to remind them to spend some time in their dorms, but now I have to convince a lot of them to get out of their dorms and socialize.
@MarkFlavin19 ай бұрын
@@thekeyofimagination there was a great book written on 1909 called the Machine Stops. It imagines the cost of isolation and looks into the future not so much to predict social media but see where the predictable outcome of chasing novelty and headlines will take us. I see this more and more especially in younger folks they do everything for the picture or share first and forget to be in the moment. The episode you highlighted showed the exact opposite where he had no one to share his moments with so he personified his attachment with his companion.
@sladen38849 ай бұрын
It'd be even more fun if your entire script was chatGPT. The weird part of the moral quandary is that If Alisha looked like bender or c3p0 leaving 'her' behind would somehow be more palatable.
@thekeyofimagination9 ай бұрын
Correy asks at one point in the episode, "why didn't they make you look like [a robot]?" It's a fair question, even if we already know the answer. I actually wonder if it will make a difference in the end. Can someone fall in love with a robot that looks like Bender or C3P0? The answer is probably yes.
@BobbiCodes4 ай бұрын
These episodes actually make me incredibly angry because I think they helped plant the seed in people's minds for desiring artificial companionship, and I believe this to be misguided and even delusional. As far as I'm concerned we might as well be talking about time travel, because in my opinion, we are so terribly far away from the development of human-level AI that society is likely to tear itself apart before we will get anywhere close. But... this won't stop companies from exploiting this misguided desire for profit, and we have this situation that we see now - fake AI being shoved into every tech product, making the experience worse, wasting resources, and setting the stage for continually fueling this obsession. All because people see episodes like this one and get the wrong idea, that this is something that is a) possible; and b) a good idea. When if anything, we should have taken away the opposite message, and interpreted it as a warning! We weren't supposed to take it as something that could ever happen, because it's a science fiction story, inviting us to temporarily suspend our disbelief in order to consider something new. So in this story, we are to assume that the robot *is* a person. And at the end, we witness a murder. After this, the victim's partner acts in such a way that I think ruins the story, because if all this is true, that she is a woman, has emotions, and feels pain, and they are in love, it shouldn't make any difference whether the inside of her head is wires and components or blood, fat, nerves and connective tissue. I think the episode betrays itself here by making him react so unnaturally. Allenby is the one who should be staying on the asteroid in my opinion, either as a prisoner, or a corpse. That would be, in my mind, the only satisfying ending.