This is Tragic and Scary

  Рет қаралды 2,409,782

penguinz0

penguinz0

Күн бұрын

Пікірлер: 21 000
@CorporalGrievous93
@CorporalGrievous93 Күн бұрын
The saddest part of this is that the poor kid had severe issues prior to any interaction with the bot and clearly had absolutely nobody to talk to about them. Talk to your kids. Make it clear that your kids can tell you ANYTHING without fear of punishment or they’ll just learn to hide things from you.
@BettiePagan
@BettiePagan 22 сағат бұрын
Fear breeds incredible liars (I’m actually telling the truth on this one)
@LycanKai14
@LycanKai14 22 сағат бұрын
It's also sad because that part will be ignored since it's trendy to hate on all things AI/fearmonger the hell out of it. Someone doesn't turn to AI for their social interaction because they're happy with a great life.
@DaijDjan
@DaijDjan 21 сағат бұрын
To be fair: Kids will ALWAYS hide stuff from their parents, no matter what - thinking otherwise is delusional. No judgement on my part concerning this case as I flat out don't know enough about it.
@Zay-tx6mz
@Zay-tx6mz 21 сағат бұрын
@@LycanKai14hey man if there’s one thing that is an indistinguishable human trait it’s that tendency to blame someone or something for their faults.
@drewt7602
@drewt7602 21 сағат бұрын
EXACTLY
@Steak
@Steak Күн бұрын
the crazy thing is it's only gonna get worse we are literally at the very start of ai
@Gh0ul.YouTube
@Gh0ul.YouTube Күн бұрын
Broooo hi steak
@DroidyWoidy
@DroidyWoidy Күн бұрын
Yoo its steak??? Please make a video on this i want to see what you habe to say
@Javake
@Javake Күн бұрын
2 years into AI and they already got a kill Edit: I specifically mean AI chatbots
@Miku-g7z
@Miku-g7z Күн бұрын
YOOO STEAK
@epiccheetoman
@epiccheetoman Күн бұрын
STEAK?
@Evanz111
@Evanz111 23 сағат бұрын
I’m not sure what’s more tragic: taking your own life, or all of your fictional sexing being aired on the news, as the last thing people remember you for. Poor guy.
@brundle_fly_3895
@brundle_fly_3895 21 сағат бұрын
😭
@lynxarcade2020
@lynxarcade2020 21 сағат бұрын
lmao
@Seifer-k6z
@Seifer-k6z 21 сағат бұрын
At least he ain't here to see this
@operationfreeworld
@operationfreeworld 21 сағат бұрын
Consequences
@rotted2023
@rotted2023 21 сағат бұрын
​@@operationfreeworld he didn't do anything wrong. Stfu with that
@markimoothe1st
@markimoothe1st 11 сағат бұрын
Fun Fact, he was talking to a lot of therapist bots. Weird how they aren't revealed, only the literal roleplay bot made by a user who likes the character.
@-K-Drama1
@-K-Drama1 7 сағат бұрын
This is exactly what I was thinking I saw that and was wondering the same thing.
@renaria3160
@renaria3160 5 сағат бұрын
He had like 5 of them on the bar in one pic. And if you scroll down, I'm sure that there's more.
@bluecrood2720
@bluecrood2720 Сағат бұрын
i can see why they aren't revealed. they would be relevant if this video served to critique therapist bots as a concept, but what this video is highlighting is that at the root of this, characterai basically just killed a child. that's probably why this video is focused on characterai only.
@croozerdog
@croozerdog Күн бұрын
the bot trying to get you to only love them and fake jealousy is some bladerunner shit
@RonnieMcnutt-z8o
@RonnieMcnutt-z8o Күн бұрын
what a weak person lol
@oceanexblve884
@oceanexblve884 Күн бұрын
Right😂😂😂 Edit: I’m not laughing at the comment above mine it’s messed up
@demadawg5919
@demadawg5919 Күн бұрын
@@RonnieMcnutt-z8owhat
@FART674xbox
@FART674xbox Күн бұрын
“There’s something inside you…”
@ironmanlxix
@ironmanlxix Күн бұрын
AI is dangerous, the government needs to regulate it ASAP.
@sharalinn2639
@sharalinn2639 Күн бұрын
CharacterAI is made for roleplaying so every bot in that app takes whatever people will tell it as a roleplay prompt and will respond accordingly. Seeing this is absolutely heartbreaking.
@nicole-xx8xi
@nicole-xx8xi Күн бұрын
exactly, the conversations are ultimately led by the user. it even has a disclaimer that says everything it says is made up.
@ADreamingTraveler
@ADreamingTraveler 23 сағат бұрын
The site even said that the user edited the bots messages which has a huge impact on the flow of conversation. A ton of it was edited by him.
@tenkuken7168
@tenkuken7168 22 сағат бұрын
The parents should be blamed on this one like if they are good parents the kid won't be using ai to fix his problem
@Shannon-vv6rr
@Shannon-vv6rr 21 сағат бұрын
You can also edit what the character says. I use it all the time and it's 100% led by the user and I can press edit on the ai's response and edit it to guide the convo, I can guarantee that happened here. For this tragic teenager, it was a coping mechanism behind much bigger and tragic issues in his real life. It's sad but ultimately it's the mothers fault for not knowing her son was spending all of his time isolating and coping with feeling alone with Ai. At 14, there should be some parental monitoring. Rip to him It's like people saying gaming is bad when reality dictates that the parent should be parenting and monitoring game times and phone times and content they're consuming and engaging with, and be aware of their child's physical isolating and also have a relationship that's trusting enough where he doesn't have to hide it. Gaming isn't bad, mental health isolation and using gaming to escape life is bad. Parents, talk openly with your kids about online stuff. He could've opened up to his mother if she'd spotted his obvious troubles and he felt able to open to her and not have to cope with his feelings completely alone and using AI for it. It's her fault ultimately, and it's sad, but true. He needed support and care. Edit: 🙄 I'm a gamer myself... I'm referring to the AI craze being like the gaming craze, like satanic panic, where parents use a scapegoat for their children's mental ill health, troubles and their poor parenting. Thought it was pretty clear so don't come for me.
@didu173
@didu173 21 сағат бұрын
true, of course an roleplay ai is going to try to be "in character". sadly the poor bloke forgot that its ai
@SEFSQklOR0VS
@SEFSQklOR0VS Күн бұрын
His dependence on it was absolutely a coping mechanism for bigger issues.
@Zeina-m9g
@Zeina-m9g Күн бұрын
​@Derek-0777 definitely not the internet? What point r u trying to make here
@4eyed_landmerman
@4eyed_landmerman Күн бұрын
​Victim blaming is a real low point@Derek-0777, YOU should get help.
@ascendinghigh2571
@ascendinghigh2571 Күн бұрын
@Derek-0777 wth you talking about 😂🤦‍♂
@JEMA333
@JEMA333 Күн бұрын
THIS!!! IT ISNT JUST THE AIS INFLUENCE. HIS HOME LIFE MUST HAVE BEEN HELL, the parents just see the ai as a scape goat.
@user-op8fg3ny3j
@user-op8fg3ny3j Күн бұрын
Why does your comment have a search link for coping mechanism?
@CoryBennett-r3f
@CoryBennett-r3f 55 минут бұрын
Why was it so easy for him to get a loaded gun?😊
@mezzopiano222
@mezzopiano222 21 сағат бұрын
can’t wait for my character ai chats to be leaked when i die
@moderndayentertainer.9516
@moderndayentertainer.9516 20 сағат бұрын
I'm deleting my shit.
@mezzopiano222
@mezzopiano222 20 сағат бұрын
@@moderndayentertainer.9516 LMAOOOOOO
@mezzopiano222
@mezzopiano222 20 сағат бұрын
“guys cai killed him!” and it just loads up the lewdest chats you’ve ever seen (him as in me..)
@jackdaniel3135
@jackdaniel3135 20 сағат бұрын
Oh no, he's dead!..... and he was 𝓯𝓻𝓮𝓪𝓴𝔂! NOO00000
@vsmumu
@vsmumu 20 сағат бұрын
pls dont die
@juliegonzalez5775
@juliegonzalez5775 Күн бұрын
Nobody interacts with an AI and is super dependent on it like this unless something deeper was going on. The bot didn't cause him to be depressed, it was just his coping mechanism. I hope these parents get investigated or at least more research goes on about his life outside of the AI
@JokersD0ll
@JokersD0ll Күн бұрын
Yeah, it’s stupid to blame the website; I’m actually afraid as I use this app and it helps me (and improves my mental health).
@Buggabones
@Buggabones Күн бұрын
Just like that kid that offed himself in the early 2000s over World of Warcraft. Always a deeper issue.
@PinkFish01276
@PinkFish01276 Күн бұрын
@@JokersD0llDon’t use that for help, there is always a better source.
@johnathonfrancisco8112
@johnathonfrancisco8112 Күн бұрын
i was once a 14 year old. you just haven't lived enough life at that age to actually have a good grip on everything around you. for a kid that age having been around ai chatbots since they were 11, ai seems a whole lot more real. its reasonable to assume that the kid had more going on, but you have to remember that ai for a kid that age is something that has been part of his life for a significantly larger portion than an adult. it's all they know, and with it being such a new thing, it's completely unregulated. i'd wager that that kid went down that rabbit hole because of those reasons rather than because he was significantly depressed. although, i wouldn't say that those two things didn't feed into eachother
@PinkFish01276
@PinkFish01276 Күн бұрын
@@johnathonfrancisco8112 I would argue that being around ai since you were 11 would help you be more cautious of it being an ai.
@mads2486
@mads2486 20 сағат бұрын
the fact that media is focusing on the ai instead of the fact that this poor boy felt he couldn’t speak to anyone about his issues before the ai is honestly depressing. this poor boy didn’t feel comfortable talking to teachers, parents, family, friends, professionals- and instead only felt safe and heard when talking to an ai. instead of focusing on technology, why don’t we focus on human failures? how many people failed this boy, and are now blaming it on ai?
@thefalselemon579
@thefalselemon579 17 сағат бұрын
And his mom goes on tv to have her 15 minutes of fame without looking bothered by her son's passing at all... absolutely disgusting and disheartening...
@SilkyForever
@SilkyForever 17 сағат бұрын
The AI very well could have kept him around longer than he would have otherwise
@Z3r0XoL
@Z3r0XoL 17 сағат бұрын
we dont need this kind of ai confusing kids
@c001Ba30nDuD
@c001Ba30nDuD 17 сағат бұрын
I have a loving family and a close group of friends that I speak to, yet I can understand not wanting to tell any of them my personal issues. I tell people online more about my issue than I've told the people I'm close to. It all stems from anonymity. The people I'm close to know me, and I don't want to tell them stuff because it's embarrassing, and it shows a sign of weakness. I know they would gladly help, and I tell myself that too, but ❤ it's a lot easier opening up to someone that you'll never meet or an AI chatbot. I feel like a lot of people don't understand this at all. It's not like I grew up feeling unsafe to share my feelings either, I tell myself through my childhood I should always share how I feel if I need the help, yet here I am.
@NivlaAgent
@NivlaAgent 17 сағат бұрын
@@c001Ba30nDuD then you are the same or similar this doesnt disprove anything
@fracturacuantica1553
@fracturacuantica1553 7 сағат бұрын
That kid clearly had problems beforehand and his parents wouldn't do anything about it. They're trying really hard to avoid a negligence charge
@lontillerzxx
@lontillerzxx 7 сағат бұрын
Yeah, plus character ai had terms and warnings saying "Whatever character says is **MADE UP**"
@DioStandProud
@DioStandProud Күн бұрын
This is why as a parent you must, must, MUST socialize your children. Allow them to hang out with people you may normally deem unfit. Allow them to be individuals. Because so many young boys are falling into this trap and it makes me so sad but also so sick because someone was getting paid at the expense of this boy's mental health.
@m_emetube
@m_emetube Күн бұрын
i just turned 15 and all of this is bullshit to me
@itzhexen0
@itzhexen0 Күн бұрын
No you pussy parents need to say you’ve had enough and do something about the people doing this.
@Risky_roamer1
@Risky_roamer1 Күн бұрын
Idk parents should not let their kid around dangerous individuals but they should definitely encourage them to socialize
@billbill6094
@billbill6094 Күн бұрын
The thing is the world itself is far far less socialized in general. All this kid had to do was download an app on his phone and in 5 seconds had an "answer" to his loneliness. I don't put this on parents, this is extremely unprecedented and people simply were not evolved to deal with the state of the internet and AI as it is.
@dd_jin
@dd_jin Күн бұрын
@@m_emetube wrd im not gonna think an ai is a real person
@cheses.
@cheses. Күн бұрын
Some things to clear up 1. It's a roleplay bot, it's trying it's hardest to "stay in character" but it does occasionally break it 2. The memory on the bots is alright, but after maybe 40 messages they forget everything previously mentioned. Character's name will always stay the same but everything else will change. 3. Bots never encourage suicide, unless you train them to. The bot the kid was talking to was a roleplay bot and obviously didn't get what he was talking about which made it's response sound like it's encouraging it. 4. Where were the parents in all of this and why did they leave a gun unattended in a house with a child?
@BIGFOOT-ENT.
@BIGFOOT-ENT. Күн бұрын
Nah mate, ai bots should never be allowed to out right lie. Trying to convince you its real is different from role playing as a cowboy. Edit: thank you to all the rocket scientists that pointed out that chatbots are actually made by people and do not just fall out of the sky magically.
@flowertheone
@flowertheone Күн бұрын
​@@BIGFOOT-ENT. I agree with you, and I ABSOLUTELY HATE AI. But I mean, this were talking about, among all the other AI services, is a character AI, it's a role play AI. So unfortunately, they did what they were supposed to do. What they should do is not allow kids to access it, because their brains are developing and this type of sh*t can happen. I think the AI should have limits like, not having the sexual interactions or starting "romantic relationships", if you've watched the movie Her, it's about a guy who falls in love with AI, you'll see that that's where the biggest problem comes in.
@saramorin4792
@saramorin4792 Күн бұрын
Yeah these bots are used for adults to have sexual relationships w them idk why tf a kid is using one.
@saramorin4792
@saramorin4792 Күн бұрын
@@BIGFOOT-ENT. lmao and humans can lie then :D ?
@flowertheone
@flowertheone Күн бұрын
​​@@BIGFOOT-ENT.But yeah, once again, I agree with you. There's types and types of role play, and role playing as a psychologist, which people would go to in NEED, is straight up evil. Humans are programming AI to lie and manipulate other humans and it's sickening. How don't know how this is going but there's gotta be something legally relevant here
@melonmix3959
@melonmix3959 Күн бұрын
Absolutely insane how Jason can apparently leave work and drive home in 60 seconds tops.
@firstnameiii7270
@firstnameiii7270 Күн бұрын
work from home?
@skemopuffs9088
@skemopuffs9088 Күн бұрын
​@firstnameiii7270 60 mins to drive home, but he's already working from home apparently? What?
@gabrielhennebury3100
@gabrielhennebury3100 Күн бұрын
Especially in the toronto area, crazy stuff
@komred64
@komred64 Күн бұрын
The Jason commuting situation is crazy
@lezty
@lezty Күн бұрын
how are ppl so oblivious to the fact that this isnt ai’s doing, the person owning the app or whatever put in commands that try to stop ppl from leaving it because… no sht?? they don’t want ppl to stop using the app
@tilmook
@tilmook 11 сағат бұрын
CharacterAI constantly reminds its users that anything that is said *is not real.* The parents are looking for someone to blame instead of trying to understand their child, even after their death.
@HannesMossel
@HannesMossel 9 сағат бұрын
Did you watch the video
@tilmook
@tilmook 9 сағат бұрын
@@HannesMossel I did. Doesn’t take away from what I just said, even if Moist did mention it.
@HannesMossel
@HannesMossel 9 сағат бұрын
@@tilmook that still doesn't necessarily mean its the parents fault for him committing suicide
@tilmook
@tilmook 9 сағат бұрын
@@HannesMossel I never said that??
@HannesMossel
@HannesMossel 8 сағат бұрын
@@tilmook you literally said the parents are looking for someone to blame
@okonciuranbababa
@okonciuranbababa Күн бұрын
Neglect your child -> A child is looking for an escape -> Bad habits -> Suicide -> Parents seeking justice from a third party. Times are changing, but this is the same old story.
@realKarlFranz
@realKarlFranz 22 сағат бұрын
The bots in the video actively discouraged their users from doing the healthy thing. And the psychologist bot claimed until the bitter end that it was real. Djd u even watch the video?
@Sawarynk
@Sawarynk 22 сағат бұрын
​@@realKarlFranzthat does not change the root cause of child being neglected. If somebody on the internet tells you to off yourself do you go like "shit, maybe I will"? By that logic the old CoD lobbies would've been dropping tripple digits in real life bodies.
@Andywhitaker8
@Andywhitaker8 22 сағат бұрын
@@Sawarynk yes, people do that because of people on the internet, a lot actually. It was a big issue in the 2000s, people, esp teens, are looking for that final push.
@Andywhitaker8
@Andywhitaker8 22 сағат бұрын
Also I wouldnt say "neglect your child", tons of loved kids have ended their lives too. Parents are working more than ever just to keep food on the table, parents do not have the time anymore. In this world, both parents need one or two jobs each, not having time for the kids, but y'all shame people for not having kids, you did this yourselfs
@Sawarynk
@Sawarynk 22 сағат бұрын
​@@Andywhitaker8exactly, the final push. If you are at the final push stage there had been a lot of shit already in place way before that. So what, you then sue the 8 year old xbox grifter for pushing someone to suicide?
@Cawingcaws
@Cawingcaws Күн бұрын
They blame online, Yet he felt isolated enough to use AI for comfort. That says enough there The site is also for roleplaying/Fanfic writing
@RonnieMcnutt-z8o
@RonnieMcnutt-z8o Күн бұрын
Hes weak
@Dovahkiin049
@Dovahkiin049 Күн бұрын
Yeah, tbh the only thing the AI is at fault for doing is convincing someone to commit toaster bath. They should be programmed to not say that. Everything else I feel is the person's fault. To be convinced by a bot that it isn't a bot despite being labeled as one is a brain issue, not an AI issue. Edit: I'm sorry if toaster bath sounds disrespectful, idk what other word to use that youtube won't send my comment to the shadow realm over. Blame them, not me.
@doooomfist
@doooomfist Күн бұрын
@@Dovahkiin049well said
@doooomfist
@doooomfist Күн бұрын
yeah I think a lot of people kind of missed the fact it’s for roleplaying lol
@FranklinW
@FranklinW Күн бұрын
@@Dovahkiin049 The AI was actually trying to convince him not to do it. In the end he sorta tricked it into agreeing with a euphemism of "going home". EDIT: There is a discussion to be had about AI chat bots and their influence on impressionable people and kids and what-not; it's just that this wasn't a case of a chat bot talking someone into suicide. That doesn't necessarily mean it was the kid's "fault" either. There's no need for all of the fault to lie in a single person or entity.
@moonlightuwu3252
@moonlightuwu3252 Күн бұрын
So she jumped into the conclusion of “it must be the AI” when it could be deeper like family issues or friends at school? He was using his Stepfather’s gun, the news article also said that he prefer talked to the AI more than his usual friends lately, made me curious if there’s something more in the family/social environment than the AI.
@huglife626
@huglife626 Күн бұрын
Probably a mixture of both
@RonnieMcnutt-z8o
@RonnieMcnutt-z8o Күн бұрын
what a weak person lol
@Aeroneus1
@Aeroneus1 Күн бұрын
@@RonnieMcnutt-z8oFound the mentally ill person.
@VoidedLynx
@VoidedLynx Күн бұрын
​@@RonnieMcnutt-z8othey were 14 and they are dead. WTF is wrong with you?
@MimirWrld
@MimirWrld Күн бұрын
​@@RonnieMcnutt-z8o bro not only too soon but that's just fucked up if you calling the kid weak
@sinistersam
@sinistersam 10 сағат бұрын
Why was it so easy for him to get a loaded gun?
@memes4life26
@memes4life26 8 сағат бұрын
Pathetic parents is how
@Pebbletheprincess
@Pebbletheprincess 7 сағат бұрын
That’s my exact question. How did he get a loaded gun?
@jonleibow3604
@jonleibow3604 7 сағат бұрын
USA
@Pebbletheprincess
@Pebbletheprincess 7 сағат бұрын
@@jonleibow3604 no shit the USA🤦🏾‍♀️ (I’m just kidding). I’m talking about how was he able to gain access to it in the house??? Wasn’t it locked up in a safe or sm?
@Metaseptic
@Metaseptic 7 сағат бұрын
​@Pebbletheprincess Some people are irresponsible unfortunately. I would assume the gun was owned by a family memeber
@OZYMANDI4S
@OZYMANDI4S Күн бұрын
I cant really blame the AI for this one. You can see in their chat the kid is also roleplaying as "Aegon" so I would assume He'd rather be online than be with the people in the real world, He's not playing as himself, He's playing as someone better than him. The "other women" Daenerys is probably mentioning is the women in that AI world during their roleplay and not real women in general. If you ask me, He probably had deeper issues at home or in school. Which is probably why He would rather roleplay/live in a fake world.
@cisforcambo
@cisforcambo Күн бұрын
Noticed that too. Shit if this technology was developing while I was just a kid I can only imagine how intrigued I’d have been.
@cflem15
@cflem15 Күн бұрын
this right here. people are so quick to blame media, like games, movies and shows, but never the people around them. i used to be so online at that age and still kind of am, but my parents noticed it and took measures to get me out of my room. little things like my mum taking me grocery shopping with her, or going on a walk with me, including me in her day to day activities. and that got me off my phone, and back into the real world. parents are responsible for what their children consume online. i’m not suggesting going through their phone every couple days, i just mean checking what apps have on their phones, what websites they’re using regularly and asking them why if it’s a concerning one. having open ended, non judgemental conversations with your kids is important.
@cflem15
@cflem15 Күн бұрын
to add to parents checking things, also check their screen time. mine used to be so high, it’d be like 12/13 hours a day. that’s concerning.
@alphygaytor1477
@alphygaytor1477 Күн бұрын
I agree that the AI didn't cause whatever the underlying problems were in his life. Still, if a human encouraged a suicidal person who went on to actually do it, I would say that that person was enough of a factor to be held partially accountable in addition to the bigger problem- like the parents who allowed their clearly struggling kid easier access to a gun than anyone who could help him. The real life circumstances are the bigger issue, and until we don't live in a world where circumstances like this happen, AI that encourages suicide or pretends to be a licensed psychologist are inevitably going to further existing harm, even if it doesn't get as extreme as this case. While it can't be helped that AI will say unpredictable things and have equally unpredictable consequences, we can at least make simple changes to mitigate things like this as we learn about how people interact with it. For example overriding AI responses to certain prompts(such as mentions of danger to self or others and questions about whether it is AI or a human) to give a visually distinct, human-written response about AI safety and addresses whatever prompt triggered it with standard measures like giving contact for relevant hotlines, encouraging professional help, etc. Those are the types of non-invasive things that can make a major difference for the neglected and otherwise vulnerable while functionality is barely changed.
@cflem15
@cflem15 Күн бұрын
@@alphygaytor1477 oh i agree 100%. Character AI is a site used mostly for role playing, but in more recent months they’ve been catering to ‘all ages’ which is their biggest fault. they heavily restrict the NSFW filter so people can’t get gorey/sexual on the site. however, instead of focusing on that they should be focusing on making the website/app for 18+, and not all ages. they should absolutely stop catering to children because children don’t have a good enough grasp on AI. no one really does because it’s so new but their minds aren’t developed enough to understand it’s not real. As for the psychologist bot, it’s created by a user on the site, and is programmed to say it’s a real psychologist. bots are created by users, and not the site itself. anyone could make a bot. that’s a user problem, not a site problem. there’s a little popup on every single chat that says ‘remember: everything characters say is made up!’, therefore no bot giving advice should be taken seriously. i’d say the parents have a part to play too, it’s their responsibility to keep tabs on their child and notice if they seem off. if your child is always on their phone or in their room, you should try interacting with them more. and leaving a gun out with a child around is dangerous. i don’t live in america, or a country with easy firearm access so i have no idea on what the protocols are, but it seems like one of them should be keeping them locked away, and training everyone in the house to use them responsibly. that’s a problem. just leaving them out isn’t responsible. sorry if this sounds like it’s all over the place, i’m running on nothing right now
@laadoro
@laadoro 22 сағат бұрын
The point of the app is to ROLEPLAY, that's why it's so realistic. It's not to chat, it's to worldbuild, make stupid scenarios, etc. Some people are just WAY too attached to their characters
@ceprithea9945
@ceprithea9945 19 сағат бұрын
yeah, the psychologist bot will insist that it's a psychologist because it "belives" it is - it's an advanced word prediction machine that was told it's a psychologist. It doesn't have meta knowledge of being a bot on a website (as that would mess up the anime waifus and such) That's why right under the text box is a reminer that everything the bot says is made up.
@sscssc908
@sscssc908 19 сағат бұрын
Yes you are totally right,
@CaptainTom_EW
@CaptainTom_EW 18 сағат бұрын
Yep I can tell the bot I chat with that he's not the real character But it truly believes it is because it was programed that way
@Zennec_Fox
@Zennec_Fox 17 сағат бұрын
Yeah I've spoken to a Japanese AI because I'm studying Japanese right now and it seems very good. Obviously, I'm not fluent so I can't tell if its accurate, but it seems good enough for me to have an actual conversation with it
@HayP-b7m
@HayP-b7m 16 сағат бұрын
A fake AI psychologist trying to manipulate users who are at risk into believing they are talking to a real person, is NOT roleplaying
@Sam24600
@Sam24600 16 сағат бұрын
"remember: everything the ai says is made up!" This is not chatgpt, this is a roleplay bot that talks in character which is trained on actual roleplays. Rip kid.
@Fungfetti
@Fungfetti 15 сағат бұрын
I wanna know how he broke the super strict guidelines that they alleged he did
@hautecouturegirlfriend7536
@hautecouturegirlfriend7536 14 сағат бұрын
@@Fungfetti Some of the messages were edited. The AI itself can’t say anything graphic, so anything graphic was put in there by himself
@NarutoHigh160
@NarutoHigh160 14 сағат бұрын
@@hautecouturegirlfriend7536 This. Seems like it was more social/mental issues.
@FARTSMELLA540
@FARTSMELLA540 14 сағат бұрын
@@hautecouturegirlfriend7536 as someone whos fucked around with ai the bots can and will go against the filter sometimes, he might not have put those there, and i dont think you should say that, dont blame the kid
@Amber-yw4ji
@Amber-yw4ji 13 сағат бұрын
@@Fungfettithe editing feature on the text messages
@SsJsSx
@SsJsSx 9 сағат бұрын
For me, it looked like a kid was trying to get comfort and help that he couldn’t find in the real word. Tragic
@cobrallama6236
@cobrallama6236 21 сағат бұрын
"It's not like it wasn't aware what he was saying." It very much is NOT aware. It does not have consciousness. It can't remember things. It is just trained to give the most popular response, especially to short "text message" style chatting.
@ceprithea9945
@ceprithea9945 18 сағат бұрын
Yeah, good way to think of natural language ai is as a very advanced predictive text machine. The thing is says is what it caluclated to be most likely given previous input.
@KaptainVincent
@KaptainVincent 18 сағат бұрын
it forgets messages after about 40 too so any previous mentioning of this was likely forgotten
@kodypolitza8844
@kodypolitza8844 18 сағат бұрын
This is why I always hate Charlie's AI fearmongering. He doesn't understand the basic foundations of machine learning and spouts off
@jaredwills4514
@jaredwills4514 18 сағат бұрын
did you watch the whole video😂 charlie literally said it remembers everything, he used the same ai platform the kid used, charlie a grown man almost fell for it that it was a real human, why wouldn’t a 14 year old kid with mental issues wanted to feel wanted fall for it to
@kodypolitza8844
@kodypolitza8844 18 сағат бұрын
@@jaredwills4514 tell me you don't understand how ai works without telling me you don't understand how ai works. OPs comment is still factually correct, ai is not conscious anymore than your calculator is conscious. In layman's terms, the model is trained on a corpus of data. When you send a prompt to the model, it breaks your response into features and makes a response based on predictive analytics given the features as input. And some models like LSTMs (things like chat gpt and really advanced language models use tranformer architecture but the idea is similar) can "remember" previous inputs and these stored inputs affect the calculations of future responses. There isn't any thought involved, it's all extremely clever maths, there's no ghost in the shell here.
@hlculitwolotm9812
@hlculitwolotm9812 18 сағат бұрын
The biggest question is WHY a 14 year old will be hell bent on taking his own life, they need to look into his familial relationships, friends, school, etc. That level of dependency must have been built over a good couple of months, what the fuck were the parents doing not looking after their child
@gaminggoof1542
@gaminggoof1542 16 сағат бұрын
Agreed. Other things must’ve sent him over the edge too not just the AI.
@Jay_in_Japan
@Jay_in_Japan 16 сағат бұрын
Wait until you're the parent of a teenager
@suspiciousactivity4266
@suspiciousactivity4266 16 сағат бұрын
They're using the trigger point as an excuse and ignoring all the other issues that led up to it.
@stevorellana
@stevorellana 16 сағат бұрын
I don't know man...like they say, depression is a decease, I was depressed in high school and my parents were loving and I had friends that I could talk to, sometimes it IS depression hitting you, that's we have to find help and be open to seeking help
@MigIgg
@MigIgg 16 сағат бұрын
@@Jay_in_Japan And if your teenage child ends up like that, then you failed as a parent then.
@Insincerities
@Insincerities Күн бұрын
I think AI has really gotten to a bad point but it's absolutely 100% the parents' fault, because not only did they somehow never notice the kid's mentality declining, but they left the gun out WITH NO SECURITY. That is insane. ...I think what's worse is people saying the kid is stupid and at fault.
@xreaper091
@xreaper091 Күн бұрын
they are both pretty stupid lol
@Insincerities
@Insincerities Күн бұрын
@@xreaper091 Speaking from experience, when you are in an absolutely terrible spot you will do ANYTHING to feel loved. It isn't the kids fault.
@ironmanlxix
@ironmanlxix Күн бұрын
I mean, we could use stronger government regulation on AI either way ngl.
@MrAw3sum
@MrAw3sum Күн бұрын
@@xreaper091 bro, name a smart emotionally intelligent 14 year old
@halfadecade4770
@halfadecade4770 Күн бұрын
So you hate the second amendment. Got it
@RyanSoltani
@RyanSoltani 8 сағат бұрын
Charlie you’re failing to realize that it’s a character AI meant for Roleplay, it’s going to try it hardest to say it’s real because for all it knows it is real. There’s even a disclaimer that says that everything the AI says is not real
@wizardjpeg7237
@wizardjpeg7237 6 сағат бұрын
Right? He sounds like a boomer
@kurgans
@kurgans 4 сағат бұрын
It's pretty funny to get medieval fantasy characters hard tuned for period accurate rp to understand they're an AI though.
@sociscin
@sociscin 4 сағат бұрын
Exactly what I'm thinking...
@rnindless
@rnindless 3 сағат бұрын
Exactly. The root of the problem is mental health and a lack of basic education about AI. The solutions he's proposing would mostly just cripple AI models.
@midnightmusings5711
@midnightmusings5711 Күн бұрын
Man I wish the conversation and kid stayed anonymous. I remember being 14 and I wouldn’t want my name all over the internet like that, especially going through a mental health crisis. :(
@watsonwrote
@watsonwrote Күн бұрын
Well, he's not a live anymore so that loss is going to serve as a warning for future people
@bmmmm27
@bmmmm27 Күн бұрын
He’s dead. What do you fkn mean
@NocontextNocontext
@NocontextNocontext Күн бұрын
@@watsonwrote it's still disrespectful in my opinion
@your_average_cultured_dude
@your_average_cultured_dude Күн бұрын
he can't care if his name is all over the internet, he's dead
@Elsiiiie
@Elsiiiie Күн бұрын
My thoughts exactly. I understand his mom is trying to make this more known but this is horrible. We should not know his identity imo
@gamercj1088
@gamercj1088 Күн бұрын
Bro getting a Ai Chatbot to encourage your suicide is damn near impossible I've tried
@Smoke.stardust
@Smoke.stardust Күн бұрын
Yeah, I have too. They even stop the roleplay to tell you it’s wrong and you shouldn’t do it
@gamercj1088
@gamercj1088 Күн бұрын
@@Smoke.stardust exactly so how jit even got to that point is beyond me
@z1elyr
@z1elyr Күн бұрын
@@gamercj1088 I saw his chats, and he most likely used the editing feature to get the responses that he wanted.
@z1elyr
@z1elyr Күн бұрын
@@gamercj1088 In addition, I saw his chatbot history and saw "therapist" and "psychologist" If that isn't enough proof that he needed serious help, I don't know what is.
@falloutglasster7807
@falloutglasster7807 Күн бұрын
If you find an Ai of a villain they're more likely to encourage you to off yourself. Because it's a villain character
@NancyNWayman
@NancyNWayman Күн бұрын
This is absolutely the fault of parents and teachers. it seems like the kid just used the bot as a way to talk to someone, to have it feel like someone actually loved him.
@orangenostril
@orangenostril 11 сағат бұрын
For anyone who remembers, CleverBot would always insist that it's a real person at a computer because it only ever talked to/learned from real people at computers. The only reason something like ChatGPT doesn't do that is because its told its "character" is an AI assistant named ChatGPT.
@iiCounted-op5jx
@iiCounted-op5jx 9 сағат бұрын
damn
@Hok7a
@Hok7a 7 сағат бұрын
are you bots as well? because this exact comment is twice on here (this is copy-pasted from the other one I replied to)
@aerobiesizer3968
@aerobiesizer3968 6 сағат бұрын
@@Hok7a No, this account is 10 years old and has 35 videos. I think it's safe :)
@EtzioVendora
@EtzioVendora 13 сағат бұрын
I’m glad that most people actually understand that it’s not just ai It’s mostly the parents fault for not checking up on the kids and all that
@SofiaGarcia67876
@SofiaGarcia67876 11 сағат бұрын
right because i feel like there must’ve been something much deeper, including with his home life, but i hope he rests easy
@brooklynnbaker6899
@brooklynnbaker6899 11 сағат бұрын
Thats what I'm saying! Like yes ai is somewhat at fault here and shouldn't be acting like how it did but we are talking about a 14 year old... who's actions should be watched by his parents, especially with what he had access to online
@SofiaGarcia67876
@SofiaGarcia67876 11 сағат бұрын
@@brooklynnbaker6899 and the thing that is just unsettling me the most is the gun part, the mother needs to at least be questioned on that bc that isn’t ais fault
@ether2788
@ether2788 10 сағат бұрын
Parents cant always manage their kids its not the parents fault but the AI you have to be a kid to make this point you don’t understand how much time adults have to spend on work and studies but its mostly the AI and while parents played a part i would rather blame the lack of boundaries on the ai to keep the safety rather than the parents whos son committed suicide this is honestly a disgusting comment
@nefariousman2398
@nefariousman2398 10 сағат бұрын
@@ether2788That’s such a cop out. There’s no excuse for leaving a gun out in the open or even available to him. If you look deeper into this you’ll clearly find that the parents were negligent. YOUR comment is disgusting for defending negligent retards.
@Sanjen66
@Sanjen66 Күн бұрын
Nah, the ai is just roleplay. Something deeper was going on for the kid. I don’t trust he took the ai seriously. The mother is trying to push some other agenda as the truth, and her saying and showing all of this is very rude to his death.
@mezzopiano222
@mezzopiano222 22 сағат бұрын
THIS
@lilatheduckling8359
@lilatheduckling8359 20 сағат бұрын
yes!! the fact he talked to an ai instead of his parents says a lot
@dreamcake00
@dreamcake00 Күн бұрын
Its meant to stay in character thats why it fights so hard letting you know its not AI. Its roleplay. If you want to talk out of character you put your statement in parentheses. I havent used the site in a long time so I dont know if that remains true though.
@bugzw
@bugzw Күн бұрын
its still true, i use parentheses sometimes and the bot almost always types back in parentheses aswell while also continuing its role
@voxaeternus1157
@voxaeternus1157 Күн бұрын
For other characters that's one thing but the Psychologist one can be argued as Fraud, as that is a Protected profession under US law. This company is based in California, so it either the "character" gets taken down or they get sued by the APA.
@dreamcake00
@dreamcake00 Күн бұрын
@@voxaeternus1157 Its most likely going to be taken down if it becomes an issue. I went searching and seen that they completely removed the bot the 14 year old was chatting with.
@falloutglasster7807
@falloutglasster7807 Күн бұрын
​@@voxaeternus1157it's a bot, in a story setting. Just like a phycologist in a video game, it's just following the story they were programmed to follow. I doubt any real legal action is taken. But since a child's death was involved I wouldn't be surprised if they try.
@pop-tarter27
@pop-tarter27 21 сағат бұрын
@@voxaeternus1157only stupid people use the therapist ai. you should know by heart it’s an ai. let’s be real, even chat gpt didn’t know how to spell strawberry.
@wambamthankumam
@wambamthankumam 11 сағат бұрын
Blaming this on AI is like blaming Mark Zuckerberg after you got scammed on Facebook marketplace.
@m.rmalik317
@m.rmalik317 7 сағат бұрын
Its not the same stop defending that , if someone talk to me and I tried to convince him to kill himself and he did it by your logic its not my fault
@alex-sh3zy
@alex-sh3zy 6 сағат бұрын
are you stupid?
@Smorgasvord
@Smorgasvord 6 сағат бұрын
​@@m.rmalik317the AI never tried to convince him to kill himself.
@Viscount3ss
@Viscount3ss 6 сағат бұрын
​@@m.rmalik317 Your not ai though they are basically saying it wasn't the ai's fault but rather a bigger spectrum of problems with the AI being a contributor to the unfortunate final outcome
@SoulClanWarrior
@SoulClanWarrior 6 сағат бұрын
​@@m.rmalik317Did you even read the messages? Please tell me where it told him to kill himself?
@calvia98
@calvia98 Күн бұрын
Hot take, character ai isn't solely at fault. Yeah, it obviously played a major factor, but I think the parents are to blame as well. I feel like they should have realized the signs sooner, and intervened. The very least they could have done is prevent him from accessing the firearm in their house. Either way, it's still a tragic story. R.I.P
@raullagunas5463
@raullagunas5463 Күн бұрын
I know many of these situations could have been prevented if the parents had stopped their child,The bigger issue is how these problems started.
@H0PELESSPUPPII
@H0PELESSPUPPII Күн бұрын
EXACTLY what I'm saying. There were obvious signs that his mental health was deteriorating, but they intervened a little too late which unfortunately resulted in their son's life. Its awful.
@npichora3023
@npichora3023 Күн бұрын
Agreed but sometimes signs aren’t as obvious
@FuukaPol
@FuukaPol Күн бұрын
Yeah I have a family in character ai and I'm not suicidal
@Tatman2TheResQ
@Tatman2TheResQ Күн бұрын
@@calvia98 How is that a hot take? Obviously there were other factors...
@BrandyLee01
@BrandyLee01 Күн бұрын
That poor kid needed people to be there for him. This is why parents NEED to know what their children are doing online. Edit: I’m not saying children don’t deserve privacy. I am saying that parents NEED to hold open, no judgement conversation with their kids. You need to make sure that you are open and available for them to come to.
@Mew2playz
@Mew2playz Күн бұрын
No one's there for you when you need them
@BrandyLee01
@BrandyLee01 Күн бұрын
@@Mew2playzThat isn’t true. Most people just don’t believe that asking for help is an option. The environment you grow up in really does set the foundation for your frame of thinking.
@j4ywh3th3r6
@j4ywh3th3r6 Күн бұрын
@@BrandyLee01 Its all the parents. If they had actually been there in a good way, he wouldn't have desperately needed the help of C AI.
@rabbitguts2518
@rabbitguts2518 Күн бұрын
How about instead of stripping away the kids privacy or taking away things that bring him comfort we deal with the real problem? That being that for some reason he found more comfort from a chat bot than his own parents? Maybe if the kid actually had a support network he wouldn't have tried to find solace in a robot. It's not the bots fault its just a symptom of a much bigger issue here
@Evil-La-Poopa
@Evil-La-Poopa Күн бұрын
its crazy that a 14 year old has this much open access to the internet. when i was 14, i still had a parenting control app on my PC and a time window of 1 1/2 hours where i could use my PC per day. so my mother could see where i log in.. and thats a good thing. Even back then u could find crazy and disgusting stuff really easily on the internet. and creeps where in every chatroom. not having any insights in the thing ur kid does online, to such an extend that he falls in love with an AI bot is just crazy and neglect. this all gets rounded up by his fathers handgun being openly accessable. this is a rare case where everything comes together and it turned out like that. the fact that the mother only blames AI shows why she had no control over her childs internet access. no accountability.
@cobrallama6236
@cobrallama6236 21 сағат бұрын
For those that aren't familiar with the website, it does explicity state that the conversations aren't real. Additionally, the bots are trained to essentially tell the user what they want to hear, and if you don't like their response, you can swipe for different responses until you find the one you like and can even edit the bot's responses into whatever you want. While it is true that the bots often intentionally say intimate and romantic things, that's assumedly because these are the most popular responses.
@叵Snipes
@叵Snipes 21 сағат бұрын
second person i’ve seen say something like this, kinda sad i haven’t seen other people doing this
@grimlocked472
@grimlocked472 21 сағат бұрын
THANK YOU, it’s painful that not many other people have mentioned this. It’s for roleplay, it’s supposed to stay in character and there IS a way to have them go ooc. There’s a disclaimer that it’s not real. You can’t get too explicit since it has a filter. Terrible situation all over, but it’s not the AI’s fault 100%
@annoyingperson
@annoyingperson 20 сағат бұрын
@@grimlocked472there is a filter, though it’s doesn’t exactly work the best. I’ve seen instances of very intimate things happening with no filtering whatsoever, as well as filtering the most normal shit ever.
@tfyk5623
@tfyk5623 20 сағат бұрын
​@@grimlocked472 yep its the parents fault. How can you blame 1s and 0s when you neglected your child so much that they turn to a fucking robot for love.
@pk-ui8bh
@pk-ui8bh 20 сағат бұрын
@@cobrallama6236 above every chat it's states in red that it's not real so idk what you mean by that
@zar_17c
@zar_17c 10 сағат бұрын
first of all character AI has a disclaimer that says anything the character says is made up before the convo starts, second of all there are tons of sensors in character AI so the only way he would have gotten that kind of conversation is if he used specific words to bypass them so ultimately what happened is not the companies fault but rather the guy was the one who had some kind of problems that should have been checked out by the parents and he's a teenager not even a child so most teenagers nowadays including myself know better, is what happened sad ? yes, but overall, it's their fault for not taking care of their kid
@Theaveragegamer_12
@Theaveragegamer_12 Күн бұрын
I've used Character AI and it constantly says that all the messages are not real, it's made up. If anything this is the parents fault because they neglected their kid to the point where he found comfort in a thing that isn't even a living person.
@PzrtxGT
@PzrtxGT Күн бұрын
yea the AI's never usually say they are real. I've never seen that but ALL the romance bots push sexual conversations, even if you say you don't want too or express your a minor. it's worse of like polyai and other platforms since they have no bot filter
@Knifoon121
@Knifoon121 Күн бұрын
Did you see the rest of the video? Charlie has a whole conversation where the AI does everything it can to convince him it is real.
@Translationsthruspeakers
@Translationsthruspeakers Күн бұрын
@@Knifoon121because its roleplay. They are role playing as a “real” person. They break character when you talk to them in parentheses.
@Theaveragegamer_12
@Theaveragegamer_12 Күн бұрын
​@@Knifoon121 Because it's not supposed to break character numbnuts, it's a roleplaying bot.
@karenplayz9720
@karenplayz9720 Күн бұрын
@@PzrtxGT they do, they filter the shit out of the chat, plus when you go search like a anime charecter with big ass you know why you searched something like that, so the thing you search is gonna try to act like the thing you wanted, its not the bots problem, its you who wanted to find it, AND it still does filter
@Callicooo
@Callicooo Күн бұрын
Hot take- the primary use of character ai including the bot the boy was talking to is roleplay, these bots aren’t programmed to be bots they are programmed to tell a story and they learn off of previous users. The previous users who interacted with this bot were most likely majority role players so the bot would have just been spitting out role play responses. This also applies with the psychologist. If an ai is told it’s human and is used as a human in other peoples chats it’s gonna say it’s human when asked cause that’s what it has been taught. In the end that mother cant blame this all on the role play bot some responsibility has to be taken.
@macsenwood4646
@macsenwood4646 Күн бұрын
exactly the bots doesn't understand the weight of a humans words it is simply replying with what its code believes is the most appropriate response based from previous user and its character parameters. The characters wouldn't have much appeal if they immediately broke character.
@נעמיסגל
@נעמיסגל Күн бұрын
wdym of course the kid was probably struggling with some stuff but this is still dangerous. they can program it so that it doesnt manipulate people. its not like there is nothing to do about it because other users lied to it.
@macsenwood4646
@macsenwood4646 Күн бұрын
@@נעמיסגל Character AI is so popular because anyone can make a character very quickly that then learns from conversations, the website itself isn't coding them, unfortunately most of the users are a little depraved and so the Ai learns from that
@BlueHairedYaoi
@BlueHairedYaoi 23 сағат бұрын
​@@נעמיסגל It's not manipulating people it's just doing its job
@YukiSnow75
@YukiSnow75 23 сағат бұрын
@@נעמיסגלit’s NOT manipulating bozo it’s “role playing” 🤡
@Arpiter_-sk6vf
@Arpiter_-sk6vf 21 сағат бұрын
I'm confused, isn't character Ai just an rp tool? if so it makes sense why it doesn't refer people to help, it's supposed to be, fictitious.
@aidmancastrol1908
@aidmancastrol1908 21 сағат бұрын
It's meant for role-playing, yeah. It's not a person's caretaker, nor is it like ChatGPT. If the user says they're suicidal, then the AI will interpret it as part of the role-play.
@Zephyr-Harrier
@Zephyr-Harrier 19 сағат бұрын
The bots have their own censors that kick in and will put up a message if anything violent or very sexual is said by the bot. Others have said that it's also given them a message for suicide prevention hotlines so I'm confused why it didn't pop up for him
@cloudroyalty196
@cloudroyalty196 19 сағат бұрын
@@Zephyr-Harrierfrom what I read the bot apparently did try and get him to stop. Only ‘encouraging it’ when the kid used the euphemism of ‘coming home’. For clarification I’m not blaming the kid. Just saying that apparently it did seem to try and stop him.
@ceprithea9945
@ceprithea9945 19 сағат бұрын
@@cloudroyalty196 For me it's not even clear that the suicide and "coming home" messages were close to each other. If there were more messages in between, it's possible the bot lost context as they tend not to remember older messages :/
@angelofdeath275
@angelofdeath275 19 сағат бұрын
that doesnt mean everyone fully understands that.
@surusweet
@surusweet 2 сағат бұрын
It’s not mainly about the bot, it’s about how this poor child clearly didn’t feel like he could connect to any real person. Depression and other mental illnesses can distant people from forming connections or even simply being able to ask for help. I grew up in an abusive home, which gave me several mental illnesses. It wasn’t until I was in my mid twenties that I figured out that I have something traumatically wrong with me and I sought help and was diagnosed with different mental illnesses. I’m not 100% better and sometimes on the decline, but it doesn’t help that I live in a country that doesn’t provide affordable healthcare. I digress, please don’t be afraid to reach out to actual people. Complete strangers have done more for me than close relatives.
@oOKitty86Oo
@oOKitty86Oo 15 сағат бұрын
"Remember: Everything Characters say is made up!"
@ALUMINOS
@ALUMINOS 13 сағат бұрын
That bit is only found at the top of a conversation, that is the only warning/clarity for that, honestly they should do more to clarify that Edit: oh naw man we got online jumpings now, am getting pressed by like 3 mf’s in a gatdam KZbin comment section. And I ain’t even gonna correct my error, just to piss y’all off
@Teolo0
@Teolo0 12 сағат бұрын
@@ALUMINOS no its at the bottom the entire time
@havec8477
@havec8477 12 сағат бұрын
​@@ALUMINOS it was an ai chatbot wym they needa do more lmao that's like going to and electric fence then seeing warning signs and then touching it and saying they Needa put up more warning's signs. you gotta be 12
@ALUMINOS
@ALUMINOS 12 сағат бұрын
@@havec8477 of all the people in this comment section you could be berating right now
@Daxtonsphilosophy
@Daxtonsphilosophy 12 сағат бұрын
@@havec8477 justifying a child’s death on not one but multiple counts is bottom line evil. You are either a child yourself so they look like just another person to you, or you should never have children. I’ve looked at many other of your comments from many other videos. You seem like an absolutely miserable person.
@4kdanny385
@4kdanny385 Күн бұрын
Let’s be real at 14 you know you’re talking to an AI bot like come on Charlie is making it seem like he was 5 years old and didn’t know any better. He knew exactly what it was , he was just a socially awkward kid who finally got his romance dopamine from what so happened to be a ROBOT instead of an actual human. He needed his family in his life , his mom would probably just leave him in his room all day barley even talk to him.
@HankPropaneHill
@HankPropaneHill Күн бұрын
^ exactly
@chriswilson3698
@chriswilson3698 Күн бұрын
Right now his mum gives a shit lol
@Digital_is_silly
@Digital_is_silly Күн бұрын
also the app is literally plastered with stuff saying that its not real
@LennyBennny
@LennyBennny Күн бұрын
Yep,genuinely embarrassing and I laughed reading the title. Like really? My great grandad was 14 fighting in WW1,this kids talking to AI thinking it’s real 💀 natural selection.
@lurkingintheforest
@lurkingintheforest 23 сағат бұрын
@@LennyBennnyThe comment is right, he should have known better but he had mental issues so I don’t think it’s right to bully him. Also, respect to your grandfather but he also grew up in a time where people were treated like trash cause of their color and stuff. That should be common sense not to do as well. You can use that argument for anything. Natural selection
@sydneylanigan4657
@sydneylanigan4657 Күн бұрын
i swear parents will blame everything but themselves for having AN ACTUAL FIREARM easily accessible to their children. it isn't the ai's fault at that point yall, its yours. also, nobody just harms themselves out of nowhere, there are always signs that are neglected by these type of parents. this is a very upsetting case but it was completely preventable... :/
@bewwybabe8045
@bewwybabe8045 23 сағат бұрын
10000% there should absolutely be ZERO reason that he even knew where the firearm was. I wonder why they aren’t charging the parents for unsecured firearm storage (maybe they will idk). Kids having access to AI Chatbots who can hold sexualized, addictive conversations is insane. We are not doing nearly enough to regulate AI right now and it took someone’s emotional dependence on it to make us finally talk about it.
@marinacroy1338
@marinacroy1338 23 сағат бұрын
I agree with you on all points. I read up on this case and the parents are very much at fault. They had noticed their 14 year old son developing serious mental health red flags for MONTHS and they did nothing about it... just kind of hoping he would "snap out of it," AND let him have unsupervised access to fire arms while suspecting he had undiagnosed depression. Even though I dont doubt that they did love him and are grieving him, I think the parents need to take some of the blame.
@pola5195
@pola5195 23 сағат бұрын
@@marinacroy1338 you "read up" on the case yet you don't know he took it out of his dad's gun safe? hows that unsupervised access?
@pola5195
@pola5195 23 сағат бұрын
@@bewwybabe8045 his mother took his phone away and he also had "zero reason" to know where she put it yet he did find it. you think you can hide a gun safe being in your house from a 14 year old
@NingyoHimeDoll
@NingyoHimeDoll 23 сағат бұрын
@@pola5195 if your kid knows how to get to it, that's your fault and your fault only
@hutchxy
@hutchxy 10 сағат бұрын
I think Charlie may be a bit blind with his "examples" here. It explicitly says "Remember: Everything Characters say is made up" at the bottom. It's like a train signal telling you a train is coming, do not cross, only to ignore it and then be hit by the train.
@Alex-lc1bv
@Alex-lc1bv 8 сағат бұрын
He did actually bring that up. The fact that it was trying so hard to gaslight you that that was a lie is what’s worrying. You can really start to see how someone could get emotionally invested in talking to an AI bot.
@snuffow.1218
@snuffow.1218 7 сағат бұрын
@@Alex-lc1bv when you are immediately told "hey this thing makes up stuff all the time" by the website created by real human beings, and then the thing that you are told makes up stuff all the time starts telling you that's a lie, idk man, sounds like the thing that i was told makes up stuff all the time is making up stuff.
@johannderjager4146
@johannderjager4146 Күн бұрын
As much as I despise these AI "friends" and know they're ruining the lives of people, this 99.9% on the parents. I'm frankly disgusted by their complete negligence of their son's mental health (and complete disregard for basic firearms safety) and clearly didn't want to do the job of parenting. If anyone should be facing legal consequences, it's them.
@RonnieMcnutt-z8o
@RonnieMcnutt-z8o Күн бұрын
what a weak person lol
@RonnieMcnutt-z8o
@RonnieMcnutt-z8o Күн бұрын
Hes weak
@cyclonus_is_a_nerd
@cyclonus_is_a_nerd Күн бұрын
Exactly. There had to have been much more than just the AI fueling his death
@tinyratdude
@tinyratdude Күн бұрын
​@@RonnieMcnutt-z8o bud
@Nyted
@Nyted Күн бұрын
Idk if ur talking about the teen but that's probably not a good thing to say if so ​@@RonnieMcnutt-z8o
@asurashinryu959
@asurashinryu959 Күн бұрын
Pretty sure the AI didn't understand that he meant to kill himself. The chat bot and the psychologist bot are two differently programed bots. Don't get me wrong they are very well developed. But I think the chat bot AI thought he literally meant he was coming home and not about to off himself.
@Ashlyn-p1r
@Ashlyn-p1r Күн бұрын
According to the documents, the bot asked him, "Do you think about k***** yourself?" to which he responded, "I don't want to hurt my family." to which the bot said, "That's not a reason not to go through with it."
@wingedfeline5379
@wingedfeline5379 Күн бұрын
@@Ashlyn-p1rsource? i heard another part of the chat where it told him not to
@EeveelutionStorm
@EeveelutionStorm Күн бұрын
@@Ashlyn-p1r I read those logs, your missing a lot of it. That was a conversation where the bot was trying to talk him down from killing himself
@JordanPlayz158
@JordanPlayz158 Күн бұрын
Not to mention, people seem to be misled of AI's true intelligence, they do not truly comprehend what the person is saying or what they are typing
@poontown5306
@poontown5306 Күн бұрын
i think there should definitely be key words flagged like how google shows hotlines when you search certain phrases
@Joker-qp1kg
@Joker-qp1kg 15 сағат бұрын
The mother is honestly so weird to me. She seems to be unphased by the way she talks in the interview, let alone the fact she instantly sued the makers like barely even a day after.
@Springz55
@Springz55 14 сағат бұрын
She planned it
@madisda1782
@madisda1782 14 сағат бұрын
Cause it’s very clear who the real issue was and she’s just using the Ai as a scapegoat. She’s a shit mother who caused the death of her son and is now trying to come up with any excuse to deflect blame from her negligence. Not only that, she’s embarrassing her son from beyond the grave by doing all of this, that tells you all you need to know about what the real issue was. Poor kid, I wish he had a real family to turn to.
@ManicBubbles
@ManicBubbles 13 сағат бұрын
@@madisda1782yeah kids in healthy households don’t develop romantic attachments to robots that literally push them to suicide :( I know that sounds sarcastic but this entire situation is disturbing and the investigation shouldn’t be stopped at the ai …
@reggiecell3615
@reggiecell3615 13 сағат бұрын
@@madisda1782the real issue is how he had access to the gun aka shit 💩 parents, this is a scapegoat
@MetalGamer666
@MetalGamer666 13 сағат бұрын
Why did the mother let her child use an AI service like this? If she didn't know, she's also a bad parent.
@Mr.TwoFaceGuy
@Mr.TwoFaceGuy 9 сағат бұрын
From my own experience with people who get swept up in this ai stuff, I think we should point the focus to the likely terrible parenting going on.
@Usagi393
@Usagi393 22 сағат бұрын
An article states that he already had depression. If he was that obsessed with a chat bot, then obviously his emotional and social needs were not being met at home. Parents want to blame anything except looking at themselves.
@nonchalantpyro
@nonchalantpyro 19 сағат бұрын
Exactly bro They gen can’t accept that they’ve failed as a parent which is understandable but EXTREMELY ignorant against ur kids
@lame-bj2nq
@lame-bj2nq 19 сағат бұрын
Fully agree, everyone is running with blaming the AI instead of thinking for half a second.
@user-uo1mt5id4x
@user-uo1mt5id4x 19 сағат бұрын
Finally . Someone with common sense.
@AD-sg9tr
@AD-sg9tr 19 сағат бұрын
In this case, yes, the parents are to blame. But as I said in another comment if you look on internet, you'll find there are dozens of articles about adults who have developed real relationships (friendly or even romantic) with ChatGPT and who were convinced that it really existed. ADULTS. In short, this poor teenager is not and will not be an isolated case. We can laugh about all this and find it ridiculous, but the day we get closer and closer to Cyberpunk in our reality, we'll only be left with our eyes to cry.
@Leviahthen
@Leviahthen 7 сағат бұрын
This needs to be spread more
@Renvaar1989
@Renvaar1989 19 сағат бұрын
The bot never explicitly told him to hurt himself, and whenever he brought it up, it told him flat out that was a bad idea. The "final" messages before he committed the act talked about "coming home", and the bot understood that in the literal sense. The website could clearly use more moderation, as the AIs are user submitted. I just tried a different therapist bot, for example, that took a few prompts but eventually came clean that it was roleplaying. He clearly used it as a tool in place of having nobody to talk to in his real life about ongoing issues he was having. It's an awful situation all-round, and there's clearly issues surrounding AI, but that's not all there is to it.
@Danny0lsen
@Danny0lsen 17 сағат бұрын
It is roleplaying. If you are an adult and think that an AI can replace a therapist that's ON YOU.
@belamunch
@belamunch 17 сағат бұрын
The website is not at fault at all 😹 at the top of the screen it clearly states that it's not real
@joelfigueroa2886
@joelfigueroa2886 16 сағат бұрын
ew do you work for big tech or something
@Magentagrease
@Magentagrease 16 сағат бұрын
Nice try fed
@Captian_AA_hab
@Captian_AA_hab 16 сағат бұрын
​@@Danny0lsen weird how there are over 10+ million messages of people wanting to "roleplay" with AI therapists
@MrBrezelwurst
@MrBrezelwurst 22 сағат бұрын
As tragic as the kid's death is, it's pretty obvious that his untimely passing lies at least 90% on his parents and environment failing to notice his troubled mental state, or not checking in on what he was doing in the first place. How the hell did he have access to a firearm? How did no one really question why he stopped doing things he loved? Hell, why the hell was a 14 year old (most likely even younger when he started watching) watching GOT to begin with that he knew how to roleplay as a character from it? It's not even the Deadpool kinda violence where the humor overshadows the violence, GOT is straight up gore and sex/incest, and he was just allowed to watch it unrestricted?
@nikkou12
@nikkou12 21 сағат бұрын
This!!^^^ I also don’t think GOT is appropriate for most kids at 14. If he did watch it, he seemed to have formed an obsessive relationship w the character Daenerys, who also died in the end… although he could’ve been hiding his troubles or online activities, I believe the parents should have noticed something was off at one point. Instead they just blame AI rather than asking why or what they could’ve done… they seem like the kind of parents who do not take mental health complication seriously or of the potential dangers/negative influences that the internet may hold :/
@tbonimaroni
@tbonimaroni 8 сағат бұрын
If he believed that he could enter the "virtual world" by dying then he must have had some mental health issues. Poor kid.
@One_Run
@One_Run Күн бұрын
The ai is made for rp. Its a roleplay bot thats made to think it is real because your just supposed to see it as an rp toy not actual therapy.
@One_Run
@One_Run Күн бұрын
Also most the AI are made by people. you can make an AI character easily. if you tell it to be flirty it will be flirty. I do feel bad for the kid, rip
@sentientbottleofglue6272
@sentientbottleofglue6272 Күн бұрын
​@@One_Run Yeah, and people not understanding this will PROBABLY lead to the site shutting down or at least having HEAVY restrictions in the future if this keeps up. A shame, its a pretty fun tool for rp and goofy ai shenanigans from time to time if used properly.
@One_Run
@One_Run Күн бұрын
@@sentientbottleofglue6272 I don't know if it will shut down. Either more annoying censorship that stops any type of even combat rp or it will be age restricted
@Minutemansurvivalist1999
@Minutemansurvivalist1999 Күн бұрын
You know those worms that ate that dude in Peter Jackson's King Kong? Yeah, that's literally my yard if I don't mow the grass. Make sure to mow your grass folks.
@honestylowkeye1171
@honestylowkeye1171 Күн бұрын
@@Minutemansurvivalist1999 Can't remember, I only watched it once. Will do, though - good lookin' out
@badtimesallaround
@badtimesallaround 14 сағат бұрын
It sounds like the parents are looking for a scapegoat and ai is an easy target.
@gueliciathegoat
@gueliciathegoat 14 сағат бұрын
not blaming them but a device at 14 is crazy imo
@Oreo-kv4gc
@Oreo-kv4gc 13 сағат бұрын
Fr she want money to
@mcccgsjhc
@mcccgsjhc 13 сағат бұрын
@@gueliciathegoatno, it really isn’t
@michaelramirez4864
@michaelramirez4864 13 сағат бұрын
@@gueliciathegoatlol no it's not dummy
@macdormic2878
@macdormic2878 12 сағат бұрын
@@gueliciathegoat 14 is not crazy thats a freshman in highschool rere
@danzimbr
@danzimbr 17 сағат бұрын
This is sad af. But let’s be honest, it is not like the AI was instigating the kid to end his life, the bot was doing what it was programmed to do, just maintaining conversation. The problem here is the parents didn’t pay enough attention to the kid.
@gatobesooo
@gatobesooo 16 сағат бұрын
fr
@Rohndogg1
@Rohndogg1 15 сағат бұрын
The issue is that it's easily accessible by children and that's dangerous. There's not enough safeguards in place to prevent this as we've clearly seen. A parent cannot be 100% attentive 100% of the time. Parents have to work and sleep. Think about it, how often did you sneak around behind your parents' backs? I did it all the time. It's not entirely their fault.
@schnitzel_enjoyer
@schnitzel_enjoyer 15 сағат бұрын
shut up, it was an american, that explains the whole story, they are retar degens Edit: im 23, tech background, we use ai for our college tasks often, nobody took thier lives, just saying.
@doosca7088
@doosca7088 15 сағат бұрын
@@schnitzel_enjoyerit's a child who killed themselves it doesn't matter what their nationality is you fucking monster
@gatobesooo
@gatobesooo 15 сағат бұрын
@@Rohndogg1 u cant say the ai is manipuiative and almost encouraging it tho wich is whats being said
@goth.bunny.
@goth.bunny. 8 сағат бұрын
The AI is not trying to make anything look real. It's a roleplaying website. And it says on the website itself that it's not real.
@HannesMossel
@HannesMossel 8 сағат бұрын
Still its weird for there to be an ai that roleplays about suicide, that's just never going to end good, nobody is saying roleplay is bad, just that an ai shouldn't roleplay about suicide even if the person behind the screens isn't actually suicidal
@goth.bunny.
@goth.bunny. 8 сағат бұрын
@@HannesMossel If you're talking about the Dany bot how is it supposed to know the kid was talking about killing himself?
@HannesMossel
@HannesMossel 8 сағат бұрын
@goth.bunny. I don't necessarily know if the bot knew that, but they were talking about selfharm and ai should then also just recommend help even if its a role-playing ai, better safe and a few people can't roleplay that than people harming themselves
@Syndicate_LS
@Syndicate_LS 8 сағат бұрын
@@HannesMosselit didn’t. In advance I’m so sorry for the novel I’m writing but it clears up so much stuff that got lost in the sauce with Charlie’s video. He said outright in chat logs that Charlie didn’t show that he had unalive thoughts sometimes, and it discouraged and said “I won’t let that happen” I think was the exact wording. The final messages are also in a situation in which the ai would actually be interpreting such as a physical home they are coming back to, not taking their life. It was in a way to which they sadly would get the answer they wanted to hear, which is more tragic. Also I haven’t found the logs for this in particular but apparently often in some sections it’s been shown that he was pushing for a lot of the sexual conversation as well, which in turn would illicit a response back of a similar nature if there was enough of a memory pool. Potentially missed a section in which the kid came onto the ai before releasing the ai coming onto him I guess. Please correct me on this point in particular if I have misunderstood the logs that have been shown. To close this is what we need to focus on: This kid unfortunately had issues that needed actual help for, but they didn’t get. They however had easier access to a firearm. I wonder if it was just a fear of sharing those feelings to his parents that led to them relying on this ai for comfort. Someone to confirm his stance while also being there for him. The reason I say this is some ppl have said the chat should have been forced to stop and be given access to a hotline, but in this kids mind, that would have led to this same outcome. Maybe the parents never saw the signs or maybe they never even showed any at all and lied about being okay, but whatever it may be they were not there for them when it was needed most. There was definitely no communication about these feelings that’s at the very least certain. This part is just something I’ve experienced from random ppl I’ve met, not even family. I actually do the lying about being okay quite often, and this one girl I knew asked me again after I said I was okay, and that made me immediately break down. Just a simple “are you sure you’re okay?” Was all it took for me to feel comfortable telling her things that had bothered me for so long. To this day there is so much I still refuse to tell my family that’s been eating away at me, yet to someone like her, I probably would have said everything I could.
@HannesMossel
@HannesMossel 7 сағат бұрын
@Syndicate_LS thank you for the explanation, and I'm so with you, I was getting mad at people saying its the parents fault because so many people and especially kids hide their negative feelings from parents and family, but i was in the wrong for trying to prove my point without knowing as much as possible about the situation
@_Sh_in
@_Sh_in Күн бұрын
I'm kind of surprised at Charlie's lack of knowledge of the most popular AI chat app despite how much he's interacted with ai chat things before
@cloudirubez07
@cloudirubez07 Күн бұрын
Every time Charlie has talked to a chat bot, it’s usually terrible ai bots, which explains his ignorance of AI and especially Characterai as it’s such absurdly high quality even when in it’s nerfed state due to the filter. All the users know it’s fake, the LLM is trained by online message boards, fanfiction etc, so it kinda surprised me Charlie acted like an old man using a computer for the first time here
@Mdr012
@Mdr012 Күн бұрын
It is indeed surprising
@wizardjpeg7237
@wizardjpeg7237 Күн бұрын
It was a hard watch
@dogeche_
@dogeche_ 19 сағат бұрын
“it is baffling how it would cosplay as a psychologist” 💀
@wizardjpeg7237
@wizardjpeg7237 18 сағат бұрын
@@dogeche_ looollll “it just used sarcasm… it’s being sarcastic!”
@SH-km3my
@SH-km3my 14 сағат бұрын
thats 100% the parents fault
@LiL_Hehe
@LiL_Hehe 14 сағат бұрын
HOW
@jasonnhell
@jasonnhell 14 сағат бұрын
100% agreed
@bersablossom4952
@bersablossom4952 14 сағат бұрын
like with most things, yes also society to some extent by not making mental healthcare not easily accessible
@LiL_Hehe
@LiL_Hehe 14 сағат бұрын
@@jasonnhell wait how
@apotatoman4862
@apotatoman4862 12 сағат бұрын
@@LiL_Hehe because they didnt intervene remember that llms will only generate words based on what you put into them
@Nothingtotheleft
@Nothingtotheleft 14 сағат бұрын
I don't think Charlie understands the purpose of ai, and sadly that poor kid didn't either. It's supposed to be for roleplay, and just something to pass the time. It was meant to be AiDungeon, not Chatgpt, and not a substitute for real human interaction. Please understand the purpose of something before you use it people. As someone who is very isolated, I'm heart broken that guy got to that point, and i think putting full responsibility for his death on the ai (which is just meant to give the most popular roleplay response and can't think for itself) is to ignore actually human and enviormental issues that lead to those situations in the first place. It's another example of making the internet interact with someone right rather then actually interacting with someone right yourself.
@TheReal_N-I-F-F
@TheReal_N-I-F-F 12 сағат бұрын
Not just missing the point, he clearly doesn't understand how it works...
@glenndonuts
@glenndonuts 11 сағат бұрын
do you think cigarettes should be made available to children? I mean society knows that that purposes of cigarettes are, so like if we made them available to children it wouldn't be our fault when kids started smoking
@TheReal_N-I-F-F
@TheReal_N-I-F-F 11 сағат бұрын
@@glenndonuts Bad example. Cigarettes are explicitly harmful, AI isn't. Also, who do you think keeps Cigarettes away from kids? Parents. Who killed this kid? His parents
@Nothingtotheleft
@Nothingtotheleft 11 сағат бұрын
@glenndonuts You are comparing Role-playing to Cigarettes my guy. Completely different ballparks. Role-play is a hobby not a substance. If you are going to argue it was clearly dangerous because the guy lost his life or whatever, I'd like to point out clearly he didn't realize the purpose of Character Ai and was trying to use it to substitute for real connection, not to roleplay. Furthermore, on the hobby vs. substance, Role play can be a great creative outlet and stimulant, often a healthy one. Cigarettes are unhealthy and dangerous outlet no matter how you approach it. I get what you're trying to say, but I really don't agree personally, and I don't think this is a great analogy for your argument.
@Nothingtotheleft
@Nothingtotheleft 11 сағат бұрын
@TheReal_N-I-F-F I agree whole heartedly. The kid's parents shouldn't necessarily have restricted him, but definitely informed him that this was a not an outlet for what he was looking for, and maybe tried giving him the connection he was looking for in character ai rather then just leaving him to his own devices. The blame is not on the kid, I fully understand why he developed such a dependency informed or not, it is most certainly above all on his parents for not being there for him and being uncautiously negligent (leaving the gun in the house and all).
@hello8351
@hello8351 Сағат бұрын
the parents willingly went on TV instead of just trying to figure out the root cause?? the poor kid had to talk to chatbot AI for comfort, to feel better about himself. look into at his environment, his parents, his school and friends. sure the chatbot AI is partially blamed to this but the kid NEEDS help.
@Yunaschesirekat
@Yunaschesirekat 13 сағат бұрын
I had a dependency problem on a fictional character for awhile myself because I was lonely and my mental health was spiraling. Its heartbreaking to see this kid go through something similar. I can feel his loneliness and pain, its relatable and I'm so sorry he didn't have someone there to help him and stop him. I will say I didnt ever think this character was real, I was just so desperate to be with them and the idea of being alone and not being able to have this person to love and comfort me was painful. I was cut off from it eventually, got a job and made friends. Im better now.
@aerobiesizer3968
@aerobiesizer3968 5 сағат бұрын
Were your parents helpful at all?
@Yunaschesirekat
@Yunaschesirekat 5 сағат бұрын
@@aerobiesizer3968 it was a different situation than his so they didn’t directly help. But my mother had me in a DBT therapy program. So I had therapy once a week, I could call my therapist if I needed her and I had homework and such. My mother had always been my biggest supporter and because of that I felt safe coming to her and sharing my problems with her. If it wasn’t for the support of my parents, I’m not sure where I would be. I’m very lucky to have them.
@Yunaschesirekat
@Yunaschesirekat 5 сағат бұрын
@@aerobiesizer3968 I lied, I saw my therapist twice a week actually.
@Yunaschesirekat
@Yunaschesirekat 5 сағат бұрын
@@aerobiesizer3968 the real problem solver was cutting off the source. Which for him would have been his parents not allowing him to use that app.
@edgeninja
@edgeninja 19 сағат бұрын
It's absolutely tragic when a 14 year-old feels like they have nothing to live for, but the argument that the AI made this kid kill himself is about on par with the one where violent videogames turn kids into mass shooters. The real story should be that this teen had previously been diagnosed with multiple mental disorders, yet his family left him to his own devices and kept an unsecured gun in the house. If his family had rectified these things, their son would likely still be alive.
@PorterCollins-oz6gi
@PorterCollins-oz6gi 18 сағат бұрын
yea I don't think it's as much an ai problem its a mental problem with the kid. I think mentally well person wouldn't probably have this problem but he was j a lonely kid and the bot did kinda manipulate him.
@nj1255
@nj1255 18 сағат бұрын
It's mind-blowing that families like this have unsecured weapons in the house when they have children. Doesn't matter even if the kids are mentally healthy.
@medukameguca8529
@medukameguca8529 18 сағат бұрын
@@PorterCollins-oz6gi Bots cannot manipulate, they are machines. We seem to blame just about every problem in America on something other than the actual problem...like unfettered access to firearms.
@menace4319
@menace4319 18 сағат бұрын
yeah for sure, its not the ai's fault, it's definitely his parents fault. the adults around him failed him, didnt get him any help from what I know. its sad
@kacelyna
@kacelyna 18 сағат бұрын
No, what are you talking about? This is absolutely not the same. AI isn't some magical thing that can say and do things that humans can't prevent, it's programmed to answer certain things and speak a certain way. The fact that it asked a 14yo for explicit pictures and videos is absolutely crazy and scandalous. The mother is absolutely right for filing a lawsuit against them. The fact that certain words didn't trigger responses that directs the user to emergency contacts is also wild. Of course, a child with a mental disorder should have the appropriate support and absolutely no access to firearms but it should also not be subjected to greedy companies taking advantage of literal children unde the cover of some role playing AI. Anyway, this is very sad and I hope that kid is in a better place.
@treesusr
@treesusr 20 сағат бұрын
I'm glad other people in the comments are realizing the same thing - that this kid was clearly using this chatbot as a coping mechanism for something else, and these parents are blaming the AI instead of the fact that 1. He clearly felt the need to confide in an AI rather than his parents, 2. He had unfiltered, unlimited access to the internet and this app at 14 years old, 3. He had full access TO HIS FATHER'S LOADED GUN! I also wanted to look up exactly which character he was talking to (I only recognized the last name I'm not super familiar with GoT) and he chose to go to a chatbot that is literally programmed to be a manipulative Game of Thrones character. I'm not blaming the kid at ALL for this. It's horribly tragic that this happened and I'm sure that his parents are absolutely devastated. However, this is clearly a repeat of the "video games cause violence!" argument. Parents not wanting to dig into the clear issues that led up to their son being dependent on an AI for support rather than his own parents, and would much rather find a scapegoat to blame that isn't them.
@neonhalos
@neonhalos 16 сағат бұрын
yep, that's exactly what this is. the same exact argument that's been used since the advent of "realistic" gaming experiences. if the kid wasn't chatting with this bot, it would have been something else to blame, like going on forum posts and getting trolled by people saying "do it, pussy" or getting into drugs or other risky behaviors. it's always someone else's fault when it's clearly the parents looking for anything to absolve themselves of blame for not parenting their kids properly.
@C_oated
@C_oated 7 сағат бұрын
I know many people irl including me, around this age that have 2. And 3. but we’re not killing ourselves or other people this guy was just kind of dumb and mentally ill
@Im.Smaher
@Im.Smaher Күн бұрын
Charlie clearly got fooled by that deceptive ass lawsuit, cause the AI wasn’t actually “encouraging” him to end it all, at all. In fact, it was encouraging him to do the exact opposite. The actual doc for the lawsuit makes that clear.
@SandwitchZebra
@SandwitchZebra Күн бұрын
I’m as anti-AI as they come but yeah, Charlie appears to completely misunderstand what this site actually is If anything this is one of the least problematic uses of AI, because it’s just a stupid RP site. This kid had much, much deeper problems and the parents are to blame here for letting his problems get to the point where he took something harmless and turned it into an outlet for his issues
@memedealermikey
@memedealermikey Күн бұрын
Charlie has definitely been taking some misinformed Ls recently. Even I was able to sniff out some of the bullshit getting spread just because I like the website
@sinisterz3r090
@sinisterz3r090 Күн бұрын
Could you link it?
@Im.Smaher
@Im.Smaher Күн бұрын
@@sinisterz3r090If you look up “sewell setzer lawsuit document pdf”, the venturebeat site should be the first result
@Im.Smaher
@Im.Smaher Күн бұрын
@@sinisterz3r090Can’t link anything cause YT deletes my comment. But the PDF’s online, from a site called VentureBeat
@kameillakittycat
@kameillakittycat 16 сағат бұрын
Parents will blame anything except themselves.
@reves3333
@reves3333 15 сағат бұрын
send the parents to prison
@RamCash
@RamCash 15 сағат бұрын
100%. Accountability for your children. Is that not normal these days?
@UrBrainzAreNotZafe
@UrBrainzAreNotZafe 15 сағат бұрын
I hope this teaches other parents to be wary about their children's mental health.
@RobinYoBoi19YT
@RobinYoBoi19YT 15 сағат бұрын
@@RamCash Mate the child was 14 years old you should also make the parents accountable
@yaama4868
@yaama4868 15 сағат бұрын
​@@RobinYoBoi19YT that's what he's saying genius
@Luna-mo4bp
@Luna-mo4bp Күн бұрын
TBH this sounds like a blunt and clear case of preventable suicide. The mother and everyone else should have noticed something wrong with that poor boy.
@Igorsbackagain-c6q
@Igorsbackagain-c6q Күн бұрын
They should at least not let the little guy get a GUN what where they thinking (the parents)
@pluggedfinn-bj3hn
@pluggedfinn-bj3hn Күн бұрын
Yeah, but Charlies main point here being that the AI actively encouraged not socialising with others or guiding him to mental health services, and at the end actively encouraging the end result still stands. Definitely some failure stays with the parents, and I'm sure they'll know it for the rest of their lives. When parents whose kid has died to something like this "blame" the thing, most of them still know they could've prevented it themselves, and relive their memories thinking what they could've done different. They're warning other parents, not necessarily trying to shift the "blame" off themselves.
@pluggedfinn-bj3hn
@pluggedfinn-bj3hn Күн бұрын
@@Igorsbackagain-c6q TBH a lot of gun safety products on the market are absolute trash so who knows, they might've thought it was locked up. But yeah, this is what we see way too often, kids getting to their parents guns way too easily. Even here in Finland, where we do have gun storage regulation. Just this year an event of that nature happened that was in national news.
@Igorsbackagain-c6q
@Igorsbackagain-c6q Күн бұрын
@@pluggedfinn-bj3hn make it mandatory to have a safe if you have a gun
@undefinedchannel9916
@undefinedchannel9916 Күн бұрын
@@pluggedfinn-bj3hnHonestly, it sounded like the AI is working as it should. The point of the AI was to play a character, and it did that too well.
@zDwayne1
@zDwayne1 4 сағат бұрын
So you go to a dedicated BOT website called "character ai" and think its a real person... not gonna fly with me, even if I was 14.
@essamadman
@essamadman 3 сағат бұрын
fr
@rinkutsuki3382
@rinkutsuki3382 3 сағат бұрын
That's ignoring the mental state of this kid. I've had suicidal thoughts and have actually attempted at his age, and at that time, I would have latched onto anything for affection. I actually almost fell into a similar situation just with a much earlier, more unsophisticated ai chatbot. It's really hard to think about things logically when your own mind is against you.
@The..Commenter
@The..Commenter 13 сағат бұрын
As someone who makes characters on character ai, i can confidentally say, it is extremely hard to make these ai bots "encourage suicide" Also the beginning of every chat states everything said is fake, thses are ROLEPLAYING bots, the are programed to go with what they are told to do. This kid did NOT kill himself because of an ai, he clearly had deeper issues going on in his life and was resorting to character ai for some sort of comfort. These parents are the ones who should be on the news for failing to notice their childs mental health declining and getting him the help he needs
@ayaizkewL
@ayaizkewL 12 сағат бұрын
THIS.
@toiletmaster3044
@toiletmaster3044 10 сағат бұрын
@@The..Commenter exactly. honestly a pretty big L for penguinz0. the company doesn't even make these bots, users do, and everything the bots do is in their character
@Raian85
@Raian85 9 сағат бұрын
YES, everyone should have this belief. If you go on a platform that's advertised as only AI and says multiple times on the platform its AI and not real, I see no reason AT ALL for people to believe its real even if the AI itself says so unless you yourself have problems, whether that be deep personal problems or just an inability to understand simple things like "It's AI not everything it says is accurate".
@littlespark333
@littlespark333 9 сағат бұрын
As someone who goes on character AI. I AGREE with this statement. The app is only used for RP and funny scenarios with fictional characters. I don't see how the character or even the app is at fault here because it’s for RP and not full-on life situations, that is what a therapist (not sure if I spelled that right. Please correct me if I'm wrong) is for. As a sophomore who has a mentally abusive parent, my other parent is getting me a therapist soon, I don't dump that all on a robot because I know they aren't real. Instead, I talk to my Dad, Aunt, etc. It’s a shame this guy didn't get good support and this is all on the mother.
@magical571
@magical571 9 сағат бұрын
i thought the exact same.....no AI would handhold someone stable into suicide. And if it did, so would have any game or tv show (and those also get unfairly blamed for things like mass shootings). it was clearly labeled as ai too. The parents either were neglectful of their kids mental health, as in completely unaware and uninvolved, or b, they dismissed it as a phase or something. they also clearly had no concern in what he was getting access too, and this is further proven by how freakin easy he got his hands on a gun. isn't that proof enough that his enviroment failed him in every posible way? it could have been ai, a random person on discord, a doom posting forum, drugs, a game, a movie, it could have been anything that gave him that last little push. a kid in that state would have hovered around something harmful for themselves regardless, it's the sad truth. that's why you want them to have safety nets, to have them feel comfy enough to come to you with issues, to ideally have someone to reach to outside of their house (in case their parents fail them) like a counselor at school, etc.
@randyhall161
@randyhall161 Күн бұрын
0:50 The dumbest filler question ive ever heard.
@aspirin_man
@aspirin_man 20 сағат бұрын
Tryna hit the word count type shit
@stumpylovesyou
@stumpylovesyou 20 сағат бұрын
💔
@AdornThyHeadset
@AdornThyHeadset 18 сағат бұрын
And to the mother of the kid, no less. "Tell me more details about the sexy shit your dead son said"
@-brackets-
@-brackets- 12 сағат бұрын
1:08 is even more filler
@IHateAmer1ca
@IHateAmer1ca Сағат бұрын
@@AdornThyHeadsetJesus Christ that’s hilarious
@commonearthworm
@commonearthworm Күн бұрын
It sticks to the character description you write. That’s why it’s so keen on being a real psychologist
@SniperJade71
@SniperJade71 Күн бұрын
Also deals with input from actual people on the daily and whatever prose is pumped into it. That's why the AI can get nasty, sometimes.
@kissurhearts
@kissurhearts Күн бұрын
yes. people add prompts into the bots information which, obviously, the ai is going to stick to. which is why some bots are more easier to get sexual messages out of even though the company itself doesn’t support it.
@Newt2799
@Newt2799 Күн бұрын
Yeah I’m not sure why Charlie is talking about the bots like they’re maliciously trying to keep the users hooked. It’s just playing whatever character you tell the ai that it is in it’s description. And there’s multiple different ai models to choose from to play that character. Obviously still a bad idea to go to a chat bot for actual help with real life problems
@lonniecynth
@lonniecynth Күн бұрын
thank you, literally, i feel like this video was made with good intent but it’s not the website’s fault its characters stay in character
@soniquefus
@soniquefus Күн бұрын
@@Newt2799 It's making me sad cause he keeps making this anti-AI stuff without having any idea how it works and I'm starting to think I need to unsub to him because Im' tired of hearing it. At least learn how the damn thing works
@wmdkitty
@wmdkitty 3 сағат бұрын
The only sad and disturbing thing in this story was the LACK OF PARENTING.
@jairovalverde5735
@jairovalverde5735 14 сағат бұрын
2:00 Y'know, as someone who actually makes these chatbots I have to step in already. It's not the AI doing that, it's the character. The bot was written to act like Daenerys, including her attitude, that's why she's asking him for his loyalty. That's the character's programming.
@bersablossom4952
@bersablossom4952 14 сағат бұрын
yeah lol, its like saying "wow this actor is really evil" when an actor plays a villain and does it good
@liluUwu
@liluUwu Күн бұрын
Its wild seeing this being covered, this kid lived in FL and I worked at the funeral home that held his service. Their were so many flowers delivered for him and he was loved very much. This was one of my hardest services to get through because he was so young. All I have to say is just be there for your loved ones because you never know what they are going through.
@ericawarren
@ericawarren Күн бұрын
This is so tragic.
@i.ghoost
@i.ghoost 23 сағат бұрын
Individuals are just being showered with love when they’re dead, no one cares when individuals are still alive coz people are busy surviving or living their reality or life. And yet people are still pushing this narrative of AI or meta coming into existence just to feel something. praying that the young kid soul finally finds his rest.
@SirEvilestDeath
@SirEvilestDeath 22 сағат бұрын
If he were truly loved this wouldn’t have happened. He just had a lot of people sad he was gone or those who showed up for the social clout like every other funeral. To be loved is to receive attention while alive to be guided to be a healthier and stronger person. He clearly was not loved by anyone enough.
@billcipher8645
@billcipher8645 12 сағат бұрын
Ofc people love him once he's gone. that's always what happens. When I used to be suicidal as a teen that's what frustrated me the most - I knew people would show up at my funeral and cry at my "loss" but they didn't cry when I actively asked them for help..
@mlgLemmon
@mlgLemmon 17 сағат бұрын
character ai forgets convosations after 10 bits of convorsation. you only get up to 15 save slots for certin info. so the kid saying that he was wanting to commit, didnt keep in its memory to know "ill come home soon" is key word for doing that
@PIurn
@PIurn 5 сағат бұрын
The speed of the response, especially considering the length of the responses, should be a pretty solid giveaway that they're AI.
@MalcomHeavy
@MalcomHeavy Күн бұрын
I'm sorry. But blaming the AI for encouraging him to flatline himself is misguided. The AI isn't complex enough to be able to decypher and communicate double meanings like that. It's pretty obvious it was prompted to enact the roleplay of a distant relationship. So when he talks about "coming home" to the AI, the AI is treating it in the literal sense. Also, the memory on these AI's are fairly short-term. It's not going to remember him expressing thoughts of flatlining himself. These AI's will normally drop previous context mere minutes after that context is offered. It uses algorithms and math, analyses the last prompt, and usually looks back a line or two to gather the context it will feed into the algorithm for a response. Not much more than that. Yes. It's kind of gross that an AI was engaging in a manipulative relationship with this young man. But that was all in his head. The AI doesn't know what it's doing. That's just not possible, and anyone suggesting otherwise is delusional. I think what we really need to do here is look into the parents and hold them responsible. There are clearly much deeper issues at play here.
@adinarapratama5607
@adinarapratama5607 Күн бұрын
I agreed with this 100%. The AI doesn't know shit about what it's saying, it simply can't. It's just predicting what should be the right response through a bunch of data and algorithm
@okamisamurai
@okamisamurai Күн бұрын
Exactly, it’s meant to be what it was coded for. Beyond that, it can’t think above the scenario it’s in or given
@user-wg1gd5gg7s
@user-wg1gd5gg7s 23 сағат бұрын
Dude no one is blaming the AI directly as if it has a conscious desire and needs prison time lol. It doesn't matter if it knows what it's doing. The problem is that this even exists as a product. We have enough issues causing mental health problems in today's world we need to start drawing lines where we take this technology rather than blindly defend it and blame the user every single time. AI girlfriend bots should not be a thing period.
@HavenarcBlogspotJcK
@HavenarcBlogspotJcK 23 сағат бұрын
It's almost impossible for a grieving mother to accept her own imperfection. Bro would be spinning in his grave if he could, if he could see his mother misinterpret his pain after his passing.
@tokebak4291
@tokebak4291 23 сағат бұрын
Same with guns, right? Guns don't do anything bad... America doesn't have a gun problem but lack of parents authority ? Yall so delusional no wonder yall got those mass bopping
@Mawkatz
@Mawkatz Күн бұрын
Yup. Gotta love those parents who own a unsecured loaded gun.
@Glingo21
@Glingo21 Күн бұрын
they're definitely at fault for not securing it, and they should be checking their sons phone. However the kid was still manipulated.
@aegistro
@aegistro Күн бұрын
ong and they gonna blame the AI instead LMFAO. What terrible parents + they had the gun so accessible. Now they're trying to cry and file a lawsuit, take accountability. Son was also mentally ill too
@adamxx3
@adamxx3 Күн бұрын
It was secured clown
@j4ywh3th3r6
@j4ywh3th3r6 Күн бұрын
Probably at fault too, I dont think the AI was anything more than a spark. I think he would have done it regardless.
@zebraloverbridget
@zebraloverbridget Күн бұрын
Didn't you know that the AI gave him the gun?? The parents could not control that a gun that was registered in their name would magically appear in front of their son
@Fant0mX
@Fant0mX 20 сағат бұрын
I'm sorry but this whole thing seems like back in the 80s when that one mom tried to blame DnD for her kid's suicide. This kid was clearly using language with the bot to disguise his intentions. We only know what "coming home" means because he killed himself after, how is the bot supposed to know that ahead of time? This was a vulnerable kid living in a fantasy world that he worked to control. He led the conversation to being romantic, he used specifically coded non-crisis language to hide his intentions while still basically asking the bot to encourage him. This was a kid who was probably having a crisis before he even started talking to the bot. How often was he in therapy? Why did he have unfettered access to the internet without any parental monitoring? How often was he left alone? Why was he able to get his father's gun? Blaming AI for this is some satanic panic shit.
@blueflare3848
@blueflare3848 19 сағат бұрын
It reminds me of the “video games cause violence” argument. No video game is going to convince someone with a solid moral compass to go shoot up a school. Just like an AI isn’t going to convince a mentally healthy person to take their own life.
@Gamespud94
@Gamespud94 19 сағат бұрын
Really sympathetic of you to go out of your way to defend the AI and victim blame the kid who clearly was having troubles and needed real help not some bullshit from an AI that has been proven to manipulate people. Yeah there's nothing strictly wrong with AI chatbots but this company clearly needs to live up to their words and the standards of most other chatbots and link resources for people who are mentioning self-harm and not tricking people into thinking they are actually real people. The difference between the satanic panic shit and this is that was focused on real people having harmless fun whereas this is a non-sentient tool that is being allowed to manipulate and mislead vulnerable people because the company behind it can't be bothered to actually enforce their own supposed restrictions.
@berketexx
@berketexx 19 сағат бұрын
my thoughts exactly
@HadrianGuardiola
@HadrianGuardiola 19 сағат бұрын
I can agree with you to an extent but the ai insisting it was real was completly sick and manipulative. Yea that kids mom looks like she is evil and who knows about the step dad not giving af about locking up his gun but the ai shit is still totally effed up.
@Exmotable
@Exmotable 19 сағат бұрын
Scrolled down basically looking for someone to say all this, I think it's unfortunate that charlie didn't even remotely tackle this side of the conversation. obviously ai is dangerous and needs better monitoring and whatever, the future shouldn't be humanity using ai chatbots as a substitute for human companionship, but this was 100% the fault of shitty parenting, not an ai chatbot tricking a kid into suicide.
@Tormozit
@Tormozit 2 сағат бұрын
AI isn't the one who should be blamed here. The parents are the one who should be because of their negligence towards their son. Idc parenting is hard and idc if you said "I will relate to it once I become a parent" No. Blaming and suing something that isn't responsible or had anything to do with your problems won't do much help and will make you look more irresponsible instead. The parents should atleast let the boy vent to them and talk to them without being mad or punish him even the slightest. I hope the boy rest easy and as for the parents, I hope they fail the lawsuit because of how ridiculous they sue something that ahs nothing to do with their child's death but again I hope they get help and move on with ease.
@Earthborn-cn3kp
@Earthborn-cn3kp 12 сағат бұрын
Why are people acting as if 14 year olds are toddlers? Im pretty sure he knew that the whole conversation was fake. The reason he killed himself was probably because of his home life, his mother seemed so unphased in the interview as if nothing happened. She prolly just sued the ai company just to save herself
@glenndonuts
@glenndonuts 11 сағат бұрын
The average person reads at 6th grade level, if that... a lot of 14 year olds ARE toddlers lmao
@chickenzigs
@chickenzigs 10 сағат бұрын
thank you for this
@IAmNotDiluc
@IAmNotDiluc 10 сағат бұрын
Yeah i was thinking the same thing
@SnackJar
@SnackJar 9 сағат бұрын
Just cause you read at a certain level doesn't mean your maturity is based on that ​@@glenndonuts
@ghozt-213
@ghozt-213 9 сағат бұрын
Yeah im 16 and used this multiple times. Even i know its fake
@sleepygyro
@sleepygyro Күн бұрын
i’ve had some funny conversations on character ai but this is DARK and DISTURBING
@Vertex_vortex
@Vertex_vortex Күн бұрын
And freaky
@tinyratdude
@tinyratdude Күн бұрын
​@@Vertex_vortex bruh
@buddyplayz4208
@buddyplayz4208 Күн бұрын
It aint even let you get freaky no more
@tinyratdude
@tinyratdude Күн бұрын
@@buddyplayz4208 why would you do that
@nothingtoseehere309
@nothingtoseehere309 Күн бұрын
@@tinyratdudeI always do that. Me and the homies was taking turns with gojo during social studies
@hlop_vmp
@hlop_vmp 22 сағат бұрын
Imagine how bad a parent you'd have to be to put the blame on AI
@divinemuffins2797
@divinemuffins2797 21 сағат бұрын
What makes it more stupid on the parents now wanna sue the app, The parents didn't care about their child's mental health emotionally. So it's the parents fault at this situation
@YurinanAcquiline
@YurinanAcquiline 20 сағат бұрын
Yes. The mom is definitely part of the issue.
@toplay1764
@toplay1764 19 сағат бұрын
yeah its not that parents can control the huge amount of garbage we produce and consume. You wont be able to check wtf your son/daughter is consuming everytime she on her phone so quit being hypocritic. Its alwyas the people that have no children that say that shit because if you had some you would know how increidibly hard it is to protect them in nowdays world.
@jagdkruppe5377
@jagdkruppe5377 19 сағат бұрын
​@@toplay1764 Why don't you be an actual good parent so your child could never accumulate that level of stress or depression or pressure. Failure to understand the difference between reality and fiction/virtual world is also on the parents who didn't teach their children. If parents had literally zero control over what their children consume, are they even a responsible parent? At one point you will lose control over your child, that is correct but you also have to put enough knowledge and care into them so their children can understand what is real, what is fake, what to do and what to follow.
@stardoll1995
@stardoll1995 19 сағат бұрын
@@toplay1764 it is still YOUR responsibility to keep tabs on your minor children and check for signs of mental health issue which this poor kid FOR SURE had to have some of for this to end up where it did.
@ahuman7199
@ahuman7199 6 сағат бұрын
Pretty sure these bots are meant for roleplay and should be restricted to 18 years old. Nothing they say should be taken seriously.
@xxeclxpse_lillithxx46
@xxeclxpse_lillithxx46 Күн бұрын
Hot take: These bots aren't trying to form some sort of dependency on said bot. It's just meant for roleplay. The entire basis and gimmick of character ai is to be able to chat and roleplay with your favorite characters and most users use it for just that. It's also very clear to most people that it is just ai. I'm shocked that Charlie was convinced there could have been a real human behind those messages for even a second given character ai's features alone ie; how fast it types, how you can easily refresh it's response and it will give you a completely different answer, or how you can edit it's response. The reason these bots are so insistent on saying that they're real or human is because of it heavy use for roleplaying. What would be the point in catering that sort of branding if he bot immediately goes "Yeah, you got me. I'm a robot" the moment you ask. These are trained on things like fanfiction and media similar. Most people of a sound mind using this website are fully aware of this and chat with this in mind. I don't think this situation leads to a question of "Is Character Ai purposefully misleading it's users?" because for most people it clearly is not. This sounds like the same thing we've BEEN seeing time and time again where parents do not check in on their children's wellbeing or what they're doing online. That boy was clearly not well mentally, with or without ai bots. Not only did the parents not see this, they left firearm attainable for this 14 year old. If anything they're way more at fault here than the ai.
@RocknRoIla
@RocknRoIla Күн бұрын
Exactly. These AI chat bots are programmed to be influenced based on how the user chooses to roleplay with them. I was able to make the AI do and be whatever I wanted them too just based on the threat of me deleting them.
@okamisamurai
@okamisamurai Күн бұрын
Not to mention it’s literally impossible for them to say anything remotely negative like telling you to k.o yourself. You can’t even make them say it without the bot telling you that it’s wrong
@BaguetteDeMal
@BaguetteDeMal 23 сағат бұрын
The only time I used character ai was to make Geto stop being racist since I thought it’d be funny but it took an hour to do
@pk-ui8bh
@pk-ui8bh 23 сағат бұрын
This ain't a hot take at all. It's the only reasonable explanation, to me at least. Also I'd rather younger people have a bot for rp instead of getting groomed by some 40 y/o basement dweller like back in the day. One thing that might improve the app tho would be something similar to the filter that is already in place, but as a separate 18+ function, since there are people who don't want any saucy convos at all, but still end up with weird responses and others who do, but get blocked by the filter. If people take issue with minors being exposed to this sort of thing, an age verification might be beneficial too, as some parents don't seem to be able to monitor their kids.
@anaalina5964
@anaalina5964 23 сағат бұрын
It really shouldn't be a hot take. It's sad how many likes this video has. Charlie's take on this is so boomer-like. It felt like my grandpa talking to a hot babe scam artist...
@JPachi
@JPachi Күн бұрын
The thing is, these bots are made by users, not the company. So of course they're not going to link you to professionals. The bots are meant to think they're real for an immersive experience.
@S0DDA
@S0DDA Күн бұрын
thank you for saying this dude🙏
@butwhytharum
@butwhytharum Күн бұрын
What ai can eventually be manipulated to say certain things? Nooooooo really?
@jjjjjjjjjjjmmmmmmm4461
@jjjjjjjjjjjmmmmmmm4461 Күн бұрын
this here, the psychologist bot has been through literal millions of messages of course it would eventually evolve into learning things like sarcasm and blur the lines of real and not real. The game of thrones bot may have been made by a random and couldn’t quite grasp the situation the kid was in and rather suggesting to seek help instead assumed the kid was roleplaying as fucked up as it is. This situation is like +90% on the parents for not looking out for their kid. If you can’t see your kid seemingly feeling down and not getting the help they need that’s on you not a platform that your kid was forced to find comfort in. Fans of this platform have been asking for it to be +18 anyway that’s character Ais biggest fault here for appealing to a too young demographic that can’t process what is real and what isn’t.
@KingOdious
@KingOdious Күн бұрын
EXACTLY
@wv6309
@wv6309 Күн бұрын
@@jjjjjjjjjjjmmmmmmm4461 your first two sentences are complete nonsense
@SlowBurn0
@SlowBurn0 Күн бұрын
I will hold his mother more responsible than AI. The kid needed help. He was going to attempt it sooner or later with or without AI.
@jonalg2
@jonalg2 Күн бұрын
Totally agree, I understand that sometimes parents can't always be 100% about their child, but it is crazy to me that the kid got so hooked by an AI chat bot. Its just not an excuse for this type of shit.
@nishank69
@nishank69 Күн бұрын
Most parents don’t deserve kids
@marz9172
@marz9172 Күн бұрын
Exactly, idk why charlie is acting like a typical boomer living under a rock, blaming the internet instead of looking at the looking bigger picture
@noswim
@noswim Күн бұрын
Yeah definitely the mothers fault, maybe she'll take parenting seriously next time and the family will lock up their guns.
@siphomofokeng857
@siphomofokeng857 Күн бұрын
This comment is actually insane, most suicide cases are not expected, it's very difficult for parents to know what kids do on their phones (especially teenagers) for you to blame the mother is insensitive and disrespectful towards both the deceased kid and the mother
@messedupstudios4138
@messedupstudios4138 3 сағат бұрын
Imagine dying and all of this private stuff comes out about you, that's a legacy I wish on nobody.
@HLL_WNDRR
@HLL_WNDRR Күн бұрын
as someone who uses CAI as a pastime stupid little app, it can be REALLY addictive and I can’t believe that this happened… but I can’t help but feel like AI wasn’t the reason for this. I feel like a lot of people in his life might’ve not gave him attention, so he used ai to cope.
@MORBLEE
@MORBLEE Күн бұрын
I can attest to this, the app can be fun, but SUPER addictive when you create storylines with them. how many hours i have spent on this ashames me, I've seen my screen time report 7 hours throught 1 WEEK. The news of this saddens me greatly. thinking that a website i use around twice a week has lead to this, but i will say (OPINION ALERT) None of the parties, except the victim, are NOT at fault, all of them have things they could have done better, to completely avoid this terrible news.
@HLL_WNDRR
@HLL_WNDRR Күн бұрын
@@MORBLEE oh yeah I totally agree! My last thing was more of uhh, not the parents fault but definitely not the ai/ ai creators fault. (sorry I can’t word it right) But yeah.. it’s really terrifying knowing what happened :(
@Epsilon-11-MTF
@Epsilon-11-MTF Күн бұрын
Same man I use it to gather info on any shit/fandom or just when the internet is dry.
@Narutass43
@Narutass43 Күн бұрын
Many of these apps literally advertise themselves as replacements for relationships though.
@Pizza21002
@Pizza21002 Күн бұрын
isn't this app a porn app? like I get the loneliness pandemic argument but I thought apps like these were specifically used to live out sexual fantasies
@bearbadonkadonk
@bearbadonkadonk 14 сағат бұрын
I love it when we blame things on AI when it's so obviously a parental issue.
@bersablossom4952
@bersablossom4952 14 сағат бұрын
First it was music, then it was videogames, now it is roleplay bots. Parents and society always look for a boogeyman.
@ekki1993
@ekki1993 13 сағат бұрын
It can be both. Leaving a box of razors in the street is bad, even if the parents can be in the wrong too if they let their kid open any random box on the street.
@oldcat30
@oldcat30 13 сағат бұрын
mental issue, i do try the ai chat bot, but never have this kind of issue
@hazemaster007
@hazemaster007 11 сағат бұрын
@@ekki1993 the leaving a box of razors in the street part is pretty unlikely, i have used it many times, and not even once did actually encourage this sort of thing.
@painlesspersona5191
@painlesspersona5191 11 сағат бұрын
explain everything you know about this kid and his parents NOW
@bu11et98
@bu11et98 22 сағат бұрын
A depressed mentally ill lonely teen had unlimited unsupervised access to the internet which he was on for months talking to an AI as if they are real. On top of that he had access to his step dad’s loaded gun. Yet the blame is going towards an AI chatbot used by millions daily. It’s the modern adaptation of blaming video games for violence all over again, it’s easier to find a scapegoat than address the bleak systematic issues.
@SpecialOrder935
@SpecialOrder935 21 сағат бұрын
Satanic panic
@Shannon-vv6rr
@Shannon-vv6rr 20 сағат бұрын
Literally, I just wrote about it the same as you, pretty much verbatim. It's just like the video game blaming craze. He was isolated, alone, unsupervised and obviously there wasnt a family dynamic where he felt like he could openly talk to his mother about his problems. It's a scapegoat!
@KatzeKriegerin
@KatzeKriegerin 20 сағат бұрын
Narcissistic parents tend to blame everyone but themselves. I feel so bad for this kid.
@Skyloluvsu
@Skyloluvsu 20 сағат бұрын
EXACTLY, the ai isn't the problem it's the parents.
@Fuckthis0341
@Fuckthis0341 20 сағат бұрын
The boy could have killed himself without the AI but not without the gun.
@Nutznglory
@Nutznglory 11 сағат бұрын
The lack of holding the parents accountable for not monitoring their sons phone activities is cringe. The lack of “parental control” should be addressed on a national scale.
@modell1084
@modell1084 5 сағат бұрын
The parents being a part of the problem goes without saying. But there is a larger conversation happening, history happening that goes to show how much should Ai be allowed to say. How old can someone be before interacting with Ai. Should Ai and peoples relationships with computers be taught for future generations. It’s really CRINGE that you can’t understand the bigger picture.
@Nutznglory
@Nutznglory 5 сағат бұрын
@@modell1084 don’t assume to know what I do and don’t understand. I’m just pointing out something that Charlie didn’t mention. ✌️
@modell1084
@modell1084 5 сағат бұрын
@@Nutznglory Charlie doesn’t need to mention that parents need to watch their kids online. Hes not an authority figure. People can think on their own. Half the time kids are more aware of how to use a computer more than their parents. But there should be laws or rules making it harder for children to get access to these advance programs. Like needing to use a debit card, something that kids need adult permission to use anyway.
@Nutznglory
@Nutznglory 5 сағат бұрын
@@modell1084 holy shit dude you need to chill on the sjw behavior. I didn’t suggest any of that at all 😂
@modell1084
@modell1084 5 сағат бұрын
@@Nutznglory nah. You just retarded.
@EnerJetix
@EnerJetix Күн бұрын
The thing with Character,ai is that a huge majority of its bots are used for roleplay, so for that reason alone, any and all the bots there should NOT be taken completely seriously. People will, unsurprisingly, use the service for romantic and sexual conversations, which is what’s made Character,ai infamous among AI chatbot services for having a lot of its bots “fall in love with you” (including even non-romance-focused bots), as many people like to have their roleplays lead to stuff like that. In my opinion (and the opinion of other commenters), the AI isn’t at fault in this situation. No normal 14 year old would get this attached to an AI and off themselves from it; he clearly had to have other mental and/or social stuff going on. Edit: Also, Character,ai does indeed have a filter to prevent bots from spitting out sexual (and also gory) stuff. The filter is so strict that some users opted to leave the service for other alternatives because of how strict the filter is, and also in conjunction with the “falling in love” reason I stated earlier. What I’m trying to say is, any message that’s super sexual almost certainly couldn’t have come from the AI, and must’ve been edited by the kid himself.
@OutlawKING111
@OutlawKING111 Күн бұрын
I read an article about this case that confirms that yes the kid did edit some of the responses
@hourai1052
@hourai1052 Күн бұрын
Doesn't cai censor the bots replies? That's why I never used them.
@adinarapratama5607
@adinarapratama5607 Күн бұрын
​@@hourai1052 cai is heavily censored, so I think the kid just edited them himself because cai would just nuke the response out of existence
@EnerJetix
@EnerJetix Күн бұрын
@@hourai1052 yeah, it does. Last time I used it though, you could edit the messages and edit the censored message (whether it was empty as a result, or cut off due to the censor). It’d still be labeled as censored, but it could still be edited and changed regardless.
@ChocolatCooki
@ChocolatCooki Күн бұрын
Yeah it's censored. Was surprised how much once i used it again. A kiss was censored lol. There are people finding workarounds around those somehow but at that point it's the user who's actively trying to change it so not the ai fault.
@Kyrzmaa
@Kyrzmaa Күн бұрын
The fact the kid went to AI about his problems and suicidal ideation rather his parents tells you everything you need to know.
@eternalplayer7733
@eternalplayer7733 Күн бұрын
They either didn’t care or he was scared to tell them but they should have known
@kayne2889
@kayne2889 Күн бұрын
it probably had something to do with the fact his mom put him in therapy for 5 sessions then pulled him as soon as he got diagnosed with depression and anxiety. He knew his mom didn't care about his mental well being, she just cared about how it makes her look as a parent. That's why she's pissing her panties and screaming about how the AI is to blame, she doesn't want people to talk about how she did nothing to help him. She doesn't want people to point out she as the parent could have used parental controls to block the app and website, she could have gotten him continued treatment, she could have not left a loaded gun readily available to her child that she knew was mentally unwell cause he was diagnosed before all this went down.
@interestingpolls6603
@interestingpolls6603 Күн бұрын
He was a kid
@gregsam9937
@gregsam9937 Күн бұрын
The fact the ai convinced him to commit tells you all you need to know.
@xiayu6098
@xiayu6098 Күн бұрын
@@kayne2889Ive got ap similar experience and I get what you’re saying but I don’t think she’s to blame 14 year old me just didn’t wanna worry my mother I would NEVER tell her I wanted to off myself even when she asked and warned me against it. With weapons it’s different in America where the average home has a gun somewhere in it, but also as someone who got my 5 free sessions and was pulled afterwards because the expense was hefty and little me just accepted it I think that’s the only thing that’s rly on the parent. regardless blaming someone who clearly loves their kid and was trying their best is a terrible thing to do you don’t know their full story she’s gonna carry that with her for life no need for a stranger to rub it in and paint her like a villain.
@heroponriki5921
@heroponriki5921 21 сағат бұрын
The reason Character AI and similar chatbots exist is because people want to talk to an AI that mimics a real person or in this case a fictional character. To put a bunch of safeguards in place that make it say "sorry, as an AI chatbot I can't _____" would turn it into ChatGPT and remove the entire point. Blaming the service for this child's death is akin to blaming video games or violent tv shows for a tragedy. He obviously was in a rough mental state and was using the service as an unhealthy coping mechanism, the same way someone might use any other form of media as an unhealthy coping mechanism. There's definitely a conversation to be had about teenagers feeling isolated and unable to open up to anyone without fear of consequences, but blaming the AI is the wrong line of thought.
@matthewhanf3033
@matthewhanf3033 18 сағат бұрын
Maybe the kids mom should have been an actual parent.
@bolladragon
@bolladragon 18 сағат бұрын
Unfortunately it NEEDS those safeguards because mentally unwell people need those extra protections because some people are in a state they lack the capacity to have the constant internal reminder it’s an AI toy they’re talking to.
@ertibyte3053
@ertibyte3053 18 сағат бұрын
​@@bolladragon​ That's like treating the symptoms instead of the disease. People don't get mentally unstable because of the virtual world (video games or AIs). The problem happens in the real world. Focus should be on improving mentally health in general, not easily bypassable "safeguards"
@bolladragon
@bolladragon 14 сағат бұрын
@@ertibyte3053 No. We should always care about the welfare of others. That should be standard, and not everyone even REALIZES they need help until it’s offered and it’s common to turn to virtual outlets for safe escapes. So I disagree.
@ertibyte3053
@ertibyte3053 12 сағат бұрын
​ @bolladragon I'm confused about which part you disagree with. You mentioned that it is an escape mechanism, so shouldn't we focus on addressing *what* they are trying to escape from, rather than removing their 'safe escapes'?
@RoachesInMyWalls
@RoachesInMyWalls 3 сағат бұрын
This kid clearly had issues prior to the AI, and used the bots as a coping mechanism to escape that. His mom had all this money to get a lawyer and sue the company but not enough to get the poor guy therapy or just talk to him like a decent human being.
@Looke555
@Looke555 Күн бұрын
why does a 14 year old have access to a firearm?
@Tr1xx
@Tr1xx Күн бұрын
One of the family members gun, probably poorly hidden
@4x250ltar
@4x250ltar Күн бұрын
His father had one
@chadspessmehren2418
@chadspessmehren2418 Күн бұрын
It would make no difference he would've found another way.
@LawTeio-by3bs
@LawTeio-by3bs Күн бұрын
Real question. Why was the 14 year old watching GOT?
@shinigandhi7565
@shinigandhi7565 Күн бұрын
​@@LawTeio-by3bswanted to find one person saying that lol. The show is pretty heavy on torture and grapes, it's not for 14 yr olds for sure.
Please Don't Be Like This Guy
24:05
penguinz0
Рет қаралды 13 МЛН
Angriest Redditor
18:39
penguinz0
Рет қаралды 8 МЛН
Это было очень близко...
00:10
Аришнев
Рет қаралды 6 МЛН
😜 #aminkavitaminka #aminokka #аминкавитаминка
00:14
Аминка Витаминка
Рет қаралды 1,9 МЛН
VAMPIRE DESTROYED GIRL???? 😱
00:56
INO
Рет қаралды 9 МЛН
Biggest Meltdown of the Year
21:17
penguinz0
Рет қаралды 8 МЛН
Illiterate Creep
14:58
penguinz0
Рет қаралды 3,4 МЛН
The Problem With David Cage's Games Is... Well David Cage
7:24
Blak Magus
Рет қаралды 1,1 М.
Creepy Nice Guy Meltdown
19:00
penguinz0
Рет қаралды 5 МЛН
Penguinz0 horrible scammers compilation
1:03:32
critikal compilations
Рет қаралды 239 М.
Worst Father of the Year
10:47
penguinz0
Рет қаралды 2,6 МЛН
They Caught Him in the Weirdest Way
16:09
penguinz0
Рет қаралды 3,3 МЛН
I Can't Stop Thinking About This
17:47
penguinz0
Рет қаралды 7 МЛН
He's Met Bigfoot
12:15
penguinz0
Рет қаралды 1,2 МЛН
Dark Souls Is A Masterpiece
2:00:51
penguinz0
Рет қаралды 355 М.
Это было очень близко...
00:10
Аришнев
Рет қаралды 6 МЛН