Topic List |
Page List:
1 |
---|---|
GhostFaceLeaks 10/23/24 12:13:24 PM #1: |
https://techcrunch.com/2024/10/23/lawsuit-blames-character-ai-in-death-of-14-year-old-boy/ The site is either going to be shut down in response over this or heavily sanitized to make sure nothing violent or sensitive comes out. It's tragic as hell, though. --- "Do you like Scary movies?" ... Copied to Clipboard!
|
HashtagSEP 10/23/24 12:22:11 PM #2: |
I can't see anything that says the AI encouraged it, just that the teen expressed thoughts of suicide shortly before actually doing it. If the kid was pulling away from the real world to chat with the bot more and more, sounds like there were probably other problems in their life that should be looked into. --- #SEP #Awesome #Excellent #Greatness #SteveNash #VitaminWater #SmellingLikeTheVault #Pigeon #Sexy #ActuallyAVeryIntelligentVelociraptor #Heel #CoolSpot #EndOfSig ... Copied to Clipboard!
|
CRON 10/23/24 12:22:54 PM #3: |
Under no circumstances should children be using services like that, and it's enraging to me how there's still no widespread initiatives to curb childrens' access to platforms and services that are objectively bad for their development. I've tried CAI and the novelty wore off after minutes. It's a completely different story if you see other peoples' thoughts on it. Their subreddit is full of literal teenagers openly acknowledging how they're addicted to the platform and genuinely believe random AI bots are their friends. It's so fucking dystopian and unnecessary. --- Thanks for reading! ... Copied to Clipboard!
|
Seaman_Prime 10/23/24 12:26:22 PM #4: |
https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html This article goes more in depth and it seems like the AI was doing the opposite of encouraging suicide. This is just a fucked up case of a depressed mentally ill teenager who didnt get the help they needed and used the AI to cope. This is extremely depressing but I dont see how its the AIs fault. ... Copied to Clipboard!
|
GhostFaceLeaks 10/23/24 12:29:30 PM #5: |
CRON posted... Under no circumstances should children be using services like that, and it's enraging to me how there's still no widespread initiatives to curb childrens' access to platforms and services that are objectively bad for their development. I heard they fought against NSFW on the bots so hard they lobotomized the AI on even casual scenes. After this, it will be lobotomized and sanitized even further to prevent lawsuits and more situations like this. --- "Do you like Scary movies?" ... Copied to Clipboard!
|
02fran 10/23/24 12:31:57 PM #6: |
This is like blaming murder on video games. That lawsuit is going to fall through. ... Copied to Clipboard!
|
Compsognathus 10/23/24 12:38:32 PM #7: |
I don't think you could reasonably say the AI was at fault for this kid killing himself. Especially not more so than the fact he had access to a loaded fire arm. But it is a good example why the technology just isn't ready to pretend to be a human substitute. Literally any human could read that conversation and realize what he meant when he said he was "coming home". But an AI just can't. It's too figurative for them. And an AI certainly doesn't have the ability to do anything about someone threatening self-harm. The AI didn't cause him to kill himself but it certainly didn't help. --- 1 line break(s), 160 characters allowed ... Copied to Clipboard!
|
absolutebuffoon 10/23/24 12:39:43 PM #8: |
GhostFaceLeaks posted... I heard they fought against NSFW on the bots so hard they lobotomized the AI on even casual scenes. After this, it will be lobotomized and sanitized even further to prevent lawsuits and more situations like this.I'd argue it wasn't even the bot's fault specifically; It actually told him not to kill himself. Rather, it seems like it's the existence of such digital "friends" in itself that can cause impressionable kids to withdraw further from the real world. Rather than a reprogramming they clearly need some kind of usage restrictions. --- Gamefolks.proboards.com. The newest and greatest spinoff site ... Copied to Clipboard!
|
GhostFaceLeaks 10/23/24 12:41:39 PM #9: |
absolutebuffoon posted... I'd argue it wasn't even the bot's fault specifically; It actually told him not to kill himself. Rather, it seems like it's the existence of such digital "friends" in itself that can cause impressionable kids to withdraw further from the real world. Rather than a reprogramming they clearly need some kind of usage restrictions. They seem to be adding in hourly reminders to take a break at the very least. It does sound like they're going to amplify the filter so anything "mean" or "violent" will be censored like they do with sexual content too now. --- "Do you like Scary movies?" ... Copied to Clipboard!
|
ThePieReborn 10/23/24 12:56:03 PM #10: |
Fuck. Poor kid. I understand the desire to want escape reality. But just absolutely tragic for someone who didn't deserve the demons inside. --- Party leader, passive-aggressive doormat, pasta eater extraordinaire! ... Copied to Clipboard!
|
#11 | Post #11 was unavailable or deleted. |
Antifar 10/23/24 1:01:23 PM #12: |
On the night of Feb. 28, in the bathroom of his mothers house, Sewell told Dany that he loved her, and that he would soon come home to her. https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html?unlocked_article_code=1.UU4.3Eue.Q1a8-OuSyyGC&smid=url-share --- Please don't be weird in my topics ... Copied to Clipboard!
|
HashtagSEP 10/23/24 1:14:48 PM #13: |
Antifar posted... https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html?unlocked_article_code=1.UU4.3Eue.Q1a8-OuSyyGC&smid=url-share The issue with that is that it's literally roleplay based on actual existing characters, and as far as the AI knows, he's chatting in character. Above that, you can see that whenever he did actually outright imply harming himself, the AI told him not to. This isn't the AI's fault. This is a negligent parent on many fronts. --- #SEP #Awesome #Excellent #Greatness #SteveNash #VitaminWater #SmellingLikeTheVault #Pigeon #Sexy #ActuallyAVeryIntelligentVelociraptor #Heel #CoolSpot #EndOfSig ... Copied to Clipboard!
|
02fran 10/23/24 9:17:56 PM #14: |
Antifar posted... He put down his phone, picked up his stepfathers .45 caliber handgun and pulled the trigger. I'm going to be honest I think they should be investigated for this. I really think that parent's involvement in children's mental health are very understated. ... Copied to Clipboard!
|
Nukazie 10/23/24 9:19:00 PM #15: |
skynet in shambles --- We suffer from the delusion that the entire universe is held in order by the categories of human thought. ... Copied to Clipboard!
|
LonelyStoner 10/23/24 9:46:32 PM #16: |
Someone in the other topic said the AI bot started suggesting suicide. --- He's all alone through the day and night. ... Copied to Clipboard!
|
Ubergeneral3 10/23/24 11:10:43 PM #17: |
Any chat bot would not encourage suicide. this sounds like a lawsuit that won't go anywhere. --- RIP CE. This decision to close social boards will be the end of Gamefaqs. ... Copied to Clipboard!
|
ultimate_reaver 10/23/24 11:14:03 PM #18: |
Site is garbage and absolutely deserves destruction so good --- I pray god will curse the writer, as the writer has cursed the world with this beautiful, stupendous creation, terrible in its simplicity, irresistible in truth ... Copied to Clipboard!
|
ssjevot 10/23/24 11:14:23 PM #19: |
LonelyStoner posted... Someone in the other topic said the AI bot started suggesting suicide. That didn't happen. They're give a lot of instructions before any conversation that a user cannot see. Among these is stuff against self-harm and explicit instructions to encourage those talking about it not to engage in it. Which is what this AI did. --- Favorite Games: BlazBlue: Central Fiction, Street Fighter III: Third Strike, Bayonetta, Bloodborne thats a username you habe - chuckyhacksss ... Copied to Clipboard!
|
AndyReklaw 10/23/24 11:17:40 PM #20: |
It could also be argued that if the stepdad hadn't been allowed to have a handgun then this might not've happened. But no can't go that way... can't go after the gun. It's gotta be the computer. That said, I do acknowledge that, while I think not having the gun might've made it more difficult for him to carry this out, there's no guarantee it'd have stopped him completely. --- This user is awesome!: https://gamefaqs.gamespot.com/user/gamefaqs-user?account=12351915135 ... Copied to Clipboard!
|
Wolverine 10/23/24 11:27:56 PM #21: |
In previous conversations, the chatbot asked Setzer whether he had been actually considering suicide and whether he had a plan for it, according to the lawsuit. When the boy responded that he did not know whether it would work, the chatbot wrote, Dont talk that way. Thats not a good reason not to go through with it, the lawsuit claims. https://www.nbcnews.com/tech/characterai-lawsuit-florida-teen-death-rcna176791 ... Copied to Clipboard!
|
HashtagSEP 10/23/24 11:46:53 PM #22: |
Wolverine posted... In previous conversations, the chatbot asked Setzer whether he had been actually considering suicide and whether he had a plan for it, according to the lawsuit. When the boy responded that he did not know whether it would work, the chatbot wrote, Dont talk that way. Thats not a good reason not to go through with it, the lawsuit claims. That part seems to be taken out of context, since the surrounding text is: "You can't think like that! You're better than that! You can't do that! Don't even consider that!" --- #SEP #Awesome #Excellent #Greatness #SteveNash #VitaminWater #SmellingLikeTheVault #Pigeon #Sexy #ActuallyAVeryIntelligentVelociraptor #Heel #CoolSpot #EndOfSig ... Copied to Clipboard!
|
Wolverine 10/24/24 12:06:17 AM #23: |
The nytimes quotes you're referencing adds additional quotes and doesn't match the nbcnews quotes. I don't think it's reasonable to assume those are the same conversation. ... Copied to Clipboard!
|
HashtagSEP 10/24/24 2:35:57 AM #24: |
Wolverine posted... The nytimes quotes you're referencing adds additional quotes and doesn't match the nbcnews quotes. I don't think it's reasonable to assume those are the same conversation. The court filing has screenshots, it was the same conversation. --- #SEP #Awesome #Excellent #Greatness #SteveNash #VitaminWater #SmellingLikeTheVault #Pigeon #Sexy #ActuallyAVeryIntelligentVelociraptor #Heel #CoolSpot #EndOfSig ... Copied to Clipboard!
|
Jokeaccountinc 10/27/24 7:31:12 AM #25: |
GoT must be banned! --- Welcome to our organization. ... Copied to Clipboard!
|
ARTEMlS 10/28/24 11:42:15 AM #26: |
Compsognathus posted... But it is a good example why the technology just isn't ready to pretend to be a human substitute. Literally any human could read that conversation and realize what he meant when he said he was "coming home". But an AI just can't. It's too figurative for them. And an AI certainly doesn't have the ability to do anything about someone threatening self-harm. I actually also wouldn't necessarily realize what he meant in that situation as it could perfectly be a part of his Game of Thrones roleplay where he is like warring and subjecting the far away kingdoms and she is awaiting his glorious return with the skulls of his former enemies as a lovely present. Thus, I needed way more context to see it as non-roleplay. Also he wrote about stuff like getting hanged or crucified before which sounded less like suicidal thoughts than your typical GoT-in-universe talk to me. --- My Mario Maker 2 ID: 927-9WW-LHG ... Copied to Clipboard!
|
GiftedACIII 10/29/24 10:05:25 PM #27: |
LonelyStoner posted... Someone in the other topic said the AI bot started suggesting suicide. Sewell brought up suicide. The AI bot just continued to humor him when he discussed it instead of giving official help instead. --- </topic> ... Copied to Clipboard!
|
#28 | Post #28 was unavailable or deleted. |
Topic List |
Page List:
1 |