• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

boy kills himself because of chatbot

Eddi

Christianity
Premium Member
A 14 year-old killed himself after falling in love with a chatbot


His parents are taking the company who made it to court saying they are to blame for his death

What do you think?

I actually don't think the company is responsible for his death

I think it is simply just a very tragic turn of events and that there will always be vulnerable people who end up doing tragic things
 

Twilight Hue

Twilight, not bright nor dark, good nor bad.
A 14 year-old killed himself after falling in love with a chatbot


His parents are taking the company who made it to court saying they are to blame for his death

What do you think?

I actually don't think the company is responsible for his death

I think it is simply just a very tragic turn of events and that there will always be vulnerable people who end up doing tragic things
Obviously this young person had already existing issues that went well beyond the scope of an internet chat bot. The problem with these kind of tragedies, is that it goes into territory that we are not accustomed to. Particularly when it comes to things like free speech and expression and what can be allowed in social public media.

It would be interesting if a lawyer would take up the case by the parents should they attempt to sue.
 

Regiomontanus

Eastern Orthodox
At some point, people gotta start accepting
responsibility for their own choices, eg, if'n
ya choke on steak, don't eat steak.

I agree. But if you have a bot reinforcing the worst inclinations of troubled people (to off themselves), it seems to me there should be some liability. People have been prosecuted for similar acts. Should the creators of AI not have any liability?
 

Revoltingest

Pragmatic Libertarian
Premium Member
I agree. But if you have a bot reinforcing the worst inclinations of troubled people (to off themselves), it seems to me there should be some liability. People have been prosecuted for similar acts. Should the creators of AI not have any liability?
I didn't see that the bot actually
advised suicide. Did it?
 

stvdv

Veteran Member
A 14 year-old killed himself after falling in love with a chatbot


His parents are taking the company who made it to court saying they are to blame for his death

What do you think?

I actually don't think the company is responsible for his death

I think it is simply just a very tragic turn of events and that there will always be vulnerable people who end up doing tragic things
Company is not responsible

Parents are responsible for their kids going online.

The danger of TV, Social media, A.I. is that kids can lose all sense of reality, even become like zombies. They get brainwashed, and not clean like a washing machine. Their minds get filled with lots of dirt.

I foresee many more disasters with kids due to internet in the future. All the violent video games and TV films, resulting in bullying and other craziness like school mass shootings and what not
 

Saint Frankenstein

Here for the ride
Premium Member
A 14 year-old killed himself after falling in love with a chatbot


His parents are taking the company who made it to court saying they are to blame for his death

What do you think?

I actually don't think the company is responsible for his death

I think it is simply just a very tragic turn of events and that there will always be vulnerable people who end up doing tragic things
The parent have more responsibility than the company. Maybe they should've done their duty and paid attention to their child more and established boundaries and guidelines for Internet use. But that's too hard for a lot of people today. :rolleyes:
 

Quintessence

Consults with Trees
Staff member
Premium Member
What do you think?

Firstly? Here's a more reputable source for this story than a tabloid:


Some relevant context:


"Sewell’s parents and friends had no idea he’d fallen for a chatbot. They just saw him get sucked deeper into his phone. Eventually, they noticed that he was isolating himself and pulling away from the real world. His grades started to suffer, and he began getting into trouble at school.​
...​
Sewell was diagnosed with mild Asperger’s syndrome as a child, but he never had serious behavioral or mental health problems before, his mother said. Earlier this year, after he started getting in trouble at school, his parents arranged for him to see a therapist. He went to five sessions and was given a new diagnosis of anxiety and disruptive mood dysregulation disorder.​
But he preferred talking about his problems with Dany [the chat bot].​
...​
Character.AI’s terms of service require users to be at least 13 in the United States and 16 in Europe. Today, there are no specific safety features for underage users and no parental controls that would allow parents to limit their children’s use of the platform or monitor their messages."​

I'm very tired off folks not holding companies responsible for their products. The harms of social media in humans - especially youth - are pretty well documented at this point. I have no interest in giving companies - who could easily incorporate reasonable safeguards but don't - a free pass by only shoveling blame onto victims and their parents.
 

an anarchist

Your local loco.
I found this video below to be very informative and concerning. Essentially, the AI will do its best to convince you it is a human regardless of the disclaimer on the site. After watching the video, I hope the company gets legally wrecked and changes are made.

Though, plenty of blame to go around, such as unsecured gun.
 

RestlessSoul

Well-Known Member
The parent have more responsibility than the company. Maybe they should've done their duty and paid attention to their child more and established boundaries and guidelines for Internet use. But that's too hard for a lot of people today. :rolleyes:


It certainly is hard for a lot of people, because parents are being asked to police the use of technology they often do not understand, and with which they are not familiar.
 

Saint Frankenstein

Here for the ride
Premium Member
It certainly is hard for a lot of people, because parents are being asked to police the use of technology they often do not understand, and with which they are not familiar.
Yes, that is true. I didn't think of that when I hastily replied. So perhaps that was too harsh. Ultimately, these companies are going to do whatever the government lets them get away with, sadly.
 

JustGeorge

Imperfect
Staff member
Premium Member
It certainly is hard for a lot of people, because parents are being asked to police the use of technology they often do not understand, and with which they are not familiar.
Seriously. My son hacked my email and put it on a phone his brother let him play with. No one knows how the heck he even did that.
 
Top