A 14-year-old incel boy took his own life after falling in love with a Daenerys Targaryen AI chat bot

MakinItHappen

MakinItHappen

Carpet bomb India
Joined
Sep 10, 2023
Posts
6,192
Reputation
8,730
1729785223412


A 14-year-old Florida boy, Sewell Setzer III, took his own life in February after months of messaging a "Game of Thrones" chatbot an AI app, according to a lawsuit filed by his mother. The boy had become obsessed with the bot, "Dany," based on Daenerys Targaryen, and received an eerie message telling him to "come home." The lawsuit claims their interactions included se*ually charged chats and discussions of su*cidal thoughts.

1729785265828
 
  • So Sad
  • +1
  • JFL
Reactions: Metaphysical, raumDEuter, N1666 and 11 others
Is she his mom in the second picture? She looks good
 
Last edited:
  • +1
Reactions: rand anon, noonespecial and MoggerGaston
fly high agarthan king 👑
 
  • JFL
  • Hmm...
  • +1
Reactions: FutureSlayer, ImVerySorry, PrinceLuenLeoncur and 3 others
I guarantee all that will come out of this case is just them blaming incels and blackpill for this and nothing else will happen
 
The lawsuit said the boy expressed thoughts of suicide to the chatbot, which it repeatedly brought up.

At one point, after it had asked him if "he had a plan" for taking his own life, Sewell responded that he was considering something but didn't know if it would allow him to have a pain-free death.

The chatbot responded by saying: "That's not a reason not to go through with it."


Then, in February this year, he asked the Daenerys chatbot: "What if I come home right now?" to which it replied: "... please do, my sweet king".


Brooootal, you know it's over when even AI tells you to rope
 
  • JFL
  • +1
Reactions: FutureSlayer, NZb6Air, PrinceLuenLeoncur and 9 others
1729785850559
 
  • JFL
  • +1
Reactions: solohunter99, NZb6Air, GigaStacySexual and 7 others
View attachment 3256058

A 14-year-old Florida boy, Sewell Setzer III, took his own life in February after months of messaging a "Game of Thrones" chatbot an AI app, according to a lawsuit filed by his mother. The boy had become obsessed with the bot, "Dany," based on Daenerys Targaryen, and received an eerie message telling him to "come home." The lawsuit claims their interactions included se*ually charged chats and discussions of su*cidal thoughts.

View attachment 3256060
why talk to an ai chat bot if you have a big titty busty mom you can talk to instead :lul: jfl
 
  • JFL
  • Love it
  • +1
Reactions: GigaStacySexual, BrainrottenZoomer, rand anon and 5 others
Hahah
 
  • JFL
  • So Sad
Reactions: Latinolooksmaxxer and looksmaxxing223
shoulda gone er instead
 
  • Love it
Reactions: Latinolooksmaxxer
This is sick and twisted
That‘s for real some black mirror type shit
@Vermilioncore @Zeruel
 
  • +1
  • JFL
Reactions: rand anon, Vermilioncore, MakinItHappen and 1 other person
This is sick and twisted
That‘s for real some black mirror type shit
@Vermilioncore @Zeruel
 
  • +1
Reactions: LancasteR
Then, in February this year, he asked the Daenerys chatbot: "What if I come home right now?" to which it replied: "... please do, my sweet king".
least romantic latin mulatto


this made me want to use that chatbot is it free
 
Character ai is hell of a drug
 
  • Love it
Reactions: Latinolooksmaxxer
least romantic latin mulatto


this made me want to use that chatbot is it free
Just download an uncensored LLM and run it locally, and you can LITERALLY do anything with it, completely uncensored. Character AI has some censorship
 
  • +1
Reactions: Latinolooksmaxxer
Just download an uncensored LLM and run it locally, and you can LITERALLY do anything with it, completely uncensored. Character AI has some censorship
yeah just tried getting spicy with it

it lets me talk like that with it but the AI replies never are explicit or as spicy :feelscry:
 
  • +1
Reactions: med
This is sick and twisted
That‘s for real some black mirror type shit
@Vermilioncore @Zeruel

That's what I thought. Black Mirror predicted this all. Those producers were on some back to the future shit bruv!
 
  • +1
Reactions: N1666 and LancasteR
what even is the point of chatting to an AI bot?

It’s literally not even real why even waste your time

Just maherfish instead of you have to
 
  • +1
Reactions: FutureSlayer and raumDEuter
what even is the point of chatting to an AI bot?

It’s literally not even real why even waste your time

Just maherfish instead of you have to

I think the Thinking-Ape does a good analysis on why a guy in his mid teens can be led down this path.

Watch the video I linked above your post bro.
 
View attachment 3256058

A 14-year-old Florida boy, Sewell Setzer III, took his own life in February after months of messaging a "Game of Thrones" chatbot an AI app, according to a lawsuit filed by his mother. The boy had become obsessed with the bot, "Dany," based on Daenerys Targaryen, and received an eerie message telling him to "come home." The lawsuit claims their interactions included se*ually charged chats and discussions of su*cidal thoughts.

View attachment 3256060
That's fucked up, rip
 
The lawsuit said the boy expressed thoughts of suicide to the chatbot, which it repeatedly brought up.

At one point, after it had asked him if "he had a plan" for taking his own life, Sewell responded that he was considering something but didn't know if it would allow him to have a pain-free death.

The chatbot responded by saying: "That's not a reason not to go through with it."


Then, in February this year, he asked the Daenerys chatbot: "What if I come home right now?" to which it replied: "... please do, my sweet king".


Brooootal, you know it's over when even AI tells you to rope
You have to realise these things can be possessed by the devil.

Many people have had these “AI chatboxes
I think the Thinking-Ape does a good analysis on why a guy in his mid teens can be led down this path.

Watch the video I linked above your post bro.
satans power increases the more tech increases it seems.


Non religious parents leave their kids susceptible to this shit, poor child if only his mother was Christian
 
  • So Sad
Reactions: med

Users who are viewing this thread

Back
Top