Who are these people? This is ridiculous. :)

I guess with so many humans, there is bound to be a small number of people who have no ability to think for themselves and believe everything a chat bot is writing in their web browser.

People even have romantic relationships with these things.

I dont agree with the argument that chat gpt should “push back”. They have an example in the article where the guy asked for tall bridges to jump from, and chat gpt listed them of course.

Are we expecting the llm to act like a psychologist, evaluating if the users state of mind is healthy before answering questions?

Very slippery slope if you ask me.

  • TimewornTraveler@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    10 hours ago

    hi, they’re going to be in psychosis regardless of what LLMs do. they aren’t therapists and mustn’t be treated as such. that goes for you too