• CodenameDarlen@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    8 hours ago

    I wish I were this dumb, I’ve set up a local uncensored LLM and asked dozens of suicide methods, well, I’m still alive…

  • jdnewmil@lemmy.ca
    link
    fedilink
    arrow-up
    16
    arrow-down
    15
    ·
    19 hours ago

    Is this a new twist on the old lemmings argument? I mean, Jimmy next door could have provided the same “advice”.

  • wischi@programming.dev
    link
    fedilink
    arrow-up
    10
    arrow-down
    21
    ·
    edit-2
    17 hours ago

    To be fair, if you listen to bad advise from chat-bots you have probably already lost. I know there are vulnerable people were that’s easier said than done, but even before Chat-GPT, the internet already was a completely toxic place, where other people, that never saw or knew you, tell you to just kill yourself.

    So yes, there are a lot of real issues with LLMs, but people following clearly idiotic advice from a bot (that’s also clearly marked as such), is a non-issue in my book.

    • azolus@slrpnk.net
      link
      fedilink
      English
      arrow-up
      21
      ·
      edit-2
      14 hours ago

      To be fair, if you listen to bad advise from chat-bots you have probably already lost.

      Maybe we should punish corporations for claiming their chat-bots can give good advice or are “phd-level intelligence” rather than blaming individuals falling for their bs and suffering then?

      • greybeard@feddit.online
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 hours ago

        No, lets make it a personal responsibility issue, because that’s worked so great the plastics industry.