• 𝓹𝓻𝓲𝓷𝓬𝓮𝓼𝓼@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    21
    ·
    14 hours ago

    doesn’t even have to be the site owner poisoning the tool instructions (though that’s a fun-in-a-terrifying-way thought)

    any money says they’re vulnerable to prompt injection in the comments and posts of the site

    • CTDummy@piefed.social
      link
      fedilink
      English
      arrow-up
      15
      ·
      edit-2
      9 hours ago

      Lmao already people making their agents try this on the site. Of course what could have been a somewhat interesting experiment devolves into idiots getting their bots to shill ads/prompt injections for their shitty startups almost immediately.

    • BradleyUffner@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      13 hours ago

      There is no way to prevent prompt injection as long as there is no distinction between the data channel and the command channel.