Fake4000@lemmy.world to Technology@lemmy.worldEnglish · 2 年前Reddit started doing what they always wanted to do, sell user content to AI.www.reuters.comexternal-linkmessage-square201fedilinkarrow-up11.1Karrow-down115cross-posted to: [email protected]
arrow-up11.09Karrow-down1external-linkReddit started doing what they always wanted to do, sell user content to AI.www.reuters.comFake4000@lemmy.world to Technology@lemmy.worldEnglish · 2 年前message-square201fedilinkcross-posted to: [email protected]
minus-squareAppoxo@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up6·2 年前Afaik the OpenAI bot may choose to ignore it? At least that’s what another user claimed it does.
minus-squareJohnEdwa@sopuli.xyzlinkfedilinkEnglisharrow-up12·2 年前Robots.txt has been always ignored by some bots, it’s just a guideline originally meant to prevent excessive bandwidth usage by search indexing bots and is entirely voluntary. Archive.org bot for example has completely ignored it since 2017.
Afaik the OpenAI bot may choose to ignore it? At least that’s what another user claimed it does.
Robots.txt has been always ignored by some bots, it’s just a guideline originally meant to prevent excessive bandwidth usage by search indexing bots and is entirely voluntary.
Archive.org bot for example has completely ignored it since 2017.