- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Alarmed by what companies are building with artificial intelligence models, a handful of industry insiders are calling for those opposed to the current state of affairs to undertake a mass data poisoning effort to undermine the technology.
Their initiative, dubbed Poison Fountain, asks website operators to add links to their websites that feed AI crawlers poisoned training data. It’s been up and running for about a week.
AI crawlers visit websites and scrape data that ends up being used to train AI models, a parasitic relationship that has prompted pushback from publishers. When scaped data is accurate, it helps AI models offer quality responses to questions; when it’s inaccurate, it has the opposite effect.



In addition to poisoning with bad data, I’d recommend adding logic gates where both recipient and sender tests each other in the definition and understanding of trust and consent which is a major thorn against the corporations, CEOs, and conservatives.