• vacuumflower@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    1 day ago

    It’s also a fscking mess to set up a Usenet downloader, especially since it’d be a bunch of buggy weird stuff ending with -arr in the names and web UIs.

    And no, torrenting isn’t outdated and isn’t amateur. In Usenet messages are replicated over all services offering that newsgroup. I hope the downsides are clear.

    Some kind of Usenet with global identifiers of messages and posters, and with something like Kademlia to find sources for a specific newsgroup(to get all the other side has in it)/post(to get it specifically)/person(their public key), would be much better than just replicating each message everywhere with a local identifier.

    • FreedomAdvocate@lemmy.net.au
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 hours ago

      It’s also a fscking mess to set up a Usenet downloader, especially since it’d be a bunch of buggy weird stuff ending with -arr in the names and web UIs.

      It’s really not. You pretty much need to just put in some api keys for your indexer, downloader, and provider, and away you go.

      In Usenet messages are replicated over all services offering that newsgroup. I hope the downsides are clear.

      What downsides are you talking about in regards to downloading content from usenet?

    • ArcaneSlime@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      19 hours ago

      Well you could use the -arr stack but you could also just set up SABnzbd which is the same difficulty to set up as qbit/jackett.

      I haven’t touched the -arrs myself, just go to my indexer, click download, it goes into the correct folder which sabnzbd automatically picks up and starts a-downloadin’, then it transfers the complete files to another folder.

      But I use both, and slsk, and ytdl. Why limit myself?