• FreedomAdvocate@lemmy.net.au
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    1 day ago

    It’s better because it basically has everything that torrent sites have, since the same groups upload everything to both, but it’s all done over SSL encrypted connections so your ISP can’t see what you’re downloading, so you don’t need a VPN and you are downloading directly from servers so it’s much faster and you don’t have to worry about the number of seeders, nor do you have to seed yourself. You have many different providers you can sign up to, and many different indexers to help find what you are looking for. It also can download parts of the same content from different sources and combine them to make a whole.

    Once you’ve tried it, torrenting feels so amateur and insecure and outdated. Ideally you just set up both, which is what I have done with qbittorrent running in a docker container with a built in VPN, but the torrents are the “last resort” when the content I want can’t be found on Usenet - which is very rarely.

    • pmk@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      6
      ·
      23 hours ago

      Doesn’t this also mean that the server can be a single point of failure? Whereas in a torrent swarm it’s distributed and more resilient?

      • FreedomAdvocate@lemmy.net.au
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 minutes ago

        It’s not like it’s just some server in some persons house. They’re hosted by companies on server farms that have guaranteed uptime figures and most have 2000+ day retention of data. I’ve been using it for 10+ years and have never had any server failure type issues.

    • vacuumflower@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      22 hours ago

      It’s also a fscking mess to set up a Usenet downloader, especially since it’d be a bunch of buggy weird stuff ending with -arr in the names and web UIs.

      And no, torrenting isn’t outdated and isn’t amateur. In Usenet messages are replicated over all services offering that newsgroup. I hope the downsides are clear.

      Some kind of Usenet with global identifiers of messages and posters, and with something like Kademlia to find sources for a specific newsgroup(to get all the other side has in it)/post(to get it specifically)/person(their public key), would be much better than just replicating each message everywhere with a local identifier.

      • FreedomAdvocate@lemmy.net.au
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 minutes ago

        It’s also a fscking mess to set up a Usenet downloader, especially since it’d be a bunch of buggy weird stuff ending with -arr in the names and web UIs.

        It’s really not. You pretty much need to just put in some api keys for your indexer, downloader, and provider, and away you go.

        In Usenet messages are replicated over all services offering that newsgroup. I hope the downsides are clear.

        What downsides are you talking about in regards to downloading content from usenet?

      • ArcaneSlime@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        11 hours ago

        Well you could use the -arr stack but you could also just set up SABnzbd which is the same difficulty to set up as qbit/jackett.

        I haven’t touched the -arrs myself, just go to my indexer, click download, it goes into the correct folder which sabnzbd automatically picks up and starts a-downloadin’, then it transfers the complete files to another folder.

        But I use both, and slsk, and ytdl. Why limit myself?