I got into the self-hosting scene this year when I wanted to start up my own website run on old recycled thinkpad. A lot of time was spent learning about ufw, reverse proxies, header security hardening, fail2ban.

Despite all that I still had a problem with bots knocking on my ports spamming my logs. I tried some hackery getting fail2ban to read caddy logs but that didnt work for me. I nearly considered giving up and going with cloudflare like half the internet does. But my stubbornness for open source self hosting and the recent cloudflare outages this year have encouraged trying alternatives.

Coinciding with that has been an increase in exposure to seeing this thing in the places I frequent like codeberg. This is Anubis, a proxy type firewall that forces the browser client to do a proof-of-work security check and some other nice clever things to stop bots from knocking. I got interested and started thinking about beefing up security.

I’m here to tell you to try it if you have a public facing site and want to break away from cloudflare It was VERY easy to install and configure with caddyfile on a debian distro with systemctl. In an hour its filtered multiple bots and so far it seems the knocks have slowed down.

https://anubis.techaro.lol/

My botspam woes have seemingly been seriously mitigated if not completely eradicated. I’m very happy with tonights little security upgrade project that took no more than an hour of my time to install and read through documentation. Current chain is caddy reverse proxy -> points to Anubis -> points to services

Good place to start for install is here

https://anubis.techaro.lol/docs/admin/native-install/

  • A_norny_mousse@feddit.org
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    1 hour ago

    At the time of commenting, this post is 8h old. I read all the top comments, many of them critical of Anubis.

    I run a small website and don’t have problems with bots. Of course I know what a DDOS is - maybe that’s the only use case where something like Anubis would help, instead of the strictly server-side solution I deploy?

    I use CrowdSec (it seems to work with caddy btw). It took a little setting up, but it does the job.
    (I think it’s quite similar to fail2ban in what it does, plus community-updated blocklists)

    Am I missing something here? Why wouldn’t that be enough? Why do I need to heckle my visitors?

    Despite all that I still had a problem with bots knocking on my ports spamming my logs.

    By the time Anubis gets to work, the knocking already happened so I don’t really understand this argument.

    If the system is set up to reject a certain type of requests, these are microsecond transactions of no (DDOS exception) harm.

    • quick_snail@feddit.nl
      link
      fedilink
      English
      arrow-up
      3
      ·
      47 minutes ago

      With varnish and wazuh, I’ve never had a need for Anubis.

      My first recommendation for anyone struggling with bots is to fix their cache.

    • SmokeyDope@piefed.socialOP
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      edit-2
      5 hours ago

      If crowdsec works for you thats great but also its a corporate product whos premium sub tier starts at 900$/month not exactly a pure self hosted solution.

      I’m not a hypernerd, still figuring all this out among the myriad of possible solutions with different complexity and setup times. All the self hosters in my internet circle started adopting anubis so I wanted to try it. Anubis was relatively plug and play with prebuilt packages and great install guide documentation.

      Allow me to expand on the problem I was having. It wasnt just that I was getting a knock or two, its that I was getting 40 knocks every few seconds scraping every page and searching for a bunch that didnt exist that would allow exploit points in unsecured production vps systems.

      On a computational level the constant network activity of bytes from webpage, zip files and images downloaded from scrapers pollutes traffic. Anubis stops this by trapping them in a landing page that transmits very little information from the server side. By traping the bot in an Anubis page which spams that 40 times on a single open connection before it gives up, it reduces overall network activity/ data transfered which is often billed as a metered thing as well as the logs.

      And this isnt all or nothing. You don’t have to pester all your visitors, only those with sketchy clients. Anubis uses a weighted priority which grades how legit a browser client is. Most regular connections get through without triggering, weird connections get various grades of checks by how sketchy they are. Some checks dont require proof of work or JavaScript.

      On a psychological level it gives me a bit of relief knowing that the bots are getting properly sinkholed and I’m punishing/wasting the compute of some asshole trying to find exploits my system to expand their botnet. And a bit of pride knowing I did this myself on my own hardware without having to cop out to a corporate product.

      Its nice that people of different skill levels and philosophies have options to work with. One tool can often complement another too. Anubis worked for what I wanted, filtering out bots from wasting network bandwith and giving me peace of mind where before I had no protection. All while not being noticeable for most people because I have the ability to configure it to not heckle every client every 5 minutes like some sites want to do.

      • A_norny_mousse@feddit.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 hour ago

        If crowdsec works for you thats great but also its a corporate product

        It’s also fully FLOSS with dozens of contributors (not to speak of the community-driven blocklists). If they make money with it, great.

        not exactly a pure self hosted solution.

        Why? I host it, I run it. It’s even in Debian Stable repos, but I choose their own more up-to-date ones.

        All the self hosters in my internet circle started adopting anubis so I wanted to try it. Anubis was relatively plug and play with prebuilt packages

        Yeah…

        Allow me to expand on the problem I was having. It wasnt just that I was getting a knock or two, its that I was getting 40 knocks every few seconds scraping every page and searching for a bunch that didnt exist that would allow exploit points in unsecured production vps systems.

        1. Again, a properly set up WAF will deal with this pronto
        2. You should not have exploit points in unsecured production systems, full stop.

        On a computational level the constant network activity of bytes from webpage, zip files and images downloaded from scrapers pollutes traffic. Anubis stops this by trapping them in a landing page that transmits very little information from the server side.

        1. And instead you leave the computations to your clients. Which becomes a problem on slow hardware.
        2. Again, with a properly set up WAF there’s no “traffic pollution” or “downloading of zip files”.

        Anubis uses a weighted priority which grades how legit a browser client is.

        And apart from the user agent and a few other responses, all of which are easily spoofed, this means “do some javascript stuff on the local client” (there’s a link to an article here somewhere that explains this well) which is much less trivial than you make it sound.

        Also, I use one of those less-than-legit, weird and non-regular browsers, and I am being punished by tools like this.