Where robots.txt has failed for me in the past, I have added dummy paths to it (and other similar paths hidden in html or in JS variables) which, upon being visited, cause the offending IP to be blocked.
Eg, I’ll add a /blockmeplease/ reference in robots.txt, and when anything visits that path, its IP, User-Agent, etc get recorded and it gets its IP blocked automatically.
Where robots.txt has failed for me in the past, I have added dummy paths to it (and other similar paths hidden in html or in JS variables) which, upon being visited, cause the offending IP to be blocked.
Eg, I’ll add a /blockmeplease/ reference in robots.txt, and when anything visits that path, its IP, User-Agent, etc get recorded and it gets its IP blocked automatically.