I think anubis bases its purpose on some flawed assumptions:
- that most scrapers aren't headless browsers
- that they don't have access to millions of different IPs across the world from big/shady proxy companies
- that this can help with a real network-level DDoS
- that scrapers will give up if the requests become 'too expensive'
- that they aren't contributing to warming the planet
I'm sure there does exist some older bots that are not smart and don't use headless browsers, but especially with newer tech/AI crawlers/etc., I don't think this is a realistic majority assumption anymore.
> how this would help stop scrapers
I think anubis bases its purpose on some flawed assumptions:
- that most scrapers aren't headless browsers
- that they don't have access to millions of different IPs across the world from big/shady proxy companies
- that this can help with a real network-level DDoS
- that scrapers will give up if the requests become 'too expensive'
- that they aren't contributing to warming the planet
I'm sure there does exist some older bots that are not smart and don't use headless browsers, but especially with newer tech/AI crawlers/etc., I don't think this is a realistic majority assumption anymore.