| Web Crawling Service|
| 207,993 (February 2015)|
80legs is a web crawling service that allows its users to create and run web crawls through its software as a service platform.
80legs was created by Computational Crawling, a company in Houston, TX. The company launched the private beta of 80legs in April 2009 and publicly launched the service at the DEMOfall 09 conference. At the time of its public launch, 80legs offered customized web crawling and scraping services. It has since added subscription plans and other product offerings.
80legs is built on top of a distributed grid computing network. This grid consists of approximately 50,000 individual computers, distributed across the world, and uses bandwidth monitoring technology to prevent bandwidth cap overages.
80legs has been criticised by numerous site owners for its technology effectively acting as a Distributed Denial of Service attack and not obeying robots.txt. As the average webmaster is not aware of the existence of 80legs, blocking access to its crawler can only be done when it's already too late, the server ddosed, and the guilty party detected after a time-consuming in-depth analysis of the logfiles.
Some rulesets for modsecurity (like the one from Atomicorp) block all access to the webserver from 80legs in order to prevent a DDOS. WebKnight also blocks 80legs by default. As it is a distributed crawler, it is impossible to block this crawler by IP. The best way found to block 80legs is by its UserAgent, "008". Wrecksite blocks 80legs by default.