Package org.supermind.crawl.http

Class Summary
Http Takes advantage of HTTP 1.1 features such as connection persistence and request pipelining to improve performance for situations where multiple URLs must be fetched from the same host.
RobotRulesParser This class handles the parsing of robots.txt files.
RobotRulesParser.RobotRuleSet This class holds the rules which were parsed from a robots.txt file, and can test paths against those rules.

Exception Summary
HttpError Thrown for HTTP error codes.