The Gatekeeper: robots.txt
The Robots Exclusion Protocol from 1994 defines "a method that allows Web site administrators to indicate to visiting robots which parts of their site should not be visited by the robot". That's a quasi standard, but crawlers sent out by major search engines do comply. continue..
0 comments:
Post a Comment