.\" -*- mode: troff; coding: utf-8 -*- .\" Automatically generated by Pod::Man 5.01 (Pod::Simple 3.43) .\" .\" Standard preamble: .\" ======================================================================== .de Sp \" Vertical space (when we can't use .PP) .if t .sp .5v .if n .sp .. .de Vb \" Begin verbatim text .ft CW .nf .ne \\$1 .. .de Ve \" End verbatim text .ft R .fi .. .\" \*(C` and \*(C' are quotes in nroff, nothing in troff, for use with C<>. .ie n \{\ . ds C` "" . ds C' "" 'br\} .el\{\ . ds C` . ds C' 'br\} .\" .\" Escape single quotes in literal strings from groff's Unicode transform. .ie \n(.g .ds Aq \(aq .el .ds Aq ' .\" .\" If the F register is >0, we'll generate index entries on stderr for .\" titles (.TH), headers (.SH), subsections (.SS), items (.Ip), and index .\" entries marked with X<> in POD. Of course, you'll have to process the .\" output yourself in some meaningful fashion. .\" .\" Avoid warning from groff about undefined register 'F'. .de IX .. .nr rF 0 .if \n(.g .if rF .nr rF 1 .if (\n(rF:(\n(.g==0)) \{\ . if \nF \{\ . de IX . tm Index:\\$1\t\\n%\t"\\$2" .. . if !\nF==2 \{\ . nr % 0 . nr F 2 . \} . \} .\} .rr rF .\" ======================================================================== .\" .IX Title "WWW::RobotRules 3" .TH WWW::RobotRules 3 2023-07-25 "perl v5.38.0" "User Contributed Perl Documentation" .\" For nroff, turn off justification. Always turn off hyphenation; it makes .\" way too many mistakes in technical documents. .if n .ad l .nh .SH NAME WWW::RobotRules \- database of robots.txt\-derived permissions .SH SYNOPSIS .IX Header "SYNOPSIS" .Vb 2 \& use WWW::RobotRules; \& my $rules = WWW::RobotRules\->new(\*(AqMOMspider/1.0\*(Aq); \& \& use LWP::Simple qw(get); \& \& { \& my $url = "http://some.place/robots.txt"; \& my $robots_txt = get $url; \& $rules\->parse($url, $robots_txt) if defined $robots_txt; \& } \& \& { \& my $url = "http://some.other.place/robots.txt"; \& my $robots_txt = get $url; \& $rules\->parse($url, $robots_txt) if defined $robots_txt; \& } \& \& # Now we can check if a URL is valid for those servers \& # whose "robots.txt" files we\*(Aqve gotten and parsed: \& if($rules\->allowed($url)) { \& $c = get $url; \& ... \& } .Ve .SH DESCRIPTION .IX Header "DESCRIPTION" This module parses \fI/robots.txt\fR files as specified in "A Standard for Robot Exclusion", at Webmasters can use the \fI/robots.txt\fR file to forbid conforming robots from accessing parts of their web site. .PP The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed \&\fI/robots.txt\fR files on any number of hosts. .PP The following methods are provided: .ie n .IP "$rules = WWW::RobotRules\->new($robot_name)" 4 .el .IP "\f(CW$rules\fR = WWW::RobotRules\->new($robot_name)" 4 .IX Item "$rules = WWW::RobotRules->new($robot_name)" This is the constructor for WWW::RobotRules objects. The first argument given to \fBnew()\fR is the name of the robot. .ie n .IP "$rules\->parse($robot_txt_url, $content, $fresh_until)" 4 .el .IP "\f(CW$rules\fR\->parse($robot_txt_url, \f(CW$content\fR, \f(CW$fresh_until\fR)" 4 .IX Item "$rules->parse($robot_txt_url, $content, $fresh_until)" The \fBparse()\fR method takes as arguments the URL that was used to retrieve the \fI/robots.txt\fR file, and the contents of the file. .ie n .IP $rules\->allowed($uri) 4 .el .IP \f(CW$rules\fR\->allowed($uri) 4 .IX Item "$rules->allowed($uri)" Returns TRUE if this robot is allowed to retrieve this URL. .ie n .IP $rules\->agent([$name]) 4 .el .IP \f(CW$rules\fR\->agent([$name]) 4 .IX Item "$rules->agent([$name])" Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. .SH ROBOTS.TXT .IX Header "ROBOTS.TXT" The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of ): .PP The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form .PP .Vb 1 \& : .Ve .PP The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following can be used: .IP User-Agent 3 .IX Item "User-Agent" The value of this field is the name of the robot the record is describing access policy for. If more than one \fIUser-Agent\fR field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. .Sp The \fIUser-Agent\fR fields must occur before the \fIDisallow\fR fields. If a record contains a \fIUser-Agent\fR field after a \fIDisallow\fR field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that \fIUser-Agent\fR field, and will break the record into two. All the fields before the \fIUser-Agent\fR field will constitute a record, and the \fIUser-Agent\fR field will be the first field in a new record. .IP Disallow 3 .IX Item "Disallow" The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved .PP Unrecognized records are ignored. .SH "ROBOTS.TXT EXAMPLES" .IX Header "ROBOTS.TXT EXAMPLES" The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": .PP .Vb 3 \& User\-agent: * \& Disallow: /cyberworld/map/ # This is an infinite virtual URL space \& Disallow: /tmp/ # these will soon disappear .Ve .PP This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": .PP .Vb 2 \& User\-agent: * \& Disallow: /cyberworld/map/ # This is an infinite virtual URL space \& \& # Cybermapper knows where to go. \& User\-agent: cybermapper \& Disallow: .Ve .PP This example indicates that no robots should visit this site further: .PP .Vb 3 \& # go away \& User\-agent: * \& Disallow: / .Ve .PP This is an example of a malformed robots.txt file. .PP .Vb 10 \& # robots.txt for ancientcastle.example.com \& # I\*(Aqve locked myself away. \& User\-agent: * \& Disallow: / \& # The castle is your home now, so you can go anywhere you like. \& User\-agent: Belle \& Disallow: /west\-wing/ # except the west wing! \& # It\*(Aqs good to be the Prince... \& User\-agent: Beast \& Disallow: .Ve .PP This file is missing the required blank lines between records. However, the intention is clear. .SH "SEE ALSO" .IX Header "SEE ALSO" LWP::RobotUA, WWW::RobotRules::AnyDBM_File .SH COPYRIGHT .IX Header "COPYRIGHT" .Vb 2 \& Copyright 1995\-2009, Gisle Aas \& Copyright 1995, Martijn Koster .Ve .PP This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself.