You are previewing HTTP: The Definitive Guide.

HTTP: The Definitive Guide

Cover of HTTP: The Definitive Guide by David Gourley... Published by O'Reilly Media, Inc.

Robotic HTTP

Robots are no different from any other HTTP client program. They too need to abide by the rules of the HTTP specification. A robot is making HTTP requests and advertising itself as an HTTP/1.1 client needs to use the appropriate HTTP request headers.

Many robots try to implement the minimum amount of HTTP needed to request the content they seek. This can lead to problems; however, it's unlikely that this behavior will change anytime soon. As a result, many robots make HTTP/1.0 requests, because that protocol has few requirements.

Identifying Request Headers

Despite the minimum amount of HTTP that robots tend to support, most do implement and send some identification headers—most notably, the User-Agent HTTP header. It's recommended that robot implementors send some basic header information to notify the site of the capabilities of the robot, the robot's identity, and where it originated.

This is useful information both for tracking down the owner of an errant crawler and for giving the server some information about what types of content the robot can handle. Some of the basic identifying headers that robot implementors are encouraged to implement are:


Tells the server the name of the robot making the request.


Provides the email address of the robot's user/administrator.[8]


Tells the server what media types are okay to send.[9] This can help ensure that the robot receives only content in which it's interested (text, images, etc.).


Provides the ...

The best content for your career. Discover unlimited learning on demand for around $1/day.