You are here: irt.org | FOLDOC | standard for robot exclusion
<web> A proposal to try to prevent the havoc wreaked by many of the early web robots when they retrieved documents too rapidly or retrieved documents that had side effects (such as voting). The proposed standard for robot exclusion offers a solution to these problems in the form of a file called "robots.txt" placed in the document root of the website.
W3C standard (http://w3.org/TR/html4/appendix/notes.html#h-B.4.1.1).
(2006-10-17)
Nearby terms: Standard Commands for Programmable Instruments « Standard d'Echange et de Transfert « standard deviation « standard for robot exclusion » Standard for the exchange of product model data » Standard Generalised Markup Language » Standard Generalized Markup Language
FOLDOC, Topics, A, B, C, D, E, F, G, H, I, J, K, L, M, N, O, P, Q, R, S, T, U, V, W, X, Y, Z, ?, ALL