Martín-Galán, Bonifacio, Hernández-Pérez, Tony, Rodríguez-Mateos, David and Peña-Gil, Daniel Uso de robots.txt y sitemaps en la administración pública española. El profesional de la información, 2009, vol. 18, n. 6, pp. 625-630. [Journal article (Paginated)]
Text
05.pdf - Published version Available under License Creative Commons Attribution Non-commercial Share Alike. Download (1MB) |
English abstract
The use of robots.txt and sitemaps in the Spanish public administration. Robots.txt and sitemaps files are the main methods to regulate search engine crawler access to its content. This article explain the importance of such files and analyze robots.txt and sitemaps from more than 4,000 web sites belonging to Spanish public administration to determine the use of these files as a medium of optimization for crawlers.
Spanish abstract
Se explica la importancia que tienen los ficheros robots.txt y los sitemaps para los sitios web. Se realiza un estudio sobre más de 4.000 webs de la administración pública española para analizar el uso de ficheros robots.txt y sitemaps como medio de optimización para los crawlers o arañas de los motores de búsqueda.
Item type: | Journal article (Paginated) |
---|---|
Keywords: | Robots; Crawlers; Sitemaps; Motores de búsqueda; Recuperación de información; Visibilidad; Sitios web; Search engines; Information retrieval; Visibility; Web sites |
Subjects: | H. Information sources, supports, channels. > HQ. Web pages. I. Information treatment for information services > IC. Index languages, processes and schemes. L. Information technology and library technology > LS. Search engines. |
Depositing user: | Ana Ribaguda |
Date deposited: | 06 Feb 2016 10:14 |
Last modified: | 06 Feb 2016 10:14 |
URI: | http://hdl.handle.net/10760/28919 |
References
Downloads
Downloads per month over past year
Actions (login required)
View Item |