Robots.txt handler


This plugin allows you to serve the robots.txt file from Trac. The wiki page with name RobotsTxt will display the contents of the prevailing robots.txt file. Primarily useful to tracd users.

Bugs/Feature Requests

Existing bugs and feature requests for RobotsTxtPlugin are here.

If you have any issues, create a new ticket.


5 / 5


2 / 2


Download the zipped source from here.

There is also a version on PyPi.


You can check out RobotsTxtPlugin from here using Subversion, or browse the source with Trac.


General instructions on installing Trac plugins can be found on the TracPlugins page.

To enable, add the following lines to your trac.ini file:

robotstxt.* = enabled

A typical RobotsTxt Wiki page will look as follows:

User-agent: *
Disallow: /browser
Disallow: /log
Disallow: /changeset
Disallow: /report
Disallow: /newticket
Disallow: /search

Recent Changes

14006 by rjollos on 2014-07-14 06:32:52
Encode content as utf-8. Patch by eseifert. Fixes #6379.
14005 by rjollos on 2014-07-14 06:28:57
Added 3-Clause BSD license text.
7204 by coderanger on 2009-11-30 09:08:25
0.11 version of RobotsTxt.


Author: coderanger
Maintainer: none (needsadoption)

Last modified 9 months ago Last modified on Jan 17, 2016, 4:28:35 PM