Edgewall Software
Modify

Opened 17 years ago

Closed 17 years ago

Last modified 10 years ago

#5584 closed task (invalid)

how to apply robots.txt?

Reported by: gsmdib@… Owned by: Jonas Borgström
Priority: normal Milestone:
Component: general Version:
Severity: normal Keywords:
Cc: Branch:
Release Notes:
API Changes:
Internal Changes:

Description

This is probably not solely related to trac, but still, could you please advise how to make robots.txt available for a trac instance? Thanks.

Attachments (0)

Change History (11)

comment:1 by Noah Kantrowitz, 17 years ago

How are you running the Trac? If via Apache or Lighty, use their normal methods for serving files. If you are using tracd, you can look at the RobotsTxt plugin.

comment:2 by gsmdib@…, 17 years ago

Via apache. Could you please advise how to implement the 'normal method'? TIA.

comment:3 by sid, 17 years ago

Resolution: invalid
Status: newclosed

Doing a simple web search will provide a lot of documentation how to do this. example search query for 'apache robots.txt

This is not a Trac issue.

comment:4 by anonymous, 17 years ago

When hosting trac with apache and mod_python you have to explicitly unset any Python handlers for the robots.txt file:

<Location "/robots.txt">
    SetHandler None
</Location>

comment:5 by anonymous, 17 years ago

Also sone that you have to put that entry (Location "/robots.txt"…) *after* the Python handler, otherwise it won't help.

in reply to:  5 comment:6 by anonymous, 17 years ago

I put the robots.txt file in the htdocs/ directory. Trac 0.10.4 can find it there.

comment:7 by s.trac@…, 15 years ago

If like me you are having difficulties because trac resides in the root, i.e. trac.foo.com, and trac.foo.com/robots.txt gives "environment not found", then this may help if you are using apache:

RewriteEngine On
RewriteRule ^/robots.txt /htdocs/robots.txt [PT]
Alias /htdocs /where/ever/trac/is/htdocs

YMMV.

comment:8 by Sebastian Krysmanski <sebastian@…>, 15 years ago

Cc: sebastian@… added

Just a note: I've created a Python script that automatically creates a robots.txt file for multiple Trac environments. You can find it here. Although it's programed to be used in conjunction with the Mass Trac Provider Project you could easily adjust it to your own needs.

comment:9 by anonymous, 15 years ago

The above samples are overly complicated, just use this:

    Alias /robots.txt /var/www/trac-robots.txt

or wherever you want to put your robots.txt file. The sample attached earlier is good, but you can also use this to simply block all robots from accessing everything in the context you are putting the alias (VirtualHost, server, etc):

User-agent: *
Disallow: /

See also #6124.

comment:10 by gregory@…, 15 years ago

For me a combination of

Alias /robots.txt /export/home/trac/base/htdocs/robots.txt
<LocationMatch "/robots.txt">
  SetHandler None
</LocationMatch>

worked.

Last edited 10 years ago by Ryan J Ollos (previous) (diff)

comment:12 by Sebastian Krysmanski <sebastian@…>, 10 years ago

Cc: sebastian@… removed

Modify Ticket

Change Properties
Set your email in Preferences
Action
as closed The owner will remain Jonas Borgström.
The resolution will be deleted. Next status will be 'reopened'.
to The owner will be changed from Jonas Borgström to the specified user.

Add Comment


E-mail address and name can be saved in the Preferences .
 
Note: See TracTickets for help on using tickets.