Modify ↓
#478 closed defect (fixed)
Disallow web robots indexing of old Wiki info
| Reported by: | daniel | Owned by: | daniel |
|---|---|---|---|
| Priority: | normal | Milestone: | 0.8 |
| Component: | wiki system | Version: | 0.7 |
| Severity: | minor | Keywords: | noindex |
| Cc: | Branch: | ||
| Release Notes: | |||
| API Changes: | |||
| Internal Changes: | |||
Description
Trac should send html headers to instruct searchbots to not index old versions of pages or diffs.
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
Attachments (0)
Change History (9)
comment:1 by , 22 years ago
| Summary: | Disallow web robots to old Wiki info → Disallow web robots indexing of old Wiki info |
|---|
comment:2 by , 22 years ago
comment:4 by , 22 years ago
| Owner: | changed from to |
|---|---|
| Status: | new → assigned |
comment:6 by , 21 years ago
| Resolution: | fixed |
|---|---|
| Status: | closed → reopened |
Reopening; most recent version of source:trunk/trac/Wiki.py seems to have this backwards:
if version:
self.add_link('alternate',
'?version=%s&format=txt' % version, 'Plain Text',
'text/plain')
else:
self.add_link('alternate', '?format=txt', 'Plain Text',
'text/plain')
# Ask web spiders to not index old versions
req.hdf['html.norobots'] = 1
This block will set the html.norobots flag when version isn't specified (most recent page) — this isn't what you want I think.
Moving the html.robots into the first if block will fix this.
comment:7 by , 21 years ago
| Resolution: | → fixed |
|---|---|
| Status: | reopened → closed |
Good catch. This regression has been fixed in [1313]. (Please file issues like this as new tickets in the future, though.)
comment:9 by , 15 years ago
| Keywords: | noindex added |
|---|
Note:
See TracTickets
for help on using tickets.



http://www.searchengineworld.com/cgi-bin/sim_spider.cgi