Limit search engine indexing of source browser
|Reported by:||Owned by:|
Recently, when google started indexing my trac based site, I noticed it would also index all the various link permutations of the [subversion] source browser (i.e.
order). This of course could cause a lot of additional load on the server when it [the search engine] doesn't need all these different views of the same content. Really I want it to only index things once (for just each
Would it be possible (or does it already exist somewhere I can't find) to have an option to generate minimal browsing links based on the client? Since it is based on specific URL parameters, I can't just use a simple robots.txt solution.
If not already done, perhaps something modeled as:
- Have a core module/plugin that adds a role (e.g.
SEARCH_ENGINE) based on
Googlebot/*) or the client's DNS name. I would not be surprised if a generic module already existed that could do this step.
- In the source browser module, if the
SEARCH_ENGINE(or maybe a custom assigned name) role was set, then it would omit all the extra "user friendly" links.
The same pruning could of course be done for any other modules that provide multiple views of the same content.
Sorry if I'm just being picky and most users don't care about limited search indexing.