id,summary,reporter,owner,description,type,status,priority,milestone,component,version,severity,resolution,keywords,cc,branch,changelog,apichanges,internalchanges 6124,Trac should ship with a default robots.txt file,freddie@…,Jonas Borgström,"It would be convenient if trac shipped with a robots.txt file out of the box that is designed to stop search engines from indexing every possible page/revision/log combination. Googlebot for example will, first time around, attempt to view/index every possible page on the site, which due to GET query nature of trac means that it can easily make 40,000+ requests while attempting to index a site. Therefore, to save administrators the hassle of firstly having many thousand (mostly unnecessary) requests being made by bots and secondly having to formulate their own robots.txt file it would be a wise move to ship one which prevented bots from fetching diffs/old source revisions (which are unlikely to ever make it into the index anyway).",enhancement,closed,normal,,general,,normal,wontfix,robots crawler robots.txt,ilias@…,,,,