Edgewall Software

Changes between Version 109 and Version 110 of SpamFilter


Ignore:
Timestamp:
Jan 20, 2015, 9:09:26 PM (9 years ago)
Author:
figaro
Comment:

Cosmetic changes, link updates

Legend:

Unmodified
Added
Removed
Modified
  • SpamFilter

    v109 v110  
    1 = Trac Spam Filtering =
     1= Trac Spam Filtering
    22[[PageOutline(2-3)]]
    33
    4 This plugin allows different ways to reject contributions that contain spam. It requires at least Trac release 1.0. The source code for version 0.12 and before isn't updated any more (but still available).
    5 
    6 The spamfilter plugin has many options, but most of them are optional. Basically installing is enough to have a basic spam protection. But there are some things which may be helpful (in order of importance):
    7  * Train bayes database (using the entries of the log) to activate that filter and reach good performance (bayes filter needs spambayes installed!)
    8  * Setup !BadContent page containing regular expressions to filter
    9  * Get API keys for Akismet, Mollom, and/or HTTP:BL to use external services
    10  * Activate captcha rejection handler to improve user treatment (may need reCAPTCHA access when that method should be used)
    11  * Finetune the karma settings and parameters for your system (e.g. you may increase karma for good trained bayes filters or stop trusting registered users)
    12  * If necessary get API keys for other services and activate them
     4This plugin allows different ways to reject contributions that contain spam. It requires at least Trac release 1.0. The source code for version 0.12 and before isn't updated any more, but is still available.
     5
     6The spamfilter plugin has many options, but most of them are optional. Out of the box the plugin provides basic spam protection. But there are some things which may be helpful in order of importance:
     7 * Train Bayes database using the entries of the log to activate that filter and reach good performance. The Bayes filter requires spambayes.
     8 * Setup !BadContent page containing regular expressions to filter.
     9 * Get API keys for Akismet, Mollom, and/or HTTP:BL to use external services.
     10 * Activate captcha rejection handler to improve user treatment; requires reCAPTCHA access when that method is used.
     11 * Finetune the karma settings and parameters for your system, eg you may increase karma for good trained Bayes filters or stop trusting registered users.
     12 * If necessary get API keys for other services and activate them.
    1313
    1414WebAdmin is used for configuration, monitoring, and training. For monitoring and training purposes, it optionally logs all activity to a table in the database. Upgrading the environment is necessary to install the database table required for this logging.
    1515
    16 == How good is the filtering? ==
    17 
    18 The spam filter will never be perfect. You need to check submissions to Trac and improve training or settings of the filter when it becomes necessary. But a fine trained setup will help you to run a site even if it is actively spammed (i.e. thousands of spam attempts a week/day). Even large sites with completely anonymous edits are possible.
    19 
    20 But from time to time spam attacks nevertheless will succeed and handwork is required. Try removing successful spam as fast as possible. The longer it stays in the pages the harder your work will get (some spammers seem to monitor successful attempts and retry more intensive).
    21 
    22 Spam should be removed completely (also in page history). Trac has options to delete tickets as well as wiki page versions. If done early enough this does not produce gaps in page history. Spam can also be in uploaded files. Delete them!
    23 
    24 Some spam bots edit a page twice, so the last change is harmless and the previous one added the spam. Be aware of such tactics. Sometimes spam is done by humans - this type is usually successful, but humans are easily discouraged by fast deletion.
    25 
    26 The bayes filter (when properly trained) usually has the best detection rates and can be adapted pretty fast to new attacks by training the successful spam attempts. Akismet is a good second line of defense (it also uses adaptive algorithms). Training also helps the external service when a new type of attack begins. All the other services are good to catch the spammers which have rather dumb methods (most of them).
    27 
    28 Sometimes its hard for the human admins to see if a submission is spam or not. Please understand that for plain software it may be impossible!
    29 
    30 A realistic goal currently is something like 1 spam slipping through for 10.000 to 20.000 different attempts (except for a new type spam wave, where in the beginning you have maybe 10-20 slip through, but that happens only 1 or 2 times a year). False rejects should be in the order of one rejection per 1.000 or more successful submissions.
    31 
    32 == Supported Internal Filtering Strategies ==
    33 
    34 The individual strategies assign scores (“karma”) to submitted content, and the total karma determines whether a submission is rejected or not.
    35 
    36 === Regular Expressions ===
    37 
    38 The [source:plugins/1.0/spam-filter/tracspamfilter/filters/regex.py regex] filter reads a list of regular expressions from a wiki page named “BadContent”, each regular expression being on a separate line inside the first code block on the page, using the [https://docs.python.org/2/library/re.html Python syntax] for regular expressions.
     16== How good is the filtering?
     17
     18The spam filter will never be perfect. You need to check submissions to Trac and improve training or settings of the filter when necessary. But a fine trained setup will help you to run a site even if it is actively spammed, ie thousands of spam attempts a day. Even large sites with completely anonymous edits are possible.
     19
     20But from time to time spam attacks nevertheless will succeed and handwork is required. Try removing successful spam as fast as possible. The longer it stays in the pages the harder your work will get. Some spammers even monitor successful attempts and retry more intensely.
     21
     22Spam should be removed completely, also in the page history. Trac has options to delete tickets as well as wiki page versions. If done early enough this does not produce gaps in page history. Spam can also be in uploaded files. Delete them!
     23
     24Some spam bots edit a page twice, whereby the last change is harmless and the previous one contains the spam. Sometimes spam is done by humans, and while usually successful, humans are easily discouraged by fast deletion.
     25
     26The Bayes filter when properly trained usually has the best detection rates and can be adapted quickly to new attacks by training the successful spam attempts. Akismet is a good second line of defense and it also uses adaptive algorithms. Training also helps the external service when a new type of attack begins. All other services are good to catch spam inserted through rather dumb methods, which is the majority.
     27
     28A realistic goal is something like 1 spam for every 10.000 attempts. However, for a new type spam wave, which happens once or twice a year, you have maybe 10-20 slip through at the start of the wave. False rejects should be in the order of one rejection per 1.000 or more successful submissions.
     29
     30== Supported Internal Filtering Strategies
     31
     32The individual strategies assign scores or "karma" to submitted content, and the total karma determines whether a submission is rejected or not.
     33
     34=== Regular Expressions
     35
     36The [source:plugins/1.0/spam-filter/tracspamfilter/filters/regex.py regex] filter reads a list of regular expressions from a wiki page named "BadContent", each regular expression being on a separate line inside the first code block on the page, using the [https://docs.python.org/2/library/re.html Python syntax] for regular expressions.
    3937
    4038If any of those regular expressions matches the submitted content, the submission will be rejected.
    4139
    42 === Regular Expressions for IP ===
    43 
    44 The [source:plugins/1.0/spam-filter/tracspamfilter/filters/ip_regex.py ip_regex] filter reads a list of regular expressions from a wiki page named “BadIP”, each regular expression being on a separate line inside the first code block on the page, using the [https://docs.python.org/2/library/re.html Python syntax] for regular expressions.
     40=== Regular Expressions for IP
     41
     42The [source:plugins/1.0/spam-filter/tracspamfilter/filters/ip_regex.py ip_regex] filter reads a list of regular expressions from a wiki page named "BadIP", each regular expression being on a separate line inside the first code block on the page, using the [https://docs.python.org/2/library/re.html Python syntax] for regular expressions.
    4543
    4644If any of those regular expressions matches the submitters IP, the submission will be rejected.
    4745
    48 Regular expressions are much too powerful for the simple task of matching an IP or IP range, but to keep things simple for users the design is equal to the content based regular expressions. You simple can specify full IPV4
    49 addresses even if the dot has special meaning, as the match will work correctly. Only when matching partial addresses more care is needed.
    50 
    51 === IP Throttling ===
     46Regular expressions are much too powerful for the simple task of matching an IP or an IP range, but to keep things simple for users the design is equal to the content based regular expressions. You can even specify full IPV4 addresses, where the dot has special meaning, as the match will work correctly. Only when matching partial addresses more care is needed.
     47
     48=== IP Throttling
    5249
    5350The [source:plugins/1.0/spam-filter/tracspamfilter/filters/ip_throttle.py ip_throttle] filter limits the number of posts per hour allowed from a single IP.
     
    6360When this limit is exceeded, the filter starts giving submissions negative karma as specified by the `ip_throttle_karma` option.
    6461
    65 === Captcha ===
    66 
    67 Support to have CAPTCHA-style "human" verification is integrated. Captcha usage is configured in the 'Captcha' administration page.
    68 
    69 Currently five captcha types are supported:
     62=== Captcha
     63
     64Support for CAPTCHA-style "human" verification is integrated. Captcha usage is configured in the 'Captcha' administration page.
     65
     66Currently the following captcha types are supported:
    7067 * Simple text captcha: Spam robots can bypass these, so they are not recommended.
    7168 * Image captcha
    72  * External reCAPTCHA service: To use reCAPTCHA captcha method, you'll need to sign up at [http://www.google.com/recaptcha/whyrecaptcha] and set the keys at 'Captcha' administration page.
    73  * External !KeyCaptcha service: To use !KeyCaptcha captcha method, you'll need to sign up at [http://www.keycaptcha.com/] and set the user id and key at 'Captcha' administration page. NOTE: Always requires JavaScript at the user side.
    74  * External AreYouAHuman service: To use AreYouAHuman captcha method, you'll need to sign up at [http://www.areyouahuman.com/] and set the keys at 'Captcha' administration page. NOTE: Always requires JavaScript and Flash at the user side.
    75 
    76 The captcha in spamfilter is a rejection system. Captchas are only displayed to the user when otherwise a submission would be rejected as spam. In this case a successful solved captcha can increase the score of a transmission. If a transmission has too many spam points even a successfully solved captcha can't save it (i.e. the score is 30 and a captcha only removed 20 points).
    77 
    78 === Bayes ===
    79 
    80 The Bayes filter is a very powerful tool when used properly. Following are a few guidelines how to use and train the filter to get good results:
     69 * External reCAPTCHA service: To use reCAPTCHA captcha method, you'll need to sign up at [https://www.google.com/recaptcha/intro/index.html] and set the keys at 'Captcha' administration page.
     70 * External !KeyCaptcha service: To use !KeyCaptcha captcha method, you'll need to sign up at [http://www.keycaptcha.com/] and set the user id and key at 'Captcha' administration page. Note: requires JavaScript at the user side.
     71 * External AreYouAHuman service: To use AreYouAHuman captcha method, you'll need to sign up at [http://www.areyouahuman.com/] and set the keys at 'Captcha' administration page. Note: requires JavaScript and Flash at the user side.
     72
     73The captcha in spamfilter is a rejection system: they are only displayed to the user when otherwise a submission would be rejected as spam. In this case a successful solved captcha can increase the score of a transmission. If a transmission has too many spam points even a successfully solved captcha can't save it, ie the score is 30 and a captcha only removed 20 points.
     74
     75=== Bayes
     76
     77The Bayes filter is a very powerful tool when trained and used properly:
    8178
    8279 * When beginning, the filter needs a minimum amount of 25 entries for HAM (useful entries) and also for SPAM (advertising). Simply train every submission you get until these limits are reached.
    83  * The training is done in Administration Menu "Spam Filtering / Monitoring". You have following buttons
     80 * The training is done in Administration Menu "Spam Filtering / Monitoring". You have following buttons:
    8481  * ''Mark selected as Spam'' - Mark the entries as SPAM and train them in Bayes database (not visible by default for newer versions)
    8582  * ''Mark selected as Ham'' - Mark the entries as HAM and train them in Bayes database (not visible by default for newer versions)
     
    9087 * Rules for a good trained database are:
    9188  * Don't train the same stuff multiple times
    92   * HAM and SPAM count should be nearly equal (In reality you will have more SPAM, but a factor of 1 to 5 should be the maximum)
    93   * Restart from scratch when results are poor
    94   * It is hard to get rid of training errors, so be carefully
     89  * HAM and SPAM count should be nearly equal; in reality you will have more SPAM, but a ratio of 1 to 5 should be the maximum
     90  * Start from scratch when results are poor
     91  * It is hard to get rid of training errors, so be careful
    9592  * See [http://spambayes.org/background.html SpamBayes pages] for more details.
    9693 * Strategy for Trac usage:
    9794  * Use the ''Delete selected as Spam'' and ''Delete selected as Ham''
    98   * Remove every strange entry (e.g. SandBox stuff) using ''Delete selected''
    99   * Train every valid HAM entry (or database will get unbalanced)
     95  * Remove every strange entry using ''Delete selected'', eg SandBox stuff
     96  * Train every valid HAM entry or the database will get unbalanced
    10097  * Be sure to train every error: Rejected user submissions as well as undetected SPAM
    101   * Train every SPAM entry with a score below 90% (at the beginning you may train everything not 100%)
    102   * Delete SPAM entries with high score (100% in any case, after beginning phase everything above 90%)
     98  * Train every SPAM entry with a score below 90%; at the beginning you may train everything not 100%
     99  * Delete SPAM entries with high score; 100% in any case, after beginning phase everything above 90%
    103100  * When in doubt if SPAM or HAM, delete entry
    104101 * NOTE: When Akismet, Defensio, !BlogSpam or !StopForumSpam (with API key) are activated, then training will send the entries also to these services.
    105  * If you append the parameter "num" with values between 5 and 150 at monitoring page {{{url.../admin/spamfilter/monitor?num=100}}} you can show more entries, but don't train very large dataset at once.
    106 
    107 === !TrapField ===
    108 
    109 The [source:plugins/1.0/spam-filter/tracspamfilter/filters/trapfield.py TrapField] filter uses hidden form field to check content for possible spam. If enabled an additional benefit is usually better performance for some of the external services as well.
    110 
    111 == Supported External Filtering Strategies ==
    112 
    113 === IP Blacklisting ===
     102 * If you append the parameter "num" with values between 5 and 150 at monitoring page {{{url.../admin/spamfilter/monitor?num=100}}} you can show more entries, but don't train with a very large dataset at once.
     103
     104=== !TrapField
     105
     106The [source:plugins/1.0/spam-filter/tracspamfilter/filters/trapfield.py TrapField] filter uses a hidden form field to check content for possible spam. If enabled, an additional benefit is usually better performance for some of the external services as well.
     107
     108== Supported External Filtering Strategies
     109
     110=== IP Blacklisting
    114111
    115112The [source:plugins/1.0/spam-filter/tracspamfilter/filters/ip_blacklist.py ip_blacklist] filter uses the third-party Python library [http://www.dnspython.org/ dnspython] to make DNS requests to a configurable list of IP blacklist servers.
    116113
    117 See e.g. [http://spamlinks.net/filter-dnsbl-lists.htm SpamLinks DNS Lists] for a list of DNS based blacklists. A blacklist usable for this filter must return an IP for listed entries and no IP (NXDOMAIN) for unlisted entries.
    118 
    119 '''NOTE''': Submitters IP is sent to configured servers.
    120 
    121 === Akismet ===
    122 
    123 The [source:plugins/1.0/spam-filter/tracspamfilter/filters/akismet.py Akismet] filter uses the [http://akismet.com/ Akismet] web service to check content for possible spam.
     114See [http://spamlinks.net/filter-dnsbl-lists.htm SpamLinks DNS Lists] for a list of DNS based blacklists. A blacklist usable for this filter must return an IP for listed entries and no IP (NXDOMAIN) for unlisted entries.
     115
     116'''NOTE''': The submitters IP is sent to the configured servers.
     117
     118=== Akismet
     119
     120The [source:plugins/1.0/spam-filter/tracspamfilter/filters/akismet.py Akismet] filter uses the [http://akismet.com/ Akismet web service] to check content for possible spam.
    124121
    125122The use of this filter requires a [http://www.wordpress.com Wordpress] API key. The API key is configured in the 'External' administration page.
     
    127124'''NOTE''': Submitted content is sent to Akismet servers. Don't use this in private environments.
    128125
    129 === Defensio ===
    130 
    131 The [source:plugins/1.0/spam-filter/tracspamfilter/filters/defensio.py Defensio] filter uses the [http://defensio.com/ Defensio] web service to check content for possible spam.
     126=== Defensio
     127
     128The [source:plugins/1.0/spam-filter/tracspamfilter/filters/defensio.py Defensio] filter uses the [http://defensio.com/ Defensio web service] to check content for possible spam.
    132129
    133130The use of this filter requires an API key. The API key is configured in the 'External' administration page.
     
    135132'''NOTE''': Submitted content is sent to Defensio servers. Don't use this in private environments.
    136133
    137 '''Status''': This service seems to have a relatively bad detection ratio.
    138 
    139 === Mollom ===
    140 
    141 The [source:plugins/1.0/spam-filter/tracspamfilter/filters/mollom.py Mollom] filter uses the [http://mollom.com/ Mollom] web service to check content for possible spam.
     134'''Status''': This service seems to have a relatively bad detection ratio. Also, it appears to be discontinued.
     135
     136=== Mollom
     137
     138The [source:plugins/1.0/spam-filter/tracspamfilter/filters/mollom.py Mollom] filter uses the [http://mollom.com/ Mollom web service] to check content for possible spam.
    142139
    143140The use of this filter requires API keys. These API keys are configured in the 'External' administration page.
     
    145142'''NOTE''': Submitted content is sent to Mollom servers. Don't use this in private environments.
    146143
    147 === !StopForumSpam ===
    148 
    149 The [source:plugins/1.0/spam-filter/tracspamfilter/filters/stopforumspam.py StopForumSpam] filter uses the [http://stopforumspam.com/ StopForumSpam] web service to check content for possible spam. This services tests IP, username and/or email address.
     144=== !StopForumSpam
     145
     146The [source:plugins/1.0/spam-filter/tracspamfilter/filters/stopforumspam.py StopForumSpam] filter uses the [http://stopforumspam.com/ StopForumSpam web service] to check content for possible spam. This services tests IP, username and/or email address.
    150147
    151148Training this filter requires an API key. The API key is configured in the 'External' administration page.
     
    153150'''NOTE''': Submitted username and IP is sent to !StopForumSpam servers. Don't use this in private environments.
    154151
    155 === !LinkSleeve ===
    156 
    157 The [source:plugins/1.0/spam-filter/tracspamfilter/filters/linksleeve.py LinkSleeve] filter uses the [http://linksleeve.org/ LinkSleeve] web service to check content for possible spam.
     152=== !LinkSleeve
     153
     154The [source:plugins/1.0/spam-filter/tracspamfilter/filters/linksleeve.py LinkSleeve] filter uses the [http://linksleeve.org/ LinkSleeve web service] to check content for possible spam.
    158155
    159156'''NOTE''': Submitted content is sent to !LinkSleeve servers. Don't use this in private environments.
     
    161158=== !BlogSpam ===
    162159
    163 The [source:plugins/1.0/spam-filter/tracspamfilter/filters/blogspam.py BlogSpam] filter uses the [http://blogspam.net/ BlogSpam] web service to check content for possible spam.
     160The [source:plugins/1.0/spam-filter/tracspamfilter/filters/blogspam.py BlogSpam] filter uses the [http://blogspam.net/ BlogSpam web service] to check content for possible spam.
    164161
    165162This service includes also DNS checks and services identical to the checks in this plugin. Be sure to set proper karma or these checks are counted twice. You also can disable individual checks in preferences.
     
    167164'''NOTE''': Submitted content is sent to !BlogSpam servers. Don't use this in private environments.
    168165
    169 === HTTP:BL ===
    170 
    171 The [source:plugins/1.0/spam-filter/tracspamfilter/filters/httpbl.py HTTP:BL] filter uses the [http://www.projecthoneypot.org/httpbl.php Project HoneyPot HTTP:BL] web service to check content for possible spam.
     166=== HTTP:BL
     167
     168The [source:plugins/1.0/spam-filter/tracspamfilter/filters/httpbl.py HTTP:BL] filter uses the [http://www.projecthoneypot.org/httpbl.php Project HoneyPot HTTP:BL web service] to check content for possible spam.
    172169
    173170The use of this filter requires a [http://www.projecthoneypot.org/httpbl_configure.php HTTP:BL] API key. The API key is configured in the 'External' administration page.
     
    175172'''NOTE''': Submitters IP is sent to HTTP:BL servers.
    176173
    177 === !BotScout ===
    178 
    179 The [source:plugins/1.0/spam-filter/tracspamfilter/filters/botscout.py BotScout] filter uses the [http://botscout.com/ BotScout] web service to check content for possible spam. This services tests IP, username and/or email address.
     174=== !BotScout
     175
     176The [source:plugins/1.0/spam-filter/tracspamfilter/filters/botscout.py BotScout] filter uses the [http://botscout.com/ BotScout web service] to check content for possible spam. This services tests IP, username and/or email address.
    180177
    181178Using this filter requires an API key. The API key is configured in the 'External' administration page.
     
    183180'''NOTE''': Submitted username and IP is sent to !BotScout servers. Don't use this in private environments.
    184181
    185 === FSpamList ===
    186 
    187 The [source:plugins/1.0/spam-filter/tracspamfilter/filters/fspamlist.py FSpamList] filter uses the [http://www.fspamlist.com/ FSpamList] web service to check content for possible spam. This services tests IP, username and/or email address.
     182=== FSpamList
     183
     184The [source:plugins/1.0/spam-filter/tracspamfilter/filters/fspamlist.py FSpamList] filter uses the [http://www.fspamlist.com/ FSpamList web service] to check content for possible spam. This services tests IP, username and/or email address.
    188185
    189186Using this filter requires an API key. The API key is configured in the 'External' administration page.
     
    191188'''NOTE''': Submitted username and IP is sent to FSpamList servers. Don't use this in private environments.
    192189
    193 == Get the Plugin ==
    194 
    195 See the [wiki:TracPlugins#Requirements Trac plugin requirements] for instructions on installing `setuptools`.  `Setuptools` includes the `easy_install` application which you can use to install the SpamFilter:
     190== Get the Plugin
     191
     192See the [wiki:TracPlugins#Requirements Trac plugin requirements] for instructions on installing `setuptools`. `Setuptools` includes the `easy_install` application, which you can use to install the SpamFilter:
    196193{{{
    197194#!sh
     
    211208You can [source:plugins/1.0/spam-filter browse the source in Trac].
    212209
    213 ''[http://svn.edgewall.com/repos/trac/plugins/1.0/spam-filter/#egg=TracSpamFilter-dev This is a link for setuptools to find the SVN download]''
    214 
    215 == Enabling the Plugin ==
    216 
    217 If you install the plugin globally (as described [wiki:TracPlugins#ForAllProjects here]), you'll also need to enable it in the web administration or in [wiki:TracIni trac.ini] as follows:
     210''[http://svn.edgewall.com/repos/trac/plugins/1.0/spam-filter/#egg=TracSpamFilter-dev This is a link for setuptools to find the SVN download]''.
     211
     212== Enabling the Plugin
     213
     214If you install the plugin globally as described [wiki:TracPlugins#ForAllProjects here], you also need to enable it in the web administration or in [wiki:TracIni trac.ini]:
    218215{{{
    219216#!ini
     
    228225 * All external services can be disabled in 'External' section (completely and only for training)
    229226
    230 == Permissions ==
    231 
    232 The Spamfilter adds 4 new permissions to Trac:
     227== Permissions
     228
     229The Spamfilter adds new permissions to Trac:
    233230
    234231||=Permission=||=Functions=||
     
    241238|| SPAM_ADMIN || Combination of all six ||
    242239
    243 The permission SPAM_REPORT should probably not be assigned to unauthenticated users or there will be many false reports. Minimum in this case should be to exclude '/reportspam' URL in robots.txt file.
    244 
    245 == SpamFilter and !AccountManager ==
    246 
    247 If the [[http://trac-hacks.org/wiki/AccountManagerPlugin|AccountManager]] plugin is used in version 0.4 or better, than registrations can be checked for spam as well.
    248 
    249 To do so, the entry **!RegistrationFilterAdapter** needs to be added to key **register_check** in section **account-manager** of trac config.
    250 There are sveral ways to do so:
    251 * Add it as first in the line: The filter than displays reject reasons in the spamfilter log
    252 * Add it as last in line: First come accountmanager checks and only if all is fine spamfilter is called.
    253 * Enable the "account_replace_checks" to let spamfilter call the Accountmanager checks (not recommended)
    254 
    255 == Translation ==
    256 
    257 Please help to translate the plugin into your language: [https://www.transifex.com/projects/p/Trac_Plugin-L10N/resource/spamfilter/]
     240The permission SPAM_REPORT should probably not be assigned to unauthenticated users or else there will be many false reports. Minimum in this case should be to exclude '/reportspam' URL in robots.txt file.
     241
     242== SpamFilter and !AccountManager
     243
     244If the [[http://trac-hacks.org/wiki/AccountManagerPlugin|AccountManager]] plugin is used in version 0.4 or better, then registrations can be checked for spam as well. To do so, the entry **!RegistrationFilterAdapter** needs to be added to key **register_check** in section **account-manager** of trac config.
     245There are several ways to do this:
     246* Add it as first in the line: the filter then displays reject reasons in the spamfilter log.
     247* Add it as last in line: first are the accountmanager checks and only if all is fine, then the spamfilter is called.
     248* Enable the "account_replace_checks" to let spamfilter perform the Accountmanager checks (not recommended).
     249
     250== Translation
     251
     252You can translate the plugin into your language: [https://www.transifex.com/projects/p/Trac_Plugin-L10N/resource/spamfilter/]
    258253
    259254Top translations: Trac_Plugin-L10N » [http://www.transifex.net/projects/p/Trac_Plugin-L10N/resource/spamfilter/ spamfilter][[BR]]
     
    261256Kindly provided by [[Image(https://ds0k0en9abmn1.cloudfront.net/static/charts/images/tx-logo-micro.png, link=http://www.transifex.net/, title=the open translation platform, valign=bottom)]]
    262257
    263 
    264 == Further Reading ==
     258== Further Reading
    265259 * Historic information about !SpamFilter: [http://www.cmlenz.net/archives/2006/11/managing-trac-spam Managing Trac Spam]
    266260
    267 == Known Issues ==
    268  * '''Attention''': The 1.7 series of dnspython causes a massive slowdown of whole Trac.
     261== Known Issues
     262'''Attention''': dnspython v1.7 causes a massive slowdown of the Trac site.
    269263[[TicketQuery(component=plugin/spamfilter,status=!closed)]]
    270264
    271 == Requirements ==
    272 
    273  * The modules for IP blacklistening und HTTP:BL need [http://www.dnspython.org/ dnspython] installed.
    274    Install "setuptools" based on the [wiki:TracPlugins#Requirements Trac plugin requirements], then you can run "easy_install dnspython" to automatically download and install the package.
    275  * '''Attention''': The 1.7 series of dnspython causes a massive slowdown of whole Trac. Use newer versions only.
    276  * The !ImageCaptcha requires python-imaging to work.
    277  * Bayes filtering needs spambayes software installed.
    278  * Mollom filter needs [https://github.com/simplegeo/python-oauth2 python-oauth2] installed.
     265== Requirements
     266
     267 * The modules for IP blacklisting and HTTP:BL require [http://www.dnspython.org/ dnspython v1.8+]. Install "setuptools" of the [wiki:TracPlugins#Requirements Trac plugin requirements], then run `easy_install dnspython` to install the package.
     268 * The !ImageCaptcha requires [http://www.pythonware.com/products/pil/ python-imaging] to work.
     269 * Bayes filtering requires [http://spambayes.sourceforge.net/ spambayes].
     270 * Mollom filter requires [https://github.com/simplegeo/python-oauth2 python-oauth2].
    279271
    280272----