How to secure moinmoin with https

Let's imagine that you have enabled https access to secure the access of editors to your moinmoin server.

Now you probably have three problems :

  1. Make sure the users actually use https to login in !

  2. Prevent google from indexing your website twice (both http and https), which would 1. lower your page rank and 2. provide multiple results.
  3. Redirect visitors to http, in order to reduce CPU usage (in case someone posted an https url on the internet for your wiki).

The following examples are suitable for Apache (see mod_rewrite).

Automatic redirection

Redirect users that click on "Login" to the https:// site (or similar actions).

RewriteCond %{HTTPS}            !on
RewriteCond %{QUERY_STRING}     ^action=(login|newaccount|userprefs)
RewriteRule ^.                  https://%{SERVER_NAME}%{REQUEST_URI}?%{QUERY_STRING} [L]

Optionally, Redirect https users to http when : 1. they aren't logged in (no cookie) 2. the referring page don't contain "action=login" (so user can read wrong password error messages). 3. the current page isn't the login page.

RewriteCond %{HTTPS}            on
RewriteCond %{HTTP_COOKIE}      ^$
RewriteCond %{QUERY_STRING}     !action=(login|newaccount|userprefs)
RewriteCond %{HTTP_REFERER}     !https://.*action=(login|newaccount|userprefs)
RewriteRule ^.                  http://%{SERVER_NAME}%{REQUEST_URI}?%{QUERY_STRING} [L]

It is probably sensible to set cookie_secure = True (read more in HelpOnConfiguration ) in your moinmoin configuration file.

Note: when using cookie_secure = True the session cookie will be only transmitted over https (so it is secure). If you visit some http url of your wiki, moin won't get the session cookie and won't know that you are logged in. There is an idea of implementing an unsecure WANT_HTTPS = True cookie, so a user who has a secure session cookie can be redirected automatically to the https url so the browser will transmit the session cookie.

Prevent robot from indexing https wiki

To prevent search engine from indexing your wiki twice (and avoid duplicate results), you can use robots.txt

RewriteCond %{HTTPS}            on
RewriteRule ^/robots.txt$       /path/to/robots_https.txt [L]

The file /path/to/robots_https.txt would contain :

# robots.txt that prevent indexing the wiki
# (for https only)
User-agent: *
Disallow: /

Note

See also


Keywords: ssl, duplicate, 443, google, yahoo, secure, crypted

MoinMoin: FranklinPiat/UsingHttps (last edited 2011-02-18 12:40:37 by ThomasWaldmann)