Description
Compared to 1.8, 1.9 is using an enormous amount of memory serving the same wikis.
Steps to reproduce
- run 1.9
- run 1.8 in the same configuration
- wait a few hours
- observe large memory usage of 1.9 compared to 1.8
Example
n/a
Component selection
- general
Details
MoinMoin Version |
1.9 hg as of 2009-12-16 |
OS and Version |
|
Python Version |
|
Server Setup |
|
Server Details |
|
Language you are using the wiki in (set in the browser/UserPreferences) |
|
Workaround
kill moin periodicially before it eats too much memory -- I used to do this every day but that is no longer sufficient
Discussion
Doing some Benchmark for Memory Usage?
This could be for my hosting stuff a killer. I use a shared hostingplan from http://www.webfaction.com. Running my little / private Wikifarm with Apache and 1 or 2 mod_wsgi processes with some threads. Also did a restart after every 1'00 requests (thanks to maximum-requests). At the momentan I'm allowed to use around 120MB memory and using around 50-110MB (depends on the traffic). If MoinMoin 1.9 realy needs 25% more memory then I maybe will start to have also some bigger problems (sure, I just would maybe need to restart every 500 request or decrase the threas / processes). But well, I would like to verify this somehow. My Idea would go like this:
Simple Textpage
ab -c 10 -n 1000 http://moinmo.in/HelpContents
With some Maro with will do some file I/O:
ab -c 10 -n 1000 http://moinmo.in/RecentChanges
Trying out the Xapian-Search:
ab -c 10 -n 1000 http://moinmo.in/HelpContents?action=fullsearch&context=180&value=help&fullsearch=Text
Sure; the number of requests shoud be maybe increase, depends on how long the server needs to serve this stuff, etcetera. Also the test should be doing localy and defently not with a proxy between, etcetera... Even with a simple bash script we could define some different page calling to emulate more a bit real life user behavior....
And using some script to see the memory usage before and after (or even with a cronjob every minute)
""" mem.py - A script which calculates, formats, and displays a customer's memory usage. Change WsgiUserName to your needs. output: memory mb; process """ import os import sys from time import * CMD = "ps -U WsgiUserName -o rss,cmd" def main(): ausgabe = os.popen(CMD) usage = 0 pos = -1 datum = strftime("%c", localtime()) for zeile in ausgabe: pos = pos + 1 zeile = zeile.lstrip() zeile = zeile.rstrip("\n") zeile = zeile.partition(" ") # print "%s. %s (%s kb)" % (pos, zeile[2], zeile[0]) if zeile[2].find("peruser") < 0: try: usage = usage + int(zeile[0]) except ValueError: pass else: pos = pos -1 print "%s;%s;%s" % (datum, usage / 1024, pos) if __name__ == '__main__': main()
There is also a quit outdated page about MoinBenchmarks. Maybe we should define some defaults and could test every beta/rc bevore to see the different.
Just some Ideas... maybe moin dev using also alreay some other method... -- MarcelHäfner 2009-12-16 11:41:34
Yes, please do some comparative memory consumption measurements to add more facts to this bug report. Make sure you use some comparable environment - maybe use apache/mod_wsgi for both 1.8.6 and 1.9.0 (one could also use wikiserver.py, but this is slightly different between 1.8 and 1.9, not sure whether it matters). For mod_wsgi use 1 process only and unlimited requests for running the test. Maybe also do some GETs on TitleIndex. -- ThomasWaldmann 2009-12-16 15:29:21
Same problem here with Moin+FastCGI+Lighttpd. Moin starts at about 20MB memoy. After 5 hours, it has reached about 500MB. Moving to forking and forcing flup to restart Moin processes after 5 requests seems to help. -- JeanPhilippeGuérard 2010-01-15 06:55:26
Plan
- Priority:
- Assigned to:
- Status: