Description
After upgrading to 1.5.5a and cleaning up the cache with the cachecleaner (or more accuratly with the ScriptMarket/AllWikisHack), some pages don't show up in linkto: search results. After throughout investigation, it appears that some pages have empty (or almost empty: there's just a \n) cache/pagelinks. Those are generated when I did the first search, which generated thousands of such messages:
Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/MoinMoin/Page.py", line 1529, in parsePageLinks page.send_page(request, content_only=1) File "/usr/local/lib/python2.4/site-packages/MoinMoin/Page.py", line 1235, in send_page start_line=pi_lines) File "/usr/local/lib/python2.4/site-packages/MoinMoin/Page.py", line 1314, in send_page_content self.format(parser) File "/usr/local/lib/python2.4/site-packages/MoinMoin/Page.py", line 1335, in format parser.format(self.formatter) File "/usr/local/lib/python2.4/site-packages/MoinMoin/parser/wiki.py", line 1017, in format self.processor_is_parser) File "/usr/local/lib/python2.4/site-packages/MoinMoin/formatter/base.py", line 317, in processor p.format(self) File "/usr/local/www/wiki/wikis/koumbitwiki/data/plugin/parser/sctable.py", line 567, in format wikiizer.format(formatter) File "/usr/local/lib/python2.4/site-packages/MoinMoin/parser/wiki.py", line 1102, in format formatted_line = self.scan(scan_re, line) File "/usr/local/lib/python2.4/site-packages/MoinMoin/parser/wiki.py", line 878, in scan result.append(self.replace(match)) File "/usr/local/lib/python2.4/site-packages/MoinMoin/parser/wiki.py", line 907, in replace result.append(replace(hit)) File "/usr/local/lib/python2.4/site-packages/MoinMoin/parser/wiki.py", line 354, in _word_repl return (self.formatter.pagelink(1, word, anchor=anchor) + File "/usr/local/lib/python2.4/site-packages/MoinMoin/formatter/pagelinks.py", line 15, in pagelink FormatterBase.pagelink(self, on, pagename, page, **kw) File "/usr/local/lib/python2.4/site-packages/MoinMoin/formatter/base.py", line 82, in pagelink pagename = self.request.normalizePagename(pagename) File "/usr/local/lib/python2.4/site-packages/MoinMoin/request.py", line 866, in normalizePagename if wikiutil.isGroupPage(self, page): File "/usr/local/lib/python2.4/site-packages/MoinMoin/wikiutil.py", line 616, in isGroupPage filter = re.compile(request.cfg.page_group_regex, re.UNICODE) File "/usr/local/lib/python2.4/sre.py", line 180, in compile return _compile(pattern, flags) File "/usr/local/lib/python2.4/sre.py", line 222, in _compile if not sre_compile.isstring(pattern): RuntimeError: maximum recursion depth exceeded
Update:
Here is the 300 last lines of my apache error log: backtrace300.log
-- TheAnarcat 2006-09-30 21:36:46
Those logs are not enough to see the recursion, please change your Page.py parsePageLinks method to use traceback.print_exc(200) and send us this (longer) traceback. A (minimal) sample of a page causing this error would be nice, too.
Steps to reproduce
- clean the cache
- make a search
Example
Details
MoinMoin Version |
1.5.5a |
OS and Version |
FreeBSD 6.1 |
Python Version |
2.4.2 |
Server Setup |
Apache 2.0.59 + fastcgi |
Server Details |
- |
Language you are using the wiki in (set in the browser/UserPreferences) |
french |
Workaround
Deleting the cache from the pages that have corrupted caches makes search include those pages properly.
Discussion
Plan
- Priority:
- Assigned to:
- Status: fixed in 1.5 and 1.6 branch