Description
Sub pages of 4ct10n with names of actions can not be created. Instead of helpful error message, the user get a traceback.
This is regression from 1.5.
Steps to reproduce
try to create 4ct10n/edit
Component selection
- url dispatching
Details
This wiki.
Workaround
Discussion
Change url scheme to /<handler>/<pagename>/
Examples:
http://moinmo.in/pages/FrontPage - instead of http://moinmo.in/FrontPage
http://moinmo.in/edit/FrontPage - instead of http://moinmo.in/4ct10n/edit/FrontPage?action=edit&editor=foo
http://moinmo.in/search/ - instead of http://moinmo.in/pagename?action=fullsearch
http://moinmo.in/pages/ - can replace TitleIndex
http://moinmo.in/admin/ - replace SysAdmin macro
Nir,
I guess everybody can find some action URL prefix that he doesn't need to create as a page (the one that this wiki uses, for example or simple 'action' for most wikis not talking about actions like this one does).
Of course we can make some pretty error msg for unknown actions instead of a traceback, but I guess we have more severe bugs to fix than pretty processing of manually created wrong urls.
Using /pages for standard page display is IMHO no option because that makes often used URLs longer and less pretty than they are now. It would also break all links from external sites.
BTW, I used /<common_action_prefix>/<actionname> to make it pretty easy to exclude all actions in robots.txt (that's the whole point of doing this).
-- ThomasWaldmann 2007-11-05 17:29:50
Did you try this rule? works with gogole and yahoo bots.
Gogolebot robots.txt: http://www.google.com/support/webmasters/bin/answer.py?answer=40367&ctx=sibling docs
Yahoo bot docs: http://help.yahoo.com/l/us/yahoo/search/webcrawler/slurp-02.html
Web robots and dynamic content issues: http://www.ghita.ro/article/23/web_robots_and_dynamic_content_issues.html
User-agent: * Disallow: /*?action=*
Ah, interesting. Seems like those did some sane implementation. The problem is just that nobody updated the robots.txt standard (AFAIK), so you ccan't be sure that everybody is doing it like this (everybody who wants to follow robots.txt standard). -- ThomasWaldmann 2007-11-06 16:30:03
Plan
- Priority:
- Assigned to:
- Status: no bug occurs, the namespace is simply unavailable