We don't make a wiki engine - we make a network of moin moin wikis. Moin Moin wiki can and should talk with each other, share pages, protect each other from spammers, and generally be friendly to each other.
In a perfect world, we would like a NetworkOfWikis which is the InterWiki, but if we want to make any progress, we must develop our network first. We can do it quite easily as we control both sides of the network. If we build the network in a smart way, it will be easy to join with other wiki engines later, but even if we don't - there are enough Moin Moin wikis to have a lot of fun.
-- NirSoffer 2004-08-25 11:52:28
Pages related to this idea:
How this could work
There are two possible architectures how such feature can be implemented: client/server or peer to peer
Peer to Peer
Wiki instances communicate directly with each other - like BitTorrent - which is also Python, maybe we can borrow code
- No central server is needed
Each wiki is both a client and a server implementing the WikiPeerToPeerProtocol
- The network load is spread across many wikis, no need for central server of a fast connection.
- Cannot be as optimizing as a central server in some cases
To support different wiki engines, the engines will have to implement the WikiPeerToPeerProtocol.
Client/Server
In this scenario the wikis are the clients of one server holding the shared data.
The server will implement the WikiServerProtocol
The wikis will implement the WikiClientProtocol
- The wikis does not have to know each other
nice for "first contact" scenarios (like InterWiki map)
- Can work also for wiki which are not accessible from the public internet, if they pool their data from the server
To support different wiki engines, either the server will be extended to work with different engines, or the engine will implement the ClientWikiProtocol.
Possible services for central server:
Central InterWiki map service
MultiLang service
AdoptedPages service
Mix
We have central server that we use to develop content: MoinMaster. Its load is very low, we just edit a page from time to time. But if all wikis will download data from this server, it may need faster connection. Also, the server could be attacked by bad guys.
We can use the central server to start the transaction, then when each wiki got the new data, it can become a server itself, and serve all other wikis.
For example, delivering new BadContent version:
MoinMaster notify 2 wiki instances that it has new BadContent
The wiki instances pool the new BadContent
- Each wiki notify another (possibly random) two wikis...
In few iterations, a new BadContent file will be on each MoinMoin wiki with zero network load. If MoinMaster is killed, we can start the process from another server.
To make shure that all wikis DO get the information and that the information they get is valid you have to use a bit more complicated mechanism than described above... Nevertheless this is an interesting idea.