One Big Wiki

Once wiki pages have been published to the permanent wiki, pages are now immutable, and effectively reside in one huge global namespace.

Luckily the current wiki farm file structure maps well to scaling to cover all wiki's with each domain being represented by a folder.

The opportunity then is also to create a global or other useful indexes on import. We could and should use a database for more complex or Fluid Groups, but the following functionality alone adds considerable value:

This form of wiki, is completely public. You are effectively writing with a global community of wiki authors. There would be no need for a database to search for other authors who have written a page with the same name.

This option could be problematic at scale, or could be very powerful. But it would only be one of many interesting options open to the author that this architecture would enable.

Just as easily would be to create a namespace for authoring which only included a subset of authors. Such a snapshot would not need to replicate the actual data, but instead it would provide an alternative snapshot of link-lists which would be rendered as wiki-page.json.

This works more or less the same way git works, and cloning, providing access to multiple versions of wiki should be fast, and require almost no extra disk space. For this structure to work at it's most powerful see IPLD.

One big wiki is looking promising, not just for assets in IPFS, but for an effective server solution behind Dream Wiki.