You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a couple of use cases on my mind, where file-based wiki engine like PeppermintyWiki should shine, compared to other wiki engines. They are:
portability - in a sense that you can take MD files you have in one PMWiki instance and copy/move them to the other PMWiki instance
disaster recovery - when something goes wrong with your webhosting, but you still can access all your MD files (or you have a backup), you just spin up another PMWiki instance and put all your MD files in there
edit your MD files outside of web UI and then ask a wiki engine to pick all these changes up. Like in Enhancement: Git integration #192@npnance wanted to have some automation so MD will be pushed/pulled from git repo
As far as I can see, PeppermintyWiki is heavily relying on a number of extra .json files.
Some of them are acting as an cache (like idindex.json or statsindex.json). To my understanding such files can be easily dropped without loosing a bit of useful info, as they are recreated by wiki engine or from CLI console.
Others are providing additional data or logs, like <Page Name>.comments.json which are holding page comments or recent-changes.json which is kept updated as you're updating pages.
The pageindex.json is the most important file around here which holds info about what pages are there, mapping to the file name, what tags and revision they have.
As per my experiments, like I mentioned in my use cases above, it seems like for now it's enough to delete the pageindex.json, so it will be regenerated by wiki engine based on the actual files seen in the directory. This regeneration process even manages to pick up revisions, good stuff! But doing that delete means loosing tagging / dates / author information for all the pages that was stored in that file
Here I come to the essence of my feature request. I'd wish there was a better way of making PeppermintyWiki to know about external changes happened to its MD files. Some kind of update or reconcile process, which can be triggered from the web UI and CLI console. During that reconciliation, whatever inconsistencies are found between real-world files and pageindex.json are automatically resolved by making appropriate changes to pageindex.json:
if a MD file is no longer found, then delete appropriate JSON record
if a new MD file was found, create a JSON record with it like it happens now
if MD file is having different size or timestampt compared to what is stored in pageindex.json, then update it the JSON
It probably also makes sense to reflect all that in recent-changes.json as well to be consistent.
I can probably blow some dust off from my rusty PHP skills to make it happen, if you'll be willing to accept my pull request, once its done.
The text was updated successfully, but these errors were encountered:
It should also fix a bug current version is having: when you remove the pageindex.json file and you let PeppermintyWiki to regenerate it, if you have some of your md files in subfolders (like ./syntax/test.md.r0), the generated filenames for revisions were getting corrupted, because of inconsistent filename processing:
'filename' => 'ntax/test.md.r0',
instead of
'filename' => 'syntax/test.md.r0',
I have moved the filename processing code to become a new normalize_filename() function in 05-functions.php to reuse the same processing all the time, instead of different bespoke code which was scattered before
I have a couple of use cases on my mind, where file-based wiki engine like PeppermintyWiki should shine, compared to other wiki engines. They are:
As far as I can see, PeppermintyWiki is heavily relying on a number of extra .json files.
Some of them are acting as an cache (like
idindex.json
orstatsindex.json
). To my understanding such files can be easily dropped without loosing a bit of useful info, as they are recreated by wiki engine or from CLI console.Others are providing additional data or logs, like
<Page Name>.comments.json
which are holding page comments orrecent-changes.json
which is kept updated as you're updating pages.The
pageindex.json
is the most important file around here which holds info about what pages are there, mapping to the file name, what tags and revision they have.As per my experiments, like I mentioned in my use cases above, it seems like for now it's enough to delete the
pageindex.json
, so it will be regenerated by wiki engine based on the actual files seen in the directory. This regeneration process even manages to pick up revisions, good stuff! But doing that delete means loosing tagging / dates / author information for all the pages that was stored in that fileHere I come to the essence of my feature request. I'd wish there was a better way of making PeppermintyWiki to know about external changes happened to its MD files. Some kind of update or reconcile process, which can be triggered from the web UI and CLI console. During that reconciliation, whatever inconsistencies are found between real-world files and
pageindex.json
are automatically resolved by making appropriate changes topageindex.json
:pageindex.json
, then update it the JSONIt probably also makes sense to reflect all that in
recent-changes.json
as well to be consistent.I can probably blow some dust off from my rusty PHP skills to make it happen, if you'll be willing to accept my pull request, once its done.
The text was updated successfully, but these errors were encountered: