Skip to content
This repository has been archived by the owner on Feb 14, 2018. It is now read-only.

Fork of Page with same name of local page overwrites #342

Open
rynomad opened this issue Jan 25, 2013 · 17 comments
Open

Fork of Page with same name of local page overwrites #342

rynomad opened this issue Jan 25, 2013 · 17 comments

Comments

@rynomad
Copy link
Contributor

rynomad commented Jan 25, 2013

Just discovered this while demoing SFW to a friend... had a bunch of sites running on a farm, some linked to one another and I was showing off the neighborhood flag icon to WV-page, and then decided to show off forking... Of course this overwrote my WV with another, and was unrecoverable. Probably more user error than code error, but a thought for making such things idiotproof (henceforth 'nomad-proof') would be a check if page exists locally => append 'merge' to slug => open 'slug' and 'merge-slug' side by side, integrate manually. thoughts?

@sergueif
Copy link

Yep, same situation. Demo to friend. edit his page -> auto-fork -> overwrite my own copy with all history lost. +1

@WardCunningham
Copy link
Owner

I'm sorry you've lost a page. Welcome Visitors does seem especially vulnerable to this mistake. Replacing the journal makes it especially harsh.

I have thought that the proper action would be to merge the journals from where they diverged, possibly from the beginning. There is probably some drag and drop shuffling of the journal that can resolve conflicts, if any.

I've also thought that an un-delete should be available in a number of situations. This is one. Maybe replaced pages go into a /data/deleted directory.

@rynomad
Copy link
Contributor Author

rynomad commented Jan 27, 2013

+1 to merge from diverge. The idea seems to mirror the way git handles the issue, and when I'm describing SFW to my geekier friends, my 'in-a-nutshell' description is generally "it's like git meets wiki." a data/deleted directory would be a good stop-gap though, and handy to have in any case a la "deleting me softly: so we never have to say sorry"

@rynomad
Copy link
Contributor Author

rynomad commented Jan 28, 2013

further consideration: what if two wikis have pages named the same but about completely different content. Example:

I run a development wiki for my spinoff of SFW, and I have a page about Sinatra, giving new developers some background on working with the framework. I take a break and browse the growing federation and find that someone has a page about Frank Sinatra (let's just assume that they left off 'Frank' in the name) I like Frank and I happen to know of a good youtube video of one of his better performances, so I want to add that to this other page. If SFW automatically merges these pages, you get a (reversible) mess of content that was never supposed to mix, and no way of pushing that content to another federated wiki.

Potential solution: Rather than auto-forking any and all edits to origin while logged in, require that one actually presses the 'fork' button. in all other cases, save into local storage and treat as if not logged in... allowing one to be logged into one's own wiki while simultaneously making edits in local editing and 'submitting changes' to other wikis. Of course, in the case that I find another wiki page about the Sinatra framework, and I want to incorporate that content, 'merge' should still be a function of the fork button. But it seems SFW should allow the user to explicitly define what they want to do.

I'd be happy to take a stab at this: any pointers as to where in the server/clientside code I should go first?

@WardCunningham
Copy link
Owner

An interesting experiment would be to detect a damaging fork (by looking in the neighborhood sitemap) and then producing the merge as a new, ghost page, instead of completing the fork. This would allow you to play with merging techniques and have some confidence in them before you start writing back to the page store.

You might require that two pages have at least some actions in common before you try to merge.

This is an area where I've tried to let operational experience guide further development. To that end I make many sites and muddle through with the basic tools all ready in place. Merging has been on my todo list since the beginning. I can't wait to see how well it works.

@WardCunningham
Copy link
Owner

An exotic tool might be an action sorter that would allow one to drag merged actions around and see the impact that these changes have on a result. Actions that fail to apply might show as red. Maybe you could drag one back and forth to find where in the journal it would apply. Crazy things are possible.

@WardCunningham
Copy link
Owner

This issue points out the need to not casually throw information away. Specifically, the within page journal is not enough history to avoid all losses. I am on the verge of implementing a deleted page cache. I mention @nrn and @harlantwood now because they have written server-side storage modules. Here are my thoughts as to how delete should work (copied from here:

Trash is one deep. Up to server for where it is stored.

Deleted pages are retrieved with slug suffixed with _old.

Deleted pages are included in a sitemap. How?

Deleted pages appear in future pages and twin links.

Deleted pages show as ghosts that can be forked back into existence.

Deleted pages have revisions suffixed _old_rev15

Rename will leave old page in trash.

Dangling references will find old page in future page. Do we want to remember the new name?

Rename to blank is like move to trash.

@rynomad
Copy link
Contributor Author

rynomad commented Jan 31, 2013

Been putting some thought into this, and poking around the code, and have come to a few preliminary conclusions:

coding for merge is far beyond my current skill level, though I fully intend to break your program a half dozen times trying to make it work anyway.

I really like the idea of merge as a ghost page... anything that lessens spontaneous-user-panic is a plus, and accidentally overwriting/making large-scale changes to your hand crafted page with an inadvertent or misunderstood double-click is up there on my list of panic-inducers.

Likewise, I've been creating many wiki's on my laptop farm and doing some exploration and use-simulations, and more and more I'm running into scenarios where 'Alice' is logged in to her wiki, and wants to contribute something/fix a typo/ join the discussion on 'Bob's wiki, and the end result she's seeking is not to have that content on her wiki, but rather to enrich bob's wiki... The Submit changes mechanism seems impossible to access while she is logged in. Her options are to either logout, make and submit her changes, and log back in. Or to fork bob's page, and manually inform Bob to check out her version/ hope he's got the activity plugin pointed at her. so he can fork back. Neither of these are as smooth or as effective as the local editing + submit changes mechanism already in place. (sidenote: Imagine a cumulative 'review/submit changes' button... You spend an hour posting to 10 of your friends wiki's and on one page you review all those changes and submit with one click!)

^^A little long-winded, and delayed. Apologies on both counts. I'm looking at this project with what might perhaps be an overly specific envisioning of a federation/syndicate I'd like to spawn, so take all this with a grain of salt. That said, as opposed to merge-logic, calling the local-edit function while logged in sounds like something that might fit within my current progression of learn-by-breaking-things-until-they-work-better, so I think I'll give it a shot and report back how I like it.

Last Thing - In reference to Un-delete... if you append a number to the end of a page's title slug, it seems to become completely invisible to wiki internal linking, might be a quick way to implement an undelete key, make it hide in plain site, until we iron out a design for a more structured data-store.

@WardCunningham
Copy link
Owner

All good observations. I worry about the case where 'Alice' is a spammer. We all suffer a little because there are spammers in the world.

@harlantwood
Copy link

I'm happy to help as needed with thinking about (or reviewing the implementation of) the way these features interface with the "stores" in the ruby server. It looks to me like these features are mostly at a higher level however, i.e. simple Store.put_text(...) and Store.get_page(...) (etc) calls will probably work fine, across all Stores. Feel free to ping me anytime, I'm happy to brainstorm architecture or review code as is helpful.

@rynomad
Copy link
Contributor Author

rynomad commented Feb 5, 2013

Indeed. Spam is massively unpleasant. I wonder if we could implement some sort of "accept submission" config for wiki owners. i.e.
allow all submissions (current default), or
allow submissions only from other wiki owners, or
allow submissions from a select few, or
don't allow submissions.

alternatively, it would be interesting if there were some way to 'report spammers' within the federation, some sort of distributed blacklist...with each submission the server weighs your report card against the owners 'benefit of the doubt' setting to determine whether your submission will be accepted.

oh, and world peace would be cool too.

@rynomad
Copy link
Contributor Author

rynomad commented Feb 5, 2013

also, and this may be a naive, inane, unthinkable idea, but in theory, could we use git itself to handle these logistics? fetch, clone, pull, merge, push(with permission). There seem to be a few rubygems that deal with git. What's more, what about using git repo's as one of the available stores?

@harlantwood
Copy link

Hi @rynomad... as it happens, I've just split out my ruby code abstracting stores into a gem called polystore -- which includes an (ugly but working) github store.

Of course, this only handles get and put operations, not all the git operations you list above -- notably not merge, which is where things get really interesting.

It's a great topic, and totally in line with my larger mission to bring the fork/diff/merge information ecosystem to all kinds of creative works.

@WardCunningham
Copy link
Owner

The simple solution is to not accept submissions. SFW doesn't need them. If you want me to follow your wiki there are many ways to reach me. Send me a link.

SFW is an editor that saves its work in files. That makes it as compatible with git as any other editor.

@rynomad
Copy link
Contributor Author

rynomad commented Feb 5, 2013

@harlantwood thanks for those repo links, very good brainfood. I also liked the 'enlightened structure' site; that link turned into quite a rabbit hole.

@WardCunningham fair enough, I suppose I did kinda get off on a tangent with an overly specific use-scenario, thanks for reeling me back in. And thanks for shedding light on git compatibility. Since you mention it, I finally have my SFW farm up and running, and I recall promising you a link to an 'obscure workarounds' fork:

http://nei.ghbor.net:2013/view/obscure-workarounds

^gotta use port 2013 since I'm hosting this behind a residential broadband connection that blocks port 80.

@WardCunningham
Copy link
Owner

Really, your connectivity provider blocks port 80? Here is a video where I explain why "service" is the highest and most creative kind of internet access. It should be illegal to block port 80.

@rynomad
Copy link
Contributor Author

rynomad commented Feb 5, 2013

aaaaand, once again I give myself the bonehead award. http://nei.ghbor.net is now live on 80. I'll save the song and dance but suffice it to say the real issue wasn't the ISP, it was PEBKAC. good video though, I agree wholeheartedly.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants