You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With #18 fixed and the comments now being curlable the next step would be to utilize that curlability by becoming part of the salmentions ecosystem and both receive salmentions from others and forward them.
This moves this endpoint closer to be able to take part in a SWAT0 dance.
Fetch the WebMention pinged target site if it isn't already in the database and store it in entries and especially ensure to save its interaction targets
Fetch all the interaction targets found that isn't already in the database and extract their WebMention endpoints. Save these as entries just like all other entries?
Parse the comments included in the WebMention pinged source
Fetch each found comment that isn't already in the database and save them as entries and mentions. If they are not targeted at a registered site, then maybe mark them as "external" or something
Ensure that a too deep comment tree doesn't break presentation – look at how eg. Disqus does it – after X many levels don't indent the additional levels anymore. This is a pure styling issue.
Whenever all comments has been fetched, if any new were found – send a WebMention to each interaction target of the original WebMention pinged target
Ensure that if an earlier post updates to mention a post later in the chain that no infinite salmentioning will happen – this is probably more of a problem in the presentation part than in the pinging part. If the relation gets circular then any attempt at trying to fetch a tree representation of that relation will be infinitely deep. A graph database would really make sense here – fetching an entire tree in PostgreSQL isn't really that optimal – one would really need one JOIN per level
Support sending Salmentions for non-interactions, for the mentions, without pinging hundreds of URL:s – a tricky one
So this more or less means that rather than just simply fetching a single page, the source, per WebMention the script would, in the worst case scenario, have to fetch four different types of pages: The source, the target, all comments included in the source and all targets of the target.
And when all that is done it has to send WebMentions to all the WebMentions endpoints of the targets targets.
So a source with a single comment on a target with a single target would mean fetching 4 pages and doing 1 WebMention rather than just fetching 1 page as is done today. And 4 times the metadata will be stored. This metadata will be possible to use to better present reply-context and such on eg. the standalone embed pages and such though and could perhaps be used in other ways as well.
The text was updated successfully, but these errors were encountered:
A first full implementation, with tests, of Salmentioning – fetching and sending – is now in master.
It still does some excessive fetching and pinging and it is possible for it to get stuck in an infinite loop – and there's yet no way to opt-in a site through the UI – but the basis is there and it works – now it's just a matter of refining it.