Skip to content

Commit

Permalink
Merge branch 'main' into feature/programatically-add-tags-to-posts
Browse files Browse the repository at this point in the history
  • Loading branch information
carbontwelve committed Jan 27, 2023
2 parents eb83767 + a3a1fe6 commit ad2c720
Show file tree
Hide file tree
Showing 61 changed files with 425 additions and 49 deletions.
25 changes: 2 additions & 23 deletions .eleventy.js
Original file line number Diff line number Diff line change
Expand Up @@ -96,29 +96,8 @@ module.exports = function (eleventyConfig) {
}).use(require("markdown-it-anchor"), {
permalink: false,
slugify: input => slugify(input),
}).use(function(md) {
// Recognize Mediawiki links ([[text]])
md.linkify.add("[[", {
validate: /^\s?([^\[\]\|\n\r]+)(\|[^\[\]\|\n\r]+)?\s?\]\]/,
normalize: match => {
const parts = match.raw.slice(2, -2).split("|");
const slug = slugify(parts[0].replace(/.(md|markdown)\s?$/i, "").trim());
const found = linkMapCache.get(slug);

if (!found) {
throw new Error(`Unable to find page linked by wikilink slug [${slug}]`)
}

match.text = parts.length === 2
? parts[1]
: found.title;

match.url = found.permalink.substring(0,1) === '/'
? found.permalink
: `/${found.permalink}`;
}
})
}).use(require("markdown-it-footnote"));
}).use(require('./utils/helpers/wikilinks'), linkMapCache)
.use(require("markdown-it-footnote"));

setupMarkdownIt(markdownIt);

Expand Down
Binary file added _assets/og-image/2022-week-24-in-review.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/2022-week-25-in-review.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/2022-week-26-in-review.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/2022-week-27-in-review.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/2022-week-30-in-review.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/2022-week-31-in-review.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/2022-week-32-in-review.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/2022-week-35-in-review.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/2022-week-36-in-review.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/2022-week-37-in-review.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/2023-week-1-in-review.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/2023-week-2-in-review.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/a-revival-of-sorts.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/code-coverage-info.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/cool-urls-dont-change.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/defrag-like-its-1992.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/draughts.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/floppybox.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _assets/og-image/like-pie.jpg
Binary file added _assets/og-image/phpurls.jpg
Binary file added _assets/og-image/portcullis.jpg
Binary file added _assets/og-image/t-irc.jpg
Binary file added _assets/og-image/todoci.jpg
Binary file added _assets/og-image/toshiba-t5100.jpg
27 changes: 19 additions & 8 deletions _includes/layouts/post.njk
Original file line number Diff line number Diff line change
Expand Up @@ -19,21 +19,32 @@
<a class="pill" href="/topic/{{ tag.slug }}">{{ tag.name }}</a>
{% endfor %}
<br/>
{% if readingTime.words > 0 %}
{{ readingTime.text }}.
{% endif %}
</p>

{% if growthStage === 'stub' %}
<section>
<blockquote>
<p>Note: This is a stub post to be filled out in the future. It has been created for the purpose of interlinking feel free to check out what links here from the list below to find related pages.</p>
</blockquote>
</section>
{% endif %}

<section>
{{ content | safe }}
</section>

{% if backlinks.length > 0 %}
<nav>
<h3>Linking here</h3>
<ul>
{%- for link in backlinks %}
<li><a href="{{ link.url }}">{{ link.title }}</a></li>
{%- endfor %}
</ul>
</nav>
<nav>
<h3>Linking here</h3>
<ul>
{%- for link in backlinks %}
<li><a href="{{ link.url }}">{{ link.title }}</a></li>
{%- endfor %}
</ul>
</nav>
{% endif %}
</article>
{% endblock %}
21 changes: 21 additions & 0 deletions content/colophon/2023-01-27-stub-posts.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
---
title: "Adding stub posts"
tags: [Blogging]
growthStage: seedling
---

A while back as part of converting into a #DigitalGarden I added [[Wiki links|Wiki Links]] support to PhotoGabble's #11ty build. In doing so I also made it very easy for me to interlink between pages and most importantly copy posts directly from Obsidian into markdown files in this websites repository without having to change anything.

The only problem with this approach was that now I had wikilinks brought over from Obsidian that had nothing to point to on PhotoGabble. [Evan Boeh's](https://boehs.org/) solution to this was to have Eleventy automatically create stub pages which would act as placeholders until they fleshed them out else act as waypoints for finding related posts.

Because I know that any automated system would likely result in me forgetting to fill in the stubs I opted for my Wiki Links solution to break deployment on broken wiki links so that I could either add the missing files as a stub or remove the wikilink.

## What is a Stub

Within the context of this Digital Garden stubs are akin to drafts. Whereas drafts are never made public until they are published, stubs are publicly available but not included in any of the published content lists: archive, topic listings, etc. The only ways a visitor should find a stub post is if they clicked on a wiki link that pointed to one or a linking here list from a published post.

More often than not a stub page will have nothing of merit on it except the list of "linking here" pages which acts as a "these are all related" list.

## Upcoming posts

A nice side effect of having stubs is that I can give each one a potential future publish date making the [view all stubs](/stubs/) page a schedule of upcoming posts. With a bit of effort I could modify this page into two lists, one upcoming and one overdue. However, for now I am happy with what is there.
6 changes: 6 additions & 0 deletions content/essays/2023-01-30-the-small-web.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
---
title: "The Small Web"
tags: ["Decentralisation", "Self Hosting"]
growthStage: stub
---

6 changes: 6 additions & 0 deletions content/essays/2023-02-27-self-hosting-is-the-new-old-web.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
---
title: "Self hosting is the new old web"
tags: ["Decentralisation", "Self Hosting"]
growthStage: stub
---

Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ growthStage: evergreen

![Kareltima III Startup](/img/kareltima-iii-the-self-beating-machine-2.png "Kareltima III Startup")

I have spent a good four years hunting down this old home-brew dos game. The reason behind it taking so long is that I only knew the game by its executable name Karel3.exe rather than by its full name Kareltima III, I am not even entirely sure how I came about finding it in the first place although I do remember searching for home-brew back in 1998 and collecting a vast number of home made dos games to play on my old Toshiba laptop after having played as many of those available on DosBox that would play on my laptop (I would download them at school and then play at home, we didn't even get 56k for a while).
I have spent a good four years hunting down this old home-brew dos game. The reason behind it taking so long is that I only knew the game by its executable name Karel3.exe rather than by its full name Kareltima III, I am not even entirely sure how I came about finding it in the first place although I do remember searching for home-brew back in 1998 and collecting a vast number of home made dos games to play on my old [[toshiba-t5100 | Toshiba laptop]] after having played as many of those available on DosBox that would play on my laptop (I would download them at school and then play at home, we didn't even get 56k for a while).

I found my copy of the game on an old floppy disk which had once contained Omnis 7 (1994) that I had obtained from somewhere in 1998 and formatted for my own use, once I had the games full name I [googled](http://www.google.co.uk/search?q=Kareltima+III) it and found much to my surprise the game creators [personal website](http://www.bedroomlan.org/projects/kareltima) (moved from the [original homepage](http://www.bedroomlan.org/~alexios/coding_karel3.html)).

Expand Down
6 changes: 6 additions & 0 deletions content/noteworthy/2023-01-01-toshiba-t5100.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
---
title: "Toshiba T5100"
tags: ["DOS", "Retro Computing"]
growthStage: stub
---

35 changes: 35 additions & 0 deletions content/noteworthy/2023-01-23-defrag-like-its-1992.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
---
title: "Defrag like its 1992"
tags: ["DOS", "Retro Computing"]
growthStage: budding
---

> If you let it run to completion your web browser will run slightly faster for a while.
> [lorddimwit](https://lobste.rs/s/cuoth6/windows_95_defrag_simulator_makes_noise#c_9n4zts)
The first computer that I owned was a second hand [[Toshiba T5100]] luggable saved from recycling by a friends father; being securely disposed it came without its original 40MB Hard Drive and so everything I did was via 3.5" floppy disks.

This didn't stop me from using the [MS-DOS `defrag` command](https://www.computerhope.com/defrag.htm) and spend longer than reasonable watching it do its thing; I did however learn quickly that turning the computer off while its working would corrupt the disk.

I had forgotten that memory until stumbling upon this [hacker news post titled: Defrag like its 1992](https://news.ycombinator.com/item?id=29585654) way back in December of 2021 ()[^1].

{% figure "/img/defrag-like-its-1992-1.png" "Fig 1. A close approximation to the real thing..." "An HTML reproduction of the Text Based UI (TUI) of the MS-DOS Defrag command. It's made up of a selection of ASCII box characters to show sections of the disk that are used/unused. Other characters are used to denote status of disk blocks: r for reading, W for writing, B for bad and X for unmovable." %}

[Defrag - By ShipLift LLC](https://defrag.shiplift.dev/) as pictured above (_fig 1_) gives me a nostalgic sense of satisfaction, it's akin to a kinetic sculpture or those flowing sand paintings; useless, yet nice to look at.

{% figure "/img/defrag-like-its-1992-2.png" "Fig 2. I can almost hear the rampant clicking of my old 2GB hard drive" "Similar text based UI to Fig 1. This looks much closer to the Defrag I remember using." %}

In the years since I first saw this I have found a handful of others: [J. Román ( **Manz** )](https://manz.dev/) created a more [faithful reproduction of the MS-DOS `defrag` command available on codepen](https://codepen.io/manz/pen/MdErww) (_fig 2_) it gives me the same amount of joy as the version by Andrew LeTourneau and Conner McCall from ShipLift LLC but gets extra points for being closer to the memory I have of running defrag on my Toshiba.

{% figure "/img/defrag-like-its-1992-3.png" "Fig 3. I have always found this applications icon to be very pretty uWu" "An HTML reproduction of the Windows 98 defragment drive interface." %}

The last computer I personally ran defrag on was running Windows98; I found this five-year-old [lobsters post titled Windows 95 Defrag Simulator (makes noise)](https://lobste.rs/s/cuoth6/windows_95_defrag_simulator_makes_noise) however the domain is no longer active and has in the years since been squatted. Once again the [Wayback Machine](https://archive.org/web/) comes to our rescue with an [archived copy of their Windows 95 Defrag Simulator](https://web.archive.org/web/20170312133201/http://hultbergs.org/defrag/) (_fig 3_).

I'm unsure why but this one doesn't render correctly on my computer, it does enough that you can see the simulation even though it appears to be simulating erasing data rather than repositioning it.

Out of the three simulations I found this one is the only one to have HDD activity noise, albeit a recording on loop, it does add something, however it's a little disappointing that it's not synthesized based on simulated disk activity.

Late last year I created a partial simulation of the Windows98 windowing system that utilised [98.css](https://jdan.github.io/98.css/) for that authentic Windows98 aesthetic. It might make a nice tinker project to reproduce the Windows98 defrag window using that[^2].

[^1]: This post has had an open issue: [Issue #53: Defrag like its 1992](https://github.com/photogabble/website/issues/53) for over a year before I got round to actually writing this...
[^2]: It's only taken me two years to complete this post, so I expect two to five years from now I will have written my own simulation!
177 changes: 177 additions & 0 deletions content/noteworthy/2023-01-25-wiki-links.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,177 @@
---
title: "Adding Wiki Links to 11ty"
tags: [Blogging]
aliases: [wiki-links]
growthStage: seedling
---

I use [Obsidian.md](https://obsidian.md/) to draft my posts before they are published on PhotoGabble. One feature of #Obsidian that I love is interlinking between notes and being able to see the connectivity graph of each note.

## Preface

[[My publishing workflow]] typically consists of creating new notes in Obsidian and dumping in a lot of research links, random thoughts and references to other notes via use of Obsidian's built-in [Wikilinks](https://en.wikipedia.org/wiki/Hyperlink#Wikis) function; specifically, any `[[bracketed]]` word is converted to a link.

A while back I brought that functionality over to PhotoGabble and with it the concept of back links and the connectivity graph that I like from Obsidian[^1]. This means I can interlink any two pages and display "what else links here".

## Teaching 11ty Wikilinks

This solution borrows from [Evan Boeh's Digital Garden](https://boehs.org/), specifically the source of their [garden.11tydata.js](https://git.sr.ht/~boehs/site/tree/master/item/html/pages/garden/garden.11tydata.js). In order to parse content for backlinks and relevant page metadata this solution is made up of two parts:

- A computed data function for calculating backlinks and filling a `Map()` with useful page metadata
- A [Markdown It](https://github.com/markdown-it/markdown-it) plugin for parsing wikilink syntax and replacing with `<a>` tags based upon lookup of aforementioned `Map()`


### The Markdown It Wikilink plugin:

{% raw %}
```js
module.exports = function(md, linkMapCache) {
// Recognize Mediawiki links ([[text]])
md.linkify.add("[[", {
validate: /^\s?([^\[\]\|\n\r]+)(\|[^\[\]\|\n\r]+)?\s?\]\]/,
normalize: match => {
const parts = match.raw.slice(2, -2).split("|");
const slug = slugify(parts[0].replace(/.(md|markdown)\s?$/i, "").trim());
const found = linkMapCache.get(slug);

if (!found) throw new Error(`Unable to find page linked by wikilink slug [${slug}]`)

match.text = parts.length === 2
? parts[1]
: found.title;

match.url = found.permalink.substring(0,1) === '/'
? found.permalink
: `/${found.permalink}`;
}
})
};
```
{% endraw %}

The plugin accepts an instance of markdown it (`md`) and a `Map` in the form of `linkMapCache`; the link map cache is filled by the computed data function and acts as a way of looking up the title and permalink for a given input.

If a link is valid but also contains a custom title e.g. `[[ Page Title | Override ]]` then the page will be looked up by `Page Title` and linked via `<a href="/page-slug">Override</a>`.

If a link isn't valid then unlike with Evan's solution I throw an Error. Evan has a separate collection that builds virtual stub posts for these missing pages. I have chosen to be made aware of them so that I might manually add a [[Adding stub posts | stub post]]. This is only really because it forces me to follow my own work flow, while a fully automated system would leave a lot of stubs never finished.

### Eleventy Computed Backlinks Data:

With the wikilink syntax being parsed the `linkMapCache` needs filling. To do this I needed 11ty to loop over my main posts collection and for each post: identify what it wikilinks to as well as what is wikilinking back to it. This can be done in two ways that I know of: as a map function run on the posts collection when added to 11ty or as computed frontmatter.



{% raw %}
```js
// This regex finds all wikilinks in a string
const wikilinkRegExp = /(?<!!)\[\[([^|]+?)(\|([\s\S]+?))?\]\]/g;

const parseWikilinks = (arr) => arr.map(link => {
const parts = link.slice(2, -2).split("|");
const slug = slugify(parts[0].replace(/.(md|markdown)\s?$/i, "").trim());

return {
title: parts.length === 2 ? parts[1] : null,
link,
slug
}
});

// This function gets past each page via *.11tydata.js in order to
// fill that pages backlinks data array.
module.exports = (data) => {
if (!data.collections.all || data.collections.all.length === 0) return [];
const allPages = data.collections.all;
const currentSlug = slugify(data.title);
let backlinks = [];
let currentSlugs = [currentSlug];

// Populate our link map for use later in replacing wikilinks with
// page permalinks.
// Pages can list aliases in their front matter, if those exist we
// should map them as well.

linkMapCache.set(currentSlug, {
permalink: data.permalink,
title: data.title
});

if (data.aliases) {
for(const alias of data.aliases) {
const aliasSlug = slugify(alias);
linkMapCache.set(aliasSlug, {
permalink: data.permalink,
title: alias
});
currentSlugs.push(aliasSlug)
}
}

// Loop over all pages and build their outbound links if they
// have not already been parsed, this is being done in a way
// that is cached between reloads so restarting the dev server
// will be required to pick up changes.

allPages.forEach(page => {
if (!page.data.outboundLinks) {
const pageContent = page.template.frontMatter.content;
const outboundLinks = (pageContent.match(wikilinkRegExp) || []);
page.data.outboundLinks = parseWikilinks(outboundLinks);
}

// If the page links to our current page either by its title
// or by its aliases then add that page to our current
// page's backlinks.

if (page.data.outboundLinks.some(link => currentSlugs.includes(link.slug))) {
backlinks.push({
url: page.url,
title: page.data.title,
})
}
});

// The backlinks for the current page, set to the page data
return backlinks;
}
```
{% endraw %}

This is then used via your `*.11tydata.js` files via:

```js
module.exports = {
eleventyComputed: {
backlinks: (data) => backlinks(data),
},
};
```

## Displaying the backlinks

Once done you will begin seeing any wikilinks you use get converted into links and your page data will now have a `backlinks` property that you can display to users. To do so I use the following snippet:

{% raw %}
```nunjucks
{% if backlinks.length > 0 %}
<nav>
<h3>Linking here</h3>
<ul>
{%- for link in backlinks %}
<li><a href="{{ link.url }}">{{ link.title }}</a></li>
{%- endfor %}
</ul>
</nav>
{% endif %}
```
{% endraw %}

## The Post Relationship Graph

{% figure "/img/adding-wiki-links-to-11ty-1.png" "It's satisfying to see interconnected ideas" "A giant node graph showing hundreds of interconnected circles each one an article." %}

A lovely side effect of adding backlink support is that you now have a map of post relationships that could be exported to json and made available to the frontend for displaying as a pretty node graph much like how Obsidian does.

I'll leave that as an exercise for the reader.

[^1]: Although as of writing this I have yet to surface the graph to the website in a way that can be seen.

0 comments on commit ad2c720

Please sign in to comment.