You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While more complex sites probably (and should) use Algolia, it would be nice to ship something lightweight for smaller projects. I wrote something similar for Laradocgen which actually worked pretty nice.
Basically, when compiling the site, create a search index with the post descriptions, title, and slug, save it as a JSON file, then create a clientside search script to pull the data.
Each file is 4.79KB on average, however the size between files varies by design.
It took 5,444.42ms to generate a search index for all 100 files. The resulting JSON file is 439KB and contains 449625 characters.
Here is a table for how long various search queries take using Chrome 101 on the same machine.
term
result count
pages containing term
processing time (ms)
[space]
63152
100
45.1
test
44
38
4
foo
11
9
2.7
lorem
196
100
9.7
lorem markdownum postes
1
1
0.7
As you can see, more specific terms with fewer matches are much faster. Overall, it is buttery smooth and feels like it is
in realtime, even for a massive search index like this where the compiled site is almost 3MB total with over 60 thousand words. On Safari on an iPhone 6s the processing times are around 3 times slower, on a Chromebook they are around 2 times slower. I cannot notice the difference.
While more complex sites probably (and should) use Algolia, it would be nice to ship something lightweight for smaller projects. I wrote something similar for Laradocgen which actually worked pretty nice.
Basically, when compiling the site, create a search index with the post descriptions, title, and slug, save it as a JSON file, then create a clientside search script to pull the data.
The text was updated successfully, but these errors were encountered: