Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add basic CSS caching to oranda #551

Merged
merged 1 commit into from
Jul 27, 2023

Conversation

jamesmunns
Copy link
Contributor

@jamesmunns jamesmunns commented Jul 26, 2023

This PR adds a very naive caching implementation when fetching the pre-built CSS. This applies only to assets that would be fetched over the network.

For larger workspace projects, the blocking requests can add a significant amount of time:

Before:

✓ >o_o< SUCCESS: Your site builds are located in `public`.
oranda build  1.03s user 0.30s system 10% cpu 12.634 total

After:

✓ >o_o< SUCCESS: Your site builds are located in `public`.
oranda build  0.85s user 0.22s system 28% cpu 3.740 total

In the future, it may be desirable to cache this at the filesystem level (in .cache/ or similar), or to handle this in a more common way, such as in a consistent manner managed by axoasset.

Debatable design decisions:

  • Using a Vec instead of a HashMap:
    • This is pretty much just because Vec::new() is const and HashMap::new() isn't
    • I don't think we'll likely be troubled by O(n) vs O(1) lookup, particularly compared to a network request
  • Using expect() on Mutex poisoning rather than bubbling up the error:
    • We literally do nothing but read and push with the mutex locked
    • If the mutex is poisoned something has gone horrifically wrong

Feel free to close this if y'all aren't interested, but 3-4 second iteration times for me is nicer than 12-15 second iteration times.

Copy link
Member

@Gankra Gankra left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks like a solid approach, but i'll give liv a chance to look at it since she was planning to do this next week

@jamesmunns
Copy link
Contributor Author

Misc bonus I didn't realize while initially developing: This means that long running oranda dev builds only make a single network request. Verified with some extra tracing:

oranda dev
↪ >o_o< INFO: Found 88 paths to watch, starting watch...
↪ >o_o< INFO: Workspace detected, gathering info...
↪ >o_o< INFO: Building 29 workspace member(s)...
MnemosProjectOverview ↪ >o_o< INFO: Cache miss
MnemosProjectOverview ↪ >o_o< INFO: Building components: mdbook
         kernel ↪ >o_o< INFO: Used cache
            abi ↪ >o_o< INFO: Used cache
           mstd ↪ >o_o< INFO: Used cache
       spitebuf ↪ >o_o< INFO: Used cache
          alloc ↪ >o_o< INFO: Used cache
         forth3 ↪ >o_o< INFO: Used cache
   sermux-proto ↪ >o_o< INFO: Used cache
    trace-proto ↪ >o_o< INFO: Used cache
        crowtty ↪ >o_o< INFO: Used cache
     dumbloader ↪ >o_o< INFO: Used cache
         f3repl ↪ >o_o< INFO: Used cache
 allwinner-core ↪ >o_o< INFO: Used cache
          beepy ↪ >o_o< INFO: Used cache
      melpomene ↪ >o_o< INFO: Used cache
         mnemos ↪ >o_o< INFO: Used cache
         forth3 ↪ >o_o< INFO: Used cache
     mnemos-abi ↪ >o_o< INFO: Used cache
   mnemos-alloc ↪ >o_o< INFO: Used cache
mnemos-trace-proto ↪ >o_o< INFO: Used cache
   sermux-proto ↪ >o_o< INFO: Used cache
       spitebuf ↪ >o_o< INFO: Used cache
     mnemos-std ↪ >o_o< INFO: Used cache
        crowtty ↪ >o_o< INFO: Used cache
     dumbloader ↪ >o_o< INFO: Used cache
         f3repl ↪ >o_o< INFO: Used cache
 mnemos-d1-core ↪ >o_o< INFO: Used cache
   mnemos-beepy ↪ >o_o< INFO: Used cache
      melpomene ↪ >o_o< INFO: Used cache
↪ >o_o< INFO: Building workspace index page...
↪ >o_o< INFO: Used cache
✓ >o_o< SUCCESS: Your site builds are located in `public`.
✓ >o_o< SUCCESS: Your project is available at: http://127.0.0.1:7979
↪ >o_o< INFO: Path(s) ["/Users/james/personal/mnemos/oranda-workspace.json"] changed, rebuilding...
↪ >o_o< INFO: Workspace detected, gathering info...
↪ >o_o< INFO: Building 29 workspace member(s)...
MnemosProjectOverview ↪ >o_o< INFO: Used cache
MnemosProjectOverview ↪ >o_o< INFO: Building components: mdbook
         kernel ↪ >o_o< INFO: Used cache
            abi ↪ >o_o< INFO: Used cache
           mstd ↪ >o_o< INFO: Used cache
       spitebuf ↪ >o_o< INFO: Used cache
          alloc ↪ >o_o< INFO: Used cache
         forth3 ↪ >o_o< INFO: Used cache
   sermux-proto ↪ >o_o< INFO: Used cache
    trace-proto ↪ >o_o< INFO: Used cache
        crowtty ↪ >o_o< INFO: Used cache
     dumbloader ↪ >o_o< INFO: Used cache
         f3repl ↪ >o_o< INFO: Used cache
 allwinner-core ↪ >o_o< INFO: Used cache
          beepy ↪ >o_o< INFO: Used cache
      melpomene ↪ >o_o< INFO: Used cache
         mnemos ↪ >o_o< INFO: Used cache
         forth3 ↪ >o_o< INFO: Used cache
     mnemos-abi ↪ >o_o< INFO: Used cache
   mnemos-alloc ↪ >o_o< INFO: Used cache
mnemos-trace-proto ↪ >o_o< INFO: Used cache
   sermux-proto ↪ >o_o< INFO: Used cache
       spitebuf ↪ >o_o< INFO: Used cache
     mnemos-std ↪ >o_o< INFO: Used cache
        crowtty ↪ >o_o< INFO: Used cache
     dumbloader ↪ >o_o< INFO: Used cache
         f3repl ↪ >o_o< INFO: Used cache
 mnemos-d1-core ↪ >o_o< INFO: Used cache
   mnemos-beepy ↪ >o_o< INFO: Used cache
      melpomene ↪ >o_o< INFO: Used cache
↪ >o_o< INFO: Building workspace index page...
↪ >o_o< INFO: Used cache
✓ >o_o< SUCCESS: Your site builds are located in `public`.

Copy link
Contributor

@shadows-withal shadows-withal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you! This is a good start for sure

@shadows-withal shadows-withal merged commit ba96f63 into axodotdev:main Jul 27, 2023
7 checks passed
@jamesmunns jamesmunns deleted the james/css-cache branch July 27, 2023 13:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants