Skip to content
Compute ext using Rust
Rust JavaScript HTML CSS
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.
bivec Fix (or suppress) all clippy warnings. Nov 9, 2019
cachegrind Some profiling Aug 1, 2019
compressor Fix (or suppress) all clippy warnings. Nov 9, 2019
modules Improve interface for resolving derived modules Sep 19, 2019
saveload Some changes recommended by clippy Nov 4, 2019
yoneda-viewer Implement zooming in yoneda viewer Sep 18, 2019
.gitignore Ignore callgrind output Aug 24, 2019
LICENSE-APACHE Some basis web assembly compatibility. Jul 29, 2019
LICENSE-MIT Some basis web assembly compatibility. Jul 29, 2019 Add some documentation for saveload Sep 27, 2019 Generate docs for other crates as well Aug 13, 2019
katex-header.html Fix maths rendering Aug 7, 2019

Build Status


The main crate provides a library for computing Ext resolutions. It produces a binary that displays the result in an ASCII graph. This is mainly used for testing purposes. It also comes with a CLI interface for defining Steenrod modules, which may be used "in production".

At the moment, it also has an interface for calculating Steenrod operations in Ext using the algorithm described in via the steenrod subcommand (cargo run --release steenrod), but the intention is to expose this via ext-websocket once it is sufficiently presentable (the current algorithm can be very slow, and the speed cannot be easily determined a priori).

There are a few further sub-crates.


This is what you should use in general.

The ext-websocket crate uses the rust code as a backend and relays the results of the computation to the JS frontend via a websocket. This is intended to be run and used locally --- you don't want to expose a web interface that could heavily drain your server resources, and relaying the result involves moving a lot of data. It is usually somewhat reasonable to run the backend on servers in the local network, and the frontend on your computer, but when the network is slow, running a browser on the server and using ssh X-forwarding might be a better idea.

There is also a version that compiles all the rust code into wasm and lets everything run in the browser. A live and cutting-edge version of this can be found at

Read the README file in ext-websocket/ for more details.


This is a utility for further compressing the history file constructed by the previous interface (again, see the README in ext-websocket/ for more details). It is not very well polished. To use it, save the file to compress as compressor/old.hist, and then run cargo run --release. The compressed file will be saved at compressor/new.hist.

This program is multithreaded, and to change the number of threads used, edit the NUM_THREAD variable in compressor/src/


This is a small crate that provides BiVec - a variant of Vec indexed by an i32 whose starting index may be non-zero.


This is a small crate that provides OnceVec and OnceBiVec, a wrapper around UnsafeCell<Vec> (or BiVec) that models a Vec whose only way of modification is push. This models some partially computed infinite data structure, and we think of pushing as simply finding out more of this infinite data structure instead of genuinely mutating it.


This contains some helper functions for a command line interface.


This provides an interface for saving and loading resolutions and other data.


To compile the main crate, simply run

$ cargo build

This will automatically download and manage the dependencies, and the compiled binary can be found at target/debug/rust_ext.

This by default resolves the sphere at p = 2 to degree 30. See rust_ext --help for more configuration options.

Once can also run the resolver directly via

$ cargo run

This will compile the code (if necessary) and then run the binary. Command line options can be passed with --, e.g. cargo run -- --help. In particular, cargo run -- module will start an interactive interface for defining a module.

To compile and run a properly optimized version, use

$ cargo build --release
$ cargo run --release

The compiled binaries can be found at target/release. This binary is usually much faster but compilation takes longer.

To run the tests, do

$ cargo test


To compile the code documentation, run

$ cargo doc --no-deps

To view the docuemntation, run

$ cargo doc --no-deps --open

As usual, the latter command triggers the former if needed. This can also be viewed on

You can’t perform that action at this time.