Skip to content

Oxidation 2015 11 05

Lars Bergstrom edited this page Nov 10, 2015 · 3 revisions


Oxidation spreadsheet: Public web version:


  • larsberg, edunham, glandium, rillian, acrichto, vgosu, brson

Linux distro suppport

We have a thread ( and some status info ( What do we need for when a Rust component is on by default?

  • rillian: is Mach install ok as an alternative to distro support? - I worry that we shouldn't put distro support on a path for developer convenience, but rust is still changing very quickly. if we rely on distro packaged versions, it'll slow down our ability to adopt useful features. plan was to use tooltool to install toolchain, do official builds with that, and let devs install whatver version they want. needs to be easier for devs; ted wanted it in mozilla build package that windows devs use. could also have mach subcommand that installs latest version someplace.
  • acrichto: We have nightly builds for all of the platforms that can be downloaded and used by developers.
  • rillian: 1) CI is behind a firewall, so it would need a proxy to get a build like that. 2) People on slow bandwidth would object to downloading 100MB. See how it goes?
  • glandium: Two things. The developer side - having Rust available for local dev for FF devs. Other side is having Firefox built by linux distros for redistribution. Big difference. It's almost OK for the dev side if we have people use a nightly build. But linux distros building Firefox will not be OK with that, which is an issue when we require Rust. For example, Debian. Started talking about it with them. Would need to have a Rust compiler in debian-stable. Once it's there, it won't be updated for quite a while. So if we get 1.5 there, it will be there for a LONG time. Makes things harder. Already have that problem with GCC - have to use old versions of GCC so that people can redistribute it. FF devs can use a newer version, but for distros we have to support older versions. Can't rely on them updating it every six weeks or something.
  • rillian: Debian's policy on self-hosted compilers?
  • glandium: At least for debian, you can't do that. But probably true for fedora, too. Can't build a package without everything also in packages. Just downloading would be bad. Probably true for most distros.
  • rillian: fedora wants firefox to switch to app packages. Maybe debian needs to package parallel installable versions of Rust?


  • rillian: Needs 1.5?
  • acrichto: In 1.4 stable. You can use the normal hooks to use your own jemalloc.

Which version of Rust?

  • rillian: Been trying to stay with stable. Feelings? But we need better dep generation, and would have to wait 5 weeks.
  • acrichto: I'd recommend staying in the stable subset of features. But if you need bugfixes or feature changes from nightly for a better dev experience, it seems fine. Stable is much better, when possible.
  • brson: Once Rust is on by default, it should definitely be against stable.

More than one project

  • acrichto: If you build a static lib, you can only have one of those, b/c of the Rust static lib. If you link everything in manually, that's fine.
  • rillian: Need an extra step that pulls in all the deps and does a separate link pass, unifying the dependencies. So, libgeckorust.a or similar.
  • acrichto: Only hard part would be picking up the standard library and crates they use. Also libcompilerrt.a. Beyond that, should work fine...
  • rillian: Does the compiler emit those? Similar to emit the Rust libraries?
  • acrichto: No, it's a very rust-specific format.
  • brson: Could also have an auto-generated crate that links all the rust crates into one library...
  • acrichto: One that points at all the rust projects could link them all together. This project is about having more than one project; easiest is to have a single one.
  • glandium: Require cargo?
  • acrichto: Nope, can just call rustc.-
  • rillian: Right now, I turn all my crates into modules, and then the Rust compiler will traverse & include the toplevel code.
  • brson: Scary :-)
  • acrichto: How do you link into firefox itself?
  • rillian: Make a staticlib from every crate & just gets linked in. Mac is fine and can discard. Linux & Windows it's harder. (Actually, I think this works on Linux too. May depend on specific versions.)


  • rillian: Traditionally, we've been hostile to external build systems. What needs to be there for Cargo to work?
  • brson: Not hitting the internet and a self-contained build is important. Also a requirement for debian & other package managers. Issue open to address this, but don't know what the final solution is. Also want crate dependencies on some server?
  • glandium: No dependencies on a server. Everything should be in-tree. No magic, no network, etc.
  • brson: Same thing distros want - package the app with all the crates dependencies, etc.
  • glandium: Different slightly. Servo would have http library as a separate package, but in Gecko, we don't need that - want just all the source trees for all the dependencies in-place.
  • brson: Yes. Debian says they want to put everything into /etc and then to point all those deps at the /etc folder.
  • rillian: Kinda want both. Want the build to work from in-tree, but also want Cargo to be able to tell me there are new deps available, port the source into the tree. So, a dev mode where you can hit to find the updates, etc. I mean, there's CARGO_HOME...
  • acrichto: Can always have deps that have paths, but that would be totally separate from Cargo.
  • brson: Good to hear this requirement - that Cargo could handle the updates to those dependencies.
  • brson: In packaging, we'd considered ignoring this problem since Debian was just going to rewrite all the Cargo.toml files to have paths deps instead. I'm not sure that punting on this problem is a good idea.
  • rillian: Can you hook into the pluggable Cargo backends?
  • acrichto: There's some flexibility to point to a local git repo, but it's still very finicky.
  • acrichto: Can you explain why gecko vendors all the code in-tree?
  • glandium: If you look at how things are built (like in debian), once the source is downloaded, build machines no longer have network connectivity.
  • acrichto: Doesn't generate-source-tarball have an opportunity to hit & handle a source tarball?
  • vgosu: Also, everything that we use to build firefox is in-tree to ensure it has been reviewed. Otherwise, you can get a backdoor introduced via
  • glandium: We really want to know exactly what we're buildling.
  • brson: We crypto sign our code, etc., but I understand why you want to have it all in-tree.
  • acrichto: I understand this requirement, and it really does sound like "online updates, offline builds, in-tree repo" is going to be needed.
  • larsberg: As a Servo dev, I'd be very interested in whether we could also use this to edit our stuff :-)
  • glandium: That said, don't need to block on this, if we can get the rustc stuff working. Or can even just hack up cargo toml files, even if there's no tooling.
  • acrichto: Luckily, cargo is standalone from the compiler. Cargo is both forward and backwards-compatible with rustc, if that also helps for any decisions related to rustc compiler versions.

"how all those things fit together"

  • glandium: Mostly covered. My main question was how do we actually get this into Firefox? How do we get multiple things in Gecko, what does it all involve at the build level, etc. Just figuring out how we make that work. Maybe cargo is an answer - maybe something else?
  • brson: Sounds like there's a path forward for most of the problems we've chatted about. Could have Cargo have some feature that lets you locally-download all your dependencies into a local repo, install updates, etc. In the Gecko tree, the only custom code is a single rust crate that mentions all the deps you have. Then, there's some step where Cargo can bring them down and up. So, there would be one master crate and that's not part of, etc. - it's just the glue that holds everything together. One giant static lib.
  • glandium: Would that require some specific directory layout? One thing is that in gecko, different teams will have different sub-parts in Rust. And they have organizations with separate directories, so Rust could would be in SM, gfx, net, etc...
  • rillian: Also, it would introduce a dependency on other components any time you want to build yours.
  • acrichto: The dependencies can point via relative paths to pick up a bunch of projects.
  • brson: Yes, for the projects that are in-tree, but for the things that are mirrored from, those would probably need to be in a single external-crates directory (or 3rdparty or w/e).
  • acrichto: I assume you already have a vendored code folder and could be there?
  • brson: I was thinking there woudl be no distinction, but with the scattered code, you need to be able to handle it.
  • rillian: Required for some of the split of glue code, 3rd party code, etc. Each has different review requirements.
  • brson: Still have that model for 3rd party stuff?
  • rillian: I don't care where they are, but definitely have different review requirements. Sometimes need private-patched versions of things.
  • larsberg: What about sequencing? If rust is built all at once, then we can't build C after Rust...
  • rillian: Probably mainly a problem with bindgen, but we can worry about it later.


  • rillian: On OSX nightly, the MP4 parser is now ON! Maybe just debug builds, not release builds. Getting there!
  • rillian: My priories are: jemalloc, something with multiple projects, and some build/releng work to get that happening on CI.
  • larsberg: nfroyd is also working on this stuff in Q4 and can help!
Clone this wiki locally
You can’t perform that action at this time.