There are two public module rules
```python load('@io_bazel_rules_js//js:def.bzl', 'js_binary', 'js_library') js_library( name = 'lib', srcs = ['lib.js'] ) js_binary( name = 'bin', srcs = ['main.js'], deps = [':lib'], ) ```
There is a WORKSPACE rule to install modules from NPM, optionally include their
npm_install will also take a
sha256 argument to
verify against what's published on NPM as well as a
type_sha256 for the type
```python load('@io_bazel_rules_js//js:def.bzl', 'npm_install') npm_install('immutable', version='3.8.1', type_version='3.8.1') ```
The resulting library will be available as
Because the rule will create your
BUILD file for you, it needs to include all
specified dependencies. Occasionally, a library will have some functionality you
don't need that pulls in a large number of transitive dependencies. While
unsafe, you can pass
ignore_deps list of strings (of the Bazel
dot-style names), and they will not be included as dependencies. This li'l trick
is to be used at your own risk.
npm_install, a module will be created with the source for that NPM
project. For a simply named library (say
react), other modules are free to
depend on a module named
@react//:lib. However, the
- character (and perhaps
others) is not allowed in external names with Bazel, so they will be replaced
.. For example,
honk-di would be required in a
BUILD file as
These rules will declare dependencies, but they will not resolve them. For
example, if you declare an
npm_install rule for
@bar//:lib, which depends on
@foo//:lib, Bazel will fail to build citing that it can't find
You must determine a version and explicitly define it at the
When encountering such a resolution error, it's helpful to look at the file
where the error occurred (namely, the
BUILD file for
file will have comments for the all of its dependencies and versions it provided
package.json. It's fair to say most will be semvar ranges rather than
specific versions, so it's up to you to find the right release.
For external modules (installed with
npm_install), import statements will work
the same as with Node and NPM.
honk-di will be importable as
For internal modules, the following convention should be applied:
- If the file is part of the current target, import it with a relative path.
- If the file is part of another target, import it with a fully-qualified
path. So, if working in
//lib/ui/actionsand you need a library from
Both presently work in nearly all cases, but the behavior is not guaranteed as these rules evolve.
js_library emits its "runtime" and "compile-time" definitions as
jsar files. The "runtime" is the source code required to use this library in a
running process (ie -- all the source code). The "compile time" is just files
needed to link this library to another. This only really makes sense in the case
of TypeScript where
.d.ts files are emitted, and those are the only files
required to compile other libraries which depend on this one. The runtime is
still needed to execute.
The metadata is as follows:
```python struct( files = <runtime jsar + compile-time jsar> jsar = <this library's runtime code> cjsar = <this library's compile-time definitions> runtime_deps = <transitive set of runtime dependencies> compile_deps = <transitive set of compile-time dependencies> ) ```
js_binary target will create a "fat" archive -- its local code, and the code
of all its transitive dependencies. It will also create a runner script which
will extract these files to a local
./node_modules, invoke each
External dependencies created with
npm_install will use a behind-the-scenes
jsar to directly create the tarfile containing the sources with working
js_library. These targets will have all files included as
A word about Bazel configurations
The two primary ones are "target" and "host". If you run
bazel build ...,
Bazel will build "target" configurations -- ie: Artifacts for the platform where
the code will be running. However, if you need an artifact that will run locally
like a compiler or code-generator, that artifact must be compiled for "host."
So, while building JS artifacts, you're creating many "target" artifacts. However, when running tests (or browserify, etc), an odd thing happens.
- We want a
js_binarythat can resolve Mocha dependencies and resolve whatever libraries we're passing to it for testing
- This rule, logically, should have an attribute for Mocha with
cfg = "host", as it'll be running locally.
- Bazel infers that the dependencies will also need to be compiled for the "host", but everything on disk is compiled for "target"
- Bazel will now recompile all the
js_librarieswith a "host" configuration
bazel test ..., this would first compile all libraries to the
target configuration, then build them again for the host (indicating it with
[for host] on the build status).
We circumvent this by specifically indicating that we want "target" binaries in our higher-order binary rules. It looks wrong. But that's why.