Skip to content

Latest commit

 

History

History
354 lines (258 loc) · 12.6 KB

advanced-topics.rst

File metadata and controls

354 lines (258 loc) · 12.6 KB

Advanced topics

This section describes some details of dune for advanced users.

META file generation

Dune uses META files from the findlib library manager in order to interoperate with the rest of the world when installing libraries. It is able to generate them automatically. However, for the rare cases where you would need a specific META file, or to ease the transition of a project to dune, it is allowed to write/generate a specific one.

In order to do that, write or setup a rule to generate a META.<package>.template file in the same directory as the <package>.opam file. Dune will generate a META.<package> file from the META.<package>.template file by replacing lines of the form # DUNE_GEN by the contents of the META it would normally generate.

For instance if you want to extend the META file generated by dune you can write the following META.foo.template file:

# DUNE_GEN
blah = "..."

Findlib integration and limitations

Dune uses META files to support external libraries. However, it doesn't export the full power of findlib to the user, and especially it doesn't let the user specify predicates.

The reason for this limitation is that so far they haven't been needed, and adding full support for them would complicate things quite a lot. In particular, complex META files are often hand-written and the various features they offer are only available once the package is installed, which goes against the root ideas dune is built on.

In practice, dune interprets META files assuming the following set of predicates:

  • mt: what this means is that using a library that can be used with or without threads with dune will force the threaded version
  • mt_posix: forces the use of posix threads rather than VM threads. VM threads are deprecated and are likely to go away soon
  • ppx_driver: when a library acts differently depending on whether it is linked as part of a driver or meant to add a -ppx argument to the compiler, choose the former behavior

Dynamic loading of packages

Dune supports the findlib.dynload package from findlib that allows to dynamically load packages and their dependencies (using OCaml Dynlink module). So adding the ability for an application to have plugins just requires to add findlib.dynload to the set of library dependencies:

(library
  (name mytool)
  (public_name mytool)
  (modules ...)
)

(executable
  (name main)
  (public_name mytool)
  (libraries mytool findlib.dynload)
  (modules ...)
)

Then you could use in your application Fl_dynload.load_packages l that will load the list l of packages. The packages are loaded only once. So trying to load a package statically linked does nothing.

A plugin creator just need to link to your library:

(library
  (name mytool_plugin_a)
  (public_name mytool-plugin-a)
  (libraries mytool)
)

By choosing some naming convention, for example all the plugins of mytool should start with mytool-plugin-. You can automatically load all the plugins installed for your tool by listing the existing packages:

let () = Findlib.init ()
let () =
  let pkgs = Fl_package_base.list_packages () in
  let pkgs =
    List.filter
      (fun pkg -> 14 <= String.length pkg && String.sub pkg 0 14 = "mytool-plugin-")
      pkgs
  in
  Fl_dynload.load_packages pkgs

Cross Compilation

Dune allows for cross compilation by defining build contexts with multiple targets. Targets are specified by adding a targets field to the definition of a build context.

targets takes a list of target name. It can be either:

  • native which means using the native tools that can build binaries that run on the machine doing the build
  • the name of an alternative toolchain

Note that at the moment, there is no official support for cross-compilation in OCaml. Dune supports the opam-cross-x repositories from the ocaml-cross organization on github, such as:

In particular:

  • to build Windows binaries using opam-cross-windows, write windows in the list of targets
  • to build Android binaries using opam-cross-android, write android in the list of targets
  • to build IOS binaries using opam-cross-ios, write ios in the list of targets

For example, the following workspace file defines three different targets for the default build context:

(context (default (targets (native windows android))))

This configuration defines three build contexts:

  • default
  • default.windows
  • default.android

Note that the native target is always implicitly added when not present. However, when implicitly added dune build @install will skip this context, i.e. default will only be used for building executables needed by the other contexts.

With such a setup, calling dune build @install will build all the packages three times.

Note that instead of writing a dune-workspace file, you can also use the -x command line option. Passing -x foo to dune without having a dune-workspace file is the same as writing the following dune-workspace file:

(context (default (targets (foo))))

If you have a dune-workspace and pass a -x foo option, foo will be added as target of all context stanzas.

How does it work?

In such a setup, binaries that need to be built and executed in the default.windows or default.android contexts as part of the build, will no longer be executed. Instead, all the binaries that will be executed will come from the default context. One consequence of this is that all preprocessing (ppx or otherwise) will be done using binaries built in the default context.

To clarify this with an example, let's assume that you have the following src/dune file:

(executable (name foo))
(rule (with-stdout-to blah (run ./foo.exe)))

When building _build/default/src/blah, dune will resolve ./foo.exe to _build/default/src/foo.exe as expected. However, for _build/default.windows/src/blah dune will resolve ./foo.exe to _build/default/src/foo.exe

Assuming that the right packages are installed or that your workspace has no external dependencies, dune will be able to cross-compile a given package without doing anything special.

Some packages might still have to be updated to support cross-compilation. For instance if the foo.exe program in the previous example was using Sys.os_type, it should instead take it as a command line argument:

(rule (with-stdout-to blah (run ./foo.exe -os-type %{os_type})))

Classical ppx

classical ppx refers to running ppx using the -ppx compiler option, which is composed using Findlib. Even though this is useful to run some (usually old) ppx's which don't support drivers, dune does not support preprocessing with ppx this way. but a workaround exists using the ppxfind tool.

Profiling dune

If --trace-file FILE is passed, dune will write detailed data about internal operations, such as the timing of commands that are run by dune.

The format is compatible with Catapult trace-viewer. In particular, these files can be loaded into Chromium's chrome://tracing. Note that the exact format is subject to change between versions.

Implicit Transitive Deps

By default, dune allows transitive dependencies of dependencies to be used directly when compiling OCaml. However, this setting can be controlled per project. It can be disabled by adding the (implicit_transitive_deps false) to the dune-project file.

Once this setting is added, all dependencies that are directly used by a library or an executable must be directly added in the libraries field. We recommend users to experiment with this mode and report any problems. The goal is to make this the default mode eventually.

Note that you must use threads.posix instead of threads when using this mode. This is not an important limitation as threads.vm are deprecated anyways.

Name Mangling of Executables

Executables are made of compilation units whose names may collide with the compilation units of libraries. To avoid this possibility, dune prefixes these compilation unit names with Dune__exe__. This is entirely transparent to users except for when such executables are debugged. In which case the mangled names will be visible in the debugger.

Starting from dune 1.11, the (wrapped_executables <bool>) option is available to turn on/off name mangling for executables on a per project basis.

Starting from dune 2.0, dune mangles compilation units of executables by default. However, this can still be turned off using (wrapped_executables false)

Explicit JS mode

By default, Javascript targets are defined for every bytecode executable that dune knows about. This is not very precise and does not interact well with the @all alias (eg, the @all alias will try to build JS targets corresponding to every test stanza). In order to better control the compilation of JS targets, this behaviour can be turned off by using (explicit_js_mode) in the dune-project file.

When explicit JS mode is enabled, an explicit js mode needs to be added to the (modes ...) field of executables in order to trigger JS compilation. Explicit JS targets declared like this will be attached to the @all alias.

Starting from dune 2.0 this new behaviour will be the default and JS compilation of binaries will need to be explicitly declared.

Dialects

A dialect is an alternative frontend to OCaml (such as ReasonML). It is described by a pair of file extensions, one corresponding to interfaces and one to implementations.

The extensions are unique among all dialects of a given project, so that a given extension can be mapped back to the corresponding dialect.

A dialect can use the standard OCaml syntax or it can specify an action to convert from a custom syntax to a binary OCaml abstract syntax tree.

Similarly, a dialect can specify a custom formatter to implement the @fmt alias, see formatting-main.

When not using a custom syntax or formatting action, a dialect is nothing but a way to specify custom file extensions for OCaml code.

Defining a dialect

A dialect can be defined by adding the following to the dune-project file:

(dialect
 (name <name>)
 (implementation
  (extension <string>)
  <optional fields>)
 (interface
  (extension <string>)
  <optional fields>))

<name> is the name of the dialect being defined. It must be unique in a given project.

(extension <string>) specifies the file extension used for this dialect, for interfaces and implementations. The extension string must not contain any dots, and be unique in a given project.

<optional fields> are:

  • (preprocess <action>) is the action to run to produce a valid OCaml abstract syntax tree. It is expected to read the file given in the variable named input-file and output a binary abstract syntax tree on its standard output. See preprocessing-actions for more information.

    If the field is not present, it is assumed that the corresponding source code is already valid OCaml code and can be passed to the OCaml compiler as-is.

  • (format <action>) is the action to run to format source code for this dialect. The action is expected to read the file given in the variable named input-file and output the formatted source code on its standard output. For more information. See formatting-main for more information.

    If the field is not present, then if (preprocess <action>) is not present (so that the dialect consists of valid OCaml code), then by default the dialect will be formatted as any other OCaml code. Otherwise no special formatting will be done.