Skip to content
Switch branches/tags
Go to file

Latest commit

* Support operation user_switch

* Add a class ConstantString to specify inline string for nodes.
Support for loop on range.
Inline operation.resolve.
Keep closure graph associated to closure name so that recursive calls can be detected.
Allow to register global symbols to be imported on compiled code.
Compile only graphs called via main function.

* Support operations grad and value_and_grad.
Use operation python_implementation if available.
Inline strings found in code.

* Generate an exception raising in compiled code if there is a call to a function that wants to return an unreachable node value.

* Use a dict instead of a module object to compile code.

Use a new PythonCompiler object for each compilation, to avoid sharing states through different compilations.

* Generate a unique name for each global symbol registered.

* Use new methods myia_iter, myia_hasnext and myia_next overloaded with ovld to manage related operations.

Overload these methods for builtins class range.

* Add python implementation for macro user_switch, and remove implementation in python backend.

* Allow more generally to import callable objects if we can not resolve from other cases.

* Add a test to cover myia_iter/next/hasnext on objects.

* Make sure to check result of test_iter_object.

* Make sure to generate a name for a callable added to globals,
even if either object does not have __name__ field
or __name__ field is empty.

* Use repr to save a constant string into ConstantString.

* - Support dout parameter for gradient implementation.
- Add an implementation for operation make_dict.
- Overload myia iteration methods for tuple.
- Inline operation make_tuple by replacing call with empty string, to get (*tuple_values) instead of make_tuple(*tuple_values).
- In PythonConstantConverter.convert_type:
  - Support dict type and convert it to a dict with expected keys (necessary for make_dict).
  - Make other types strings inlinables.
- When registering global symbol, generate a new global name only if name is not already associated to given value.

* Test operators.

* Inline apply calls if possible (e.g. for simple operators +, -, *).

Git stats


Failed to load latest commit information.
Latest commit message
Commit time
Dec 14, 2017


Myia is a new differentiable programming language. It aims to support large scale high performance computations (e.g. linear algebra) and their gradients. The main application Myia aims to support is research in artificial intelligence, in particular deep learning algorithms.

  • Define a model using a subset of Python, which is compiled to Myia (interfaces in other languages than Python may follow). This subset is general purpose and includes looping constructs and recursion. It excludes side effects and inplace operations.
  • Ask for the derivative of your model. Derivatives are fully supported for all control flow and all differentiable primitives.
  • Compile to efficient CPU and GPU code that optimizes use of your resources.

If you want to play with the current implementation, you can check out

A short document explaining some of Myia's inner workings is available here


Myia is currently under development and is not yet ready for use. We are optimistic about having an alpha version to play with around the start of 2020.

See Roadmap.


Development in artificial intelligence has been undergoing a boom in the past decade, chiefly due to the success of deep neural networks. The training of a neural network is a sort of differentiable program: one writes a program to compute the output and a cost, and then one computes the derivative of that cost with respect to the model's parameters to determine how they should be updated.

Differentiation can be automated, but mainstream programming languages offer no support for this, hence the need for libraries or programming languages that can reliably support these applications.

The current leading solutions for deep learning fall in two camps:

Computation graph-based solutions such as TensorFlow, Theano and MXNet support automatic differentiation and are very well optimized, but they are not fully general, with only limited support for loops and none for general recursion. Thus models like recursive neural networks are tricky and awkward to write.

Operator overloading solutions such as PyTorch or Autograd use a dynamic approach to automatic differentiation which makes them much more general, but they are tightly coupled to the Python language and cannot reap the benefits of an optimizing compiler. They also involve a certain quantity of overhead per operation which discourages composing small cheap operations.

Myia's solution is to define a strongly-typed, general-purpose intermediate representation with an IR-level automatic differentiation transformation, which can then be compiled and optimized for various targets, thereby getting the best of both leading approaches.



  • Parser: Supports def, if, for, while, operators, function calls, class and methods (limited support).
  • Intermediate representation: Implemented, with an array of utilities.
  • Debug VM: Faithfully runs the IR.
  • VM: Works on the simplified/optimized IR.
  • Primitives: Scalar primitives work, as well as map, reduce, broadcasting, 2D convolutions, concat/split, and many other operations.
  • Type system: Types are inferred without the need for annotations. Shapes can also be inferred. Myia supports recursive ADTs (e.g. tree data structures).
  • Optimization: Pattern-based optimizations, inlining, constant propagation, common subexpression elimination, closure conversion.
  • Automatic differentiation: Second order differentiation is not yet in working order.
  • GPU support: Using Relay or PyTorch.

In development

  • Compiler optimization: The compiler currently needs to be optimized to reduce compile times.
  • Auto-monadization: We are working to support print statements and random number generation through an auto-monadization system that can automatically keep track of the IO or RNG state.

Next steps

  • Error messages: We need to make sure that every likely mistake leads to an understandable and traceable error diagnosis.

Near future

  • Serialization: Serializing optimized graphs will allow for greater performance across runs and greater portability across systems.
  • Debugger: Intent is to have a step debugger for Myia. There used to be a working one for a previous version of the IR, so this should not pose a problem.
  • More Python syntax: break/continue.

After Beta

  • Even more Python syntax: Support for these features is not certain.
    • Augmented assignment (under restrictions)
    • yield and await
  • Support other languages: Which ones depend on demand. A new language is also a possibility.



If you use Myia for a scientific paper, please cite the above paper or mention Myia in the acknowledgements. It would be great if you could also let us know about it.


Myia prototyping




No releases published


No packages published