-
Notifications
You must be signed in to change notification settings - Fork 0
What happens when?
Abstract terms embedded in Ergo source files are recognized and built at parse time.
There are no specific events associated with parsing.
Modules are constructed recursively at load time. An internal module cache speeds up this process.
Directives are executed at load time and, after that, the rest of the module is loaded.
This ensures that things like operation declarations are accessible from within the module that declares them.
When a module is loaded, it raises a ModuleLoadedEvent
:
- The
tabling
library handles this event by transforming tabled predicates.
Before compilation, clauses undergo a variety of miscellaneous transformations.
Specifically, after a Knowledge Base is constructed (and its Dependency Graph populated), it raises a KnowledgeBaseCreatedEvent
:
- The
expansions
library handles this event by expanding all macros in all clauses. - The
compiler
library handles this event by inlining all clauses that were marked to be inlined.
Additionally, when the VM compiles a query, it raises a QuerySubmittedEvent
:
- The
expansions
library handles this event by expanding all macros in the query.
After all terms have been rewritten, predicates are ready to be compiled.
The first step is to build an Execution Graph that represents the control flow of the predicate being compiled, including branching from multiple definitions.
This process is mediated by the Dependency Graph that was populated when the Knowledge Base was created.
This is when the compile-time context of built-in calls is determined and captured.
After the graph is built, it can be optimized. This is when the bulk of static analysis happens.
Redundant branches are pruned, dead code is eliminated, constants are propagated and so on.
The result is a much smaller graph that's semantically equivalent to the initial graph.
Finally, once the optimized graph is ready, it can be compiled down into Op
s that target the ErgoVM.
This part is rather straightforward -- each node implements its own compilation logic.
The generated code will run directly on the VM at runtime.
KnowledgeBaseCreatedEvent
:
- The
compiler
library handles this event by compiling all predicates in the Knowledge Base.
QuerySubmittedEvent
:
- The
compiler
library handles this event by compiling the query.
Once the VM is started on a given query, it will execute the current branch and then backtrack until there are choice points left to try.
Its state is managed through various registers.
This is when the compile-time state of a goal is unified with the VM's current environment (via the arguments register).
This is when dynamic predicates are resolved and matched against the knowledge base (if they have static clauses, those are compiled as expected and the result is branched with a node that will resolve only the dynamic clauses).
This is when the custom unification logic of Abstract Terms is invoked as a result of a call to unify/2
(if they weren't optimized away at an earlier step).
Some large enumerators may optimize solution generation in determinate, non-side-effecting paths by delegating the computation of the value to when that value is actually accessed through an iterator.
For example, the query for(_I, 0, 1000000000000), J := _I * 2
will create a solution generator that computes J := _I * 2
only when that specific solution is being yielded, and not a second earlier.
This optimization is, in theory, transparent and shouldn't cause problems if the affected predicates are correctly marked as determinate and/or impure. Otherwise, it might cause some confusing behavior.