Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bespoke encoding basics #85

Open
cwgoes opened this issue Sep 5, 2019 · 3 comments
Open

Bespoke encoding basics #85

cwgoes opened this issue Sep 5, 2019 · 3 comments
Assignees

Comments

@cwgoes
Copy link
Collaborator

@cwgoes cwgoes commented Sep 5, 2019

Consider a Core term f of type A ⊸ B.

In the interaction net encoding compiler path, assuming EAL-typability, we would encode this (if of form \x -> t, for example) as:

2019-09-05-181457_253x238_scrot

where φ is the recursive interaction net translation function (see the language reference).

In the bespoke encoding path, we instead create a new node type T and rewrite rule R such that when the primary port of T is connected to an application node to an argument A, we erase T, connect an eraser to A, and connect whatever the application node's primary port was connected to to a new subgraph which is equal to the encoding of eval (f A).

eval (f A) can then be implemented by native evaluation semantics which do not utilize interaction nets. For example, if the Core term in question is a tail-recursive numerical computation, Juvix can compile it to a native loop (possibly using SIMD).

Furthermore, we can safely encode non-EAL-typable terms this way, such as the Ackermann function, and they can safely interact with the rest of the interaction net (which must have been an EAL-typable term, treating the bespoke-encoded subterm as opaque).

The decision of whether or not to take the bespoke path can be made for all subterms of this form according to some heuristic (or possibly exact cost calculation) in the compiler.

Where A and B are both types which are encoded as primitive nodes (e.g. integers), this is trivial.

Where A is a primitive type and B is a function of some arity, of only primitive-typed arguments, which then returns a primitive type, this can be implemented as a sequence of node types T, T', T'', etc. which keep the curried arguments and eventually evaluate when all arguments are provided (or even when some are provided, there is a continuum of options here).

Where A and/or arguments of B are non-primitive types (e.g. functions), this becomes more complex, since we must convert between AST and interaction-net form during reduction.

@cwgoes cwgoes added the research label Sep 5, 2019
@mariari

This comment has been minimized.

Copy link
Contributor

@mariari mariari commented Sep 5, 2019

In terms of deciding on what functions should be bespoke, analysis should be done at the front end language, producing core terms which have syntactic marks.

I suggest the syntactic marks be quasiquote (`) and comma (,).

Lisps have used this for quite some time, variations include

  • quasiquote and tilda (clojure)
  • alpha and bullet (parallel execution on connection machines)
@cwgoes

This comment has been minimized.

Copy link
Collaborator Author

@cwgoes cwgoes commented Sep 5, 2019

Further thoughts on bespoke encoding of non-primitive types:

More generally, with our Core term f of type A ⊸ B, encoding f through the bespoke path would result in a set of new node types T_i with possible curried internal data, and a set of rewrite rules R_i, the first i - 1 of which just deal with currying (although again, there is a continuum of options, but let's leave that out for now), and the last one of which is interesting, let it be R.

R must then cause, when connected to a primary port of an argument A:

  • Erasure of R_i (the prior node)
  • Connection of an eraser to A
  • Creation of a new subgraph φ (eval (f (read-back A)), where φ is the recursive interaction net encoding function (which might itself perform bespoke encoding, although we need to be concerned about runtime costs here), and read-back is the read-back function from nets to Core terms, run starting at A as the root node.
  • Connection of the primary port of this subgraph to whatever R_i was previously connected to.

I think this works - it follows all the interaction net laws and should preserve semantics - but there are oddities:

  • Read-back and (complex) encoding algorithms must be executed at runtime
  • Read-back must happen over a term A which may be in the progress of parallel reduction

I'm not sure it will make sense in practice - perhaps for some small functions? In general, though, we have no idea of the size (and corresponding read-back cost) of A, and it might be dependent on the order of reduction.

@mariari

This comment has been minimized.

Copy link
Contributor

@mariari mariari commented Sep 11, 2019

Currently as of the merging of pull request #94, we can bespoke primitive types:

This works over a type called Primitive which is currently either a Bool or an Int.

Functions are written as such

plus  Primitive  Primitive  Maybe Primitive
plus (PInt i1) (PInt i2) = Just (PInt $ i1 + i2)
plus (PBool _) _         = Nothing
plus _         (PBool _) = Nothing

Currently the return type is a maybe, however this is not needed as when we compile Core, type errors should not occur at this level. Doing it this way allows us to make hand made BOHM INets and for it somewhat gracefully handle type errors (can be made more graceful, if Maybe is returned after every argument).

What needs to be fleshed out, is the Core -> Primitive bespoke translation.

Augmenting the parser of Bohm, so custom infix operators can have proper precedence and to remove all hard-coded functions such as +.

  • This also allows removing the special logic created to handle these functions as well.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
2 participants
You can’t perform that action at this time.