This repository has been archived by the owner. It is now read-only.

New-style macro APIs and a prototype implementation for Scala 2.12.2 #1

Merged
merged 93 commits into from Jul 23, 2017

Conversation

Projects
None yet
8 participants
@xeno-by
Collaborator

xeno-by commented Jun 4, 2017

Def macros and macro annotations have become an integral part of the Scala 2.x ecosystem. Well-known libraries like Play, Sbt, Scala.js, ScalaTest, Shapeless, Slick, Spark, Spire and others use macros to achieve previously unreachable standards of terseness and type safety.

Unfortunately, Scala macros have also gained notoriety as an arcane and brittle technology. The most common criticisms of Scala macros concern their subpar tool support and overcomplicated metaprogramming API based on compiler internals. Even five years after their introduction, macros still can't expand in Intellij, leading to proliferation of spurious red squiggles - sometimes in pretty simple code. As a result, the language committee has decided to retire the current macro system in Scala 3.x.

During the last couple years, we've been working on a new macro system that will support both Scala 2.x and Scala 3.x. New-style macros are based on a platform-independent metaprogramming API that was designed to be easy to use and easy to support in multiple implementations of the language.

Status

This repository contains a technology preview of the new macro system that features:

  • Scalameta-based syntactic and semantic APIs that cross-compile against Scala 2.12, 2.13 and Dotty for both JVM and JS. The corresponding library is quite slim, being less than 500Kb in size.
  • A prototype implementation of the new macro engine for Scala 2.12.2 that supports macro annotations and def macros.
  • Examples of new-style macros, including the @main annotation and a new-style materializer.

Roadmap

Our first order of business is to get a Dotty implementation up and running. Next, we will be publishing documentation, including detailed descriptions of the APIs and example projects to get started. Follow our issue tracker for more information.

Credits

Over the years, many contributors influenced the design of Scala macros. Check out Eugene Burmako's dissertation for more information.

The latest iteration of this project is the result of collaboration between:

  • Eugene Burmako, who led the development of new-style macros based on Scalameta, implemented the new macro APIs and prototype support for Scala 2.x.
  • Fengyun Liu, who experimented with the extractor-based approach to macro APIs, convincingly demonstrated its practical benefits and implemented prototype support for Scala 3.x in liufengyun/gestalt. This was a groundbreaking contribution that allowed us to deliver the new macro system much faster than initially planned.
  • Olafur Pall Geirsson, who worked on the converter-based approach together with Eugene, exposed and highlighted the shortcomings of the converter-based approach to macro APIs.

xeno-by added some commits May 20, 2017

create the build infrastructure
Our new sbt build and drone configuration are heavily inspired
by scalameta 1.x.

However, this infrastructure only provides very basic functionality:
  * Using Java 1.8
  * Cross-building against 2.10.x, 2.11.x, 2.12.x and Dotty
  * Support for both JVM and JS backends (JS not yet available for Dotty)
  * Automatic calculation of pre-release versions for Scalameta
  * Support for validation of GitHub pull requests
  * Support for Bintray publishing

More features will be added as necessary.
create the core infrastructure
This commit populates core with essential infrastructure.
First, it reintroduces things that have been useful to us
in scalameta 1.x, namely prettyprinters (.syntax, .structure)
and classifiers (.is[T], .isNot[T]). Secondly, it introduces
programmatic access to versions.

Prettyprinting retained the same API, but underwent a complete redesign
of the underlying implementation:
  * Show[T] has been removed, because we didn't really use the generality
    that it offered (only .syntax and .structure ended up being popular).
  * Prettyprinting combinators from Show have been replaced by a dumb
    imperative Prettyprinter class that encapsulates a StringBuilder.
    Denys is telling me that this works extremely well performance-wise
    in Scala Native + combinators were really tricky to use/maintain
    at times, so I'm willing to give this a try.
  * I'm planning to have all our data structures provide prettyprinting
    functionality, so I've made it as easy as possible to do that.
    There's now a trait called Pretty that enrolls its descendants into
	Syntax and Structure as long as they implement `def render`.
    You can check out the new Version class in this commit to see
    how this works.

Classifiers remained as is. They are working well API-wise, and I don't
expect to implement them for anything apart from trees so I didn't bother
making classifiers easier on the implementation side.

Versioning is going to be a big deal for scalameta, since the project
aims to become the foundation for the official macro system of the lang.
As a result, I've added a way to programmatically obtain important
metadata about our libraries. This commit introduces `scalaVersion`
and `coreVersion` into the `scala.meta.config` package. Future commits
may transfer more metadata from our build to executable code.
format everything with scalafmt
I was kinda wary of enabling scalafmt for the whole project
in scalameta 1.x, because large parts of our codebase were written
before scalafmt existed.

However with the new beginnings in this project, we can start
well-formatted and then make sure things stay well-formatted
as we're reviving more and more scalameta 1.x modules.
set align.openParenCallSite to false
This looks more to my liking.
initial syntactic API
Inspired by liufengyun/gestalt, we express our core API in
scala.reflect-like fashion. This has tangible benefits for implementing
semantic macros in the short run.
first stab at scala.macros
In a crucial UX improvement over scala.reflect.macros, we implement
the scala.macros.Universe cake in the scala.macros package object.

In the previous commit, we modelled all companions using defs, not vals,
and this proves to be critical, making it possible to swap universe
implementations on the fly (see the sources of the package object).

In comparison with liufengyun/gestalt, in order to implement Universe,
we only need to implement one method. We also need just one asInstanceOf
to trick the scalac/dotc typechecker.
first new-style macro compiles
It's manually desugared into old-style macros which don't work in Dotty,
and also it can't run yet, but that's a huge progress for today.
compatibility check for expanding new-style macros
`@inline` has been renamed to `@inlineMetadata`, and it now carries
coreVersion and engineVersion.

Before a new-style macro is allowed to expand, the versions are compared
with the corresponding versions available in the compiler plugin,
and an error is reported if there are incompatibilities.
separate abiDef from implDef
Sometimes we may want implDef to be called in a particular way.

For instance, old-style macros trim stacktraces of macro-generated
exceptions by looking for a method whose name ends with
`macroExpandWithRuntime` (I remember being criticized for this particular
way of arranging the macro engine, and now I understand why I was wrong).

Thus, for optimal user experience on platforms that desugar new-style
macros to old-style macros, we really want implDef to be named in this
idiosyncratic way.

However, we also want the inline module to have a method with
a well-defined name that can be called in platform-independent fashion.
That's why we introduce the notion of abiDef.
first steps at implementing Universe
This commit implements abstract types in Universe. In order to achieve
that, I had to apply a massive refactoring.

Here are the most important changes:
  * Public scala.meta.XXX packages are now aliased as scala.macros.XXX,
    which makes it possible for clients of profiles/macros not to use
    names starting with scala.meta.
  * The scala.meta.dialects has been reorganized to work similarly
    to other scala.meta.XXX packages.
  * Universe.abstracts is now of an abstract type Abstracts,
    which makes it possible to keep abstracts of all slices in the same
    value, which in turn greatly simplifies delegation.
  * Universe.companions has been split into parts that can be extended
    in descendants on any level of the super cake.
  * Abstract types declared in Trees.scala are now fully overridable.
    Previously only the top-level abstract types were overridable.
  * Expansion is now part of the Universe cake and hence virtualized.
    Previously it wasn't, which was a mistake since Expansion depends
    on Tree that is part of the cake.
update to the latest version of Dotty
It turns out that in the last several days Dotty cross-compilation
was turned off in my scripts. When I tried to compile the tests now,
I got blocked by lampepfl/dotty#2551.

Unfortunately, updating to the latest version of Dotty didn't fix
the problem. Let's hope this gets resolved asap.
more pattern matching fixes
At the moment, we don't provide classtags for our abstract types,
and, as a result, pattern matches like `case Term.This(_) => `
are going to produce unchecked warnings.
Name.Anonymous can't be erased to Name
It needs to be at least Term.Name with Type.Name.
unbreak the macro ABI
Unfortunately, our recent code reorganization broke the macro ABI.
The most difficult problem is scala/bug#10339,
which I've barely worked around.
first successful macro expansion
I spent about a day making myself comfortable with Trees and Abstracts
before starting to implement the methods necessary for the `@main`
annotation that I presented at ScalaDays NYC 2016.

During that day, I refactored Trees to contain only type aliases
and moved custom AST classes to `object customTrees` inside Abstracts.
This makes Trees crystal clear and also obviates the need in Companions,
because now the custom case classes don't have to implement Companion
APIs directly.

I also went ahead and added c.Template (c here stands for customTrees),
because it is quite hard to encode the notion of Template via g.Template.
Some time ago, Fengyun suggested that Template should be removed from
the public API, so maybe it indeed should.

I also took extreme care to preserve attributes when converting to/from
custom trees. Namely, all custom trees inherit all attrs from their
global analogues, but global trees only inherit positions from custom
trees (otherwise, we run into problems with the mix of untyped and typed
trees).

A lot of time went into designing naming conventions. Unlike with
converters, we don't have a clear-cut distinction between m.Tree and
g.Tree, so it wasn't easy. After some back and forth, I settled on
calling global-specific (or potentially global-specific) things with
names that start from g, and left other identifiers unchanged (i.e.
no m prefix).

After all the guidelines were established, hacking the `@main` macro
was actually very enjoyable. Unlike with converters, I only had to
implement a very small portion of ???'s. This is definitely a very strong
point of the extractor-based approach to macros.
initial semantic API
Extends the cake with a semantic slice that features the usual structure.
This design tries to combine the best features of the scalameta semantic
API ca. 2015, the scalameta semantic API ca. 2017 and liufengyun/gestalt.

This commit introduces Symbol and Denotation, which are both opaque
entities fully defined by a concrete implementation. A Symbol is a GUID
of a member (where uniqueness is defined within the limits of the
definition of classpath). A Denotation is a Symbol augmented with
a prefix, which enables it to compute symbol signatures.

Types from syntactic API are also used to represent types in the
semantic API. Fengyun was insisting that we shouldn't conflate these
two concepts, but I'd like to use this pull request as an extended
illustration of the idea that I tried to convey. Hopefully, this and
the follow-up commits will provide arguments in favor of this approach.

Semantic operations are implemented in an unusual fashion. There are a
lot of methods that overlap between refs, members, symbols and denots,
because e.g. it is reasonable to expect isFinal to be callable on all of
the above. As a result, quite some design effort went into efficiently
sharing these operations without copy/paste. I'm pretty happy with the
end result.

One blatant issue with the current data model is that allows performing
semantic operations on unattributed trees. There is a solution for that
that Fengyun has implemented in liufengyun/gestalt (separation of
typed and untyped trees), and it works really well. Therefore, we don't
consider this issue to be a serious problem, and we'll get to fixing it
once more urgent tasks are dealt with.
tighter integration between syntactic and semantic API
This commit plugs a huge usability hole by allowing metaprogrammers
to create types manually, exploiting the unification of type trees
and types.

First, we add functionality to look up symbols in the underlying
symbol table via `Symbol.apply(String)` which is inspired by the
recent iteration of the semantic API in scalameta 1.x. Secondly,
we add new constructors to Term.Name and Type.Name.
These constructors go from symbols to names that wrap them.

When we combine these new facilities with the standard APIs to create
type trees, we end up with a very potent mix that lets metaprogrammers
to throw together non-trivial types, including type applications,
type projections, etc.

I believe this is a very elegant way to approach semantic APIs,
because it obviates the need in maintaining a separate hierarchy
of extractors and helpers for types.

For example, if we separated type trees and types, we would have to add
methods like `termRef` and `typeRef` to construct simple types and then
methods like `appliedTo`, `project`, etc to construct more complex types.
Then our users would say e.g. `Type.typeRef("scala.List").appliedTo(tpe)`

In contrast, with the approach introduced in this commit, our users say
`Type.Apply(Type.Name(Symbol("scala.List#")), List(tpe))`, or if we
decide reuse AST-based helpers, `Type.Name(Symbol(...)).appliedTo(tpe)`.
This doesn't require any additional APIs, apart from the three newly
introduced ones, and those APIs are anyway core for semantic API
and are usable in other scenarios, e.g. via `Symbol(...).isXXX` or
for safe construction of typed ASTs.
introduce Mirror
Even though we went back to the reflect-like scheme of organizing things,
we shouldn't regress to complete savagery of semantic APIs being powered
by global state of the cake.

This commit also includes a reorganization of expansions, because
I realized that there's no need to have any method in Expansion
if it lives inside the cake. This saves us an additional layer of
abstraction.

I've also applied aggressive abbreviation, replacing all reasonable
occurrences of `symbol` with just `sym`. In the world of `tpe` and `denot`
this seems logical. Let's see how people like it.
@adriaanm

This comment has been minimized.

Show comment
Hide comment
@adriaanm

adriaanm Jun 5, 2017

Agreed! We're currently at an off site meeting. Will review when back home and recovered from jet lag :)

adriaanm commented Jun 5, 2017

Agreed! We're currently at an off site meeting. Will review when back home and recovered from jet lag :)

@DavidDudson

Looks like a good start to me.

I especially am interested in how the changes to pretty printing work out.

It looks a lot cleaner then the Expanders/Namers tangle that

+ def expandee(implicit e: Expansion): Term
+ def abort(pos: Position, msg: String)(implicit e: Expansion): Nothing
+ def error(pos: Position, msg: String)(implicit e: Expansion): Unit
+ def warning(pos: Position, msg: String)(implicit e: Expansion): Unit

This comment has been minimized.

+
+private[macros] trait Api {
+ private def ensureVersion(key: String, value: String): Version = {
+ def fail = sys.error(s"fatal error reading BuildInfo: $key $value is not a valid version")

This comment has been minimized.

@DavidDudson

DavidDudson Jun 5, 2017

Do we do this with the current paradise impl? This looks incredibly useful and I presume it would solve some issues we see in gitter.

@DavidDudson

DavidDudson Jun 5, 2017

Do we do this with the current paradise impl? This looks incredibly useful and I presume it would solve some issues we see in gitter.

This comment has been minimized.

@xeno-by

xeno-by Jun 5, 2017

Collaborator

I think this would totally make sense. @olafurpg wdyt sbt-wise?

@xeno-by

xeno-by Jun 5, 2017

Collaborator

I think this would totally make sense. @olafurpg wdyt sbt-wise?

+ }
+}
+
+object Dialect extends InternalDialect {

This comment has been minimized.

@DavidDudson

DavidDudson Jun 5, 2017

Are we having an all dialect? I know it's been mentioned a few times surrounding paradise.

@DavidDudson

DavidDudson Jun 5, 2017

Are we having an all dialect? I know it's been mentioned a few times surrounding paradise.

This comment has been minimized.

@xeno-by

xeno-by Jun 5, 2017

Collaborator

I'm not 100% sure how dialects should look like tbh. Would you as a macro author be interested to inspect all these flags?

If I recall correctly, a unicorn dialect was brought up in connection with parsing and quasiquotes requiring a dialect and a default dialect being not good enough. In this design, all this is irrelevant: 1) parsing is n/a, 2) quasiquotes don't require a dialect (they use a unicorn internally), 3) default dialect doesn't exist anymore (Dialect.current is not an implicit def).

What do you think about such an arrangement? In particular, would you be okay without having a way to parse stuff in macros?

@xeno-by

xeno-by Jun 5, 2017

Collaborator

I'm not 100% sure how dialects should look like tbh. Would you as a macro author be interested to inspect all these flags?

If I recall correctly, a unicorn dialect was brought up in connection with parsing and quasiquotes requiring a dialect and a default dialect being not good enough. In this design, all this is irrelevant: 1) parsing is n/a, 2) quasiquotes don't require a dialect (they use a unicorn internally), 3) default dialect doesn't exist anymore (Dialect.current is not an implicit def).

What do you think about such an arrangement? In particular, would you be okay without having a way to parse stuff in macros?

This comment has been minimized.

@DavidDudson

DavidDudson Jun 6, 2017

IIRC the main push for unicorn dialects was for tools such as scalafmt and scalafix. @olafurpg probably has a more concrete stance on this. I do not mind tbh, parsing in macros is a dirty hack 😄

@DavidDudson

DavidDudson Jun 6, 2017

IIRC the main push for unicorn dialects was for tools such as scalafmt and scalafix. @olafurpg probably has a more concrete stance on this. I do not mind tbh, parsing in macros is a dirty hack 😄

@@ -0,0 +1,26 @@
+package scala.macros.internal
+
+trait Evidence1

This comment has been minimized.

@DavidDudson

DavidDudson Jun 5, 2017

What are these for? Even being internal they still could use some documentation IMO

@DavidDudson

DavidDudson Jun 5, 2017

What are these for? Even being internal they still could use some documentation IMO

This comment has been minimized.

@xeno-by

xeno-by Jun 5, 2017

Collaborator

To disambiguate overloads that have the same java erasure. Good point!

@xeno-by

xeno-by Jun 5, 2017

Collaborator

To disambiguate overloads that have the same java erasure. Good point!

This comment has been minimized.

@DavidDudson

DavidDudson Jun 6, 2017

Ah I've done a similar thing before, it makes a lot of sense.

@DavidDudson

DavidDudson Jun 6, 2017

Ah I've done a similar thing before, it makes a lot of sense.

+package prettyprinters
+
+@scala.annotation.implicitNotFound("don't know how to prettyprint structure of ${T}")
+trait Structure[T] {

This comment has been minimized.

@DavidDudson

DavidDudson Jun 5, 2017

Interesting design, looks better then the current system. I need to experiment with it more.

It looks like the main advantage is you can mix rendering style.

@DavidDudson

DavidDudson Jun 5, 2017

Interesting design, looks better then the current system. I need to experiment with it more.

It looks like the main advantage is you can mix rendering style.

This comment has been minimized.

@xeno-by

xeno-by Jun 5, 2017

Collaborator

For me, the motivations were: 1) no higher-kinded types, 2) better performance.

@xeno-by

xeno-by Jun 5, 2017

Collaborator

For me, the motivations were: 1) no higher-kinded types, 2) better performance.

@xeno-by xeno-by referenced this pull request in scalameta/scalameta Jun 5, 2017

Merged

Massive cleanup of trees #907

@liufengyun

This comment has been minimized.

Show comment
Hide comment
@liufengyun

liufengyun Jun 6, 2017

Collaborator

This is a good start, and can serve as a basis for evolution.

Regarding review, I think only two things matter (Not really matter for the 1st PR of a project):

  • Compiler API -- contract with compilers
  • Programmer API -- api for macro authors
    • extension methods
    • helper methods
    • helper constructors/extractors

All other stuff are details that don't matter much. This PR is about 7000LOC, but the APIs in total should be about 600LOC. Maybe @xeno-by you can make it easier for review, and give some pointers?

Some initial feedback below:

Separation of User-API from Compiler-Contract

It would be good to separate them completely, as their design follow different principles and have different audience. Otherwise, the code will become messy, and difficult to manage them well.

The separation will make it very easy to figure out what are the assumptions we make on the compiler, ease compiler implementation.

Flags

Current code defines flags instead of using queries. It's against the design goal of portability.

https://github.com/scalamacros/scalamacros/pull/1/files#diff-b65270e10a64f0602e086402370a5341R17

Mirrors

I'm not sure what are mirrors, why this concept is necessary for compiler API, or user API?

type Mirror >: Null <: AnyRef

https://github.com/scalamacros/scalamacros/pull/1/files#diff-2f0948cddfb40bd8754be7e238c24c04R9

Collaborator

liufengyun commented Jun 6, 2017

This is a good start, and can serve as a basis for evolution.

Regarding review, I think only two things matter (Not really matter for the 1st PR of a project):

  • Compiler API -- contract with compilers
  • Programmer API -- api for macro authors
    • extension methods
    • helper methods
    • helper constructors/extractors

All other stuff are details that don't matter much. This PR is about 7000LOC, but the APIs in total should be about 600LOC. Maybe @xeno-by you can make it easier for review, and give some pointers?

Some initial feedback below:

Separation of User-API from Compiler-Contract

It would be good to separate them completely, as their design follow different principles and have different audience. Otherwise, the code will become messy, and difficult to manage them well.

The separation will make it very easy to figure out what are the assumptions we make on the compiler, ease compiler implementation.

Flags

Current code defines flags instead of using queries. It's against the design goal of portability.

https://github.com/scalamacros/scalamacros/pull/1/files#diff-b65270e10a64f0602e086402370a5341R17

Mirrors

I'm not sure what are mirrors, why this concept is necessary for compiler API, or user API?

type Mirror >: Null <: AnyRef

https://github.com/scalamacros/scalamacros/pull/1/files#diff-2f0948cddfb40bd8754be7e238c24c04R9

@olafurpg

This comment has been minimized.

Show comment
Hide comment
@olafurpg

olafurpg Jun 6, 2017

Member

First of all, this is super exciting! Great work @xeno-by

I agree with @liufengyun that this PR is impractical for review. The build infrastructure diffs are noise that hide away the beautiful tiny core. I would propose the next steps to be

  • merge this PR
  • focus on getting dotty pipeline working in test suite, that's where rubber meets the road
  • discuss design/details in issue tracker, like with #6

Some general questions:

  • Do we need to cross-build to Scala.js? Macros only have a compile-time footprint so there won't be linking errors unless some runtime code calls the scala.macros api.
  • it's not clear that there is an implicit Mirror inside the scope of the meta block , which makes IDEs like IntelliJ not pick up the extension methods from the Semantic API. Even if IDEs can add special support for the synthetic implicit mirror, I don't think we should rely on it since it
    1. taxes all tooling developers, Jetbrains may have resources to immediately accommodate these quirks but volunteer-based projects like ENSIME don't
    2. slows adoption of early adopters like myself, who aren't half as productive without IDE support
  • in the serialize example, the T type parameter is used as a value T.vals inside the meta block. I think it would be more intuitive to use it type position like tpe[T].vals.

If I can propose an alternative syntax, I'd go with something like

inline def serialize[T] = meta { implicit mirror =>
  tpe[T].vals.filter(...)
  ...
  q"foo(1)"
}
//  expose some dummy
  def meta(mirror: Mirror => Tree): Nothing = ???

Separation of User-API from Compiler-Contract

Do you propose to separate the abstract interface and engine implementation into individual repos @liufengyun ? I cloned the repo locally and experimented with the new macros, it was incredibly productive to have everything in one place. At least while this project is rapidly evolving and experimentation is happening across different engines, I propose to keep everything in the same repo. Once the abstract interfaces are set in stone, it's fine to split it up.

Flags

@liufengyun Good point. Those should probably be moved to the scalac engine. Having a working Dotty pipeline will make it easier to find issues like this.

Member

olafurpg commented Jun 6, 2017

First of all, this is super exciting! Great work @xeno-by

I agree with @liufengyun that this PR is impractical for review. The build infrastructure diffs are noise that hide away the beautiful tiny core. I would propose the next steps to be

  • merge this PR
  • focus on getting dotty pipeline working in test suite, that's where rubber meets the road
  • discuss design/details in issue tracker, like with #6

Some general questions:

  • Do we need to cross-build to Scala.js? Macros only have a compile-time footprint so there won't be linking errors unless some runtime code calls the scala.macros api.
  • it's not clear that there is an implicit Mirror inside the scope of the meta block , which makes IDEs like IntelliJ not pick up the extension methods from the Semantic API. Even if IDEs can add special support for the synthetic implicit mirror, I don't think we should rely on it since it
    1. taxes all tooling developers, Jetbrains may have resources to immediately accommodate these quirks but volunteer-based projects like ENSIME don't
    2. slows adoption of early adopters like myself, who aren't half as productive without IDE support
  • in the serialize example, the T type parameter is used as a value T.vals inside the meta block. I think it would be more intuitive to use it type position like tpe[T].vals.

If I can propose an alternative syntax, I'd go with something like

inline def serialize[T] = meta { implicit mirror =>
  tpe[T].vals.filter(...)
  ...
  q"foo(1)"
}
//  expose some dummy
  def meta(mirror: Mirror => Tree): Nothing = ???

Separation of User-API from Compiler-Contract

Do you propose to separate the abstract interface and engine implementation into individual repos @liufengyun ? I cloned the repo locally and experimented with the new macros, it was incredibly productive to have everything in one place. At least while this project is rapidly evolving and experimentation is happening across different engines, I propose to keep everything in the same repo. Once the abstract interfaces are set in stone, it's fine to split it up.

Flags

@liufengyun Good point. Those should probably be moved to the scalac engine. Having a working Dotty pipeline will make it easier to find issues like this.

@liufengyun

This comment has been minimized.

Show comment
Hide comment
@liufengyun

liufengyun Jun 6, 2017

Collaborator

Separation of User-API from Compiler-Contract

Do you propose to separate the abstract interface and engine implementation into individual repos @liufengyun ? I cloned the repo locally and experimented with the new macros, it was incredibly productive to have everything in one place. At least while this project is rapidly evolving and experimentation is happening across different engines, I propose to keep everything in the same repo. Once the abstract interfaces are set in stone, it's fine to split it up.

@olafurpg I don't mean separate engine implementation into different repos. I mean in the following code, some are implemented, some are not:

https://github.com/scalamacros/scalamacros/pull/1/files#diff-b65270e10a64f0602e086402370a5341R50

In fact, the unimplemented are APIs/Contracts for compiler, the implemented are User APIs.

As a compiler implementation, I'd like to see all compiler contracts are concentrated in one place, not mixed with user APIs.

In Gestalt, initially we also mix them together. Later we separate them and reaped huge benefits.

Collaborator

liufengyun commented Jun 6, 2017

Separation of User-API from Compiler-Contract

Do you propose to separate the abstract interface and engine implementation into individual repos @liufengyun ? I cloned the repo locally and experimented with the new macros, it was incredibly productive to have everything in one place. At least while this project is rapidly evolving and experimentation is happening across different engines, I propose to keep everything in the same repo. Once the abstract interfaces are set in stone, it's fine to split it up.

@olafurpg I don't mean separate engine implementation into different repos. I mean in the following code, some are implemented, some are not:

https://github.com/scalamacros/scalamacros/pull/1/files#diff-b65270e10a64f0602e086402370a5341R50

In fact, the unimplemented are APIs/Contracts for compiler, the implemented are User APIs.

As a compiler implementation, I'd like to see all compiler contracts are concentrated in one place, not mixed with user APIs.

In Gestalt, initially we also mix them together. Later we separate them and reaped huge benefits.

@olafurpg

This comment has been minimized.

Show comment
Hide comment
@olafurpg

This comment has been minimized.

Show comment
Hide comment
@olafurpg

olafurpg Jun 6, 2017

Member

However, the user APIs are a bit mixed across several traits, for example why is it Types.XtensionType instead of Semantic.XtensionSemanticType? It's a bit hard to explore the semantic api because of the heavy usage of extension methods combined with lack of IDE intelligence inside meta blocks.

Member

olafurpg commented Jun 6, 2017

However, the user APIs are a bit mixed across several traits, for example why is it Types.XtensionType instead of Semantic.XtensionSemanticType? It's a bit hard to explore the semantic api because of the heavy usage of extension methods combined with lack of IDE intelligence inside meta blocks.

@liufengyun

This comment has been minimized.

Show comment
Hide comment
@liufengyun

liufengyun Jun 6, 2017

Collaborator

Thanks for the pointer @olafurpg . As the compiler contracts/API are in fact the most important part of the macro system, I think it's worth to put them together in a separate package & directory. From code maintenance point of view, if there were no changes to the core, we know that the assumptions on compiler are not changed. It will also make it much easier for compiler people to review the code and check the contracts.

I remember scala.meta removed modifiers from trees, but it seems they come back as constructors/extractors? Is it intentional?

 +    def ModAnnot: ModAnnotCompanion
 +    def ModPrivate: ModPrivateCompanion
 +    def ModProtected: ModProtectedCompanion
 +    def ModImplicit: ModImplicitCompanion
 +    def ModFinal: ModFinalCompanion
 +    def ModSealed: ModSealedCompanion
 +    def ModOverride: ModOverrideCompanion
 +    def ModCase: ModCaseCompanion
 +    def ModAbstract: ModAbstractCompanion
 +    def ModCovariant: ModCovariantCompanion
 +    def ModContravariant: ModContravariantCompanion
 +    def ModLazy: ModLazyCompanion
 +    def ModValParam: ModValParamCompanion
 +    def ModVarParam: ModVarParamCompanion
 +    def ModInline: ModInlineCompanion

I also see there's a constructor Template. The extractor-based approach prefers macro-structures over micro-structures, as I've I documented here: liufengyun/gestalt#73 and in the paper. This point may be debatable.

Another merit of the extractor-based approach is that it enables us to be lazy about what constructor or extractor to provide. Not every syntactic element needs an extractor, e.g. for, all type trees, patterns. At least, it's completely fine to just delay providing them until a concrete use case surface.

Collaborator

liufengyun commented Jun 6, 2017

Thanks for the pointer @olafurpg . As the compiler contracts/API are in fact the most important part of the macro system, I think it's worth to put them together in a separate package & directory. From code maintenance point of view, if there were no changes to the core, we know that the assumptions on compiler are not changed. It will also make it much easier for compiler people to review the code and check the contracts.

I remember scala.meta removed modifiers from trees, but it seems they come back as constructors/extractors? Is it intentional?

 +    def ModAnnot: ModAnnotCompanion
 +    def ModPrivate: ModPrivateCompanion
 +    def ModProtected: ModProtectedCompanion
 +    def ModImplicit: ModImplicitCompanion
 +    def ModFinal: ModFinalCompanion
 +    def ModSealed: ModSealedCompanion
 +    def ModOverride: ModOverrideCompanion
 +    def ModCase: ModCaseCompanion
 +    def ModAbstract: ModAbstractCompanion
 +    def ModCovariant: ModCovariantCompanion
 +    def ModContravariant: ModContravariantCompanion
 +    def ModLazy: ModLazyCompanion
 +    def ModValParam: ModValParamCompanion
 +    def ModVarParam: ModVarParamCompanion
 +    def ModInline: ModInlineCompanion

I also see there's a constructor Template. The extractor-based approach prefers macro-structures over micro-structures, as I've I documented here: liufengyun/gestalt#73 and in the paper. This point may be debatable.

Another merit of the extractor-based approach is that it enables us to be lazy about what constructor or extractor to provide. Not every syntactic element needs an extractor, e.g. for, all type trees, patterns. At least, it's completely fine to just delay providing them until a concrete use case surface.

@xeno-by

This comment has been minimized.

Show comment
Hide comment
@xeno-by

xeno-by Jun 6, 2017

Collaborator

Thank you, everyone, for your feedback! I'm preparing for my tomorrow's talk and catching up at work, so I'll get to replying on Thursday or Friday.

Collaborator

xeno-by commented Jun 6, 2017

Thank you, everyone, for your feedback! I'm preparing for my tomorrow's talk and catching up at work, so I'll get to replying on Thursday or Friday.

+ implicit def int: Serialize[Int] = Serialize { x => x.toString }
+ implicit def string: Serialize[String] = Serialize { x => "\"" + x + "\"" }
+
+ inline implicit def materialize[T]: Serialize[T] = meta {

This comment has been minimized.

@xeno-by

xeno-by Jul 5, 2017

Collaborator

We should change syntax for this to be implicit def materialize[T]: Serialize[T] = macro { ... }. See the discussion about this here: https://gitter.im/scalamacros/scalamacros?at=595d65f176a757f808deb35e.

@xeno-by

xeno-by Jul 5, 2017

Collaborator

We should change syntax for this to be implicit def materialize[T]: Serialize[T] = macro { ... }. See the discussion about this here: https://gitter.im/scalamacros/scalamacros?at=595d65f176a757f808deb35e.

+import scala.macros._
+
+class main extends MacroAnnotation {
+ inline def apply(defn: Any): Any = meta {

This comment has been minimized.

@xeno-by

xeno-by Jul 5, 2017

Collaborator

We can do better than Any => Any. See the discussion about this here: https://github.com/scalamacros/scalamacros/issues/6.

@xeno-by

xeno-by Jul 5, 2017

Collaborator

We can do better than Any => Any. See the discussion about this here: https://github.com/scalamacros/scalamacros/issues/6.

project/versions.scala
+ lazy val Scala210 = readScalaVersionFromDroneYml("2.10.x")
+ lazy val Scala211 = readScalaVersionFromDroneYml("2.11.x")
+ lazy val Scala212 = readScalaVersionFromDroneYml("2.12.x")
+ lazy val Scala213 = readScalaVersionFromDroneYml("2.13.x")

This comment has been minimized.

@xeno-by

xeno-by Jul 5, 2017

Collaborator

Given the amount of effort that it takes to crosscompile sources and tests even now, I'm thinking of only supporting 2.13 in the Scala 2.x series.

@xeno-by

xeno-by Jul 5, 2017

Collaborator

Given the amount of effort that it takes to crosscompile sources and tests even now, I'm thinking of only supporting 2.13 in the Scala 2.x series.

@xeno-by

This comment has been minimized.

Show comment
Hide comment
@xeno-by

xeno-by Jul 11, 2017

Collaborator

@lrytz @retronym @SethTisue @szeiger I'd appreciate your feedback!

Collaborator

xeno-by commented Jul 11, 2017

@lrytz @retronym @SethTisue @szeiger I'd appreciate your feedback!

This was referenced Jul 11, 2017

@lrytz

This comment has been minimized.

Show comment
Hide comment
@lrytz

lrytz Jul 20, 2017

I read the Gestalt paper to get started, that sounds very good. For reviewing, this PR is indeed overwhelming. Maybe we can organize a tech talk hangout where you take us through the implementation?

lrytz commented Jul 20, 2017

I read the Gestalt paper to get started, that sounds very good. For reviewing, this PR is indeed overwhelming. Maybe we can organize a tech talk hangout where you take us through the implementation?

@ShaneDelmore

This comment has been minimized.

Show comment
Hide comment
@ShaneDelmore

ShaneDelmore Jul 20, 2017

Great idea @lrytz I would love to attend a hangout discussing this.

Great idea @lrytz I would love to attend a hangout discussing this.

@xeno-by

This comment has been minimized.

Show comment
Hide comment
@xeno-by

xeno-by Jul 20, 2017

Collaborator

@lrytz @ShaneDelmore See an invite to a private gitter room. Let's agree on a date/time.

Collaborator

xeno-by commented Jul 20, 2017

@lrytz @ShaneDelmore See an invite to a private gitter room. Let's agree on a date/time.

xeno-by added some commits Jul 22, 2017

remove Dialect
Since our quasiquotes don't require a dialect, I don't think we should.
If in the future, we need dialects, we should be able to easily readd
them later.
stop crosscompilation to JS
As @olafurpg has pointed out, we don't need this if we don't plan
to have macro APIs called at runtime.
stop using Drone
In scalameta, we're now using Travis, because Scala Platform CI has
proven to be unreliable. This commit removes Drone support, and future
commits will set up Travis.
stop crosscompilation to 2.10 and 2.11
The maintenance burden is real. We'll need community help with this.
remove AbsolutePath
A dedicated io package is appropriate in Scalameta, but I highly doubt
that we need it in macros.
replace inline/meta with def/macro
As the first step for the new macro system, we want to get just the
macro-related functionality, and for that we don't have to implement the
full power of inline and meta.

Since we don't need to implement inline and meta separately right away,
inline/meta becomes just a fancy way of indicating a new-style macro,
and this syntax starts looking a bit excessive.

Instead of introducing two new keywords and bringing our users the pain
of migrating code that uses these keywords as identifiers, we propose
that for the time being we use the old def/macro syntax with the
block-shaped right-hand side as the syntax for new-style macros.

@xeno-by xeno-by changed the title from New-style macro APIs and a prototype implementation for Scala 2.11.11 to New-style macro APIs and a prototype implementation for Scala 2.12.2 Jul 23, 2017

@xeno-by

This comment has been minimized.

Show comment
Hide comment
@xeno-by

xeno-by Jul 23, 2017

Collaborator

Updated the API to be in sync with Scalameta 2.0.0-M1, limited the supported Scala versions to 2.12.x, 2.13.x and 0.2.x, and changed the new macro syntax from inline/meta to def/macro. We're ready to get started.

Collaborator

xeno-by commented Jul 23, 2017

Updated the API to be in sync with Scalameta 2.0.0-M1, limited the supported Scala versions to 2.12.x, 2.13.x and 0.2.x, and changed the new macro syntax from inline/meta to def/macro. We're ready to get started.

@xeno-by xeno-by merged commit f03bbf3 into scalacenter:master Jul 23, 2017

@xeno-by

This comment has been minimized.

Show comment
Hide comment
@xeno-by

xeno-by Jul 23, 2017

Collaborator

I've just published the current state of master to our bintray repository. See an example of using the newly published complier plugin at https://github.com/scalamacros/sbt-example-newstyle.

Collaborator

xeno-by commented Jul 23, 2017

I've just published the current state of master to our bintray repository. See an example of using the newly published complier plugin at https://github.com/scalamacros/sbt-example-newstyle.

@milessabin

This comment has been minimized.

Show comment
Hide comment
@milessabin

milessabin Jul 26, 2017

The link to the new style materializer example in the description at the top is is broken ... would you mind updating it?

The link to the new style materializer example in the description at the top is is broken ... would you mind updating it?

@xeno-by

This comment has been minimized.

Show comment
Hide comment
Collaborator

xeno-by commented Jul 26, 2017

@dgouyette dgouyette referenced this pull request in scala-hamsters/hamsters Oct 15, 2017

Closed

Implement Show typeclass #34

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.