Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PROPOSAL - Migrate Scalameta Macros #6

Closed
diesalbla opened this issue Jul 22, 2018 · 46 comments
Closed

PROPOSAL - Migrate Scalameta Macros #6

diesalbla opened this issue Jul 22, 2018 · 46 comments

Comments

@diesalbla
Copy link

diesalbla commented Jul 22, 2018

The macro annotations in cats-tagless are implemented using the macro system of scalameta.

However, the development of macros and macro annotations in Scala Meta was closed, as explained
here.

With these lessons learned, we decided to retire our efforts to build a macro system on top of Scalameta.

Current support for Scalameta macros may not work in next versions of the scala compiler.

On the other hand, the Scala compiler has recently incorporated the macro-paradise compiler plugin, which enables support for macro annotations based on scala-reflect, into their main codebase.

The goal of this issue is to migrate existing macros to the scala.reflect based macro annotations.

@Odomontois
Copy link

Odomontois commented Aug 16, 2018

Totally agree.
For that very reason I temporarily dropped mainecoon from my project, until scalameta paradise didn't get the 2.12.6 support.
This should not happen again!

@milessabin
Copy link
Member

I recommend that you look very hard at non-macro solutions where at all possible, or alternatively depend on existing libraries rather than introducing new macros of any form ... all of this will change in Scala 3 and adding new boat anchors is not a good idea at this point.

@SemanticBeeng
Copy link

SemanticBeeng commented Oct 1, 2018

@milessabin : in the context of ROADMAP TOWARDS NON-EXPERIMENTAL MACROS https://www.scala-lang.org/blog/2017/10/09/scalamacros.html is your advise about scala.reflect based macros mostly?

Any guidance about fit with scalameta based macros?
According to the above decision document this way to create macros is adopted.

I understand " look very hard at non-macro solutions where at all possible" but for code generation macros is a good solution and scalameta seems to make this productive and sustainable.

Please advise further.

@milessabin
Copy link
Member

@SemanticBeeng it's about macros in general. The only thing that's certain at this point is that scala.reflect macros are not viable in the long term. It's still unclear what will replace them.

@SemanticBeeng
Copy link

SemanticBeeng commented Oct 1, 2018

Understood.
Which leaves me with scalameta in mid term, until something better comes along.
Have no knowledge of other better ways to do program transformation/term rewriting and code generation in Scala, which the context of my request for guidance above.

My best knowledge for long term is Martin's vision here which combines Scala LMS and macros (makes sense to me, for what it is worth).
Principled Meta Programming for Scala
https://gist.github.com/odersky/f91362f6d9c58cc1db53f3f443311140 (http://archive.is/13LG7)

Sorry if this extrapolation is "unsafe" but given that cats-tagless replaces mainecoon this is the best place to mention afaik.

Very interested to learn better as things evolve. Will keep an eye.

UPDATE: About scalagen on https://daviddudson.github.io/

"scalameta/paradise library was deprecated, and scalamacros/paradise is not ide or user friendly"

UPDATE: Related to cats scalameta/scalagen#22

@olafurpg
Copy link

olafurpg commented Oct 1, 2018

Which leaves me with scalameta in mid term, until something better comes along.

I highly recommend migrating to scala-reflect macro annotations since Scalameta macro annotations are not a viable solution even in the mid term (or short term for that matter). Excluding dependency upgrades and a few rare bug fixes, Scalameta macro annotations have not seen any development since late 2016. There will be no Scalameta paradise release for 2.13.

The Slinky project migrated to scala-reflect macro annotations and reported "drastically improved error messages" https://medium.com/@shadaj/introducing-slinky-0-5-0-21deb2546d65#0692

IntelliJ has improved documentation for writing custom "injectors" that enables IDE support for scala-reflect macro annotations https://github.com/JetBrains/intellij-scala/wiki/Library-Injector

@kubukoz
Copy link
Member

kubukoz commented Feb 18, 2019

The link @olafurpg posted is obsolete now, here's the updated one: https://github.com/JetBrains/intellij-scala/wiki/Library-Extensions

@dispalt
Copy link

dispalt commented Feb 27, 2019

I somewhat attempted to do this but ran into a macro issue that I can't figure out. I implemented finalAlg and autoFunctorK.

https://github.com/dispalt/tagless-redux

The macro issue is stacking multiple macros with no companion object present somehow nets a missing companion class, which I am sure can be solved I am just running out of steam on how to do it.

@kailuowang
Copy link
Contributor

@dispalt, thank you for the report. It's helpful to learn that.

@dispalt
Copy link

dispalt commented Feb 27, 2019

And I am not saying it can't be done, and I am sure someone with more experience with macros could figure it out, but it's a decent start

@kailuowang
Copy link
Contributor

It looks like we need some scalamacro expertise to tackle this, which I lack (hence the choice of scalameta at the first place which I regret). The easiest way is probably me partner with a macro guru together on this - I am going to try recruit one.
https://twitter.com/kailuowang/status/1102564142031011840

@fommil
Copy link

fommil commented Mar 4, 2019

I was (emphasis on past tense) planning on writing something equivalent to FunctorK in scalaz, with a compiler plugin to generate the boilerplate. If anybody wants to take a look at that approach, which I guess is along the lines that @milessabin is proposing (I assume compiler plugins are still going to be available in Scala 3) then https://github.com/scalaz/scalaz-deriving/blob/master/deriving-plugin/src/main/scala/scalaz/plugins/deriving/AnnotationPlugin.scala is the place to start.

I created a shell project, now decomissioned, in https://gitlab.com/fommil/attic/tree/master/scalaz-free which should save some time on the sbt setup side. The scope of this project was much larger than just FunctorK, I also wanted to generate the Free ADT encoding, as well as liftIO, liftM, liftF, liftAp etc directly. I'm pretty sure that just the FunctorK would be superior // ping to @rabbitonweb, who was interested at one point.

Also, if anybody wants to write an autoFunctorK (or rather, for FFunctor) for Haskell, that'd be pretty nice 😆

@joroKr21
Copy link
Member

joroKr21 commented Mar 4, 2019

@fommil That's a noble cause, but what is to guarantee compiler plugin APIs will remain compatible from Scala 2 to Scala 3 😄

I would propose the following migration path (which can be taken in many small steps by different contributors):

  1. Migrate all typeclass derivation to blackbox macros, one by one (leaving macro annotations to generate one-liners);
  2. Review all annotations which generates new types (are there any?) and figure out how we can deal with them: drop them, are whitebox macros sufficient or do we need full macro annotation power?
  3. After 1. and 2. are done, it should be straightforward to switch to good old scala.reflect macro annotations.

Here is a starter blackbox macro for FunctorK (I don't know if it covers all use cases): https://gist.github.com/joroKr21/d3363eb76ef7aac8455740392d6f55f1

@fommil
Copy link

fommil commented Mar 4, 2019

@joroKr21 I don't think the code would survive, but surely the principles would. I definitely agree to the use of blackbox as much as possible. This is also what is encouraged in the annotation plugin.

An immediate benefit of an annotation plugin is that a) metals / scalaide will support it and b) there is a clear path from there to write an intellij plugin (examples provided). These points become invalidated if paradise is maintained to support the PC.

@kailuowang
Copy link
Contributor

Thanks @joroKr21 and @fommil

@joroKr21, there are no new types generated. I think what you have with blackbox is where we want to be. cats-tagless is a bit more complex due to the need to support all kinds of different shape of algebra type signatures, e.g. Alg[F[_], A, B], or trait Alg[F[_]] { type A }, and method types.
This test is a more complete list of types and methods it supports for FunctorK derivation.
https://github.com/typelevel/cats-tagless/blob/master/tests/src/test/scala/cats/tagless/tests/autoFunctorKTests.scala#L175
I think my usages of scalameta in this project is quite basic, but I just lack the scalamacro experience to translate the logic/concepts into scala macro.

I think blackbox macro probably would suffice. I can see two strategies:

  1. translate the existing scalameta code into scala blackbox macro by implementing the same variables and logic in scalamacro API, we can start with autoFunctorK. This will require translating some of the more shared utils for other derivations.
  2. start from scratch with scalamacro on autoFunctorK (a better candidate might be autoInvariantK, since it's more general) first and try pass all the existing tests. then refactor and reuse the code to implement other derivations.

Which one would you prefer?

@joroKr21
Copy link
Member

joroKr21 commented Mar 4, 2019

ve.g. Alg[F[], A, B], or trait Alg[F[]] { type A }, and method types

That's ok, but how does the annotation decide which type parameter (F[_]) to choose? I guess the answer to this question means that it won't be possible to generate simply implicit val functorK: FunctorK[Alg] = FunctorK.derive.

Regarding your suggestions - I'm not sure before I look deeper in the code, but I think I prefer starting from scratch and making one typeclass work.

@kailuowang
Copy link
Contributor

the annotation doesn't decide which F[], it's based on some convention rules, namely
if there is only one type constructor type param use that, if there are multiple, use the last one.
Starting from scratch sounds good to me. I am not sure the original code is super readable either.

@joroKr21
Copy link
Member

joroKr21 commented Mar 5, 2019

Dependent types are the most tricky because they require symbols to define which we don't have yet before typechecking the definition :/

@kailuowang
Copy link
Contributor

@joroKr21 maybe for phase one we first drop support for dependent type?

@joroKr21
Copy link
Member

joroKr21 commented Mar 6, 2019

Let me try a few ideas first

@dispalt
Copy link

dispalt commented Mar 6, 2019

You can check out a couple semi-auto style versions I did here. https://github.com/dispalt/tagless-redux/blob/master/tests/src/test/scala/cats/tagless/tests/DerivationTests.scala

@joroKr21
Copy link
Member

joroKr21 commented Mar 6, 2019

Managed to make it work for dependent types (but not yet refined types). WIP here: #8

It would be nice to document the necessary and sufficient conditions to generate instances.

For example, for FunctorK I assume that F can only appear as a type constructor in the return type of methods, but technically it might be possible to piggy-back on existing FunctorK instances (e.g. Stream[F, Foo]?

https://typelevel.org/cats-tagless/typeclasses.html gives a rough overview, but it doesn't mention the "lower-kinded" versions (Functor, FlatMap, etc.)

@kailuowang
Copy link
Contributor

@joroKr21 yeah, I meant to add that documentation but never get to.
Piggy-back on existing FunctorK instances is not supported. It was asked by some users before and I can see being useful. But I suspect the implementation would be pretty complex. I propose we tackle it on future releases.

@kailuowang
Copy link
Contributor

For now the rule of thumb to determine if an algebra can have autoFunctorK is to whether you can easily manually write the FunctorK instance.

@fommil
Copy link

fommil commented Mar 6, 2019

for something to be able to have a FunctorK, the F[_] must only appear in covariant position in all of its fields.

FYI the equivalent Haskell boilerplate looks something like

instance FFunctor UserApi where
  ffmap nt (UserApi f1 f2 f3) = UserApi (nt f1) (nt ... f2) (nt ... f3)

where ... is an operator from https://hackage.haskell.org/package/universum-1.5.0/docs/Universum-VarArg.html that may have a scala equivalent.

Otherwise, you can fall back to manual arity with something like

instance FFunctor UserApi where
  ffmap nt (UserApi f1 f2 f3) = UserApi (nt f1) (nt . f2) (nt .: f3)

using http://hackage.haskell.org/package/composition-1.0.2.1/docs/Data-Composition.html

Of course you still need to special-case fields that don't have any parameters. You should be able to do that with only a syntactic analysis, although I think you might be inside the blackbox macro at that point.

(see local capabilities in mtl for context)

I should imagine that creating the equivalent Scala boilerplate would be sufficient, and then just let the compiler fail normally if an instance isn't possible.

I was never able to work out how to intercept typer errors to provide a custom message, using either plugins or macros... but it'd be good to be able to provide some information if it fails to typecheck.

@dispalt
Copy link

dispalt commented Mar 7, 2019

Hey guys I pushed an intellij integration that's very simple. You can check out the example here, https://github.com/dispalt/tagless-redux-ijext

It's rudimentary but it's pretty neat. Basically it will supply the fake methods depending on the presence of an annotation. It uses the extension method which works with a recent version of intellij.

So if you clone that repo, import to intellij, it will say it detected some extensions would you like to load them and if you hit yes, it should work.

@joroKr21
Copy link
Member

joroKr21 commented Mar 7, 2019

@fommil That's very neat. I've seen the "dots" before but I had completely forgotten about them. However I don't know of any port to Scala (maybe opportunity for a pet project?).

@dispalt Thanks, the plugin should work regardless of how the instances are generated right? Also it looks like you already have a working implementation based on scala.reflect. Is it a port of the code in this repo? Maybe that's a better migration path than rewriting from scratch 🤔

@dispalt
Copy link

dispalt commented Mar 7, 2019

I did port finalAlg autoFunctorK to paradise, then ported them to a semi auto style. Did not yet port them to paradise + blackbox. But the code is crap and I really only care about FunctorK personally.

Yes it does work based on fq annotation names so it should drop in work with a new annotation easily. Also the intellij code isnt great just wanted to play around, it's not a bad framework, just too much stuff to learn so I am coding by braile.

@fommil
Copy link

fommil commented Mar 7, 2019

btw if you use @deriving(FunctorK) for this (which might work with only config changes, that can be provided with the cats-tagless jar, so out of the box for users) then you'll get IntellJ support for free.

The more I think about it the more I'm convinced it should work... there might be a couple of branches to add to the plugin code to handle the HKT but the semiauto boilerplate is basically identical to the normal-kinded version.

@kailuowang
Copy link
Contributor

kailuowang commented Mar 15, 2019

It turned out that switching to old scala.reflect macro annotation is still a bit beyond my close-to-zero scala macro skills. What I can do easily though is to create a new module that only includes the blackbox macro "semi-auto" derivation style and rid off the scala meta dependency and release that. @joroKr21 what do you think?

@joroKr21
Copy link
Member

joroKr21 commented Mar 15, 2019

Yes, I think it will still take some time to migrate, because there are also a lot of methods generated on the companion objects which would have to be rewritten. And it would have to be a big rewrite, it can't be done piecemeal. Maybe we can cut some less used features but I'm not sure yet 🤔

About releasing the macros for "semi-auto" lovers, do we need a separate module? Would the scala meta dependency in itself be a problem? Or maybe just not enabling the compiler plugin is good enough. Ideally scala meta should be a compile-time only dependency (i.e. provided), but I don't know if it works in practice.

I guess I'm also not sure whether you mean a temporary module, until the migration is done, or have it in general?

We also shouldn't forget to document the "semi-auto" derivation 😺

TLDR: I think it makes sense to release the blackbox macros before we're done with the rest of the migration, because some users would anyway use only that. But I'm not sure why we would need a separate module.

@joroKr21
Copy link
Member

joroKr21 commented Mar 16, 2019

I guess one reason to release a separate module (and keep it) would be to make it lighter-weight for people who want to use "semi-auto" only or scalaz-deriving. I didn't have time to try out the latter yet.

@kailuowang
Copy link
Contributor

@joroKr21 making the Scalameta dependency provided and have user adding it together with the complier plug in is a good idea. Yet another benefit for having a separate module is the ability to release it on Scala 2.13. Since we are going to migrate the rest of the functionality gradually, it might be easier to have the new code on a new module and not touch the legacy module at all. We can rename the old one legacy-macro.

We have two options on how to make this reorganization regarding inter module dependency

  1. Legacy module depends on new module. This way we can still keep the tests unchanged.
  2. Legacy module restore to the old code before your last 2 PR merges and thus the 2 modules are independent from each other. But we also need to copy and modify the tests for the new module.

I am fine either way. I can do 2 phases , do option 1 first and then 2. Wdyt?

@joroKr21
Copy link
Member

Okay I'm sold on releasing a separate module 👍

Option 1. is less work so I would go for that. We can cut this dependency later on as well.

@joroKr21
Copy link
Member

Option 1. is less work so I would go for that. We can cut this dependency later on as well.

On the other hand, I'm not sure what the migration path would look like anymore. I was planning to just replace the existing code with scala.reflect code (is that possible?)

@kailuowang
Copy link
Contributor

@joroKr21 you should still be able to do that if, for any code you want to replace, you write the new code in the new module and change the usages in the legacy module to use this new code instead right? Unless you need dependency the other way as well, i.e. new code depending on legacy code?

@joroKr21
Copy link
Member

Unless you need dependency the other way as well, i.e. new code depending on legacy code?

That wouldn't work anyway. I guess it doesn't really matter if we replace old code or just write new code based on the old one 😄

@dispalt
Copy link

dispalt commented Mar 19, 2019

Option 1. is less work so I would go for that. We can cut this dependency later on as well.

On the other hand, I'm not sure what the migration path would look like anymore. I was planning to just replace the existing code with scala.reflect code (is that possible?)

I might have missed this but how are you all thinking about autoDerive and fullyRefined? I think that will change the amount of work for the annotations.

I think the simplest is just ditch the meta annotations and do the reflect version, its not like there are really any changes that need to happen to the meta version.

@joroKr21
Copy link
Member

@dispalt we were talking about releasing a separate module without macro annotations.

@dispalt
Copy link

dispalt commented Mar 19, 2019

yeah definitely, I just wasn't sure if the plan was to then do macro annotations with or without auto derive. I was thinking of maybe trying to do the macro annotations using the semi auto macros, and was wondering what the thought is on autoDerive, etc.

@kailuowang
Copy link
Contributor

@dispalt I think we can create reflect.macro annotations little by little. If we can add macro annotations that simply calls the semiauto macros that would be great. The next step could be the instance summoning apply methods. Personally I think autoDerive using implicit instance and FunctionK is lower priority.

@dispalt
Copy link

dispalt commented Mar 20, 2019

Yeah I agree, are you going to re-arrange the code to a new module, or keep as is, Ill probably wait til that's done before doing anything.

@kailuowang
Copy link
Contributor

the new module is ready. we can start porting annotations there one by one.

@joroKr21
Copy link
Member

@fommil, I tried scalaz-deriving, but unfortunately it doesn't work out of the box. It looks like it's missing some kind checks and it thinks it should require implicit def functorK[F[_]: FunctorK]: FunctorK[Alg[F]] instead of implicit val functorK: FunctorK[Alg] . You can checkout this branch to see: https://github.com/joroKr21/cats-tagless/tree/scalaz-deriving. The most difficult part was actually getting deriving.conf on the classpath.

@fommil
Copy link

fommil commented Mar 25, 2019

@joroKr21 cool, that is pretty much what I expected. It should be relatively easy to fix.

BTW you're supposed to put the deriving.conf in the classpath of your main not tests (then you don't need the resources hack).

The easiest way to get this working would be to add a higher kinded example to https://github.com/scalaz/scalaz-deriving/tree/master/deriving-plugin/src/test/scala
and then try to find out what is breaking in the plugin itself https://github.com/scalaz/scalaz-deriving/blob/master/deriving-plugin/src/main/scala/scalaz/plugins/deriving/DerivingPlugin.scala it might be necessary to introduce a @derivingK or somesuch with the subtely different behaviour, or a hack like "if the typeclass ends in a capital K" then behave differently.

If that all checks out ok, then the bug might be in the blackbox macro ... https://github.com/scalaz/scalaz-deriving/blob/master/deriving-macro/src/main/scala/scalaz/macros/DerivingMacros.scala you might need to add some flag to indicate kindedness or something, then you can switch on it in that file. Good luck!

@joroKr21
Copy link
Member

The migration itself was done in #45. See also #37.

See #59 and #60 for follow up work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants