Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] inserted record can be returned from query #1383

Closed
wants to merge 7 commits into from
Closed

[WIP] inserted record can be returned from query #1383

wants to merge 7 commits into from

Conversation

rolandjohann
Copy link

@rolandjohann rolandjohann commented Mar 13, 2019

Fixes #572

Problem

When inserting records only one column can be returned via RETURNING clause. There are several use cases of database generated values besides IDs, for example timestamps.

Solution

Added returningRecord to QueryDsl including all relevant macro implementations.

Notes

The current implementation works with PostgreSQL, not tested with other DBs. This is my first contact with Macros, so please review carefully.

Unit tests and README update will be done in the next few days.

Checklist

  • Unit test all changes
  • Update README.md if applicable
  • Add [WIP] to the pull request title if it's work in progress
  • Squash commits that aren't meaningful changes
  • Run sbt scalariformFormat test:scalariformFormat to make sure that the source files are formatted

@getquill/maintainers

@rolandjohann
Copy link
Author

rolandjohann commented Mar 13, 2019

MySQL doesn't support INSERT INTO .. RETURNING, but is capable returning last inserted ID. Cassandra has no support for that, too. Didn't check the other DBs.

The question now is if we should restrict that feature to postgres only and how to implement that?

EDIT: At MySQL query[_].returning(_.name) where name is not the id leads to runtime exception, so maybe we can leave returning clause at quill-core

@deusaquilus
Copy link
Collaborator

Thanks for the awesome work @rolandjohann. Most databases actualy don't support record returns at all. I think Postgres is the exception rather then the norm, so we have to add this case to each DB test. Since returning is not supported for anything other then ID in most databases, and quill does not intrinsically know what column is the ID, checking to see if a column is the right one is not possible. Checking to see if it's a whole record and appropriately failing however, should be.

What I suggest is that you add a Capabilities trait that parsing Parsing expects as a self-type (and that the database contexts implement) which has some simple key/value variables that indicate what each DB can and cannot do (e.g trait Capabilities { def insertReturn[T < InsertReturnCapability]:T = InsertReturnCapability.ReturnWholeRecord vs ReturnSingleField } ). Then we can do c.fail in Parsing that will break during the compile phase when things like a whole-record return are done in databases that do not support that.

Also, instead of having a separate .returningRecord call, could we just have .returning(r => r) and detect that the input and output of .returning is the same in the parser?

@rolandjohann
Copy link
Author

rolandjohann commented Mar 13, 2019

@deusaquilus great idea defining capabilities, I'll try to implement that.

Regarding the API I'm not sure either how to define it. In the first version the signature required the type explicitly .returningRecord[T]. The idea behind that is a little more complex:
Case class representation of "prototype" entities which are not existent in the DB yet and the DB adds several generated columns (id, created_at, etc.) are slightly misconcepted in most (if not all) ORM-like libs/frameworks because case classes expect members to be defined which will be generated by the DB and thus are ignored at insert query submission - or they are optional, which prevents using them as public facing domain model, too.

case class Person(id: Long, name: String, age: int, createdAt: DateTime)

val personToCreate = Person(
  id = 0L,
  name = "Bob",  // ignored by query
  age = 123,
  createdAt = new DateTime() // ignored by query
)
query[Person].insert(lift(personToCreate))

My current concept involves prototypes of entities, so I can share/unify domain models even outside the context of DAO/Repository and can use them at other layers (HTTP via REST for example):

case class Person(id: Long, name: String, age: int, createdAt: DateTime)
case class PersonPrototype(name: String, age: int)

With this DB facing domain model we can implement insert query by prototype and specify return type explicitly to complete entity:

class PersonRepository {
  def create(personPrototype: PersonPrototype): Future[Person] = {
    implicit val m = schemaMeta[PersonPrototype]("person")
    query[PersonPrototype].insert(lift(personPrototype)).returningRecord[Person]
  }
}

Currently I see two possibilities:

  1. io.getquill.dsl.QueryDsl.Insert#returning(r => r), e.g. io.getquill.dsl.QueryDsl.Insert#returning(identity) => this is explicit, IMHO a little to verbose
  2. overloading io.getquill.dsl.QueryDsl.Insert#returning so we can return a column or an entity:
def returning[R](f: E => R): ActionReturning[E, R]
def returning[R]: ActionReturning[E, R]

This offers flexibility, but adds verbosity because we must specify return type explicitly even if it is the same of query[T].

What are your thoughts on this?

Copy link
Collaborator

@deusaquilus deusaquilus left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rolandjohann Actually, I really like the idea of forcing the user to specify the return type. I think just having insert(foo).returning is very unclear in the first place. What is being returned? Is it the Id? Is the number of records inserted? There's no good reason to assume one over another. Admittedly, returningRecord clears up this confusion a little bit but the meaning of "record" isn't entirely transparent either. Let's go with the "overloading" option.

@@ -698,6 +698,8 @@ trait Parsing {
Delete(astParser(query))
case q"$action.returning[$r](($alias) => $body)" =>
Returning(astParser(action), identParser(alias), astParser(body))
case q"$action.returningRecord" =>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about something like this (or some simplified variation of this)?

case q"$action.returning[$r](($alias) => $body)" =>
    val tpe =
    body match {
      case Ident(TermName(n)) if (n == alias.name.toString) => Some(x.tpe)
      case _ => None
    }
  val argTpe =
    alias match {
      case q"$mods val $pat: $tpt = $expr" =>
        Some(tpt.tpe)
      case _ => 
        None
    }

  (tpe, argTpe) match {
    case (Some(a), Some(b)) if (a =:= b) =>
      ReturningRecord(astParser(action))
    case _ =>
      Returning(astParser(action), identParser(alias), astParser(body))
  }          
}

I'd rather not introduce additional constructs to the DSL if it's technically not needed.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you mean with the "overloading" option? If I understand the suggested snipped correctly, it checks if the lambda passed to returning is of type T => T. With that API users aren't able specify the return type.

What about introducing .returningAs[R] and don't touch parsing of . returning at all? Introducing an additional API seems to be less complex at implementation and more flexible for the user.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rolandjohann So I think .returning(e => e) should work, otherwise we'll get WTFs from users. It's an expectations thing. It's totally reasonable for a user to ask "So wait, why is this API for whole records, and this one for single fields? That's a totally arbitrary distinction."

Also, are you saying that overloading this method now forces all calls to .returning then to specify the generic parameter including the regular use-case .returning(a -> a.id)? That definitely calls for introducing returningAs! Is this the case?

Why can't we do something like this (see the bottom for the addition)?

case q"$action.returning[$r](($alias) => $body)" =>
    val tpe =
    body match {
      case Ident(TermName(n)) if (n == alias.name.toString) => Some(x.tpe)
      case _ => None
    }
  val argTpe =
    alias match {
      case q"$mods val $pat: $tpt = $expr" =>
        Some(tpt.tpe)
      case _ => 
        None
    }

  (tpe, argTpe) match {
    case (Some(a), Some(b)) if (a =:= b) =>
      ReturningRecord(astParser(action))
    case _ =>
      Returning(astParser(action), identParser(alias), astParser(body))
  }          
}
// Option A (if returning can be overloaded without verbosity increase)
//case q"$action.returning[$r](...$paramss)" if (paramss.length == 0) => 

// Option B (if returningAs needs to be introduced)
case q"$action.returningAs[$r](...$paramss)" =>
  ReturningRecord(astParser(action))

If there's something I missed here or if this is not as simple as I think, let me know.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, just recognized that I missed a point you mentioned earlier.

We can go with overloading, for this we must introduce the additional signature at QueryDsl. Just using the old one and omit the lambda which normally defines the column to return will return a function type:

val q1: (Person => Person) => ctx.ActionReturning[Person, Person] = query[Person].insert(lift(Person(0L, "Roland", 29))).returning[Person]
val q2: ctx.ActionReturning[Person, Long] = query[Person].insert(lift(Person(0L, "Roland", 29))).returning(_.id)
scala> quote(query[Person].insert(lift(Person(0L, "Roland", 29))).returning[Person])
<console>:18: error: missing argument list for method returning in trait Insert
Unapplied methods are only converted to functions when a function type is expected.
You can make this conversion explicit by writing `returning _` or `returning(_)` instead of `returning`.
       quote(query[Person].insert(lift(Person(0L, "Roland", 29))).returning[Person])

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah. That's what I was thinking.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And we must define the type when calling to get Entity, too.

.returning[Person]
.returning(_.id)

@@ -205,6 +205,7 @@ class MirrorIdiom extends Idiom {
case Insert(query, assignments) => stmt"${query.token}.insert(${assignments.token})"
case Delete(query) => stmt"${query.token}.delete"
case Returning(query, alias, body) => stmt"${query.token}.returning((${alias.token}) => ${body.token})"
case ReturningRecord(query) => stmt"${query.token}.returningRecord"
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we need to transport the type here as well?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes

@rolandjohann
Copy link
Author

What I suggest is that you add a Capabilities trait that parsing Parsing expects as a self-type (and that the database contexts implement) which has some simple key/value variables that indicate what each DB can and cannot do (e.g trait Capabilities { def insertReturn[T < InsertReturnCapability]:T = InsertReturnCapability.ReturnWholeRecord vs ReturnSingleField } ). Then we can do c.fail in Parsing that will break during the compile phase when things like a whole-record return are done in databases that do not support that.

I tried a naive approach: My first thought was to define the additional method only for postgres contexts with an implicit class, so users can't use this feature on non postrges contexts and get errors at compile time. But the PoC implementation crashed at compilation:

Tree 'ctx.InsertOps[ctx.Insert[io.phenetic.quill.Person]]((ctx.unquote[(ctx.EntityQuery[io.phenetic.quill.Person], io.phenetic.quill.Person) => ctx.Insert[io.phenetic.quill.Person]](personInsertMeta.expand).apply((ctx.unquote[ctx.EntityQuery[io.phenetic.quill.Person]]({
[error]   final class $anon extends AnyRef with ctx.SchemaMeta[io.phenetic.quill.Person] {
[error]     def <init>(): <$anon: ctx.SchemaMeta[io.phenetic.quill.Person]> = {
[error]       $anon.super.<init>();
[error]       ()
[error]     };
[error]     private[this] val _entity: ctx.Quoted[ctx.EntityQuery[io.phenetic.quill.Person]]{def quoted: io.getquill.ast.Entity; def ast: io.getquill.ast.Entity; def id782076222(): Unit; val liftings: Object} = {
[error]       final class $anon extends AnyRef with ctx.Quoted[ctx.EntityQuery[io.phenetic.quill.Person]] {
[error]         def <init>(): <$anon: ctx.Quoted[ctx.EntityQuery[io.phenetic.quill.Person]]> = {
[error]           $anon.super.<init>();
[error]           ()
[error]         };
[error]         import scala.language.reflectiveCalls;
[error]         scala.collection.immutable.Nil.asInstanceOf[AnyRef{def size: Unit}].size;
[error]         @io.getquill.quotation.QuotedAst(io.getquill.ast.Entity.apply("Person", scala.collection.immutable.Nil)) def quoted: io.getquill.ast.Entity = $anon.this.ast;
[error]         override def ast: io.getquill.ast.Entity = io.getquill.ast.Entity.apply("Person", scala.collection.immutable.Nil);
[error]         def id782076222(): Unit = ();
[error]         private[this] val liftings: Object = new scala.AnyRef();
[error]         <stable> <accessor> def liftings: Object = $anon.this.liftings
[error]       };
[error]       new $anon()
[error]     };
[error]     def entity: ctx.Quoted[ctx.EntityQuery[io.phenetic.quill.Person]]{def quoted: io.getquill.ast.Entity; def ast: io.getquill.ast.Entity; def id782076222(): Unit; val liftings: Object} = $anon.this._entity
[error]   };
[error]   new $anon()
[error] }.entity): ctx.EntityQuery[io.phenetic.quill.Person]), (ctx.liftCaseClass[io.phenetic.quill.Person](Person.apply(0L, "Roland", 29)): io.phenetic.quill.Person)): ctx.Insert[io.phenetic.quill.Person]))' can't be parsed to 'Ast'
[error]     val insertQ = quote {
[error]                         ^

As I'm not familiar with macros, I can't guess if that approach even can work.

@deusaquilus
Copy link
Collaborator

Yeah, defining this kind of functionality outside of the macro parser is very difficult.

@rolandjohann
Copy link
Author

rolandjohann commented Mar 17, 2019

@deusaquilus Need your help/assistance again. The DB contexts aren't directly within the same class hierarchy as Parsing. The "highest" super classes of Parsing are ActionMacro and QueryMacro, which aren't at the class hierarchy of Context and its implementations but are called at Context via macro.

Things I tried to get InsertReturnCapability into Parsing:

  1. pass inset return capability via annotation from abstract method defined at Context
class ParserMacroSettings(insertReturn: InsertReturnCapability) extends StaticAnnotation

trait Capabilities {
  def insertReturn: InsertReturnCapability
}

trait Context[Idiom <: io.getquill.idiom.Idiom, Naming <: NamingStrategy] extends Capabilities {
  // for testing purposes to prevent having to implement that at all classes extending Context
  override def insertReturn: InsertReturnCapability = ReturnSingleField

  @ParserMacroSettings(insertReturn) def run[T](quoted: Quoted[ActionReturning[_, T]]): Result[RunActionReturningResult[T]] = macro ActionMacro.runActionReturning[T]
}

Now at macro I have the AST of annotation body with

c.macroApplication
      .children
      .head
      .symbol
      .annotations
      .find(_.tree.tpe <:< typeOf[ParserMacroSettings])
      .map(annotation => showRaw(annotation.tree.children.tail.head))

which obviously is just Select(This(TypeName("Context")), TermName("insertReturn")), but we need the return value of that expression.

  1. Evaluating code
    c.eval(c.Expr[InsertReturnCapability](c.untypecheck(q"${c.prefix}.insertReturn"))), which leads to
java.lang.IllegalArgumentException: Could not find proxy for val ctx: io.phenetic.quill.MyDatabaseContext.type in List(value ctx, method main, object Application, package quill, package phenetic, package io, package <root>) (currentOwner= method wrapper )

So basically I'm stuck here

@rolandjohann
Copy link
Author

@deusaquilus I suggest finalizing this particular feature and do the restriction to postgres afterwards. In that step mysql capability can be handled, too: IMHO there the API should restrict to returning ID and even provide an API that's pointing to that fact returningId.

@deusaquilus
Copy link
Collaborator

deusaquilus commented Mar 20, 2019 via email

@rolandjohann
Copy link
Author

Hi @deusaquilus thanks for the information

@deusaquilus
Copy link
Collaborator

Sorry but I can't do this tonight. Banged my head on the codegen commit (#1396) and need to collapse now. Will have a look at this the coming day/evening.

@deusaquilus
Copy link
Collaborator

deusaquilus commented Mar 31, 2019

Okay. The capabilities have to be read during compile-time as opposed to run-time so you cannot encode them into a value, you need to encode them into a type. You can then read that back out using .returnType from a method whose return value you have set as that type:

sealed trait InsertReturnCapability
trait ReturnSingleField extends InsertReturnCapability
trait ReturnMultipleField extends InsertReturnCapability
case object ReturnSingleField extends ReturnSingleField
case object ReturnMultipleField extends ReturnMultipleField

trait Context[Idiom <: io.getquill.idiom.Idiom, Naming <: NamingStrategy] {
  type InsertReturnCapabilityType <: InsertReturnCapability
  override def insertReturnCapability:InsertReturnCapabilityType
}

// Then in a paticular sub-context you do:
trait PostgresJdbcContext[Idiom <: io.getquill.idiom.Idiom, Naming <: NamingStrategy] {
  override type InsertReturnCapabilityType = ReturnMultipleField
  override def insertReturnCapability: ReturnMultipleField = ReturnMultipleField
}

Then in your macro code do this to check the actual return type that the context has.

val capabilityType =
  c.prefix.tree.tpe.members
    .filter(_.isMethod)
    .filter(_.name == TermName("myColor"))
    .head
    .asMethod
    .returnType

// check it like this:
if (capabilityType =:= ReturnMultipleField)
  doOneThing()
else
  doAnotherThing()

In the above code I didn't abstract out into a Capabilities trait but you should definitely try to do that:

trait ReturnMultipleCapability {
  override type InsertReturnCapabilityType = ReturnSingleField
  override def insertReturnCapability:ReturnSingleField
}
// and the same for ReturnSingleCapability
// Then you tack them both onto Context

This compile-time stuff is a little mind-bending sometimes. Thanks for the good work!
Can you try this out?

@rolandjohann
Copy link
Author

Thanks for sharing. I'll try that within the next few days

@deusaquilus
Copy link
Collaborator

deusaquilus commented Apr 7, 2019

@rolandjohann Any luck? I’m going to be away on vacation until next week so my connectivity will be limited

@rolandjohann
Copy link
Author

rolandjohann commented Apr 7, 2019 via email

@rolandjohann
Copy link
Author

rolandjohann commented Apr 7, 2019

@deusaquilus got it working, but I'm not happy with the current situation: MirrorContexts can be used with Postgres idiom but the capability is bound to Context, so we can't use this feature when mirroring.

Have a nice time on vacation!

@rolandjohann
Copy link
Author

As far as I can see we can use Idiom to check for returning capability - that seems to be far more complicated to implement

@deusaquilus
Copy link
Collaborator

deusaquilus commented Apr 12, 2019

Shoot! You're right. I know this should be possible but I don't want to you wait any longer. Please PR the change without the capabilities-checking and file an issue that we need a capabilities-checking feature (in the future) in the ___Dialect specifically to prevent other databases from making statements like ... returning * that they can't actually do. Please add a note to the documentation that this feature is only supported with Postgres and will cause an invalid query with other databases. I will approve the PR and try to work on capabilities in the coming weeks.

@deusaquilus
Copy link
Collaborator

deusaquilus commented Apr 16, 2019

Okay. Here's how you do it from Dialect

sealed trait InsertReturnCapability
trait ReturnSingleField extends InsertReturnCapability
trait ReturnMultipleField extends InsertReturnCapability

trait Capabilities {
  type ReturnAfterInsert <: InsertReturnCapability
}
trait CanReturnRecordAfterInsert extends Capabilities {
   override type ReturnAfterInsert = ReturnMultipleField
}

trait PostgresDialect
  extends SqlIdiom
  with QuestionMarkBindVariables
  with ConcatSupport
  with CanReturnRecordAfterInsert /* Add this last part here */ {
  ...
}

Here's how you then check it in the macro:

// c.prefix.tree.tpe.typeArgs(0) should be the Dialect e.g. PostgresDialect
c.prefix.tree.tpe.typeArgs(0).members.find {
  case ts:TypeSymbol if (ts.asType.typeSignature =:= typeOf[ReturnMultipleField]) => true
  case _ => false
}

I haven't tried it outside of my mock example yet but going to do it. Want to give this a shot?

@rolandjohann
Copy link
Author

@deusaquilus thanks for sharing the solution, I'll try that (hopefully) at the weekend. Time's rare currently

@deusaquilus
Copy link
Collaborator

@rolandjohann How is it going? Did this approach work?

@rolandjohann
Copy link
Author

@deusaquilus sorry for the delay. Yes, your approach work perfectly.

But having a quick look at the CI tests depicts plenty of failing tests because of non provided type when returning a single field. I tested the stuff as well by setting up a clean project with snapshot dependency on local published quill libs and there I have no problems at all. Even IntelliJ is very restrictive in hinting scala code and doesn't criticise about non provided types. Can it be that this is caused by some specific compiler flag?

BTW: If I remember correctly you asked for type specification when overloading...

@deusaquilus deusaquilus mentioned this pull request May 31, 2019
5 tasks
@deusaquilus
Copy link
Collaborator

@rolandjohann It looks like we need to do returningRecord after all. I opened #1455 to test this out. Aside from that though, is the work complete? I don't see anything with RETURNING in JdbcContext.

@rolandjohann
Copy link
Author

@deusaquilus as we tied the capability to io.getquill.PostgresDialect you didn't find it in JdbcContext. The check for the capability at the parser is Option based so I didn't define the capability elsewhere. Besides the naming (and the tests) this feature works. Tested it with AsyncMysqlContext, AysncPostgresContext and MirrorContext as well.

There I didn't face the issues that let the tests crash. Do you know what's the root cause of this?

@deusaquilus
Copy link
Collaborator

deusaquilus commented Jun 20, 2019

@rolandjohann Are you sure there's nothing missing here? I just tried the following with PostgresJdbcContext (The ProductJdbcSpec test under io.getquill.context.jdbc.postgres).

val prd = Product(0L, "test1", 1L)
val insertedProduct = testContext.run {
  product.insert(_.sku -> lift(prd.sku), _.description -> lift(prd.description)).returning[Product]
}
val returnedProduct = testContext.run(productById(lift(insertedProduct.id))).head
returnedProduct mustEqual insertedProduct

... and I got:

Bad value for type long : test1
org.postgresql.util.PSQLException: Bad value for type long : test1
	at org.postgresql.jdbc.PgResultSet.toLong(PgResultSet.java:2873)

This does not produce a query that has a returning * clause:

Information:(43, 45) INSERT INTO Product (sku,description) VALUES (?, ?)
      val insertedProduct = testContext.run {

@deusaquilus deusaquilus mentioned this pull request Jun 21, 2019
5 tasks
@deusaquilus
Copy link
Collaborator

@rolandjohann I got this to compile the tests to work but it doesn't work with quill-jdbc. Could you please have a look at my above comment?

@deusaquilus
Copy link
Collaborator

Okay. I managed to compile and test this change but I cannot release it because there are a couple of problems and open questions.

@rolandjohann Taking this functionality and bolting-on the Query decoder a brilliant 1st step but there a couple of other things that have to be done before the feature is complete.
Firstly, this needs to be more thoroughly tested. There are no quill-jdbc tests that do returning[T] yet this is the most essential thing to do. If you see my code above, the problem with this is that doing this: prepare(conn.prepareStatement(sql, Statement.RETURN_GENERATED_KEYS)) or this: prepare(conn.prepareStatement(sql, Array("*"))) is highly problematic because you don't have control over the order of the columns that come out of the returned ResultSet. In my tuple example above, the encoders are the following:

new Product(
          implicitly[Decoder[Long]](longDecoder).apply(0, row), 
          implicitly[Decoder[String]](stringDecoder).apply(1, row), 
          implicitly[Decoder[Long]](longDecoder).apply(2, row))
        )

Yet because the description field comes first in the query, the error Bad value for type long : test1 will occour as I have mentioned above.

In order to resolve this issue, we need to answer an essential question. The .returning API was only ever designed to return one column that is thereby passed into JDBC as is indicated in the original Returning AST element which has a single property AST field. The second that we want to extend this into more kinds of returns, there is a serious problem because the API returning(f: E => R) gives you the false impression that it can be an arbitrary R whereas the reality is far from that. In order to extend this API we essentially have two options for returning *.

  1. Make prepare(conn.prepareStatement, ...) return all columns for the Insert[E] row-type and splice f into the final output for the result-set extractor. This is good because it would enable users to specify a nearly arbitrary f e.g. doing something like .returning(f => (fun(f.a), f.b + 1)) would be possible. The drawback of this is that we are forced to return every single column of Ewhich for large objects may be a serious problem. Also using, this approach, the encoder that needs to be used in the result-set extractor is not the encoder ofRbut rather the encoder ofEand thereturningExtractorinActionMacro` needs to be changed accordingly.

  2. Allow only a subset of things that f can be. I think a sane subset would be a: a single field: f => f.id. A tuple of fields f => (f.a, and f.b) and a case class of fields f => C(f.a, f.b). This would mean that additional things need to happen in Parsing which would parse the single-field/tuple/case-class, figure out which fields .returning actually needs and then put these into the ReturningRecord AST (or possibly the Returning AST based on how we want to do it).

@rolandjohann @fwbrasil and @getquill/maintainers. I'd love to hear your thoughts on how this should be done.

@deusaquilus
Copy link
Collaborator

It's interesting to note that actually, the reason why my query fails above is that the columns arrive in the result-set in the wrong order and break in the decoders. If we were to do decoding based on column name as opposed to position, this problem would not happen in the first place and this implementation would actually work!

@deusaquilus
Copy link
Collaborator

deusaquilus commented Jun 23, 2019

I have a possible solution #3 which I like the most:
3. Take the f in returning(f: E => R) and run it through the parser so that you get some f_AST. Then call attach Map(Entity[E], f_AST) (just be sure to substitute the original Ident for the E in your insert query and replace that with whatever exists in f_AST using BetaReduction). Then do run this through the normalizer to get: SqlQuery(SqlNormalize(Map(Entity[E], f_AST))) which should give you a FlattenSqlQuery. Then take this fsq:FlattenSqlQuery.select, and run each one through the tokenizer (you should be able to just do fsq:FlattenSqlQuery.select.map(_.tokenize). This will give you the list of actual strings you need to put into to the columnArray in prepare(conn.prepareStatement, columnArray). That way, you should be able to do things like .returning(r => (r.a+1, r.b)) and even .returning(r => (someUDF(r.a), r.b)) and it will splice into the query as RETURNING someUDF(r.a, r.b)

@deusaquilus
Copy link
Collaborator

Scratch that. Here's what I think it needs to look like:

object ExpandReturning {

  def apply(returning: Returning, idiom: SqlIdiom)(implicit naming: NamingStrategy): List[String] = {
    val Returning(_, alias, property) = returning

    // Ident("j"), Tuple(List(Property(Ident("j"), "name"), BinaryOperation(Property(Ident("j"), "age"), +, Constant(1))))
    // => Tuple(List(Constant("name"), BinaryOperation(Ident("age"), +, Constant(1))))
    val dePropertized =
      Transform(property) {
        case Property(`alias`, rest) => Ident(rest)
      }

    // Tuple(List(Constant("name"), BinaryOperation(Ident("age"), +, Constant(1))))
    // => List(Constant("name"), BinaryOperation(Ident("age"), +, Constant(1)))
    val deTuplified = dePropertized match {
      case Tuple(values) => values
      case CaseClass(values) => values.map(_._2)
      case other => List(other)
    }

    deTuplified
      .map(field => idiom.defaultTokenizer.token(field))
      .map(_.toString)
  }
}

@deusaquilus
Copy link
Collaborator

I'm continuing this work in #1489. Bunch of stuff left to do but progress is being made.

@rolandjohann
Copy link
Author

rolandjohann commented Jun 29, 2019

@deusaquilus sorry for the disruption on this implementation. Unfortunately I have no time to contribute to this feature as I would like to.

My experiments and implementation used INSERT INTO ... RETURNING * assuming the query use .returning[Foo] with Foo having fields in exact the same order as the table definition.

With your suggested solution, will it still be possible to have an API like .returning[Foo] to not have to implement row deserialization manually .returning(r => Foo(r.id, r.name, ...))?

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@phenetic phenetic closed this by deleting the head repository Jan 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

UPDATE ... RETURNING *
4 participants