New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Codegen : output several files instead of one big Tables.scala #906

Closed
olivergg opened this Issue Jul 12, 2014 · 30 comments

Comments

Projects
None yet
@olivergg

I think it would be great if there was a way to output the generated classes into several files (one per class) instead of having one big Tables.scala file.

@nafg

This comment has been minimized.

Show comment
Hide comment
@nafg

nafg Jul 16, 2014

Contributor

+1
On Jul 12, 2014 2:31 PM, "olivergg" notifications@github.com wrote:

I think it would be great if there was a way to output the generated
classes into several files (one per class) instead of having one big
Tables.scala file.

Reply to this email directly or view it on GitHub
#906.

Contributor

nafg commented Jul 16, 2014

+1
On Jul 12, 2014 2:31 PM, "olivergg" notifications@github.com wrote:

I think it would be great if there was a way to output the generated
classes into several files (one per class) instead of having one big
Tables.scala file.

Reply to this email directly or view it on GitHub
#906.

@olivergg

This comment has been minimized.

Show comment
Hide comment
@olivergg

olivergg Sep 8, 2014

Well, it seems more complicated than I've originally thought....because of the lazy val and implicit dependencies between all the tables.

But it should just be a question of rewriting in AbstractSourceCodeGenerator.scala, the code method

    "import scala.slick.model.ForeignKeyAction\n" +
    ( if(tables.exists(_.hlistEnabled)){
        "import scala.slick.collection.heterogenous._\n"+
        "import scala.slick.collection.heterogenous.syntax._\n"
      } else ""
    ) +
    ( if(tables.exists(_.PlainSqlMapper.enabled)){
        "// NOTE: GetResult mappers for plain SQL are only generated for tables where Slick knows how to map the types of all columns.\n"+
        "import scala.slick.jdbc.{GetResult => GR}\n"
      } else ""
    ) +
    "\n/** DDL for all tables. Call .create to execute. */\nlazy val ddl = " + tables.map(_.TableValue.name + ".ddl").mkString(" ++ ") +
    "\n\n" +
    tables.map(_.code.mkString("\n")).mkString("\n\n")

into something like

 tables.map(table => {
      "import scala.slick.model.ForeignKeyAction\n" +
        (if (table.hlistEnabled) {
          "import scala.slick.collection.heterogenous._\n" +
            "import scala.slick.collection.heterogenous.syntax._\n"
        }
        else "") +
        (if (table.PlainSqlMapper.enabled) {
          "// NOTE: GetResult mappers for plain SQL are only generated for tables where Slick knows how to map the types of all columns.\n" +
            "import scala.slick.jdbc.{GetResult => GR}\n"
        }
        else "")

      (table.TableValue.name+"Table.scala",table.code.mkString("\n"))
    }).toMap

and then iterate over the resulting map to output several files.

The main problem, is that all the lazy val (the TableQuery value that are the main purpose of the generated code I guess) should be placed in an equivalent of the existing Tables.scala.

For example, assuming the following basic schema (in a postgres database)

          ADDRESS
---------+-------------------+---------------
 id      | integer           | non NULL
 street  | character varying | 
 city    | character varying | 

and

           PERSON
------------+-------------------+---------------
 id         | integer           | non NULL
 name       | character varying | 
 age        | integer           | 
 address_id | integer           | 

We would have something like this generated in a com.xxx package :

AddressTable.scala

object AddressTable {
 /** Entity class storing rows of table Address
   *  @param id Database column id DBType(int4), PrimaryKey
   *  @param street Database column street DBType(varchar), Length(2147483647,true), Default(None)
   *  @param city Database column city DBType(varchar), Length(2147483647,true), Default(None) */
  case class AddressRow(id: Int, street: Option[String] = None, city: Option[String] = None)
  /** GetResult implicit for fetching AddressRow objects using plain SQL queries */
  implicit def GetResultAddressRow(implicit e0: GR[Int], e1: GR[Option[String]]): GR[AddressRow] = GR{
    prs => import prs._
    AddressRow.tupled((<<[Int], <<?[String], <<?[String]))
  }
  /** Table description of table address. Objects of this class serve as prototypes for rows in queries. */
  class Address(_tableTag: Tag) extends Table[AddressRow](_tableTag, "address") {
    def * = (id, street, city) <> (AddressRow.tupled, AddressRow.unapply)
    /** Maps whole row to an option. Useful for outer joins. */
    def ? = (id.?, street, city).shaped.<>({r=>import r._; _1.map(_=> AddressRow.tupled((_1.get, _2, _3)))}, (_:Any) =>  throw new Exception("Inserting into ? projection not supported."))

    /** Database column id DBType(int4), PrimaryKey */
    val id: Column[Int] = column[Int]("id", O.PrimaryKey)
    /** Database column street DBType(varchar), Length(2147483647,true), Default(None) */
    val street: Column[Option[String]] = column[Option[String]]("street", O.Length(2147483647,varying=true), O.Default(None))
    /** Database column city DBType(varchar), Length(2147483647,true), Default(None) */
    val city: Column[Option[String]] = column[Option[String]]("city", O.Length(2147483647,varying=true), O.Default(None))
  }
}

PersonTable.scala

object PersonTable {
/** Entity class storing rows of table Person
   *  @param id Database column id DBType(int4), PrimaryKey
   *  @param name Database column name DBType(varchar), Length(2147483647,true), Default(None)
   *  @param age Database column age DBType(int4), Default(None)
   *  @param addressId Database column address_id DBType(int4), Default(None) */
  case class PersonRow(id: Int, name: Option[String] = None, age: Option[Int] = None, addressId: Option[Int] = None)
  /** GetResult implicit for fetching PersonRow objects using plain SQL queries */
  implicit def GetResultPersonRow(implicit e0: GR[Int], e1: GR[Option[String]], e2: GR[Option[Int]]): GR[PersonRow] = GR{
    prs => import prs._
    PersonRow.tupled((<<[Int], <<?[String], <<?[Int], <<?[Int]))
  }
  /** Table description of table person. Objects of this class serve as prototypes for rows in queries. */
  class Person(_tableTag: Tag) extends Table[PersonRow](_tableTag, "person") {
    def * = (id, name, age, addressId) <> (PersonRow.tupled, PersonRow.unapply)
    /** Maps whole row to an option. Useful for outer joins. */
    def ? = (id.?, name, age, addressId).shaped.<>({r=>import r._; _1.map(_=> PersonRow.tupled((_1.get, _2, _3, _4)))}, (_:Any) =>  throw new Exception("Inserting into ? projection not supported."))

    /** Database column id DBType(int4), PrimaryKey */
    val id: Column[Int] = column[Int]("id", O.PrimaryKey)
    /** Database column name DBType(varchar), Length(2147483647,true), Default(None) */
    val name: Column[Option[String]] = column[Option[String]]("name", O.Length(2147483647,varying=true), O.Default(None))
    /** Database column age DBType(int4), Default(None) */
    val age: Column[Option[Int]] = column[Option[Int]]("age", O.Default(None))
    /** Database column address_id DBType(int4), Default(None) */
    val addressId: Column[Option[Int]] = column[Option[Int]]("address_id", O.Default(None))

    /** Foreign key referencing Address (database name person_address_id_fkey) */
    lazy val addressFk = foreignKey("person_address_id_fkey", addressId, Address)(r => r.id, onUpdate=ForeignKeyAction.NoAction, onDelete=ForeignKeyAction.NoAction)
  }
}

and the Tables.scala file :

package com.xxx
// AUTO-GENERATED Slick data model
/** Stand-alone Slick data model for immediate use */
object Tables extends {
  val profile = scala.slick.driver.PostgresDriver
} with Tables

/** Slick data model trait for extension, choice of backend or usage in the cake pattern. (Make sure to initialize this late.) */
trait Tables {
  val profile: scala.slick.driver.JdbcProfile
  import profile.simple._
  import scala.slick.model.ForeignKeyAction
  // NOTE: GetResult mappers for plain SQL are only generated for tables where Slick knows how to map the types of all columns.
  import scala.slick.jdbc.{GetResult => GR}

  /** DDL for all tables. Call .create to execute. */
  lazy val ddl = Address.ddl ++ Person.ddl

  /** Collection-like TableQuery object for table Address */
  lazy val Address = new TableQuery(tag => new Address(tag))

    /** Collection-like TableQuery object for table Person */
  lazy val Person = new TableQuery(tag => new Person(tag))
}

I have yet to find a way to make this work...especially the cyclic dependencies...each XXXTable depends on the Tables.scala file (to reference any other lazy val from it), and Tables.scala depends on all the XXXTable

olivergg commented Sep 8, 2014

Well, it seems more complicated than I've originally thought....because of the lazy val and implicit dependencies between all the tables.

But it should just be a question of rewriting in AbstractSourceCodeGenerator.scala, the code method

    "import scala.slick.model.ForeignKeyAction\n" +
    ( if(tables.exists(_.hlistEnabled)){
        "import scala.slick.collection.heterogenous._\n"+
        "import scala.slick.collection.heterogenous.syntax._\n"
      } else ""
    ) +
    ( if(tables.exists(_.PlainSqlMapper.enabled)){
        "// NOTE: GetResult mappers for plain SQL are only generated for tables where Slick knows how to map the types of all columns.\n"+
        "import scala.slick.jdbc.{GetResult => GR}\n"
      } else ""
    ) +
    "\n/** DDL for all tables. Call .create to execute. */\nlazy val ddl = " + tables.map(_.TableValue.name + ".ddl").mkString(" ++ ") +
    "\n\n" +
    tables.map(_.code.mkString("\n")).mkString("\n\n")

into something like

 tables.map(table => {
      "import scala.slick.model.ForeignKeyAction\n" +
        (if (table.hlistEnabled) {
          "import scala.slick.collection.heterogenous._\n" +
            "import scala.slick.collection.heterogenous.syntax._\n"
        }
        else "") +
        (if (table.PlainSqlMapper.enabled) {
          "// NOTE: GetResult mappers for plain SQL are only generated for tables where Slick knows how to map the types of all columns.\n" +
            "import scala.slick.jdbc.{GetResult => GR}\n"
        }
        else "")

      (table.TableValue.name+"Table.scala",table.code.mkString("\n"))
    }).toMap

and then iterate over the resulting map to output several files.

The main problem, is that all the lazy val (the TableQuery value that are the main purpose of the generated code I guess) should be placed in an equivalent of the existing Tables.scala.

For example, assuming the following basic schema (in a postgres database)

          ADDRESS
---------+-------------------+---------------
 id      | integer           | non NULL
 street  | character varying | 
 city    | character varying | 

and

           PERSON
------------+-------------------+---------------
 id         | integer           | non NULL
 name       | character varying | 
 age        | integer           | 
 address_id | integer           | 

We would have something like this generated in a com.xxx package :

AddressTable.scala

object AddressTable {
 /** Entity class storing rows of table Address
   *  @param id Database column id DBType(int4), PrimaryKey
   *  @param street Database column street DBType(varchar), Length(2147483647,true), Default(None)
   *  @param city Database column city DBType(varchar), Length(2147483647,true), Default(None) */
  case class AddressRow(id: Int, street: Option[String] = None, city: Option[String] = None)
  /** GetResult implicit for fetching AddressRow objects using plain SQL queries */
  implicit def GetResultAddressRow(implicit e0: GR[Int], e1: GR[Option[String]]): GR[AddressRow] = GR{
    prs => import prs._
    AddressRow.tupled((<<[Int], <<?[String], <<?[String]))
  }
  /** Table description of table address. Objects of this class serve as prototypes for rows in queries. */
  class Address(_tableTag: Tag) extends Table[AddressRow](_tableTag, "address") {
    def * = (id, street, city) <> (AddressRow.tupled, AddressRow.unapply)
    /** Maps whole row to an option. Useful for outer joins. */
    def ? = (id.?, street, city).shaped.<>({r=>import r._; _1.map(_=> AddressRow.tupled((_1.get, _2, _3)))}, (_:Any) =>  throw new Exception("Inserting into ? projection not supported."))

    /** Database column id DBType(int4), PrimaryKey */
    val id: Column[Int] = column[Int]("id", O.PrimaryKey)
    /** Database column street DBType(varchar), Length(2147483647,true), Default(None) */
    val street: Column[Option[String]] = column[Option[String]]("street", O.Length(2147483647,varying=true), O.Default(None))
    /** Database column city DBType(varchar), Length(2147483647,true), Default(None) */
    val city: Column[Option[String]] = column[Option[String]]("city", O.Length(2147483647,varying=true), O.Default(None))
  }
}

PersonTable.scala

object PersonTable {
/** Entity class storing rows of table Person
   *  @param id Database column id DBType(int4), PrimaryKey
   *  @param name Database column name DBType(varchar), Length(2147483647,true), Default(None)
   *  @param age Database column age DBType(int4), Default(None)
   *  @param addressId Database column address_id DBType(int4), Default(None) */
  case class PersonRow(id: Int, name: Option[String] = None, age: Option[Int] = None, addressId: Option[Int] = None)
  /** GetResult implicit for fetching PersonRow objects using plain SQL queries */
  implicit def GetResultPersonRow(implicit e0: GR[Int], e1: GR[Option[String]], e2: GR[Option[Int]]): GR[PersonRow] = GR{
    prs => import prs._
    PersonRow.tupled((<<[Int], <<?[String], <<?[Int], <<?[Int]))
  }
  /** Table description of table person. Objects of this class serve as prototypes for rows in queries. */
  class Person(_tableTag: Tag) extends Table[PersonRow](_tableTag, "person") {
    def * = (id, name, age, addressId) <> (PersonRow.tupled, PersonRow.unapply)
    /** Maps whole row to an option. Useful for outer joins. */
    def ? = (id.?, name, age, addressId).shaped.<>({r=>import r._; _1.map(_=> PersonRow.tupled((_1.get, _2, _3, _4)))}, (_:Any) =>  throw new Exception("Inserting into ? projection not supported."))

    /** Database column id DBType(int4), PrimaryKey */
    val id: Column[Int] = column[Int]("id", O.PrimaryKey)
    /** Database column name DBType(varchar), Length(2147483647,true), Default(None) */
    val name: Column[Option[String]] = column[Option[String]]("name", O.Length(2147483647,varying=true), O.Default(None))
    /** Database column age DBType(int4), Default(None) */
    val age: Column[Option[Int]] = column[Option[Int]]("age", O.Default(None))
    /** Database column address_id DBType(int4), Default(None) */
    val addressId: Column[Option[Int]] = column[Option[Int]]("address_id", O.Default(None))

    /** Foreign key referencing Address (database name person_address_id_fkey) */
    lazy val addressFk = foreignKey("person_address_id_fkey", addressId, Address)(r => r.id, onUpdate=ForeignKeyAction.NoAction, onDelete=ForeignKeyAction.NoAction)
  }
}

and the Tables.scala file :

package com.xxx
// AUTO-GENERATED Slick data model
/** Stand-alone Slick data model for immediate use */
object Tables extends {
  val profile = scala.slick.driver.PostgresDriver
} with Tables

/** Slick data model trait for extension, choice of backend or usage in the cake pattern. (Make sure to initialize this late.) */
trait Tables {
  val profile: scala.slick.driver.JdbcProfile
  import profile.simple._
  import scala.slick.model.ForeignKeyAction
  // NOTE: GetResult mappers for plain SQL are only generated for tables where Slick knows how to map the types of all columns.
  import scala.slick.jdbc.{GetResult => GR}

  /** DDL for all tables. Call .create to execute. */
  lazy val ddl = Address.ddl ++ Person.ddl

  /** Collection-like TableQuery object for table Address */
  lazy val Address = new TableQuery(tag => new Address(tag))

    /** Collection-like TableQuery object for table Person */
  lazy val Person = new TableQuery(tag => new Person(tag))
}

I have yet to find a way to make this work...especially the cyclic dependencies...each XXXTable depends on the Tables.scala file (to reference any other lazy val from it), and Tables.scala depends on all the XXXTable

@cvogt

This comment has been minimized.

Show comment
Hide comment
@cvogt

cvogt Sep 8, 2014

Member

One way to keep things similar to how they are, but split things into separate files and resolve cyclic dependencies are self types:

trait AddressTable{
  self: Tables =>
  ...
}
trait PersonTable{
  self: Tables =>
  ...
}
trait Tables extends AddressTable with PersonTable
Member

cvogt commented Sep 8, 2014

One way to keep things similar to how they are, but split things into separate files and resolve cyclic dependencies are self types:

trait AddressTable{
  self: Tables =>
  ...
}
trait PersonTable{
  self: Tables =>
  ...
}
trait Tables extends AddressTable with PersonTable

@cvogt cvogt added this to the 2.2.0 milestone Sep 8, 2014

@cvogt cvogt self-assigned this Sep 8, 2014

@olivergg

This comment has been minimized.

Show comment
Hide comment
@olivergg

olivergg Sep 8, 2014

I did not know about self types...seems to be an elegant way to solve this. Thanks.

olivergg commented Sep 8, 2014

I did not know about self types...seems to be an elegant way to solve this. Thanks.

@cvogt

This comment has been minimized.

Show comment
Hide comment
@cvogt

cvogt Sep 8, 2014

Member

Willing to contribute a patch for 2.2 :)?

Member

cvogt commented Sep 8, 2014

Willing to contribute a patch for 2.2 :)?

@olivergg

This comment has been minimized.

Show comment
Hide comment
@olivergg

olivergg Sep 8, 2014

Yep sure... for now, thanks to your solution, I've just managed to write by hand the final desired output files (it compiles).
The only remaining things to do now are :

  • create a new method in AbstractSourceCodeGenerator to write to a Map[String,String] instead of one big String (a new method called codePerFile for example).
  • change OutputHelpers as well to have a new method that writes the multiple files to an output folder.
  • clean up and prepare a tmp branch.

I'll let you know here about my progress.

olivergg commented Sep 8, 2014

Yep sure... for now, thanks to your solution, I've just managed to write by hand the final desired output files (it compiles).
The only remaining things to do now are :

  • create a new method in AbstractSourceCodeGenerator to write to a Map[String,String] instead of one big String (a new method called codePerFile for example).
  • change OutputHelpers as well to have a new method that writes the multiple files to an output folder.
  • clean up and prepare a tmp branch.

I'll let you know here about my progress.

@olivergg

This comment has been minimized.

Show comment
Hide comment
@olivergg

olivergg Sep 10, 2014

cvogt => I have a working patch created in a local branch of my own. How should I proceed ? Thanks.

I'm also trying to figure out a way to write a junit test for this use case. I still don't quite understand how the codegen unit tests works.

cvogt => I have a working patch created in a local branch of my own. How should I proceed ? Thanks.

I'm also trying to figure out a way to write a junit test for this use case. I still don't quite understand how the codegen unit tests works.

@cvogt

This comment has been minimized.

Show comment
Hide comment
@cvogt

cvogt Sep 10, 2014

Member

Can you fork the project, push the branch to your for, then open a PR against our repo here? CodeGeneratorTest.scala contains customized code generators. You can add your own. GeneratedCodeTest.scala runs tests against generated code.

Run the tests using test-only -- *GeneratedCodeTest*

Member

cvogt commented Sep 10, 2014

Can you fork the project, push the branch to your for, then open a PR against our repo here? CodeGeneratorTest.scala contains customized code generators. You can add your own. GeneratedCodeTest.scala runs tests against generated code.

Run the tests using test-only -- *GeneratedCodeTest*

@olivergg

This comment has been minimized.

Show comment
Hide comment
@olivergg

olivergg Sep 10, 2014

I've just forked the project and pushed my local branch. But shouldn't I write a proper junit test before creating a PR ? The commit is olivergg@bf35cdc

I've just forked the project and pushed my local branch. But shouldn't I write a proper junit test before creating a PR ? The commit is olivergg@bf35cdc

@cvogt

This comment has been minimized.

Show comment
Hide comment
@cvogt

cvogt Sep 10, 2014

Member

A PR is also for code review. We need a test before it gets merged. Any changes you push to the same branch after opening the PR will show up in the PR as well. A PR is synced with the branch it is opened from. So feel free to open a PR now. Makes it easier to track comments on it.

Are you clear about how to write the test?

Member

cvogt commented Sep 10, 2014

A PR is also for code review. We need a test before it gets merged. Any changes you push to the same branch after opening the PR will show up in the PR as well. A PR is synced with the branch it is opened from. So feel free to open a PR now. Makes it easier to track comments on it.

Are you clear about how to write the test?

@olivergg

This comment has been minimized.

Show comment
Hide comment
@olivergg

olivergg Sep 10, 2014

Done. I'll leave my comment there from now on.

Done. I'll leave my comment there from now on.

@mohittt8

This comment has been minimized.

Show comment
Hide comment

+1

@satendrakumar

This comment has been minimized.

Show comment
Hide comment
@satendrakumar

satendrakumar Dec 23, 2014

Contributor

+1

Contributor

satendrakumar commented Dec 23, 2014

+1

@olivergg

This comment has been minimized.

Show comment
Hide comment
@olivergg

olivergg Dec 26, 2014

BTW, the PR is there #987
I don't have time right now to finish the empty new unit test (it's not much work though).

For those wondering whether it would improve incremental compilation speed (as I've first hoped), it is more complicated than that....the implementation has been easily done using self types (thanks to @cvogt), but it solves the circular dependencies problem by making each trait depends on every other traits (basically)...so one cannot leverage incremental compilation. This can become problematic with hundred of tables (around 60 sec of recompilation time for 143 tables last time I've tested it).

BTW, the PR is there #987
I don't have time right now to finish the empty new unit test (it's not much work though).

For those wondering whether it would improve incremental compilation speed (as I've first hoped), it is more complicated than that....the implementation has been easily done using self types (thanks to @cvogt), but it solves the circular dependencies problem by making each trait depends on every other traits (basically)...so one cannot leverage incremental compilation. This can become problematic with hundred of tables (around 60 sec of recompilation time for 143 tables last time I've tested it).

@nafg

This comment has been minimized.

Show comment
Hide comment
@nafg

nafg Dec 26, 2014

Contributor

You could write some algorithm to generate the dependency graph...
Or you could break it into 3 layers: case classes, abstract declarations,
and implementations...

On Fri, Dec 26, 2014, 6:55 AM olivergg notifications@github.com wrote:

BTW, the PR is there #987 #987
I don't have time right now to finish the empty new unit test (it's not
much work though).

For those wondering whether it would improve incremental compilation speed
(as I've first hoped), it is more complicated than that....the
implementation has been easily done using self types (thanks to @cvogt
https://github.com/cvogt), but it solves the circular dependencies
problem by making each trait depends on every other traits (basically)...so
one cannot leverage incremental compilation. This can become problematic
with hundred of tables (around 60 sec of recompilation time for 143 tables
last time I've tested it).

Reply to this email directly or view it on GitHub
#906 (comment).

Contributor

nafg commented Dec 26, 2014

You could write some algorithm to generate the dependency graph...
Or you could break it into 3 layers: case classes, abstract declarations,
and implementations...

On Fri, Dec 26, 2014, 6:55 AM olivergg notifications@github.com wrote:

BTW, the PR is there #987 #987
I don't have time right now to finish the empty new unit test (it's not
much work though).

For those wondering whether it would improve incremental compilation speed
(as I've first hoped), it is more complicated than that....the
implementation has been easily done using self types (thanks to @cvogt
https://github.com/cvogt), but it solves the circular dependencies
problem by making each trait depends on every other traits (basically)...so
one cannot leverage incremental compilation. This can become problematic
with hundred of tables (around 60 sec of recompilation time for 143 tables
last time I've tested it).

Reply to this email directly or view it on GitHub
#906 (comment).

@olivergg

This comment has been minimized.

Show comment
Hide comment
@olivergg

olivergg Jan 3, 2015

@nafg yep, that was my first idea, but I really don't know how to do that given the current code generation implementation. What about the 3-layers split ? Can you give me an example on the Category+Post tables.
BTW, I also think one could improve the documentation about contributing to the codegen project...writing a Unit test is not that difficult, but it took me a while to get it working.

1) clone the slick project somewhere and switch to the branch of https://github.com/slick/slick/pull/987 
2) sbt #run sbt 
3) project testkit #switch to the testkit project
4) testOnly scala.slick.test.codegen.GeneratedCodeTest
5) The generated files are in target/scala-2.XX/src_managed/test/slick-codegen/scala/slick/test/codegen/generated/multiple

We can then see that PostsTable depends on the lazy val Categories defined in CategoriesTable.

Instead of using self:CGMultipleTables =>, we could use the proper number of imports for the required tables.

First, one have to make this work on a basic example of the generated code and then (the hard part) would be to generate the code ;)

olivergg commented Jan 3, 2015

@nafg yep, that was my first idea, but I really don't know how to do that given the current code generation implementation. What about the 3-layers split ? Can you give me an example on the Category+Post tables.
BTW, I also think one could improve the documentation about contributing to the codegen project...writing a Unit test is not that difficult, but it took me a while to get it working.

1) clone the slick project somewhere and switch to the branch of https://github.com/slick/slick/pull/987 
2) sbt #run sbt 
3) project testkit #switch to the testkit project
4) testOnly scala.slick.test.codegen.GeneratedCodeTest
5) The generated files are in target/scala-2.XX/src_managed/test/slick-codegen/scala/slick/test/codegen/generated/multiple

We can then see that PostsTable depends on the lazy val Categories defined in CategoriesTable.

Instead of using self:CGMultipleTables =>, we could use the proper number of imports for the required tables.

First, one have to make this work on a basic example of the generated code and then (the hard part) would be to generate the code ;)

@szeiger szeiger modified the milestones: 3.1.0, 3.0.0 Jan 29, 2015

@tculshaw

This comment has been minimized.

Show comment
Hide comment
@tculshaw

tculshaw Mar 8, 2015

Hi,

I've been mucking about with the ofbiz schema in postgresql. It's got a terrifying 833 tables in it, and it's a highly normalised schema.

The Tables.scala generated is over 49,000 lines long. Compiler blows up no matter how many gig I throw at it in SBT_OPTS.

So I'm watching this issue with interest. It's gonna be a terrifying compile anyway even if split up with 833 files and many dependencies via foreign keys.

I'm happy to supply the pg_dump output if you want something hideous to test with. If you can get this working, then you've probably solved most problems anyone is gonna come up against;)

tculshaw commented Mar 8, 2015

Hi,

I've been mucking about with the ofbiz schema in postgresql. It's got a terrifying 833 tables in it, and it's a highly normalised schema.

The Tables.scala generated is over 49,000 lines long. Compiler blows up no matter how many gig I throw at it in SBT_OPTS.

So I'm watching this issue with interest. It's gonna be a terrifying compile anyway even if split up with 833 files and many dependencies via foreign keys.

I'm happy to supply the pg_dump output if you want something hideous to test with. If you can get this working, then you've probably solved most problems anyone is gonna come up against;)

@olivergg

This comment has been minimized.

Show comment
Hide comment
@olivergg

olivergg Mar 8, 2015

@tculshaw Are you using the fix from #978 ?

Compiler blows up no matter how many gig I throw at it in SBT_OPTS.

Have you waited long enough ? Is there any exception ?
Regarding the code generation, one could argue that the generated code could be put in a separate library and then be compiled once to avoid recompilation issues.

olivergg commented Mar 8, 2015

@tculshaw Are you using the fix from #978 ?

Compiler blows up no matter how many gig I throw at it in SBT_OPTS.

Have you waited long enough ? Is there any exception ?
Regarding the code generation, one could argue that the generated code could be put in a separate library and then be compiled once to avoid recompilation issues.

@tculshaw

This comment has been minimized.

Show comment
Hide comment
@tculshaw

tculshaw Mar 8, 2015

I wait for around 30 minutes and then it blows with an OutOfMemory exception.

Happy to wait an hour and if I can get a good compile run I will certainly jar it up.

My SBT_OPTS are

export SBT_OPTS="-Xmx2G -XX:+UseConcMarkSweepGC -XX:+CMSClassUnloadingEnabled -XX:PermSize=512M -XX:MaxPermSize=2G -Xss512M"

And the full log...

[info] Compiling 3 Scala sources to /home/tony/workspace/ofbiz-scala/target/scala-2.11/classes...
[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
[error] Use 'last' for the full log.

last
[debug] Running task... Cancelable: false, check cycles: false
[debug]
[debug] Initial source changes:
[debug] removed:Set()
[debug] added: Set(/home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Example.scala, /home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Tables.scala, /home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Domain.scala)
[debug] modified: Set()
[debug] Removed products: Set()
[debug] Modified external sources: Set()
[debug] Modified binary dependencies: Set()
[debug] Initial directly invalidated sources: Set(/home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Example.scala, /home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Tables.scala, /home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Domain.scala)
[debug]
[debug] Sources indirectly invalidated by:
[debug] product: Set()
[debug] binary dep: Set()
[debug] external source: Set()
[debug] All initially invalidated sources: Set(/home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Example.scala, /home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Tables.scala, /home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Domain.scala)
[debug] Recompiling all 3 sources: invalidated sources (3) exceeded 50.0% of all sources
[info] Compiling 3 Scala sources to /home/tony/workspace/ofbiz-scala/target/scala-2.11/classes...
[debug] Getting compiler-interface from component compiler for Scala 2.11.4
[debug] Other repositories:
[debug] Default repositories:
[debug] FileRepository(local,FileConfiguration(true,None),Patterns(ivyPatterns=List(${ivy.home}/local/[organisation]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/artifact.[ext]), artifactPatterns=List(${ivy.home}/local/[organisation]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/artifact.[ext]), isMavenCompatible=false))
[debug] Getting compiler-interface from component compiler for Scala 2.11.4
[debug] Other repositories:
[debug] Default repositories:
[debug] FileRepository(local,FileConfiguration(true,None),Patterns(ivyPatterns=List(${ivy.home}/local/[organisation]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/artifact.[ext]), artifactPatterns=List(${ivy.home}/local/[organisation]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/artifact.[ext]), isMavenCompatible=false))
[debug] Running cached compiler 85ae75c, interfacing (CompilerInterface) with Scala compiler version 2.11.4
[debug] Calling Scala compiler with arguments (CompilerInterface):
[debug] -bootclasspath
[debug] /usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/resources.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/rt.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/sunrsasign.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/jsse.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/jce.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/charsets.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/netx.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/plugin.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/rhino.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/jfr.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/classes:/home/tony/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.4.jar
[debug] -classpath
[debug] /home/tony/workspace/ofbiz-scala/target/scala-2.11/classes:/home/tony/.ivy2/cache/com.typesafe.slick/slick_2.11/bundles/slick_2.11-3.0.0-RC1.jar:/home/tony/.ivy2/cache/org.slf4j/slf4j-api/jars/slf4j-api-1.6.4.jar:/home/tony/.ivy2/cache/com.typesafe/config/bundles/config-1.2.1.jar:/home/tony/.ivy2/cache/org.reactivestreams/reactive-streams/jars/reactive-streams-1.0.0.RC3.jar:/home/tony/.ivy2/cache/com.typesafe.slick/slick-codegen_2.11/jars/slick-codegen_2.11-3.0.0-RC1.jar:/home/tony/.ivy2/cache/org.slf4j/slf4j-nop/jars/slf4j-nop-1.6.4.jar:/home/tony/.ivy2/cache/org.postgresql/postgresql/jars/postgresql-9.3-1103-jdbc41.jar:/home/tony/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.11.4.jar
java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:188)
at sbt.ConcurrentRestrictions$$anon$4.take(ConcurrentRestrictions.scala:196)
at sbt.Execute.next$1(Execute.scala:88)
at sbt.Execute.processAll(Execute.scala:91)
at sbt.Execute.runKeep(Execute.scala:69)
at sbt.EvaluateTask$.liftedTree1$1(EvaluateTask.scala:183)
at sbt.EvaluateTask$.sbt$EvaluateTask$$run$1(EvaluateTask.scala:183)
at sbt.EvaluateTask$.runTask(EvaluateTask.scala:198)
at sbt.Aggregation$$anonfun$3.apply(Aggregation.scala:67)
at sbt.Aggregation$$anonfun$3.apply(Aggregation.scala:65)
at sbt.EvaluateTask$.withStreams(EvaluateTask.scala:153)
at sbt.Aggregation$.timedRun(Aggregation.scala:65)
at sbt.Aggregation$.runTasks(Aggregation.scala:74)
at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:34)
at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:33)
at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:62)
at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:62)
at sbt.Command$.process(Command.scala:95)
at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:87)
at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:87)
at sbt.State$$anon$1.process(State.scala:176)
at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:87)
at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:87)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.MainLoop$.next(MainLoop.scala:87)
at sbt.MainLoop$.run(MainLoop.scala:80)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:69)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:66)
at sbt.Using.apply(Using.scala:25)
at sbt.MainLoop$.runWithNewLog(MainLoop.scala:66)
at sbt.MainLoop$.runAndClearLast(MainLoop.scala:49)
at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:33)
at sbt.MainLoop$.runLogged(MainLoop.scala:25)
at sbt.xMain.run(Main.scala:26)
at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
at xsbt.boot.Launch$.withContextLoader(Launch.scala:129)
at xsbt.boot.Launch$.run(Launch.scala:109)
at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:36)
at xsbt.boot.Launch$.launch(Launch.scala:117)
at xsbt.boot.Launch$.apply(Launch.scala:19)
at xsbt.boot.Boot$.runImpl(Boot.scala:44)
at xsbt.boot.Boot$.main(Boot.scala:20)
at xsbt.boot.Boot.main(Boot.scala)
Caused by: java.lang.OutOfMemoryError: Java heap space
at java.util.jar.Manifest$FastInputStream.(Manifest.java:332)
at java.util.jar.Manifest$FastInputStream.(Manifest.java:327)
at java.util.jar.Manifest.read(Manifest.java:195)
at java.util.jar.Manifest.(Manifest.java:69)
at java.util.jar.JarFile.getManifestFromReference(JarFile.java:185)
at java.util.jar.JarFile.getManifest(JarFile.java:166)
at sun.misc.URLClassPath$JarLoader$2.getManifest(URLClassPath.java:779)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:416)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at scala.tools.nsc.typechecker.Namers$Namer.typeErrorHandler(Namers.scala:111)
at scala.tools.nsc.typechecker.Namers$Namer.typeSig(Namers.scala:1539)
at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply$mcV$sp(Namers.scala:778)
at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply(Namers.scala:777)
at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply(Namers.scala:777)
at scala.tools.nsc.typechecker.Namers$Namer.scala$tools$nsc$typechecker$Namers$Namer$$logAndValidate(Namers.scala:1565)
at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1.apply(Namers.scala:777)
at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1.apply(Namers.scala:769)
at scala.tools.nsc.typechecker.Namers$$anon$1.completeImpl(Namers.scala:1681)
at scala.tools.nsc.typechecker.Namers$LockingTypeCompleter$class.complete(Namers.scala:1689)
at scala.tools.nsc.typechecker.Namers$$anon$1.complete(Namers.scala:1679)
at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1481)
at scala.reflect.internal.Symbols$Symbol.initialize(Symbols.scala:1628)
at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:4911)
at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5295)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5322)
at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5269)
[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space

tculshaw commented Mar 8, 2015

I wait for around 30 minutes and then it blows with an OutOfMemory exception.

Happy to wait an hour and if I can get a good compile run I will certainly jar it up.

My SBT_OPTS are

export SBT_OPTS="-Xmx2G -XX:+UseConcMarkSweepGC -XX:+CMSClassUnloadingEnabled -XX:PermSize=512M -XX:MaxPermSize=2G -Xss512M"

And the full log...

[info] Compiling 3 Scala sources to /home/tony/workspace/ofbiz-scala/target/scala-2.11/classes...
[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
[error] Use 'last' for the full log.

last
[debug] Running task... Cancelable: false, check cycles: false
[debug]
[debug] Initial source changes:
[debug] removed:Set()
[debug] added: Set(/home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Example.scala, /home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Tables.scala, /home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Domain.scala)
[debug] modified: Set()
[debug] Removed products: Set()
[debug] Modified external sources: Set()
[debug] Modified binary dependencies: Set()
[debug] Initial directly invalidated sources: Set(/home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Example.scala, /home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Tables.scala, /home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Domain.scala)
[debug]
[debug] Sources indirectly invalidated by:
[debug] product: Set()
[debug] binary dep: Set()
[debug] external source: Set()
[debug] All initially invalidated sources: Set(/home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Example.scala, /home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Tables.scala, /home/tony/workspace/ofbiz-scala/src/main/scala/com/mentation/ofbiz/Domain.scala)
[debug] Recompiling all 3 sources: invalidated sources (3) exceeded 50.0% of all sources
[info] Compiling 3 Scala sources to /home/tony/workspace/ofbiz-scala/target/scala-2.11/classes...
[debug] Getting compiler-interface from component compiler for Scala 2.11.4
[debug] Other repositories:
[debug] Default repositories:
[debug] FileRepository(local,FileConfiguration(true,None),Patterns(ivyPatterns=List(${ivy.home}/local/[organisation]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/artifact.[ext]), artifactPatterns=List(${ivy.home}/local/[organisation]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/artifact.[ext]), isMavenCompatible=false))
[debug] Getting compiler-interface from component compiler for Scala 2.11.4
[debug] Other repositories:
[debug] Default repositories:
[debug] FileRepository(local,FileConfiguration(true,None),Patterns(ivyPatterns=List(${ivy.home}/local/[organisation]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/artifact.[ext]), artifactPatterns=List(${ivy.home}/local/[organisation]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/artifact.[ext]), isMavenCompatible=false))
[debug] Running cached compiler 85ae75c, interfacing (CompilerInterface) with Scala compiler version 2.11.4
[debug] Calling Scala compiler with arguments (CompilerInterface):
[debug] -bootclasspath
[debug] /usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/resources.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/rt.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/sunrsasign.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/jsse.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/jce.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/charsets.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/netx.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/plugin.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/rhino.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/lib/jfr.jar:/usr/lib64/jvm/java-1.7.0-openjdk-1.7.0/jre/classes:/home/tony/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.4.jar
[debug] -classpath
[debug] /home/tony/workspace/ofbiz-scala/target/scala-2.11/classes:/home/tony/.ivy2/cache/com.typesafe.slick/slick_2.11/bundles/slick_2.11-3.0.0-RC1.jar:/home/tony/.ivy2/cache/org.slf4j/slf4j-api/jars/slf4j-api-1.6.4.jar:/home/tony/.ivy2/cache/com.typesafe/config/bundles/config-1.2.1.jar:/home/tony/.ivy2/cache/org.reactivestreams/reactive-streams/jars/reactive-streams-1.0.0.RC3.jar:/home/tony/.ivy2/cache/com.typesafe.slick/slick-codegen_2.11/jars/slick-codegen_2.11-3.0.0-RC1.jar:/home/tony/.ivy2/cache/org.slf4j/slf4j-nop/jars/slf4j-nop-1.6.4.jar:/home/tony/.ivy2/cache/org.postgresql/postgresql/jars/postgresql-9.3-1103-jdbc41.jar:/home/tony/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.11.4.jar
java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:188)
at sbt.ConcurrentRestrictions$$anon$4.take(ConcurrentRestrictions.scala:196)
at sbt.Execute.next$1(Execute.scala:88)
at sbt.Execute.processAll(Execute.scala:91)
at sbt.Execute.runKeep(Execute.scala:69)
at sbt.EvaluateTask$.liftedTree1$1(EvaluateTask.scala:183)
at sbt.EvaluateTask$.sbt$EvaluateTask$$run$1(EvaluateTask.scala:183)
at sbt.EvaluateTask$.runTask(EvaluateTask.scala:198)
at sbt.Aggregation$$anonfun$3.apply(Aggregation.scala:67)
at sbt.Aggregation$$anonfun$3.apply(Aggregation.scala:65)
at sbt.EvaluateTask$.withStreams(EvaluateTask.scala:153)
at sbt.Aggregation$.timedRun(Aggregation.scala:65)
at sbt.Aggregation$.runTasks(Aggregation.scala:74)
at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:34)
at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:33)
at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:62)
at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:62)
at sbt.Command$.process(Command.scala:95)
at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:87)
at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:87)
at sbt.State$$anon$1.process(State.scala:176)
at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:87)
at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:87)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.MainLoop$.next(MainLoop.scala:87)
at sbt.MainLoop$.run(MainLoop.scala:80)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:69)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:66)
at sbt.Using.apply(Using.scala:25)
at sbt.MainLoop$.runWithNewLog(MainLoop.scala:66)
at sbt.MainLoop$.runAndClearLast(MainLoop.scala:49)
at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:33)
at sbt.MainLoop$.runLogged(MainLoop.scala:25)
at sbt.xMain.run(Main.scala:26)
at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
at xsbt.boot.Launch$.withContextLoader(Launch.scala:129)
at xsbt.boot.Launch$.run(Launch.scala:109)
at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:36)
at xsbt.boot.Launch$.launch(Launch.scala:117)
at xsbt.boot.Launch$.apply(Launch.scala:19)
at xsbt.boot.Boot$.runImpl(Boot.scala:44)
at xsbt.boot.Boot$.main(Boot.scala:20)
at xsbt.boot.Boot.main(Boot.scala)
Caused by: java.lang.OutOfMemoryError: Java heap space
at java.util.jar.Manifest$FastInputStream.(Manifest.java:332)
at java.util.jar.Manifest$FastInputStream.(Manifest.java:327)
at java.util.jar.Manifest.read(Manifest.java:195)
at java.util.jar.Manifest.(Manifest.java:69)
at java.util.jar.JarFile.getManifestFromReference(JarFile.java:185)
at java.util.jar.JarFile.getManifest(JarFile.java:166)
at sun.misc.URLClassPath$JarLoader$2.getManifest(URLClassPath.java:779)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:416)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at scala.tools.nsc.typechecker.Namers$Namer.typeErrorHandler(Namers.scala:111)
at scala.tools.nsc.typechecker.Namers$Namer.typeSig(Namers.scala:1539)
at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply$mcV$sp(Namers.scala:778)
at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply(Namers.scala:777)
at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply(Namers.scala:777)
at scala.tools.nsc.typechecker.Namers$Namer.scala$tools$nsc$typechecker$Namers$Namer$$logAndValidate(Namers.scala:1565)
at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1.apply(Namers.scala:777)
at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1.apply(Namers.scala:769)
at scala.tools.nsc.typechecker.Namers$$anon$1.completeImpl(Namers.scala:1681)
at scala.tools.nsc.typechecker.Namers$LockingTypeCompleter$class.complete(Namers.scala:1689)
at scala.tools.nsc.typechecker.Namers$$anon$1.complete(Namers.scala:1679)
at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1481)
at scala.reflect.internal.Symbols$Symbol.initialize(Symbols.scala:1628)
at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:4911)
at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5295)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5322)
at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5269)
[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space

@olivergg

This comment has been minimized.

Show comment
Hide comment
@olivergg

olivergg Mar 9, 2015

@tculshaw Have you tried with Java 8 ?

olivergg commented Mar 9, 2015

@tculshaw Have you tried with Java 8 ?

@tculshaw

This comment has been minimized.

Show comment
Hide comment
@tculshaw

tculshaw Mar 9, 2015

Good suggestion, but similar result:(

Java 8 gives...
java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: GC overhead limit exceeded

with .bashrc setting...
export SBT_OPTS="-Xmx2G -XX:+UseConcMarkSweepGC -XX:+CMSClassUnloadingEnabled -XX:PermSize=512M -XX:MaxPermSize=2G -Xss512M"

Detail being...
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
at scala.reflect.internal.Types$Type.substSym(Types.scala:715)
at scala.reflect.internal.Symbols$class.createFromClonedSymbolsAtOwner(Symbols.scala:3624)
at scala.reflect.internal.SymbolTable.createFromClonedSymbolsAtOwner(SymbolTable.scala:16)
at scala.reflect.internal.Types$ExistentialType.cloneInfo(Types.scala:2682)
at scala.reflect.internal.Symbols$Symbol.cloneSymbol(Symbols.scala:1935)
at scala.reflect.internal.Symbols$Symbol.cloneSymbol(Symbols.scala:1930)
at scala.reflect.internal.Symbols$Symbol.cloneSymbol(Symbols.scala:1928)
at scala.reflect.internal.Symbols$Symbol.cloneSymbol(Symbols.scala:1924)
at scala.reflect.internal.Symbols$$anonfun$cloneSymbols$1.apply(Symbols.scala:3600)
at scala.reflect.internal.Symbols$$anonfun$cloneSymbols$1.apply(Symbols.scala:3600)
at scala.reflect.internal.util.Collections$class.mapList(Collections.scala:56)
at scala.reflect.internal.SymbolTable.mapList(SymbolTable.scala:16)
at scala.reflect.internal.Symbols$class.deriveSymbols(Symbols.scala:3539)
at scala.reflect.internal.SymbolTable.deriveSymbols(SymbolTable.scala:16)
at scala.reflect.internal.Symbols$class.cloneSymbols(Symbols.scala:3600)
at scala.reflect.internal.SymbolTable.cloneSymbols(SymbolTable.scala:16)
at scala.reflect.internal.Symbols$class.cloneSymbolsAndModify(Symbols.scala:3611)
at scala.reflect.internal.SymbolTable.cloneSymbolsAndModify(SymbolTable.scala:16)
at scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:245)
at scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:128)
at scala.reflect.internal.tpe.TypeMaps$SubstMap.apply(TypeMaps.scala:700)
at scala.reflect.internal.Types$Type.subst(Types.scala:705)
at scala.reflect.internal.Types$Type.instantiateTypeParams(Types.scala:470)
at scala.reflect.internal.Symbols$class.deriveTypeWithWildcards(Symbols.scala:3595)
at scala.reflect.internal.SymbolTable.deriveTypeWithWildcards(SymbolTable.scala:16)
at scala.tools.nsc.typechecker.Implicits$class.scala$tools$nsc$typechecker$Implicits$$depoly(Implicits.scala:154)
at scala.tools.nsc.typechecker.Implicits$ImplicitSearch.scala$tools$nsc$typechecker$Implicits$ImplicitSearch$$matchesPt(Implicits.scala:484)
at scala.tools.nsc.typechecker.Implicits$ImplicitSearch$ImplicitComputation.survives(Implicits.scala:818)
at scala.tools.nsc.typechecker.Implicits$ImplicitSearch$ImplicitComputation$$anonfun$19$$anonfun$20.apply(Implicits.scala:872)
at scala.tools.nsc.typechecker.Implicits$ImplicitSearch$ImplicitComputation$$anonfun$19$$anonfun$20.apply(Implicits.scala:872)
at scala.collection.TraversableLike$$anonfun$filterImpl$1.apply(TraversableLike.scala:259)
at scala.collection.immutable.List.foreach(List.scala:381)

tculshaw commented Mar 9, 2015

Good suggestion, but similar result:(

Java 8 gives...
java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: GC overhead limit exceeded

with .bashrc setting...
export SBT_OPTS="-Xmx2G -XX:+UseConcMarkSweepGC -XX:+CMSClassUnloadingEnabled -XX:PermSize=512M -XX:MaxPermSize=2G -Xss512M"

Detail being...
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
at scala.reflect.internal.Types$Type.substSym(Types.scala:715)
at scala.reflect.internal.Symbols$class.createFromClonedSymbolsAtOwner(Symbols.scala:3624)
at scala.reflect.internal.SymbolTable.createFromClonedSymbolsAtOwner(SymbolTable.scala:16)
at scala.reflect.internal.Types$ExistentialType.cloneInfo(Types.scala:2682)
at scala.reflect.internal.Symbols$Symbol.cloneSymbol(Symbols.scala:1935)
at scala.reflect.internal.Symbols$Symbol.cloneSymbol(Symbols.scala:1930)
at scala.reflect.internal.Symbols$Symbol.cloneSymbol(Symbols.scala:1928)
at scala.reflect.internal.Symbols$Symbol.cloneSymbol(Symbols.scala:1924)
at scala.reflect.internal.Symbols$$anonfun$cloneSymbols$1.apply(Symbols.scala:3600)
at scala.reflect.internal.Symbols$$anonfun$cloneSymbols$1.apply(Symbols.scala:3600)
at scala.reflect.internal.util.Collections$class.mapList(Collections.scala:56)
at scala.reflect.internal.SymbolTable.mapList(SymbolTable.scala:16)
at scala.reflect.internal.Symbols$class.deriveSymbols(Symbols.scala:3539)
at scala.reflect.internal.SymbolTable.deriveSymbols(SymbolTable.scala:16)
at scala.reflect.internal.Symbols$class.cloneSymbols(Symbols.scala:3600)
at scala.reflect.internal.SymbolTable.cloneSymbols(SymbolTable.scala:16)
at scala.reflect.internal.Symbols$class.cloneSymbolsAndModify(Symbols.scala:3611)
at scala.reflect.internal.SymbolTable.cloneSymbolsAndModify(SymbolTable.scala:16)
at scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:245)
at scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:128)
at scala.reflect.internal.tpe.TypeMaps$SubstMap.apply(TypeMaps.scala:700)
at scala.reflect.internal.Types$Type.subst(Types.scala:705)
at scala.reflect.internal.Types$Type.instantiateTypeParams(Types.scala:470)
at scala.reflect.internal.Symbols$class.deriveTypeWithWildcards(Symbols.scala:3595)
at scala.reflect.internal.SymbolTable.deriveTypeWithWildcards(SymbolTable.scala:16)
at scala.tools.nsc.typechecker.Implicits$class.scala$tools$nsc$typechecker$Implicits$$depoly(Implicits.scala:154)
at scala.tools.nsc.typechecker.Implicits$ImplicitSearch.scala$tools$nsc$typechecker$Implicits$ImplicitSearch$$matchesPt(Implicits.scala:484)
at scala.tools.nsc.typechecker.Implicits$ImplicitSearch$ImplicitComputation.survives(Implicits.scala:818)
at scala.tools.nsc.typechecker.Implicits$ImplicitSearch$ImplicitComputation$$anonfun$19$$anonfun$20.apply(Implicits.scala:872)
at scala.tools.nsc.typechecker.Implicits$ImplicitSearch$ImplicitComputation$$anonfun$19$$anonfun$20.apply(Implicits.scala:872)
at scala.collection.TraversableLike$$anonfun$filterImpl$1.apply(TraversableLike.scala:259)
at scala.collection.immutable.List.foreach(List.scala:381)

@tculshaw

This comment has been minimized.

Show comment
Hide comment
@tculshaw

tculshaw Mar 9, 2015

Another thought, the primarykey types could be causing the codegen problems? Some primary keys are string based, some primary keys are composite primary keys.

tculshaw commented Mar 9, 2015

Another thought, the primarykey types could be causing the codegen problems? Some primary keys are string based, some primary keys are composite primary keys.

@olivergg

This comment has been minimized.

Show comment
Hide comment
@olivergg

olivergg Mar 28, 2015

@tculshaw How much physical memory do you have ? Could you try to increase the max heap size ? (-Xmx) ? If you use java 8, you can also drop the "-XX:PermSize=512M -XX:MaxPermSize=2G" flags.

@tculshaw How much physical memory do you have ? Could you try to increase the max heap size ? (-Xmx) ? If you use java 8, you can also drop the "-XX:PermSize=512M -XX:MaxPermSize=2G" flags.

@szeiger szeiger modified the milestones: 3.1.0, 3.2.0 Sep 24, 2015

@virusdave

This comment has been minimized.

Show comment
Hide comment
@virusdave

virusdave Feb 9, 2016

Contributor

Any progress on this issue?

Contributor

virusdave commented Feb 9, 2016

Any progress on this issue?

@tculshaw

This comment has been minimized.

Show comment
Hide comment
@tculshaw

tculshaw Feb 9, 2016

I haven't pursued this - sorry!

I'm using slick successfully on a number of smaller schemas though.

Tony Culshaw MA Cantab
Director
Mentation Limited - Custom Software Development

On 10 February 2016 at 09:15, virusdave notifications@github.com wrote:

Any progress on this issue?


Reply to this email directly or view it on GitHub
#906 (comment).

tculshaw commented Feb 9, 2016

I haven't pursued this - sorry!

I'm using slick successfully on a number of smaller schemas though.

Tony Culshaw MA Cantab
Director
Mentation Limited - Custom Software Development

On 10 February 2016 at 09:15, virusdave notifications@github.com wrote:

Any progress on this issue?


Reply to this email directly or view it on GitHub
#906 (comment).

@cvogt cvogt added the effort: easy label Feb 9, 2016

@cvogt

This comment has been minimized.

Show comment
Hide comment
@cvogt

cvogt Feb 9, 2016

Member

If there are any students willing to work on this, we can offer to do this (and more) through the Google Summer of Code program, which funds students with 5K to work on FOSS projects over the summer.

Member

cvogt commented Feb 9, 2016

If there are any students willing to work on this, we can offer to do this (and more) through the Google Summer of Code program, which funds students with 5K to work on FOSS projects over the summer.

@cvogt cvogt removed their assignment Feb 9, 2016

@sumiet

This comment has been minimized.

Show comment
Hide comment
@sumiet

sumiet Jan 13, 2017

Any updates on this issue?

sumiet commented Jan 13, 2017

Any updates on this issue?

@szeiger szeiger added this to the 3.2.1 milestone Jan 24, 2017

@szeiger szeiger removed this from the 3.2.0 milestone Jan 24, 2017

@pettyjamesm

This comment has been minimized.

Show comment
Hide comment
@pettyjamesm

pettyjamesm Sep 22, 2017

Would an improvement here be the ability to specify which schemas, tables, and columns are a part of the generated code?

I appreciate that at some point the internals of the scala compiler become the limiting factor, but I suppose I'm wondering how often this many tables are actually used by logic written in slick.

Would an improvement here be the ability to specify which schemas, tables, and columns are a part of the generated code?

I appreciate that at some point the internals of the scala compiler become the limiting factor, but I suppose I'm wondering how often this many tables are actually used by logic written in slick.

@virusdave

This comment has been minimized.

Show comment
Hide comment
@virusdave

virusdave Sep 22, 2017

Contributor
Contributor

virusdave commented Sep 22, 2017

@hvesalai hvesalai modified the milestones: 3.2.1, Future Feb 28, 2018

@hvesalai hvesalai removed the effort: easy label Mar 7, 2018

@hvesalai hvesalai modified the milestones: Feature ideas, 3.3 Mar 26, 2018

hvesalai added a commit that referenced this issue Mar 28, 2018

Merge pull request #1785 from Asamsig/codegen/multiplefilesoutput
Codegen output to multiple files (one per table) #906

@hvesalai hvesalai closed this Mar 28, 2018

@hvesalai

This comment has been minimized.

Show comment
Hide comment
@hvesalai

hvesalai Mar 28, 2018

Member

Solved by #1785

Member

hvesalai commented Mar 28, 2018

Solved by #1785

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment