Skip to content

Latest commit

 

History

History
182 lines (129 loc) · 6.14 KB

schema-definition.textile

File metadata and controls

182 lines (129 loc) · 6.14 KB
layout title
manual
Schema Definition

Defining a Schema

Scala classes are mapped to tables via instances of org.squeryl.Table[T],
that are grouped in a org.squeryl.Schema singleton.

The org.squeryl.Schema trait can create the Schema or dump the definition to a file.

Mapping fields to Columns

Squeryl applies the principle of Convention over Configuration :
The class to table and field to column correspondance is determined
by name equivalence. It is possible to override a field’s column name
with the org.squeryl.annotations.Column annotations and the class’s
table name table with the org.squeryl.Schema.table[T](tableName:String)
method. as illustrated in the previous example.

The Column annotation can also be used to redefine default length
for String/varchar columns (and also for other types although it
should rarely be necessary).

Nullable columns are mapped with Option[] fields

The default (and strongly recommended) way of mapping nullable columns
to fields is with the Option[] type. If you use Squeryl to create
(or generate) your schema, all fields have a not null constraint,
and Option[] fields are nullable.

  • Important : If a class has an Option[] field, it becomes mandatory
    to implement a zero argument constructor that initializes Option[] fields
    with Some() instances (like the Book class in the example above).
    Failing to do so will cause an exception to be thrown
    when the table will be instantiated. The reason for this is that type erasures
    imposed by the JVM prevents from reflecting on the Option[] type parameter.
    This constraint could be relaxed in a future version by a compiler plugin
    that would tell Squeryl the erased type information.

Choosing between primitive or custon types

You will have to decide if your table objects will be mapped with primitive (Int. Long, Date, String etc.) or custom types. It’s a question of tradeoffs :

Primitive types

The main motivations for using primitive types are for performance and symplicity.
If a query returns N rows of objects with M fields primitive types
will cause the the garbage collector to handle of N objects, while same
query using custom types will cause the creation of N * M objects.

To use primitive types, simply import org.squeryl.PrimitiveTypeMode._
in the scope where database objects and queries are defined :

that’s all there is to it.

  • important : in PrimitiveTypes mode there can be ambiguities between numeric operators

When using org.squeryl.PrimitiveTypeMode, the compiler will treat an expression like the
one in the next example as a Boolean. The .~ function is needed to tell the compiler that the
left side is a node of TypedExpressionNode[Int] which will cause the whole expression to be a
LogicalBoolean which is what the where clause takes :

It is also needed in the following case :

This is only required when using PrimitiveType mode, with custom types

there is no ambiguity, since custom types are not (AnyVal) numerics.

A compiler plugin could eventually place .~ in all expression nodes.
An alternative could also be to use other symbols than the
standard math operators to avoid the ambiguity altogether.
Opinions on this choice/tradeoff would be appreciated.

Custom types

One motivation for using custom wrapper types is to allow fields
to carry meta data along with validation, as in the next example.

Custom field types must inherit one of the subtypes of CustomType in the package
org.squeryl.customtypes, and import the org.squeryl.customtypes.CustomTypesMode._
into the scope where statements are defined.