Skip to content
Browse files

Huge changes, mostly to add prepared statement/bound variable support

This is a huge commit.  I general prefer to commit in smaller chunks,
but this is a major feature that will have a large effect on the
future of Sequel, and I didn't wan't to commit before I knew that
the code was flexible enough to work on multiple database types, and
that it didn't break existing code.

This commit adds support for prepared statements and bound variables.
Included is a prepared_statement.rdoc file, review that to get an
idea of usage.  This has been tested on PostgreSQL, MySQL, and
SQLite, both with the native drivers and with the JDBC drivers.  For
other databases, it emulates support using interpolation.

Along with the prepared statement support comes complete, but not
necessarily good, documentation for the PostgreSQL, MySQL, SQLite,
and JDBC adapters.

There were numerous minor changes made as well:

* MSSQL should be better supported on JDBC, though I haven't tested
  it.
* Statement execution on JDBC and SQLite was refactored to reduce
  code duplication.
* JDBC support for inserting records was refactored to reduce code
  duplication.
* Dataset got private execute and execute_dui methods, that send the
  the SQL to the database.  The adapters that had special database
  #execute methods had similar changes to their datasets.
* Mysql::Result#convert_type is now a private method.
* Mysql::Result#each_array was removed, probably a leftover from the
  old arrays with keys code.
* All databases now have a @transactions instance variable set on
  initialization, saving code inside #transaction.
* Native support for prepared statements when using PostgreSQL can be
  determined by seeing if SEQUEL_POSTGRES_USES_PG is defined and
  true.
* Postgres::Adapter#connected was removed.
* #serial_primary_key_options was removed from the MySQL and SQLite
  adapters, since it was the same as the default.
* Postgres::Database#locks was refactored.
* Use block_given? and yield instead of block[] in some places.
* Database#log_info takes an additional args argument, used for
  logging bound variables.
* The InvalidExpression, InvalidFilter, InvalidJoinType, and
  WorkerStop exceptions were removed.
* Using Model[true] and Model[false] no longer results in an error.
  This was probably more helpful with the ParseTree expression
  syntax (which used == for equality), but shouldn't be helpful now.
* Using a full outer join on MySQL raises a generic Error instead of
  InvalidJoinType.
  • Loading branch information...
1 parent 0b3d9fd commit 78a683d97fb6a58cf8ed83777d097b49e8130056 @jeremyevans committed
View
104 doc/prepared_statements.rdoc
@@ -0,0 +1,104 @@
+= Prepared Statements and Bound Variables
+
+Starting with version 2.4.0, Sequel has support for prepared statements and
+bound variables. No matter which database you are using, the Sequel prepared
+statement/bound variable API remains exactly the same. There is native support
+for prepared statements/bound variables on the following databases:
+
+* PostgreSQL (using the pg driver, requires type specifiers)
+* MySQL (prepared statements only, as the ruby mysql driver doesn't support
+ bound variables)
+* SQLite (a new native prepared statement is used for each call, though)
+* JDBC (using the postgresql, mysql, or sqlite databases, and possibly others)
+
+Support on other databases is emulated via the usual string interpolation.
+
+== Placeholders
+
+Generally, when using prepared statements (and certainly when using bound
+variables), you need to put placeholders in your SQL to indicate where you
+want your bound arguments to appear. Database support and syntax vary
+significantly for placeholders (e.g. :name, $1, ?). Sequel abstracts all of
+that and allows you to specify placeholders by using the :$name format for
+placeholders, e.g.:
+
+ ds = DB[:items].filter(:name=>:$name)
+
+== Bound Variables
+
+Using bound variables for this query is simple:
+
+ ds.call(:select, :name=>'Jim')
+
+This will do the equivalent of selecting records that have the name 'Jim'. It
+returns all records, and can take a block that is passed to Dataset#all.
+
+Deleting or returning the first record works similarly:
+
+ ds.call(:first, :name=>'Jim') # First record with name 'Jim'
+ ds.call(:delete, :name=>'Jim') # Delete records with name 'Jim'
+
+For inserting/updating records, you should also specify a value hash, which
+may itself contain placeholders:
+
+ # Insert record with 'Jim', note that the previous filter is ignored
+ ds.call(:insert, {:name=>'Jim'}, :name=>:$name)
+ # Change name to 'Bob' for all records with name of 'Jim'
+ ds.call(:update, {:name=>'Jim', :new_name=>'Bob'}, :name=>$:new_name)
+
+== Prepared Statements
+
+Prepared statement support is similar to bound variable support, but you
+use Dataset#prepare with a name, and Dataset#call later with the values:
+
+ ds = DB[:items].filter(:name=>:$name)
+ ps = ds.prepare(:select, :select_by_name)
+ ps.call(:name=>'Jim')
+ DB.call(:select_by_name, :name=>'Jim') # same as above
+
+The Dataset#prepare method returns a prepared statement, and also stores a
+copy of the prepared statement in the database for later use. For insert
+and update queries, the hash to insert/update is passed to prepare:
+
+ ps1 = DB[:items].prepare(:insert, :insert_with_name, :name=>:$name)
+ ps1.call(:name=>'Jim')
+ DB.call(:insert_with_name, :name=>'Jim') # same as above
+ ds = DB[:items].filter(:name=>:$name)
+ ps2 = ds.prepare(:update, :update_name, :name=>:$new_name)
+ ps2.call(:name=>'Jim', :new_name=>'Bob')
+ DB.call(:update_name, :name=>'Jim', :new_name=>'Bob') # same as above
+
+== Database support
+
+=== PostgreSQL
+
+If you are using the ruby-postgres or postgres-pr driver, PostgreSQL uses the
+default emulated support. If you are using ruby-pg, there is native support,
+but it requires type specifiers most of the time. This is easy if you have
+direct control over the SQL string, but since Sequel abstracts that, the types
+have to be specified another way. This is done by adding a __* suffix to the
+placeholder symbol (e.g. :$name__text, which will be compiled to "$1::text"
+in the SQL). Prepared statements are always server side.
+
+=== SQLite
+
+SQLite supports bound variables and prepared statements exactly the same, since
+a new native prepared statement is created and executed for each call.
+
+=== MySQL
+
+The MySQL ruby driver does not support bound variables, so the the bound
+variable methods fall back to string interpolation. It uses server side
+prepared statements.
+
+=== JDBC
+
+JDBC supports both prepared statements and bound variables. Whether these
+are server side or client side depends on the JDBC driver. For PostgreSQL
+over JDBC, you can add the prepareThreshold=N parameter to the connection
+string, which will use a server side prepared statement after N calls to
+the prepared statement.
+
+=== All Others
+
+Support is emulated using interpolation.
View
275 lib/sequel_core/adapters/jdbc.rb
@@ -1,9 +1,38 @@
require 'java'
module Sequel
+ # Houses Sequel's JDBC support when running on JRuby.
+ # Support for individual database types is done using sub adapters.
+ # PostgreSQL, MySQL, SQLite, and MSSQL all have relatively good support,
+ # close the the level supported by the native adapter.
+ # PostgreSQL, MySQL, SQLite can load necessary support using
+ # the jdbc-* gem, if it is installed, though they will work if you
+ # have the correct .jar in your CLASSPATH. Oracle and MSSQL should
+ # load the necessary support if you have the .jar in your CLASSPATH.
+ # For all other databases, the Java class should be loaded manually
+ # before calling Sequel.connect.
+ #
+ # Note that when using a JDBC adapter, the best way to use Sequel
+ # is via Sequel.connect, NOT Sequel.jdbc. Use the JDBC connection
+ # string when connecting, which will be in a different format than
+ # the native connection string. The connection string should start
+ # with 'jdbc:'. For PostgreSQL, use 'jdbc:postgresql:', and for
+ # SQLite you do not need 2 preceding slashes for the database name
+ # (use no preceding slashes for a relative path, and one preceding
+ # slash for an absolute path).
module JDBC
- module JavaLang; include_package 'java.lang'; end
- module JavaSQL; include_package 'java.sql'; end
+ # Make it accesing the java.lang hierarchy more ruby friendly.
+ module JavaLang
+ include_package 'java.lang'
+ end
+
+ # Make it accesing the java.sql hierarchy more ruby friendly.
+ module JavaSQL
+ include_package 'java.sql'
+ end
+
+ # Contains procs keyed on sub adapter type that extend the
+ # given database object so it supports the correct database type.
DATABASE_SETUP = {:postgresql=>proc do |db|
require 'sequel_core/adapters/jdbc/postgresql'
db.extend(Sequel::JDBC::Postgres::DatabaseMethods)
@@ -23,9 +52,15 @@ module JavaSQL; include_package 'java.sql'; end
org.sqlite.JDBC
end,
:oracle=>proc{oracle.jdbc.driver.OracleDriver},
- :sqlserver=>proc{com.microsoft.sqlserver.jdbc.SQLServerDriver}
+ :sqlserver=>proc do |db|
+ require 'sequel_core/adapters/shared/mssql'
+ db.extend(Sequel::MSSQL::DatabaseMethods)
+ com.microsoft.sqlserver.jdbc.SQLServerDriver
+ end
}
+ # Allowing loading the necessary JDBC support via a gem, which
+ # works for PostgreSQL, MySQL, and SQLite.
def self.load_gem(name)
begin
require "jdbc/#{name}"
@@ -34,12 +69,18 @@ def self.load_gem(name)
end
end
+ # JDBC Databases offer a fairly uniform interface that does not change
+ # much based on the sub adapter.
class Database < Sequel::Database
set_adapter_scheme :jdbc
# The type of database we are connecting to
attr_reader :database_type
+ # Call the DATABASE_SETUP proc directly after initialization,
+ # so the object always uses sub adapter specific code. Also,
+ # raise an error immediately if the connection doesn't have a
+ # uri, since JDBC requires one.
def initialize(opts)
super(opts)
raise(Error, "No connection string specified") unless uri
@@ -48,67 +89,100 @@ def initialize(opts)
end
end
+ # Connect to the database using JavaSQL::DriverManager.getConnection.
def connect
setup_connection(JavaSQL::DriverManager.getConnection(uri))
end
+ # Return instances of JDBC::Dataset with the given opts.
def dataset(opts = nil)
JDBC::Dataset.new(self, opts)
end
+ # Close all adapter connections
def disconnect
@pool.disconnect {|c| c.close}
end
- def execute(sql)
- log_info(sql)
- @pool.hold do |conn|
- stmt = conn.createStatement
- begin
- yield stmt.executeQuery(sql)
- rescue NativeException, JavaSQL::SQLException => e
- raise Error, e.message
- ensure
- stmt.close
- end
- end
+ # Execute the given SQL, which should be a SELECT statement
+ # or something else that returns rows.
+ def execute(sql, &block)
+ _execute(sql, :type=>:select, &block)
end
+ # Execute the given DDL SQL, which should not return any
+ # values or rows.
def execute_ddl(sql)
- log_info(sql)
- @pool.hold do |conn|
- stmt = conn.createStatement
- begin
- stmt.execute(sql)
- rescue NativeException, JavaSQL::SQLException => e
- raise Error, e.message
- ensure
- stmt.close
- end
- end
+ _execute(sql, :type=>:ddl)
end
+ # Execute the given DELETE, UPDATE, or INSERT SQL, returning
+ # the number of rows affected.
def execute_dui(sql)
- log_info(sql)
- @pool.hold do |conn|
- stmt = conn.createStatement
+ _execute(sql, :type=>:dui)
+ end
+
+ # Execute the given INSERT SQL, returning the last inserted
+ # row id.
+ def execute_insert(sql)
+ _execute(sql, :type=>:insert)
+ end
+
+ # Execute the prepared statement. If the provided name is a
+ # dataset, use that as the prepared statement, otherwise use
+ # it as a key to look it up in the prepared_statements hash.
+ # If the connection we are using has already prepared an identical
+ # statement, use that statement instead of creating another.
+ # Otherwise, prepare a new statement for the connection, bind the
+ # variables, and execute it.
+ def execute_prepared_statement(name, args=[], opts={})
+ if Dataset === name
+ ps = name
+ name = ps.prepared_statement_name
+ else
+ ps = prepared_statements[name]
+ end
+ sql = ps.prepared_sql
+ synchronize do |conn|
+ if name and cps = conn.prepared_statements[name] and cps[0] == sql
+ cps = cps[1]
+ else
+ if cps
+ log_info("Closing #{name}")
+ cps[1].close
+ end
+ log_info("Preparing#{" #{name}:" if name} #{sql}")
+ cps = conn.prepareStatement(sql)
+ conn.prepared_statements[name] = [sql, cps] if name
+ end
+ i = 0
+ args.each{|arg| set_ps_arg(cps, arg, i+=1)}
+ log_info("Executing#{" #{name}" if name}", args)
begin
- stmt.executeUpdate(sql)
+ case opts[:type]
+ when :select
+ yield cps.executeQuery
+ when :ddl
+ cps.execute
+ when :insert
+ cps.executeUpdate
+ last_insert_id(conn, opts)
+ else
+ cps.executeUpdate
+ end
rescue NativeException, JavaSQL::SQLException => e
raise Error, e.message
ensure
- stmt.close
+ cps.close unless name
end
end
end
- def setup_connection(conn)
- conn
- end
-
+ # Default transaction method that should work on most JDBC
+ # databases. Does not use the JDBC transaction methods, uses
+ # SQL BEGIN/ROLLBACK/COMMIT statements instead.
def transaction
- @pool.hold do |conn|
- @transactions ||= []
+ synchronize do |conn|
return yield(conn) if @transactions.include?(Thread.current)
stmt = conn.createStatement
begin
@@ -131,6 +205,10 @@ def transaction
end
end
+ # The uri for this connection. You can specify the uri
+ # using the :uri, :url, or :database options. You don't
+ # need to worry about this if you use Sequel.connect
+ # with the JDBC connectrion strings.
def uri
ur = @opts[:uri] || @opts[:url] || @opts[:database]
ur =~ /^\Ajdbc:/ ? ur : "jdbc:#{ur}"
@@ -139,26 +217,104 @@ def uri
private
+ # Execute the SQL. Use the :type option to see which JDBC method
+ # to use.
+ def _execute(sql, opts)
+ log_info(sql)
+ synchronize do |conn|
+ stmt = conn.createStatement
+ begin
+ case opts[:type]
+ when :select
+ yield stmt.executeQuery(sql)
+ when :ddl
+ stmt.execute(sql)
+ when :insert
+ stmt.executeUpdate(sql)
+ last_insert_id(conn, opts)
+ else
+ stmt.executeUpdate(sql)
+ end
+ rescue NativeException, JavaSQL::SQLException => e
+ raise Error, e.message
+ ensure
+ stmt.close
+ end
+ end
+ end
+
+ # By default, there is no support for determining the last inserted
+ # id, so return nil. This method should be overridden in
+ # sub adapters.
+ def last_insert_id(conn, opts)
+ nil
+ end
+
+ # Java being java, you need to specify the type of each argument
+ # for the prepared statement, and bind it individually. This
+ # guesses which JDBC method to use, and hopefully JRuby will convert
+ # things properly for us.
+ def set_ps_arg(cps, arg, i)
+ case arg
+ when Integer
+ cps.setInt(i, arg)
+ when String
+ cps.setString(i, arg)
+ when Date
+ cps.setDate(i, arg)
+ when Time, DateTime, Java::JavaSql::Timestamp
+ cps.setTimestamp(i, arg)
+ when Float
+ cps.setDouble(i, arg)
+ when nil
+ cps.setNull(i, JavaSQL::Types::NULL)
+ end
+ end
+
+ # Add a prepared_statements accessor to the connection,
+ # and set it to an empty hash. This is used to store
+ # adapter specific prepared statements.
+ def setup_connection(conn)
+ conn.meta_eval{attr_accessor :prepared_statements}
+ conn.prepared_statements = {}
+ conn
+ end
+
+ # The JDBC adapter should not need the pool to convert exceptions.
def connection_pool_default_options
super.merge(:pool_convert_exceptions=>false)
end
end
class Dataset < Sequel::Dataset
- def literal(v)
- case v
- when Time
- literal(v.iso8601)
- when Date, DateTime, Java::JavaSql::Timestamp
- literal(v.to_s)
- else
- super
+ # Use JDBC PreparedStatements instead of emulated ones. Statements
+ # created using #prepare are cached at the connection level to allow
+ # reuse. This also supports bind variables by using unnamed
+ # prepared statements created using #call.
+ module PreparedStatementMethods
+ include Sequel::Dataset::UnnumberedArgumentMapper
+
+ private
+
+ # Execute the prepared SQL using the stored type and
+ # arguments derived from the hash passed to call.
+ def execute(sql, &block)
+ @db.execute_prepared_statement(self, bind_arguments, :type=>sql_query_type, &block)
end
+ alias execute_dui execute
+ alias execute_insert execute
end
-
+
+ # Create an unnamed prepared statement and call it. Allows the
+ # use of bind variables.
+ def call(type, hash, values=nil, &block)
+ prepare(type, nil, values).call(hash, &block)
+ end
+
+ # Correctly return rows from the database and return them as hashes.
def fetch_rows(sql, &block)
@db.synchronize do
- @db.execute(sql) do |result|
+ execute(sql) do |result|
# get column names
meta = result.getMetaData
column_count = meta.getColumnCount
@@ -175,11 +331,36 @@ def fetch_rows(sql, &block)
end
self
end
+
+ # Use the ISO values for dates and times.
+ def literal(v)
+ case v
+ when Time
+ literal(v.iso8601)
+ when Date, DateTime, Java::JavaSql::Timestamp
+ literal(v.to_s)
+ else
+ super
+ end
+ end
+
+ # Create a named prepared statement that is stored in the
+ # database (and connection) for reuse.
+ def prepare(type, name, values=nil)
+ ps = to_prepared_statement(type, values)
+ ps.extend(PreparedStatementMethods)
+ if name
+ ps.prepared_statement_name = name
+ db.prepared_statements[name] = ps
+ end
+ ps
+ end
end
end
end
class Java::JavaSQL::Timestamp
+ # Add a usec method in order to emulate Time values.
def usec
getNanos/1000
end
View
56 lib/sequel_core/adapters/jdbc/mysql.rb
@@ -2,52 +2,60 @@
module Sequel
module JDBC
+ # Database and Dataset instance methods for MySQL specific
+ # support via JDBC.
module MySQL
+ # Database instance methods for MySQL databases accessed via JDBC.
module DatabaseMethods
include Sequel::MySQL::DatabaseMethods
+ # Return instance of Sequel::JDBC::MySQL::Dataset with the given opts.
def dataset(opts=nil)
Sequel::JDBC::MySQL::Dataset.new(self, opts)
end
- def execute_insert(sql)
- begin
- log_info(sql)
- @pool.hold do |conn|
- stmt = conn.createStatement
- begin
- stmt.executeUpdate(sql)
- rs = stmt.executeQuery('SELECT LAST_INSERT_ID()')
- rs.next
- rs.getInt(1)
- rescue NativeException, JavaSQL::SQLException => e
- raise Error, e.message
- ensure
- stmt.close
- end
- end
- rescue NativeException, JavaSQL::SQLException => e
- raise Error, "#{sql}\r\n#{e.message}"
- end
- end
-
private
+ # The database name for the given database. Need to parse it out
+ # of the connection string, since the JDBC does no parsing on the
+ # given connection string by default.
def database_name
u = URI.parse(uri.sub(/\Ajdbc:/, ''))
(m = /\/(.*)/.match(u.path)) && m[1]
end
+
+ # Get the last inserted id using LAST_INSERT_ID().
+ def last_insert_id(conn, opts={})
+ stmt = conn.createStatement
+ begin
+ rs = stmt.executeQuery('SELECT LAST_INSERT_ID()')
+ rs.next
+ rs.getInt(1)
+ ensure
+ stmt.close
+ end
+ end
end
-
+
+ # Dataset class for MySQL datasets accessed via JDBC.
class Dataset < JDBC::Dataset
include Sequel::MySQL::DatasetMethods
+ # Use execute_insert to execute the insert_sql.
def insert(*values)
- @db.execute_insert(insert_sql(*values))
+ execute_insert(insert_sql(*values))
end
+ # Use execute_insert to execute the replace_sql.
def replace(*args)
- @db.execute_insert(replace_sql(*args))
+ execute_insert(replace_sql(*args))
+ end
+
+ private
+
+ # Call execute_insert on the database.
+ def execute_insert(sql)
+ @db.execute_insert(sql)
end
end
end
View
48 lib/sequel_core/adapters/jdbc/postgresql.rb
@@ -4,10 +4,16 @@ module Sequel
Postgres::CONVERTED_EXCEPTIONS << NativeException
module JDBC
+ # Adapter, Database, and Dataset support for accessing a PostgreSQL
+ # database via JDBC.
module Postgres
+ # Methods to add to the JDBC adapter/connection to allow it to work
+ # with the shared PostgreSQL code.
module AdapterMethods
include Sequel::Postgres::AdapterMethods
+ # Give the JDBC adapter a direct execute method, which creates
+ # a statement with the given sql and executes it.
def execute(sql, method=:execute)
method = :executeQuery if block_given?
stmt = createStatement
@@ -21,6 +27,9 @@ def execute(sql, method=:execute)
end
end
+ private
+
+ # JDBC specific method of getting specific values from a result set.
def result_set_values(r, *vals)
return if r.nil?
r.next
@@ -34,22 +43,54 @@ def result_set_values(r, *vals)
end
end
+ # Methods to add to Database instances that access PostgreSQL via
+ # JDBC.
module DatabaseMethods
include Sequel::Postgres::DatabaseMethods
+ # Return instance of Sequel::JDBC::Postgres::Dataset with the given opts.
def dataset(opts=nil)
Sequel::JDBC::Postgres::Dataset.new(self, opts)
end
+ # Run the INSERT sql on the database and return the primary key
+ # for the record.
+ def execute_insert(sql, table, values)
+ _execute(sql, :type=>:insert, :table=>table, :values=>values)
+ end
+
+ private
+
+ # Extend the adapter with the JDBC PostgreSQL AdapterMethods
def setup_connection(conn)
+ conn = super(conn)
conn.extend(Sequel::JDBC::Postgres::AdapterMethods)
conn
end
+
+ # Call insert_result with the table and values specified in the opts.
+ def last_insert_id(conn, opts)
+ insert_result(conn, opts[:table], opts[:values])
+ end
end
+ # Dataset subclass used for datasets that connect to PostgreSQL via JDBC.
class Dataset < JDBC::Dataset
include Sequel::Postgres::DatasetMethods
+ # Methods to support JDBC PostgreSQL prepared statements
+ module PreparedStatementMethods
+ private
+
+ # Add the table and values to the opts call so they can later
+ # be pulled by the DatabaseMethods#last_insert_id
+ def execute_insert(sql, table, values)
+ @db.execute_prepared_statement(self, bind_arguments, :type=>:insert, :table=>table, :values=>values)
+ end
+ end
+
+ # Convert Java::JavaSql::Timestamps correctly, and handle SQL::Blobs
+ # correctly.
def literal(v)
case v
when SQL::Blob
@@ -60,6 +101,13 @@ def literal(v)
super
end
end
+
+ # Extend the prepared statement created with PreparedStatementMethods.
+ def prepare(type, name, values=nil)
+ ps = super
+ ps.extend(PreparedStatementMethods)
+ ps
+ end
end
end
end
View
36 lib/sequel_core/adapters/jdbc/sqlite.rb
@@ -2,43 +2,39 @@
module Sequel
module JDBC
+ # Database and Dataset support for SQLite databases accessed via JDBC.
module SQLite
+ # Instance methods for SQLite Database objects accessed via JDBC.
module DatabaseMethods
include Sequel::SQLite::DatabaseMethods
+ # Return Sequel::JDBC::SQLite::Dataset object with the given opts.
def dataset(opts=nil)
Sequel::JDBC::SQLite::Dataset.new(self, opts)
end
- def execute_insert(sql)
+ private
+
+ # Use last_insert_rowid() to get the last inserted id.
+ def last_insert_id(conn, opts={})
+ stmt = conn.createStatement
begin
- log_info(sql)
- @pool.hold do |conn|
- stmt = conn.createStatement
- begin
- stmt.executeUpdate(sql)
- rs = stmt.executeQuery('SELECT last_insert_rowid()')
- rs.next
- rs.getInt(1)
- rescue NativeException, JavaSQL::SQLException => e
- raise Error, e.message
- ensure
- stmt.close
- end
- end
- rescue NativeException, JavaSQL::SQLException => e
- raise Error, "#{sql}\r\n#{e.message}"
+ rs = stmt.executeQuery('SELECT last_insert_rowid()')
+ rs.next
+ rs.getInt(1)
+ ensure
+ stmt.close
end
end
- private
-
+ # Default to a single connection for a memory database.
def connection_pool_default_options
o = super
uri == 'jdbc:sqlite::memory:' ? o.merge(:max_connections=>1) : o
end
end
-
+
+ # Dataset class for SQLite datasets accessed via JDBC.
class Dataset < JDBC::Dataset
include Sequel::SQLite::DatasetMethods
end
View
202 lib/sequel_core/adapters/mysql.rb
@@ -1,8 +1,10 @@
require 'mysql'
require 'sequel_core/adapters/shared/mysql'
-# Monkey patch Mysql::Result to yield hashes with symbol keys
+# Add methods to get columns, yield hashes with symbol keys, and do
+# type conversion.
class Mysql::Result
+ # Mapping of type numbers to conversion methods.
MYSQL_TYPES = {
0 => :to_d, # MYSQL_TYPE_DECIMAL
1 => :to_i, # MYSQL_TYPE_TINY
@@ -32,21 +34,8 @@ class Mysql::Result
# 254 => :to_s, # MYSQL_TYPE_STRING
# 255 => :to_s # MYSQL_TYPE_GEOMETRY
}
-
- def convert_type(v, type)
- if v
- if type == 1 && Sequel.convert_tinyint_to_bool
- # We special case tinyint here to avoid adding
- # a method to an ancestor of Fixnum
- v.to_i == 0 ? false : true
- else
- (t = MYSQL_TYPES[type]) ? v.send(t) : v
- end
- else
- nil
- end
- end
-
+
+ # Return an array of column name symbols for this result set.
def columns(with_table = nil)
unless @columns
@column_types = []
@@ -58,18 +47,7 @@ def columns(with_table = nil)
@columns
end
- def each_array(with_table = nil)
- c = columns
- while row = fetch_row
- c.each_with_index do |f, i|
- if (t = MYSQL_TYPES[@column_types[i]]) && (v = row[i])
- row[i] = v.send(t)
- end
- end
- yield row
- end
- end
-
+ # yield a hash with symbol keys and type converted values.
def sequel_each_hash(with_table = nil)
c = columns
while row = fetch_row
@@ -79,15 +57,40 @@ def sequel_each_hash(with_table = nil)
end
end
+ private
+
+ # Convert the type of v using the method in MYSQL_TYPES[type].
+ def convert_type(v, type)
+ if v
+ if type == 1 && Sequel.convert_tinyint_to_bool
+ # We special case tinyint here to avoid adding
+ # a method to an ancestor of Fixnum
+ v.to_i == 0 ? false : true
+ else
+ (t = MYSQL_TYPES[type]) ? v.send(t) : v
+ end
+ else
+ nil
+ end
+ end
+
end
module Sequel
- module MySQL
+ # Module for holding all MySQL-related classes and modules for Sequel.
+ module MySQL
+ # Database class for MySQL databases used with Sequel.
class Database < Sequel::Database
include Sequel::MySQL::DatabaseMethods
set_adapter_scheme :mysql
-
+
+ # Connect to the database. In addition to the usual database options,
+ # the following options have effect:
+ #
+ # * :encoding, :charset - Set all the related character sets for this
+ # connection (connection, client, database, server, and results).
+ # * :socket - Use a unix socket file instead of connecting via TCP/IP.
def connect
conn = Mysql.init
conn.options(Mysql::OPT_LOCAL_INFILE, "client")
@@ -110,55 +113,104 @@ def connect
conn.query("set character_set_server = '#{encoding}'")
conn.query("set character_set_results = '#{encoding}'")
end
+ conn.meta_eval{attr_accessor :prepared_statements}
+ conn.prepared_statements = {}
conn.reconnect = true
conn
end
+ # Returns instance of Sequel::MySQL::Dataset with the given options.
def dataset(opts = nil)
MySQL::Dataset.new(self, opts)
end
-
+
+ # Closes all database connections.
def disconnect
@pool.disconnect {|c| c.close}
end
-
- def execute(sql, &block)
+
+ # Executes the given SQL using an available connection, yielding the
+ # connection if the block is given.
+ def execute(sql)
begin
log_info(sql)
- @pool.hold do |conn|
+ synchronize do |conn|
conn.query(sql)
- block[conn] if block
+ yield conn if block_given?
end
rescue Mysql::Error => e
raise Error.new(e.message)
end
end
-
- def execute_select(sql, &block)
+
+ # Executes a prepared statement on an available connection. If the
+ # prepared statement already exists for the connection and has the same
+ # SQL, reuse it, otherwise, prepare the new statement. Because of the
+ # usual MySQL stupidity, we are forced to name arguments via separate
+ # SET queries. Use @sequel_arg_N (for N starting at 1) for these
+ # arguments.
+ def execute_prepared_statement(ps_name, args, opts={})
+ ps = prepared_statements[ps_name]
+ sql = ps.prepared_sql
+ synchronize do |conn|
+ unless conn.prepared_statements[ps_name] == sql
+ conn.prepared_statements[ps_name] = sql
+ s = "PREPARE #{ps_name} FROM '#{::Mysql.quote(sql)}'"
+ log_info(s)
+ conn.query(s)
+ end
+ i = 0
+ args.each do |arg|
+ s = "SET @sequel_arg_#{i+=1} = #{literal(arg)}"
+ log_info(s)
+ conn.query(s)
+ end
+ s = "EXECUTE #{ps_name}#{" USING #{(1..i).map{|j| "@sequel_arg_#{j}"}.join(', ')}" unless i == 0}"
+ log_info(s)
+ conn.query(s)
+ if opts[:select]
+ r = conn.use_result
+ begin
+ yield r
+ ensure
+ r.free
+ end
+ else
+ yield conn if block_given?
+ end
+ end
+ end
+
+ # Execute the given SQL, yielding the result set. Until the block
+ # given returns, no queries can use this connection. For that
+ # reason, if you need to use nested queries in MySQL, you must
+ # get all records at once for the outer queries (e.g.
+ # DB[:i].all{|i| DB[:j].each{|j|}}.
+ def execute_select(sql)
execute(sql) do |c|
r = c.use_result
begin
- block[r]
+ yield r
ensure
r.free
end
end
end
+ # Return the version of the MySQL server two which we are connecting.
def server_version
@server_version ||= (synchronize{|conn| conn.server_version if conn.respond_to?(:server_version)} || super)
end
+ # Return an array of symbols specifying table names in the current database.
def tables
- @pool.hold do |conn|
- conn.list_tables.map {|t| t.to_sym}
- end
+ synchronize{|conn| conn.list_tables.map {|t| t.to_sym}}
end
+ # Support single level transactions on MySQL.
def transaction
- @pool.hold do |conn|
- @transactions ||= []
- return yield(conn) if @transactions.include? Thread.current
+ synchronize do |conn|
+ return yield(conn) if @transactions.include?(Thread.current)
log_info(SQL_BEGIN)
conn.query(SQL_BEGIN)
begin
@@ -180,34 +232,60 @@ def transaction
private
+ # MySQL doesn't need the connection pool to convert exceptions.
def connection_pool_default_options
super.merge(:pool_convert_exceptions=>false)
end
+ # The database name when using the native adapter is always stored in
+ # the :database option.
def database_name
@opts[:database]
end
end
-
+
+ # Dataset class for MySQL datasets accessed via the native driver.
class Dataset < Sequel::Dataset
include Sequel::MySQL::DatasetMethods
-
+
+ # Methods for MySQL prepared statements using the native driver.
+ module PreparedStatementMethods
+ include Sequel::Dataset::UnnumberedArgumentMapper
+
+ # Execute the prepared statement with the bind arguments instead of
+ # the given SQL, yielding the connection to the block.
+ def execute(sql, &block)
+ @db.execute_prepared_statement(prepared_statement_name, bind_arguments, &block)
+ end
+ alias execute_dui execute
+
+ # Execute the prepared statement with the bind arguments instead of
+ # the given SQL, yielding the rows to the block.
+ def execute_select(sql, &block)
+ @db.execute_prepared_statement(prepared_statement_name, bind_arguments, :select=>true, &block)
+ end
+ end
+
+ # Delete rows matching this dataset
def delete(opts = nil)
- @db.execute(delete_sql(opts)) {|c| c.affected_rows}
+ execute(delete_sql(opts)){|c| c.affected_rows}
end
-
+
+ # Yield all rows matching this dataset
def fetch_rows(sql)
- @db.execute_select(sql) do |r|
+ execute_select(sql) do |r|
@columns = r.columns
r.sequel_each_hash {|row| yield row}
end
self
end
-
+
+ # Insert a new value into this dataset
def insert(*values)
- @db.execute(insert_sql(*values)) {|c| c.insert_id}
+ execute(insert_sql(*values)){|c| c.insert_id}
end
+ # Handle correct quoting of strings using ::MySQL.quote.
def literal(v)
case v
when LiteralString
@@ -219,12 +297,30 @@ def literal(v)
end
end
+ # Store the given type of prepared statement in the associated database
+ # with the given name.
+ def prepare(type, name, values=nil)
+ ps = to_prepared_statement(type, values)
+ ps.extend(PreparedStatementMethods)
+ ps.prepared_statement_name = name
+ db.prepared_statements[name] = ps
+ end
+
+ # Replace (update or insert) the matching row.
def replace(*args)
- @db.execute(replace_sql(*args)) {|c| c.insert_id}
+ execute(replace_sql(*args)){|c| c.insert_id}
end
+ # Update the matching rows.
def update(*args)
- @db.execute(update_sql(*args)) {|c| c.affected_rows}
+ execute(update_sql(*args)){|c| c.affected_rows}
+ end
+
+ private
+
+ # Run execute_select with the given SQL against the associated database.
+ def execute_select(sql, &block)
+ @db.execute_select(sql, &block)
end
end
end
View
7 lib/sequel_core/adapters/oracle.rb
@@ -5,12 +5,6 @@ module Oracle
class Database < Sequel::Database
set_adapter_scheme :oracle
- # AUTO_INCREMENT = 'IDENTITY(1,1)'.freeze
- #
- # def auto_increment_sql
- # AUTO_INCREMENT
- # end
-
def connect
if @opts[:database]
dbname = @opts[:host] ? \
@@ -51,7 +45,6 @@ def table_exists?(name)
def transaction
@pool.hold do |conn|
- @transactions ||= []
if @transactions.include? Thread.current
return yield(conn)
end
View
242 lib/sequel_core/adapters/postgres.rb
@@ -2,16 +2,23 @@
begin
require 'pg'
+ SEQUEL_POSTGRES_USES_PG = true
rescue LoadError => e
- begin
- require 'postgres'
+ SEQUEL_POSTGRES_USES_PG = false
+ begin
+ require 'postgres'
+ # Attempt to get uniform behavior for the PGconn object no matter
+ # if pg, postgres, or postgres-pr is used.
class PGconn
unless method_defined?(:escape_string)
if self.respond_to?(:escape)
+ # If there is no escape_string instead method, but there is an
+ # escape class method, use that instead.
def escape_string(str)
self.class.escape(str)
end
else
+ # Raise an error if no valid string escaping method can be found.
def escape_string(obj)
raise Sequel::Error, "string escaping not supported with this postgres driver. Try using ruby-pg, ruby-postgres, or postgres-pr."
end
@@ -19,6 +26,8 @@ def escape_string(obj)
end
unless method_defined?(:escape_bytea)
if self.respond_to?(:escape_bytea)
+ # If there is no escape_bytea instance method, but there is an
+ # escape_bytea class method, use that instead.
def escape_bytea(obj)
self.class.escape_bytea(obj)
end
@@ -27,14 +36,20 @@ def escape_bytea(obj)
require 'postgres-pr/typeconv/conv'
require 'postgres-pr/typeconv/bytea'
extend Postgres::Conversion
+ # If we are using postgres-pr, use the encode_bytea method from
+ # that.
def escape_bytea(obj)
self.class.encode_bytea(obj)
end
metaalias :unescape_bytea, :decode_bytea
rescue
+ # If no valid bytea escaping method can be found, create one that
+ # raises an error
def escape_bytea(obj)
raise Sequel::Error, "bytea escaping not supported with this postgres driver. Try using ruby-pg, ruby-postgres, or postgres-pr."
end
+ # If no valid bytea unescaping method can be found, create one that
+ # raises an error
def self.unescape_bytea(obj)
raise Sequel::Error, "bytea unescaping not supported with this postgres driver. Try using ruby-pg, ruby-postgres, or postgres-pr."
end
@@ -56,8 +71,12 @@ class PGresult
end
module Sequel
+ # Top level module for holding all PostgreSQL-related modules and classes
+ # for Sequel.
module Postgres
CONVERTED_EXCEPTIONS << PGError
+
+ # Hash with integer keys and proc values for converting PostgreSQL types.
PG_TYPES = {
16 => lambda{ |s| Postgres.string_to_bool(s) }, # boolean
17 => lambda{ |s| Adapter.unescape_bytea(s).to_blob }, # bytea
@@ -78,6 +97,7 @@ module Postgres
1700 => lambda{ |s| s.to_d }, # numeric
}
+ # Module method for converting a PostgreSQL string to a boolean value.
def self.string_to_bool(s)
if(s.blank?)
nil
@@ -87,34 +107,43 @@ def self.string_to_bool(s)
false
end
end
-
+
+ # PGconn subclass for connection specific methods used with the
+ # pg, postgres, or postgres-pr driver.
class Adapter < ::PGconn
include Sequel::Postgres::AdapterMethods
self.translate_results = false if respond_to?(:translate_results=)
- def connected?
- status == Adapter::CONNECTION_OK
- end
-
- def execute(sql, &block)
+ # Execute the given SQL with this connection. If a block is given,
+ # yield the results, otherwise, return the number of changed rows.
+ def execute(sql, *args)
q = nil
begin
- q = exec(sql)
+ q = exec(sql, *args)
rescue PGError => e
- unless connected?
- reset
- q = exec(sql)
- else
- raise e
- end
+ raise if status == Adapter::CONNECTION_OK
+ reset
+ q = exec(sql, *args)
end
begin
- block ? block[q] : q.cmd_tuples
+ block_given? ? yield(q) : q.cmd_tuples
ensure
q.clear
end
end
+ if SEQUEL_POSTGRES_USES_PG
+ # Hash of prepared statements for this connection. Keys are
+ # string names of the server side prepared statement, and values
+ # are SQL strings.
+ def prepared_statements
+ @prepared_statements ||= {}
+ end
+ end
+
+ private
+
+ # Return the requested values for the given row.
def result_set_values(r, *vals)
return if r.nil? || (r.ntuples == 0)
case vals.length
@@ -125,12 +154,17 @@ def result_set_values(r, *vals)
end
end
end
-
+
+ # Database class for PostgreSQL databases used with Sequel and the
+ # pg, postgres, or postgres-pr driver.
class Database < Sequel::Database
include Sequel::Postgres::DatabaseMethods
set_adapter_scheme :postgres
-
+
+ # Connects to the database. In addition to the standard database
+ # options, using the :encoding or :charset option changes the
+ # client encoding for the connection.
def connect
conn = Adapter.connect(
@opts[:host] || 'localhost',
@@ -146,38 +180,82 @@ def connect
conn
end
+ # Return instance of Sequel::Postgres::Dataset with the given options.
def dataset(opts = nil)
Postgres::Dataset.new(self, opts)
end
+ # Disconnect all active connections.
def disconnect
@pool.disconnect {|c| c.finish}
end
-
- def execute(sql, &block)
+
+ # Execute the given SQL with the given args on an available connection.
+ def execute(sql, *args, &block)
begin
- log_info(sql)
- @pool.hold {|conn| conn.execute(sql, &block)}
+ log_info(sql, *args)
+ synchronize{|conn| conn.execute(sql, *args, &block)}
rescue => e
log_info(e.message)
raise convert_pgerror(e)
end
end
+ # Execute the prepared statement with the given name on an available
+ # connection, using the given args. If the connection has not prepared
+ # a statement with the given name yet, prepare it. If the connection
+ # has prepared a statement with the same name and different SQL,
+ # deallocate that statement first and then prepare this statement.
+ # If a block is given, yield the result, otherwise, return the number
+ # of rows changed. If the :insert option is passed, return the value
+ # of the primary key for the last inserted row.
+ def execute_prepared_statement(name, args, opts={})
+ ps = prepared_statements[name]
+ sql = ps.prepared_sql
+ ps_name = name.to_s
+ synchronize do |conn|
+ unless conn.prepared_statements[ps_name] == sql
+ if conn.prepared_statements.include?(ps_name)
+ s = "DEALLOCATE #{ps_name}"
+ log_info(s)
+ conn.execute(s) unless conn.prepared_statements[ps_name] == sql
+ end
+ conn.prepared_statements[ps_name] = sql
+ log_info("PREPARE #{ps_name} AS #{sql}")
+ conn.prepare(ps_name, sql)
+ end
+ log_info("EXECUTE #{ps_name}", args)
+ q = conn.exec_prepared(ps_name, args)
+ if opts[:insert]
+ insert_result(conn, *opts[:insert])
+ else
+ begin
+ block_given? ? yield(q) : q.cmd_tuples
+ ensure
+ q.clear
+ end
+ end
+ end
+ end
+
private
- # PostgreSQL doesn't need the pool to convert exceptions, either.
+ # PostgreSQL doesn't need the connection pool to convert exceptions.
def connection_pool_default_options
super.merge(:pool_convert_exceptions=>false)
end
end
-
+
+ # Dataset class for PostgreSQL datasets that use the pg, postgres, or
+ # postgres-pr driver.
class Dataset < Sequel::Dataset
include Sequel::Postgres::DatasetMethods
-
- def fetch_rows(sql, &block)
+
+ # yield all rows returned by executing the given SQL and converting
+ # the types.
+ def fetch_rows(sql)
@columns = []
- @db.execute(sql) do |res|
+ execute(sql) do |res|
(0...res.ntuples).each do |recnum|
converted_rec = {}
(0...res.nfields).each do |fieldnum|
@@ -193,6 +271,116 @@ def fetch_rows(sql, &block)
end
end
end
+
+ if SEQUEL_POSTGRES_USES_PG
+
+ PREPARED_ARG_PLACEHOLDER = '$'.lit.freeze
+
+ # PostgreSQL specific argument mapper used for mapping the named
+ # argument hash to a array with numbered arguments. Only used with
+ # the pg driver.
+ module ArgumentMapper
+ include Sequel::Dataset::ArgumentMapper
+
+ protected
+
+ # Return an array of strings for each of the hash values, inserting
+ # them to the correct position in the array.
+ def map_to_prepared_args(hash)
+ array = []
+ @prepared_args.each{|k,v| array[v] = hash[k].to_s}
+ array
+ end
+
+ private
+
+ # PostgreSQL most of the time requires type information for each of
+ # arguments to a prepared statement. Handle this by allowing the
+ # named argument to have a __* suffix, with the * being the type.
+ # In the generated SQL, cast the bound argument to that type to
+ # elminate ambiguity (and PostgreSQL from raising an exception).
+ def prepared_arg(k)
+ y, type = k.to_s.split("__")
+ "#{prepared_arg_placeholder}#{@prepared_args[y.to_sym]}#{"::#{type}" if type}".lit
+ end
+
+ # If the named argument has already been used, return the position in
+ # the output array that it is mapped to. Otherwise, map it to the
+ # next position in the array.
+ def prepared_args_hash
+ max_prepared_arg = 0
+ Hash.new do |h,k|
+ h[k] = max_prepared_arg
+ max_prepared_arg += 1
+ end
+ end
+ end
+
+ # Allow use of bind arguments for PostgreSQL using the pg driver.
+ module BindArgumentMethods
+ include ArgumentMapper
+
+ private
+
+ # Execute the given SQL with the stored bind arguments.
+ def execute(sql, &block)
+ @db.execute(sql, bind_arguments, &block)
+ end
+ alias execute_dui execute
+
+ # Execute the given SQL with the stored bind arguments, returning
+ # the primary key value for the inserted row.
+ def execute_insert(sql, table, values)
+ @db.execute_insert(sql, table, values, bind_arguments)
+ end
+ end
+
+ # Allow use of server side prepared statements for PostgreSQL using the
+ # pg driver.
+ module PreparedStatementMethods
+ include ArgumentMapper
+
+ private
+
+ # Execute the stored prepared statement name and the stored bind
+ # arguments instead of the SQL given.
+ def execute(sql, &block)
+ @db.execute_prepared_statement(prepared_statement_name, bind_arguments, &block)
+ end
+ alias execute_dui execute
+
+ # Execute the stored prepared statement name and the stored bind
+ # arguments instead of the SQL given, returning the primary key value
+ # for the last inserted row.
+ def execute_insert(sql, table, values)
+ @db.execute_prepared_statement(prepared_statement_name, bind_arguments, :insert=>[table, values])
+ end
+ end
+
+ # Execute the given type of statement with the hash of values.
+ def call(type, hash, values=nil, &block)
+ ps = to_prepared_statement(type, values)
+ ps.extend(BindArgumentMethods)
+ ps.call(hash, &block)
+ end
+
+ # Prepare the given type of statement with the given name, and store
+ # it in the database to be called later.
+ def prepare(type, name, values=nil)
+ ps = to_prepared_statement(type, values)
+ ps.extend(PreparedStatementMethods)
+ ps.prepared_statement_name = name
+ db.prepared_statements[name] = ps
+ end
+
+ private
+
+ # PostgreSQL uses $N for placeholders instead of ?, so use a $
+ # as the placeholder.
+ def prepared_arg_placeholder
+ PREPARED_ARG_PLACEHOLDER
+ end
+ end
end
end
end
View
60 lib/sequel_core/adapters/shared/mysql.rb
@@ -1,5 +1,7 @@
module Sequel
module MySQL
+ # Methods shared by Database instances that connect to MySQL,
+ # currently supported by the native and JDBC adapters.
module DatabaseMethods
AUTO_INCREMENT = 'AUTO_INCREMENT'.freeze
NOT_NULL = Sequel::Schema::SQL::NOT_NULL
@@ -12,6 +14,8 @@ module DatabaseMethods
UNIQUE = Sequel::Schema::SQL::UNIQUE
UNSIGNED = Sequel::Schema::SQL::UNSIGNED
+ # Use MySQL specific syntax for rename column, set column type, and
+ # drop index cases.
def alter_table_sql(table, op)
type = type_literal(op[:type])
type << '(255)' if type == 'varchar'
@@ -27,10 +31,12 @@ def alter_table_sql(table, op)
end
end
+ # Use MySQL specific AUTO_INCREMENT text.
def auto_increment_sql
AUTO_INCREMENT
end
+ # Handle MySQL specific column syntax (not sure why).
def column_definition_sql(column)
if column[:type] == :check
return constraint_definition_sql(column)
@@ -53,7 +59,8 @@ def column_definition_sql(column)
end
sql
end
-
+
+ # Handle MySQL specific index SQL syntax
def index_definition_sql(table_name, index)
index_name = index[:name] || default_index_name(table_name, index[:columns])
unique = "UNIQUE " if index[:unique]
@@ -69,16 +76,14 @@ def index_definition_sql(table_name, index)
end
end
- def serial_primary_key_options
- {:primary_key => true, :type => :integer, :auto_increment => true}
- end
-
+ # Get version of MySQL server, used for determined capabilities.
def server_version
m = /(\d+)\.(\d+)\.(\d+)/.match(get(:version[]))
@server_version ||= (m[1].to_i * 10000) + (m[2].to_i * 100) + m[3].to_i
end
- # Changes the database in use by issuing a USE statement.
+ # Changes the database in use by issuing a USE statement. I would be
+ # very careful if I used this.
def use(db_name)
disconnect
@opts[:database] = db_name if self << "USE #{db_name}"
@@ -88,12 +93,17 @@ def use(db_name)
private
+ # Always quote identifiers for the schema parser dataset.
def schema_ds_dataset
ds = schema_utility_dataset.clone
ds.quote_identifiers = true
ds
end
+ # Allow other database schema's to be queried using the :database
+ # option. Allow all database's schema to be used by setting
+ # the :database option to nil. If the database option is not specified,
+ # uses the currently connected database.
def schema_ds_filter(table_name, opts)
filt = super
# Restrict it to the given or current database, unless specifically requesting :database = nil
@@ -101,16 +111,20 @@ def schema_ds_filter(table_name, opts)
filt
end
+ # MySQL doesn't support table catalogs, so just join on schema and table name.
def schema_ds_join(table_name, opts)
[:information_schema__columns, {:table_schema => :table_schema, :table_name => :table_name}, :c]
end
end
+ # Dataset methods shared by datasets that use MySQL databases.
module DatasetMethods
BOOL_TRUE = '1'.freeze
BOOL_FALSE = '0'.freeze
COMMA_SEPARATOR = ', '.freeze
+ # MySQL specific syntax for LIKE/REGEXP searches, as well as
+ # string concatenation.
def complex_expression_sql(op, args)
case op
when :~, :'!~', :'~*', :'!~*', :LIKE, :'NOT LIKE', :ILIKE, :'NOT ILIKE'
@@ -141,6 +155,7 @@ def delete_sql(opts = nil)
sql
end
+ # MySQL specific full text search syntax.
def full_text_search(cols, terms, opts = {})
mode = opts[:boolean] ? " IN BOOLEAN MODE" : ""
s = if Array === terms
@@ -160,34 +175,22 @@ def having(*cond, &block)
@opts[:having] = {}
x = filter(*cond, &block)
end
-
+
+ # MySQL doesn't use the SQL standard DEFAULT VALUES.
def insert_default_values_sql
"INSERT INTO #{source_list(@opts[:from])} () VALUES ()"
end
- # Returns a join clause based on the specified join type
- # and condition. MySQL's NATURAL join is 'semantically
- # equivalent to a JOIN with a USING clause that names all
- # columns that exist in both tables. The constraint
- # expression may be nil, so join expression can accept two
- # arguments.
- #
- # === Note
- # Full outer joins (:full_outer) are not implemented in
- # MySQL (as of v6.0), nor is there currently a work around
- # implementation in Sequel. Straight joins with 'ON
- # <condition>' are not yet implemented.
- #
- # === Example
- # @ds = MYSQL_DB[:nodes]
- # @ds.join_table(:natural_left_outer, :nodes)
- # # join SQL is 'NATURAL LEFT OUTER JOIN nodes'
+ # Transforms an CROSS JOIN to an INNER JOIN if the expr is not nil.
+ # Raises an error on use of :full_outer type, since MySQL doesn't support it.
def join_table(type, table, expr=nil, table_alias=nil)
type = :inner if (type == :cross) && !expr.nil?
- raise(Sequel::Error::InvalidJoinType, "MySQL doesn't support FULL OUTER JOIN") if type == :full_outer
+ raise(Sequel::Error, "MySQL doesn't support FULL OUTER JOIN") if type == :full_outer
super(type, table, expr, table_alias)
end
+ # Transforms :natural_inner to NATURAL LEFT JOIN and straight to
+ # STRAIGHT_JOIN.
def join_type_sql(join_type)
case join_type
when :straight then 'STRAIGHT_JOIN'
@@ -195,7 +198,8 @@ def join_type_sql(join_type)
else super
end
end
-
+
+ # Override the default boolean values.
def literal(v)
case v
when true
@@ -207,16 +211,20 @@ def literal(v)
end
end
+ # MySQL specific syntax for inserting multiple values at once.
def multi_insert_sql(columns, values)
columns = column_list(columns)
values = values.map {|r| literal(Array(r))}.join(COMMA_SEPARATOR)
["INSERT INTO #{source_list(@opts[:from])} (#{columns}) VALUES #{values}"]
end
+ # MySQL uses the nonstandard ` (backtick) for quoting identifiers.
def quoted_identifier(c)
"`#{c}`"
end
+ # MySQL specific syntax for REPLACE (aka UPSERT, or update if exists,
+ # insert if it doesn't).
def replace_sql(*values)
from = source_list(@opts[:from])
if values.empty?
View
105 lib/sequel_core/adapters/shared/postgres.rb
@@ -1,7 +1,10 @@
module Sequel
module Postgres
+ # Array of exceptions that need to be converted. JDBC
+ # uses NativeExceptions, the native adapter uses PGError.
CONVERTED_EXCEPTIONS = []
-
+
+ # Methods shared by adapter/connection instances.
module AdapterMethods
SELECT_CURRVAL = "SELECT currval('%s')".freeze
SELECT_PK = <<-end_sql
@@ -45,8 +48,11 @@ module AdapterMethods
AND dep.refobjid = '%s'::regclass
end_sql
+ # Depth of the current transaction on this connection, used
+ # to implement multi-level transactions with savepoints.
attr_accessor :transaction_depth
+ # Get the last inserted value for the given table.
def last_insert_id(table)
@table_sequences ||= {}
if !@table_sequences.include?(table)
@@ -63,6 +69,7 @@ def last_insert_id(table)
end
end
+ # Get the primary key and sequence for the given table.
def pkey_and_sequence(table)
execute(SELECT_PK_AND_SERIAL_SEQUENCE % table) do |r|
vals = result_set_values(r, 2, 2)
@@ -74,14 +81,17 @@ def pkey_and_sequence(table)
end
end
+ # Get the primary key for the given table.
def primary_key(table)
execute(SELECT_PK % table) do |r|
result_set_values(r, 0)
end
end
end
-
+
+ # Methods shared by Database instances that connect to PostgreSQL.
module DatabaseMethods
+ PREPARED_ARG_PLACEHOLDER = '$'.lit.freeze
RE_CURRVAL_ERROR = /currval of sequence "(.*)" is not yet defined in this session/.freeze
RELATION_QUERY = {:from => [:pg_class], :select => [:relname]}.freeze
RELATION_FILTER = "(relkind = 'r') AND (relname !~ '^pg|sql')".freeze
@@ -93,15 +103,18 @@ module DatabaseMethods
SQL_RELEASE_SAVEPOINT = 'RELEASE SAVEPOINT autopoint_%d'.freeze
SYSTEM_TABLE_REGEXP = /^pg|sql/.freeze
+ # Always CASCADE the table drop
def drop_table_sql(name)
"DROP TABLE #{name} CASCADE"
end
- def execute_insert(sql, table, values)
+ # Insert the values into the table and return the primary key (if
+ # automatically generated).
+ def execute_insert(sql, table, values, *bind_arguments)
begin
- log_info(sql)
- @pool.hold do |conn|
- conn.execute(sql)
+ log_info(sql, *bind_arguments)
+ synchronize do |conn|
+ conn.execute(sql, *bind_arguments)
insert_result(conn, table, values)
end
rescue => e
@@ -110,6 +123,7 @@ def execute_insert(sql, table, values)
end
end
+ # PostgreSQL specific index SQL.
def index_definition_sql(table_name, index)
index_name = index[:name] || default_index_name(table_name, index[:columns])
expr = literal(Array(index[:columns]))
@@ -129,6 +143,11 @@ def index_definition_sql(table_name, index)
"CREATE #{unique}INDEX #{index_name} ON #{table_name} #{"USING #{index_type} " if index_type}#{expr}#{filter}"
end
+ # The result of the insert for the given table and values. Uses
+ # last insert id the primary key for the table if it exists,
+ # otherwise determines the primary key for the table and uses the
+ # value of the hash key. If values is an array, assume the first
+ # value is the primary key value and return that.
def insert_result(conn, table, values)
begin
result = conn.last_insert_id(table)
@@ -147,21 +166,28 @@ def insert_result(conn, table, values)
end
end
+ # Dataset containing all current database locks
def locks
- dataset.from("pg_class, pg_locks").
- select("pg_class.relname, pg_locks.*").
- filter("pg_class.relfilenode=pg_locks.relation")
+ dataset.from(:pg_class, :pg_locks).
+ select(:pg_class__relname, :pg_locks.*).
+ filter(:pg_class__relfilenode=>:pg_locks__relation)
end
-
+
+ # Returns primary key for the given table. This information is
+ # cached, and if the primary key for a table is changed, the
+ # @primary_keys instance variable should be reset manually.
def primary_key_for_table(conn, table)
@primary_keys ||= {}
@primary_keys[table] ||= conn.primary_key(table)
end
+ # PostgreSQL uses SERIAL psuedo-type instead of AUTOINCREMENT for
+ # managing incrementing primary keys.
def serial_primary_key_options
{:primary_key => true, :type => :serial}
end
+ # The version of the PostgreSQL server, used for determining capability.
def server_version
return @server_version if @server_version
@server_version = pool.hold do |conn|
@@ -174,12 +200,14 @@ def server_version
@server_version
end
+ # Array of symbols specifying table names in the current database.
def tables
- dataset(RELATION_QUERY).filter(RELATION_FILTER).map {|r| r[:relname].to_sym}
+ dataset(RELATION_QUERY).filter(RELATION_FILTER).map{|r| r[:relname].to_sym}
end
+ # PostgreSQL supports multi-level transactions using save points.
def transaction
- @pool.hold do |conn|
+ synchronize do |conn|
conn.transaction_depth = 0 if conn.transaction_depth.nil?
if conn.transaction_depth > 0
log_info(SQL_SAVEPOINT % conn.transaction_depth)
@@ -222,10 +250,20 @@ def transaction
private
+ # Convert the exception to a Sequel::Error if it is in CONVERTED_EXCEPTIONS.
def convert_pgerror(e)
e.is_one_of?(*CONVERTED_EXCEPTIONS) ? Error.new(e.message) : e
end
-
+
+ # Use a dollar sign instead of question mark for the argument
+ # placeholder.
+ def prepared_arg_placeholder
+ PREPARED_ARG_PLACEHOLDER
+ end
+
+ # When the :schema option is used, use the the given schema.
+ # When the :schema option is nil, return results for all schemas.
+ # If the :schema option is not used, use the public schema.
def schema_ds_filter(table_name, opts)
filt = super
# Restrict it to the given or public schema, unless specifically requesting :schema = nil
@@ -233,7 +271,8 @@ def schema_ds_filter(table_name, opts)
filt
end
end
-
+
+ # Instance methods for datasets that connect to a PostgreSQL database.
module DatasetMethods
ACCESS_SHARE = 'ACCESS SHARE'.freeze
ACCESS_EXCLUSIVE = 'ACCESS EXCLUSIVE'.freeze
@@ -253,7 +292,8 @@ module DatasetMethods
SHARE = 'SHARE'.freeze
SHARE_ROW_EXCLUSIVE = 'SHARE ROW EXCLUSIVE'.freeze
SHARE_UPDATE_EXCLUSIVE = 'SHARE UPDATE EXCLUSIVE'.freeze
-
+
+ # Return the results of an ANALYZE query as a string
def analyze(opts = nil)
analysis = []
fetch_rows(EXPLAIN_ANALYZE + select_sql(opts)) do |r|
@@ -262,6 +302,7 @@ def analyze(opts = nil)
analysis.join("\r\n")
end
+ # Return the results of an EXPLAIN query as a string
def explain(opts = nil)
analysis = []
fetch_rows(EXPLAIN + select_sql(opts)) do |r|
@@ -270,14 +311,18 @@ def explain(opts = nil)
analysis.join("\r\n")
end
+ # Return a cloned dataset with a :share lock type.
def for_share
clone(:lock => :share)
end
-
+
+ # Return a cloned dataset with a :update lock type.
def for_update
clone(:lock => :update)
end
-
+
+ # PostgreSQL specific full text search syntax, using tsearch2 (included
+ # in 8.3 by default, and available for earlier versions as an add-on).
def full_text_search(cols, terms, opts = {})
lang = opts[:language] ? "#{literal(opts[:language])}, " : ""
cols = cols.is_a?(Array) ? cols.map {|c| literal(c)}.join(" || ") : literal(cols)
@@ -285,11 +330,14 @@ def full_text_search(cols, terms, opts = {})
filter("to_tsvector(#{lang}#{cols}) @@ to_tsquery(#{lang}#{terms})")
end
+ # Insert given values into the database.
def insert(*values)
- @db.execute_insert(insert_sql(*values), source_list(@opts[:from]),
+ execute_insert(insert_sql(*values), source_list(@opts[:from]),
values.size == 1 ? values.first : values)
end
-
+
+ # Handle microseconds for Time and DateTime values, as well as PostgreSQL
+ # specific boolean values and string escaping.
def literal(v)
case v
when LiteralString
@@ -310,11 +358,11 @@ def literal(v)
end
# Locks the table with the specified mode.
- def lock(mode, &block)
+ def lock(mode)
sql = LOCK % [source_list(@opts[:from]), mode]
@db.synchronize do
- if block # perform locking inside a transaction and yield to block
- @db.transaction {@db.execute(sql); yield}
+ if block_given? # perform locking inside a transaction and yield to block
+ @db.transaction{@db.execute(sql); yield}
else
@db.execute(sql) # lock without a transaction
self
@@ -322,6 +370,7 @@ def lock(mode, &block)
end
end
+ # For PostgreSQL version > 8.2, allow inserting multiple rows at once.
def multi_insert_sql(columns, values)
return super if @db.server_version < 80200
@@ -331,10 +380,13 @@ def multi_insert_sql(columns, values)
["INSERT INTO #{source_list(@opts[:from])} (#{columns}) VALUES #{values}"]
end
+ # PostgreSQL assumes unquoted identifiers are lower case by default,
+ # so do not upcase the identifier when quoting it.
def quoted_identifier(c)
"\"#{c}\""
end
-
+
+ # Support lock mode, allowing FOR SHARE and FOR UPDATE queries.
def select_sql(opts = nil)
row_lock_mode = opts ? opts[:lock] : @opts[:lock]
sql = super
@@ -346,6 +398,13 @@ def select_sql(opts = nil)
end
sql
end
+
+ private
+
+ # Call execute_insert on the database object with the given values.
+ def execute_insert(sql, table, values)
+ @db.execute_insert(sql, table, values)
+ end
end
end
end
View
57 lib/sequel_core/adapters/shared/sqlite.rb
@@ -7,6 +7,10 @@ module DatabaseMethods
TABLES_FILTER = "type = 'table' AND NOT name = 'sqlite_sequence'"
TEMP_STORE = {'0' => :default, '1' => :file, '2' => :memory}.freeze
+ # SQLite supports limited table modification. You can add a column
+ # or an index. Dropping columns is supported by copying the table into
+ # a temporary table, dropping the table, and creating a new table without
+ # the column inside of a transaction.
def alter_table_sql(table, op)
case op[:op]
when :add_column
@@ -28,51 +32,59 @@ def alter_table_sql(table, op)
end
end
+ # A symbol signifying the value of the auto_vacuum PRAGMA.
def auto_vacuum
AUTO_VACUUM[pragma_get(:auto_vacuum).to_s]
end
+ # Set the auto_vacuum PRAGMA using the given symbol (:none, :full, or
+ # :incremental).
def auto_vacuum=(value)
value = AUTO_VACUUM.key(value) || (raise Error, "Invalid value for auto_vacuum option. Please specify one of :none, :full, :incremental.")
pragma_set(:auto_vacuum, value)
end
+ # Get the value of the given PRAGMA.
def pragma_get(name)
self["PRAGMA #{name}"].single_value
end
+ # Set the value of the given PRAGMA to value.
def pragma_set(name, value)
execute_ddl("PRAGMA #{name} = #{value}")
end
- def serial_primary_key_options
- {:primary_key => true, :type => :integer, :auto_increment => true}
- end
-
+ # A symbol signifying the value of the synchronous PRAGMA.
def synchronous
SYNCHRONOUS[pragma_get(:synchronous).to_s]
end
+ # Set the synchronous PRAGMA using the given symbol (:off, :normal, or :full).
def synchronous=(value)
value = SYNCHRONOUS.key(value) || (raise Error, "Invalid value for synchronous option. Please specify one of :off, :normal, :full.")
pragma_set(:synchronous, value)
end
-
+
+ # Array of symbols specifying the table names in the current database.
def tables
self[:sqlite_master].filter(TABLES_FILTER).map {|r| r[:name].to_sym}
end
+ # A symbol signifying the value of the temp_store PRAGMA.
def temp_store
TEMP_STORE[pragma_get(:temp_store).to_s]
end
+ # Set the temp_store PRAGMA using the given symbol (:default, :file, or :memory).
def temp_store=(value)
value = TEMP_STORE.key(value) || (raise Error, "Invalid value for temp_store option. Please specify one of :default, :file, :memory.")
pragma_set(:temp_store, value)
end
private
-
+
+ # SQLite supports schema parsing using the table_info PRAGMA, so
+ # parse the output of that into the format Sequel expects.
def schema_parse_table(table_name, opts)
rows = self["PRAGMA table_info(?)", table_name].collect do |row|
row.delete(:cid)
@@ -92,15 +104,21 @@ def schema_parse_table(table_name, opts)
end
schema_parse_rows(rows)
end
-
+
+ # SQLite doesn't support getting the schema of all tables at once,
+ # so loop through the output of #tables to get them.
def schema_parse_tables(opts)
schemas = {}
tables.each{|table| schemas[table] = schema_parse_table(table, opts)}
schemas
end
end
-
- module DatasetMethods
+
+ # Instance methods for datasets that connect to an SQLite database
+ module DatasetMethods
+ # SQLite does not support pattern matching via regular expressions.
+ # SQLite is case insensitive (depending on pragma), so use LIKE for
+ # ILIKE.
def complex_expression_sql(op, args)
case op
when :~, :'!~', :'~*', :'!~*'
@@ -113,23 +131,28 @@ def complex_expression_sql(op, args)
end
end
+ # SQLite performs a TRUNCATE style DELETE if no filter is specified.
+ # Since we want to always return the count of records, do a specific
+ # count in the case of no filter.
def delete(opts = nil)
# check if no filter is specified
unless (opts && opts[:where]) || @opts[:where]
@db.transaction do
unfiltered_count = count
- @db.execute_dui delete_sql(opts)
+ execute_dui delete_sql(opts)
unfiltered_count
end
else
- @db.execute_dui delete_sql(opts)
+ execute_dui delete_sql(opts)
end
end
+ # Insert the values into the database.
def insert(*values)
- @db.execute_insert insert_sql(*values)
+ execute_insert insert_sql(*values)
end
-
+
+ # Allow inserting of values directly from a dataset.
def insert_sql(*values)
if (values.size == 1) && values.first.is_a?(Sequel::Dataset)
"INSERT INTO #{source_list(@opts[:from])} #{values.first.sql};"
@@ -138,9 +161,17 @@ def insert_sql(*values)
end
end
+ # SQLite uses the nonstandard ` (backtick) for quoting identifiers.
def quoted_identifier(c)
"`#{c}`"
end
+
+ private
+
+ # Call execute_insert on the database with the given SQL.
+ def execute_insert(sql)
+ @db.execute_insert(sql)
+ end
end
end
end
View
182 lib/sequel_core/adapters/sqlite.rb
@@ -2,18 +2,28 @@
require 'sequel_core/adapters/shared/sqlite'
module Sequel
+ # Top level module for holding all SQLite-related modules and classes
+ # for Sequel.
module SQLite
+ # Database class for PostgreSQL databases used with Sequel and the
+ # ruby-sqlite3 driver.
class Database < Sequel::Database
include ::Sequel::SQLite::DatabaseMethods
set_adapter_scheme :sqlite
+ # Mimic the file:// uri, by having 2 preceding slashes specify a relative
+ # path, and 3 preceding slashes specify an absolute path.
def self.uri_to_options(uri) # :nodoc:
{ :database => (uri.host.nil? && uri.path == '/') ? nil : "#{uri.host}#{uri.path}" }
end
-
+
private_class_method :uri_to_options
-
+
+ # Connect to the database. Since SQLite is a file based database,
+ # the only options available are :database (to specify the database
+ # name), and :timeout, to specify how long to wait for the database to
+ # be available if it is locked (default is 5 seconds).
def connect
@opts[:database] = ':memory:' if @opts[:database].blank?
db = ::SQLite3::Database.new(@opts[:database])
@@ -26,58 +36,45 @@ def connect
db
end
+ # Return instance of Sequel::SQLite::Dataset with the given options.
def dataset(opts = nil)
SQLite::Dataset.new(self, opts)
end
+ # Disconnect all connections from the database.
def disconnect
@pool.disconnect {|c| c.close}
end
-
- def execute(sql)
- begin
- log_info(sql)
- @pool.hold {|conn| conn.execute_batch(sql); conn.changes}
- rescue SQLite3::Exception => e
- raise Error::InvalidStatement, "#{sql}\r\n#{e.message}"
- end