…efore setting up a database connection
where raise_on_save_failure is set and a record save fails because the validations did not pass
It looks like using extended strings for blobs was a bad idea. Apparently, they were fine with standard strings. The use of extended strings caused problems when standard conforming strings was ON (which is the the default in Sequel).
…st (Fixes #255) Before, some adapters (at least PostgreSQL and SQLite) wouldn't raise an error if given a table that didn't exist. Instead, you would end up with an empty schema for those tables. Because the schema existed, even if it was empty, Database#table_exists? would return true if asked about those tables. Generally, this bug would show up in the following code: class Blah < Sequel::Model end Blah.table_exists? # True even if blahs is not a table This makes Database#schema raise an error if the adapter gives it an empty schema description, since a database table should have at least one column. This prevents the empty description from being added to the schema hash, which makes table_exists? not give the wrong answer when asked if the table exists.
…r overloading (Thanks bougyman)
…en't model plugins, and add my sequel_postgresql_triggers extension
…nguages on PostgreSQL Fairly straightforward code to support the CREATE FUNCTION, CREATE TRIGGER, CREATE LANGUAGE, DROP FUNCTION, DROP TRIGGER, and DROP LANGUAGE commands on PostgreSQL. Includes some decent but not all inclusive specs.
…n's Merb And Sequel presentation
This shows that while 100% code coverage may not fix all bugs, it certainly helps find some.
Note to self that using ||= with a boolean variable is a recipe for problems.
… and placeholders
…statements Before, the following two types of prepared statements did not work: DB[:items].filter("id = ?", :$i).call(:select, :i=>1) DB[:items].filter(:id=>DB[:items].select(:id).filter(:id=>:$i)).call(:select, :i=>1) The first issue is because a placeholder string was literalized immediately, before the dataset was extended with the prepared statement code. The second issue is because the arguments given in the main prepared statements weren't passed into any subselects. This commit fixes both of those issues. It also makes the name argument to Dataset#prepare optional. Fixing the first issue is done by adding an SQL::PlaceholderLiteralString class that holds the string with placeholders as well as the arguments, and not literalizing them until the SQL string is needed. Fixing the second issue was a lot more work, It is done by adding a Dataset#subselect_sql private method that literal calls, and overriding it in the PreparedStatement module that extends the dataset, which takes the subselect dataset, turns it into a prepared statement, and does the magic necessary pass the args in (if the default emulated support is used). It required changes to the argument mappers so they didn't rely on instance variables. Instead of using a hash, they now use an array that is shared with any subselects. The mapping code is simpler and the code in general is more generic. This does away with prepared_args_hash, as it is no longer necessary.
…pter supports it This adds a Sequel::DatabaseDisconnectError (subclass of Sequel::DatabaseError), for signaling to the connection pool that the connection was lost. It changes the connection pool code to rescue that exception and remove the offending connection from the connection pool. In order to implement this, a disconnection_proc had to be added to the connection pool. Sequel uses a generic one that calls Database#disconnect_connection. disconnection_proc is called both when connections are removed because the connection was lost and when Database#disconnect is called. Database#disconnect now calls @pool.disconnect, which now uses the disconnection_proc if no block is provided. All adapters have been modified to remove Database#disconnect and define Database#disconnect_connection, which was fairly easy since all defined #disconnect methods just called @pool.disconnect with a block that disconnected each connection. The only adapter that currently supports this is PostgreSQL. The PostgreSQL adapter used to silently attempt to reconnect, which might have caused the same SQL to be used twice. I'm not sure it could have happened, but I'm not sure it couldn't have happened either. Now, if the database connection is lost, it raises DatabaseDisconnectError, the connection pool removes the connection from the pool, and raises the error to the application. If the application wants to continue, it can always retry. While mucking in the connection pool, I found a bug where the wrong key could be used when new connections were created. This wasn't a huge issue in most cases, but it could have caused as many as twice the number of max_connections connections to be created.
) This commit adds support for database stored procedures, with an API similar to Sequel's prepared statement support, and implemented internally in a similar way. While it is directly callable on the Database object (via #call_sproc), that use is discouraged. Instead it should be used at the dataset level with the following API: DB[:table].call_sproc(:select, :mysp, 'first param', 'second param') # or sp = DB[:table].prepare_sproc(:select, :mysp) sp.call('first param', 'second param') sp.call('third param', 'fourth param') The only adapters with support for this are MySQL and JDBC (if using a database that supports it). Other databases don't even expose this API. Adding support to other databases should be fairly easy, though I have no plans to at present. The stored procedure implementation is similar to the prepared statement implementation at the Dataset level. #call_sproc and and returning a clone of the dataset,and overriding Dataset#execute and related functions to add options that are used by Database#execute to send the request to the Dataset#call_sproc method. While working on stored procedure support, it became necessary to fix the MySQL adapter to make it handle multiple results, since MySQL stored procedures require that. This also fixed issues with using multiple statements at once in the MySQL adapter. Before, this would cause a "commands out of sync" error message that wasn't easily recoverable from. The MySQL adapter now supports this, though the JDBC adapter connecting to MySQL still barfs when you attempt this. Additionally, fix the socket tests in the MySQL adapter to use the same user, password, and database.
…arguments in joins This allows you to do: DB.from(:i.as(:j)).join(:k.as(:l), :a=>:b) #=> ... FROM i AS j INNER JOIN k AS l ON (l.a = j.b)
This adds a couple of modules to Sequel::Dataset that can be included in adapters that don't support EXCEPT/INTERSECT. UnsupportedIntersectExcept raises an error if EXCEPT/INTERSECT is used, and UnsupportedIntersectExceptAll raises an error if EXCEPT ALL/INTERSECT ALL is used (but EXCEPT/INTERSECT is allowed). The Informix, MySQL, MSSQL, and Progress adapters use UnsupportedIntersectExcept and the Oracle and SQLite adapters use UnsupportedIntersectExceptAll. Similar code was already in the MSSQL and Progress adapters, this commit removes that code. Also, it removes some commented out code from the Informix adapter.
Before, the UNION, INTERESECT, and EXCEPT statements came after ORDER BY by default, this commit moves them before ORDER BY. Some of the adapters were changed, but not all. If you use UNION, INTERSECT, or EXCEPT, please test with your database to make sure it still works.