We execute a lot of similar statements in an ETL process. After a few hundred updates on different entities we get the above error.
It seems the the number of cursors leak - means - cursors are not closed correctly.
Unfortunately I do not have a code snippet I can share to reproduce it.
The problem is as a workaround resolved by calling
every one and then (e.g. after processing 100 entities e.g.)
Rails 3.1.1 (same on 3.1.2.rc2)
You can find how your application uses cursors.
select user_name,sql_text,count(*) from v$open_cursor
where user_name = 'PUTYOURDATABASEUSERNAME'
group by user_name,sql_text
order by 3 desc;
You can find the current open_cursors initialization parameter value.
select name,value from v$parameter where name = 'open_cursors';
The statement pool implementation is not released for oracle enhanced adapter yet, so please try master from github (see #100 for more details). There is also a new param available in database.yml statement_limit (default is 300 cursors).
I am seeing this too recently, esp. in conjunction with Delayed_job and after upgrading to oracle-enhanced-adapter v1.4.0
Using Rails 3.1.3
Seeing a lot of "ORA-01000: maximum open cursors exceeded" here too. Rails 3.1.0, oracle_enhanced 1.4
Have you tried oracle_enhanced master with statement_limit?
Switching to the master-branch of oracle_enhanced instead of using the gem-version worked for me so far. Perhaps there should be a new release soon?
I get this too with a oracle_enhanced 1.4. This happened with a recent oracle_enhanced 1.4 that bundler has just downloaded.
+1. waiting for a new release :)