Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

with pyodbc.connect(cs_sql) as conn does not close connection after leaving the block #48

Closed
zyzhu2000 opened this issue May 14, 2015 · 3 comments

Comments

@zyzhu2000
Copy link

with pyodbc.connect(cs_sql) as conn:
pass
print conn

You will see that the connection is not closed.

@mkleehammer
Copy link
Owner

It is not supposed to close the connection - it commits the transaction at the end if no error was raised. If an error was raised it will roll back.

However, there is zero reason to use a context manager in CPython once you understand how it works. It uses reference counted garbage collection so:

def func():
    cursor = connect(...).cursor()
    cursor.execute(...) // 1
    cursor.execute(...) // 2
    cursor.commit()     // 3

This is is just as good. All Python objects are deleted when they go out of scope if there are no other references to them. In this case i didn't keep a copy of the Connection around so the only reference to it is the Cursor. When the function exits, either by completing or via an exception, the cursor will be deleted. Since the cursor is the only reference to the connection, it will be deleted also - immediately, not "someday" like Java or Javascript. When a connection is deleted, the connection is closed an any outstanding changes are rolled back. (All DBs roll back uncommitted transactions but pyodbc will make a rollback call anyway.) If the code made it past the commit (3), your changes are saved. If (2) were to raise an exception, the changes from (1) will be rolled back immediately as the exception unwinds the stack.

@AtomBaf
Copy link

AtomBaf commented Feb 22, 2018

HI mkleehammer

I disagree, the connection object should be closed after the with block. I think most of developers will think that way and the current pyodbc behaviour, they will let connections objects unclosed, thus leaving way to memory leaks.

You're right about the fact that the object memory will be cleanup after exit of a function, but if you're using the connection in another context (like a Jupyter notebook), all connection objects will stay forever. The only way to use pyodbc in this context is to use the try/except blocks, which is not very clean and don't use the PEP343 feature.

Thank you

@keitherskine
Copy link
Collaborator

Also relevant: #43
It appears most python odbc modules treat context managers on connections as a way of managing commits rather than the connection itself. I was surprised at this too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants