-
-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make Postgres provider test idempotent #45418
Make Postgres provider test idempotent #45418
Conversation
36d65ab
to
bc63748
Compare
Interestingly, it takes 4 runs of TestPyQgsPostgresProvider.testNonPkBigintField to get a failure:
|
3cf0008
to
884e3ce
Compare
The
I'll file a separate ticket for that one, tomorrow |
I think that the best approach is to get rid of the external SQL file and drop/create/populate tables in each test like in QGIS/tests/src/python/test_provider_postgres.py Line 2802 in 41f1924
|
I'd prefer actually reverting changes while still keeping an upfront load of things, or a bug in the creation of data tables would result in failing all tests. And also to reduce the cost of running multiple tests at once (some read-only test would not need to create again and again the same tables) |
Interestingly, CI is getting my own failure: https://github.com/qgis/QGIS/pull/45418/checks?check_run_id=3808276429#step:13:834 |
Great: the testPktUpdateBigintPkNonFirst test was actually BOGUS in that it changed values in one table and checked if they were changed in ANOTHER table (typo). |
3d2f017
to
aedc711
Compare
Ok CI is now happy, but locally I still have failures on SECOND run in testOrderBy and testOrderByCompiled, as if some of the tests run after those, somehow change the data for next invocation of those tests. I'm trying to figure how which tests do that. |
It's actually the 3rd run which fails and I suspect it has to do with sequences not being reset, as the failure on 3rd run is an additional row with identifier
And the 4th run fails with an additional row with identifier
|
The "someData_pk_seq" sequence starts as uncalled, with lastVal being 1, while the table contains values from 1 to 5. |
TestPyQgsPostgresProvider.testNestedInsert is responsible to increment that sequence (at least: one of the responsibles) |
f992a6a
to
aedc711
Compare
The |
It looks like adding a commitChanges (or rollback) fixes the deadlock. Pushed. |
In it's current state, this PR gives me a fully idempotent TestPyQgsPostgresProvider run, fixing #45417 |
I don't recall the details of it, looking in retrospective the interest was to get no deadlock. Regardless if the action itself fails or succeeds. |
The test was checking the *wrong* table for effects of edits performed. This commit fixes that, makes failures easier to understand and makes the test idempotent.
Use from testJson
Also have it commit changes to avoid leaving cursors open
96902ad
to
8dbc7d2
Compare
def scopedTableBackup(self, schema, table): | ||
|
||
class ScopedBackup(): | ||
def __init__(self, tester, schema, table): | ||
self.schema = schema | ||
self.table = table | ||
self.tester = tester | ||
tester.execSQLCommand('DROP TABLE IF EXISTS {s}.{t}_edit CASCADE'.format(s=schema, t=table)) | ||
tester.execSQLCommand('CREATE TABLE {s}.{t}_edit AS SELECT * FROM {s}.{t}'.format(s=schema, t=table)) | ||
|
||
def __del__(self): | ||
self.tester.execSQLCommand('TRUNCATE TABLE {s}.{t}'.format(s=self.schema, t=self.table)) | ||
self.tester.execSQLCommand('INSERT INTO {s}.{t} SELECT * FROM {s}.{t}_edit'.format(s=self.schema, t=self.table)) | ||
self.tester.execSQLCommand('DROP TABLE {s}.{t}_edit'.format(s=self.schema, t=self.table)) | ||
|
||
return ScopedBackup(self, schema, table) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like this approach! I can't help but feel like it's only a few tweaks away from being a real context manager though. E.g. it would be cleaner/more pythonic to be able to directly call:
def my_test(self):
# ...
with self.backup_table('schema','my_table'):
# some sql
# ...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought about that but then some tests use more than a single table, so the with
would need multiple tables to be backed up
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
potentially we could nest the contexts... it's not super nice though: 😛
with self.backup_table('....'):
with self.backup_table('....'):
....
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or provide a list of tables?
See #45417
TODO: