New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create `requires_python` column on `release_files` to efficiently query `requires_python` with files #1448

Merged
merged 12 commits into from Oct 5, 2016

Conversation

Projects
None yet
5 participants
@Carreau
Contributor

Carreau commented Aug 31, 2016

Assuming there are much less updates of the release and release_file
table than reads then we take the overhead of building this materialized
view only rarely.

Building it concurrently is slow but does not block reads so the
building cost might be invisible.

The more efficient alternative would be :

   - To make this an actual table and update only the affected row but
     seem much more complicated and prone to errors and confusions for
     maintenance as this table should not be manipulated directly. But
     would not be that hard since Posgres 9.5 supports UPSERTs.

     It would be easy to convert back-and from a virtual-table to a real
     table without actually changing Python code.

   - To duplicate the `requires_python` column on both `releases` and
     `release_files` which seem redundant and might lead to
     inconsistencies.

   - Transfer `requires_python` to `release_files`, though it is also
     queried from `releases` for we might still have the overhead of
     join but in the other direction and will require careful checking
     of the legacy-pypi codebase.

Working with @michaelpacer on pypa/pypi-legacy#506 we came up with this solution for now, and woudl appreciate any feedback and timing statistics on replicate of production DB.

@Carreau

This comment has been minimized.

Show comment
Hide comment
@Carreau

Carreau Aug 31, 2016

Contributor

Pushed a change to try to make the linter happy.

Contributor

Carreau commented Aug 31, 2016

Pushed a change to try to make the linter happy.

@Carreau

This comment has been minimized.

Show comment
Hide comment
@Carreau

Carreau Aug 31, 2016

Contributor

The test on 3.5.2 seem to have failed before starting can one of you restart this specific build of travis ?

Thanks !

Contributor

Carreau commented Aug 31, 2016

The test on 3.5.2 seem to have failed before starting can one of you restart this specific build of travis ?

Thanks !

@di

This comment has been minimized.

Show comment
Hide comment
@di

di Aug 31, 2016

Member

@Carreau I restarted it for you.

Member

di commented Aug 31, 2016

@Carreau I restarted it for you.

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Aug 31, 2016

Member

Why not just add a column to the release_files table and setup a trigger to cascade the values over? Using a Materialized view here seems to be harder than that?

Member

dstufft commented Aug 31, 2016

Why not just add a column to the release_files table and setup a trigger to cascade the values over? Using a Materialized view here seems to be harder than that?

@Carreau

This comment has been minimized.

Show comment
Hide comment
@Carreau

Carreau Aug 31, 2016

Contributor

Why not just add a column to the release_files table and setup a trigger to cascade the values over? Using a Materialized view here seems to be harder than that?

I'm not sure how to cascade on the same table, I can give it a try; @michaelpacer and I basically learn SQL for that, and I can see many way to get that wrong:
- recursion ? I don't know if it's possible for triggers ?
How to react to row update, if row update in release_files there is and it update a release_files.requires_python, does the trigger revert back to the corresponding value in release.requires_python ? Globally I'm concern that would would get internal source of truth and that any further development later on warehouse might get confused to know which of the two column should be updated.

I might just misunderstand the features available in SQL, though and there might be way to make sure there is consistency.

Contributor

Carreau commented Aug 31, 2016

Why not just add a column to the release_files table and setup a trigger to cascade the values over? Using a Materialized view here seems to be harder than that?

I'm not sure how to cascade on the same table, I can give it a try; @michaelpacer and I basically learn SQL for that, and I can see many way to get that wrong:
- recursion ? I don't know if it's possible for triggers ?
How to react to row update, if row update in release_files there is and it update a release_files.requires_python, does the trigger revert back to the corresponding value in release.requires_python ? Globally I'm concern that would would get internal source of truth and that any further development later on warehouse might get confused to know which of the two column should be updated.

I might just misunderstand the features available in SQL, though and there might be way to make sure there is consistency.

@Carreau

This comment has been minimized.

Show comment
Hide comment
@Carreau

Carreau Sep 14, 2016

Contributor

Updated with @michaelpacer work, this now adds a column to release_files and update only concern rows on UPDATE or INSERT into releases or release_files.

Contributor

Carreau commented Sep 14, 2016

Updated with @michaelpacer work, this now adds a column to release_files and update only concern rows on UPDATE or INSERT into releases or release_files.

@Carreau

This comment has been minimized.

Show comment
Hide comment
@Carreau

Carreau Sep 15, 2016

Contributor

Travis seem to be stuck on Running setup.py bdist_wheel for psycopg2 ... can you restart this build please ?

I improved the codestyle to pass the linter, and I assume after rebasing I had to update the database revision number (I assume that what's the following was trying to convey:

self = <alembic.script.base.ScriptDirectory object at 0x7fa690fc8dd8>
ancestor = 'Destination %(end)s is not a valid upgrade target from current head(s)'
multiple_heads = "Multiple head revisions are present for given argument 'head'; please specify a specific target revision, '<branchname>@head' to narrow to a specific head, or 'heads' for all heads"

Is there a simpler way to update the db revision number than to re-run docker-compose run web python -m warehouse db revision and move the content from one file to another ?

Contributor

Carreau commented Sep 15, 2016

Travis seem to be stuck on Running setup.py bdist_wheel for psycopg2 ... can you restart this build please ?

I improved the codestyle to pass the linter, and I assume after rebasing I had to update the database revision number (I assume that what's the following was trying to convey:

self = <alembic.script.base.ScriptDirectory object at 0x7fa690fc8dd8>
ancestor = 'Destination %(end)s is not a valid upgrade target from current head(s)'
multiple_heads = "Multiple head revisions are present for given argument 'head'; please specify a specific target revision, '<branchname>@head' to narrow to a specific head, or 'heads' for all heads"

Is there a simpler way to update the db revision number than to re-run docker-compose run web python -m warehouse db revision and move the content from one file to another ?

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Sep 15, 2016

Member

I think multiple heads means that other changes to the DB has landed since this change and you'll need to either rebase your DB revision so it follows linearly to that one, or you'll need to create a merge db revision. See http://alembic.zzzcomputing.com/en/latest/branches.html

Member

dstufft commented Sep 15, 2016

I think multiple heads means that other changes to the DB has landed since this change and you'll need to either rebase your DB revision so it follows linearly to that one, or you'll need to create a merge db revision. See http://alembic.zzzcomputing.com/en/latest/branches.html

@Carreau

This comment has been minimized.

Show comment
Hide comment
@Carreau

Carreau Sep 15, 2016

Contributor

Thanks for restarting the build.

I think multiple heads means that other changes to the DB has landed since this change and you'll need to either rebase your DB revision so it follows linearly to that one, or you'll need to create a merge db revision. See http://alembic.zzzcomputing.com/en/latest/branches.html

Thanks, it appears I still had the revision wrong but because of a missing key... re-attempting.

Contributor

Carreau commented Sep 15, 2016

Thanks for restarting the build.

I think multiple heads means that other changes to the DB has landed since this change and you'll need to either rebase your DB revision so it follows linearly to that one, or you'll need to create a merge db revision. See http://alembic.zzzcomputing.com/en/latest/branches.html

Thanks, it appears I still had the revision wrong but because of a missing key... re-attempting.

@Carreau Carreau changed the title from Create a materialized view to efficiently query requires_python with files to Create `requires_python` column on `release_files` to efficiently query `requires_python` with files Sep 15, 2016

@Carreau

This comment has been minimized.

Show comment
Hide comment
@Carreau

Carreau Sep 15, 2016

Contributor

Yeah ! 🍰 🎊 Test are passing.

Not sure if you prefer commits to be squashed a bit, and/or if extra test are needed for the DB as this will mostly be used by Legacy PyPI, nor exactly what need to be tested.

Contributor

Carreau commented Sep 15, 2016

Yeah ! 🍰 🎊 Test are passing.

Not sure if you prefer commits to be squashed a bit, and/or if extra test are needed for the DB as this will mostly be used by Legacy PyPI, nor exactly what need to be tested.

@Carreau

This comment has been minimized.

Show comment
Hide comment
@Carreau

Carreau Sep 16, 2016

Contributor

Why not just add a column to the release_files table and setup a trigger to cascade the values over? Using a Materialized view here seems to be harder than that?

That's now implemented and seem to work.

[edit] fix typo spotted in next comment.

Contributor

Carreau commented Sep 16, 2016

Why not just add a column to the release_files table and setup a trigger to cascade the values over? Using a Materialized view here seems to be harder than that?

That's now implemented and seem to work.

[edit] fix typo spotted in next comment.

@mpacer

This comment has been minimized.

Show comment
Hide comment
@mpacer

mpacer Sep 17, 2016

Why not just add a column to the release_files table and setup a trigger to cascade the values over? Using a Materialized view here seems to be harder than that?

I think what @Carreau meant was that it's now implemented. Since it is not not implemented. 😄

Is there anything more that needs to be done on this for it to be included?

mpacer commented Sep 17, 2016

Why not just add a column to the release_files table and setup a trigger to cascade the values over? Using a Materialized view here seems to be harder than that?

I think what @Carreau meant was that it's now implemented. Since it is not not implemented. 😄

Is there anything more that needs to be done on this for it to be included?

@Carreau

This comment has been minimized.

Show comment
Hide comment
@Carreau

Carreau Sep 17, 2016

Contributor

I think what @Carreau meant was that it's now implemented. Since it is not not implemented. 😄

Oops. typed to fast. I edited my comment.

Contributor

Carreau commented Sep 17, 2016

I think what @Carreau meant was that it's now implemented. Since it is not not implemented. 😄

Oops. typed to fast. I edited my comment.

@Carreau

This comment has been minimized.

Show comment
Hide comment
@Carreau

Carreau Sep 19, 2016

Contributor

I'll appreciate comment or hints on the next step to take on this. In order to tackle the things on PyPI sides as well.

Thanks !

Contributor

Carreau commented Sep 19, 2016

I'll appreciate comment or hints on the next step to take on this. In order to tackle the things on PyPI sides as well.

Thanks !

@Carreau

This comment has been minimized.

Show comment
Hide comment
@Carreau

Carreau Sep 26, 2016

Contributor

Hello there !

It's Monday 🎊 , usually the day of the week most people hate because it's time to start working again !

I hope you are all fine, and have spent a nice week-end. I personally went to the beach (in Moss landing) and saw otters. I did a wrong move and my broken for is hurting again, so I used that as an excuse to watch Mr Robot on Sunday. I've also read that Postgres 9.6 is going to have BDR, which is cool I guess.

I'm now routinely looking at this piece of code and looking to get any feedback on things to improve (or to fix). Hopefully get things merged soon enough to be able to finish my patch on legacy PyPI.

I'm now going to attempt inserting a Cat Gif here to cheer you up and give you the strength to review.
Sorry if you are a dog person, the dog gif is for next week.

cat-drinks-water-2

Thanks for all your hard work !

Contributor

Carreau commented Sep 26, 2016

Hello there !

It's Monday 🎊 , usually the day of the week most people hate because it's time to start working again !

I hope you are all fine, and have spent a nice week-end. I personally went to the beach (in Moss landing) and saw otters. I did a wrong move and my broken for is hurting again, so I used that as an excuse to watch Mr Robot on Sunday. I've also read that Postgres 9.6 is going to have BDR, which is cool I guess.

I'm now routinely looking at this piece of code and looking to get any feedback on things to improve (or to fix). Hopefully get things merged soon enough to be able to finish my patch on legacy PyPI.

I'm now going to attempt inserting a Cat Gif here to cheer you up and give you the strength to review.
Sorry if you are a dog person, the dog gif is for next week.

cat-drinks-water-2

Thanks for all your hard work !

@Carreau

This comment has been minimized.

Show comment
Hide comment
@Carreau

Carreau Oct 3, 2016

Contributor

Morning everyone !

Is is monday again ! I know, I know it's often monday. But according to rough calculation only about 14.28% of the time ! I definitively need more samples to be sure.

Also good news it 💧 rained 💧 this morning in California ! Also my ankle did not hurt before it started raining. I'm sad cause I wanted to be like the old grandpa in movies that can predict rain just depending on whether* their rheumatism are hurting. [*or should I say weather, Haha]

I'm also kind of anti-participating to hacktoberfest: instead of submitting PRs I'm trying to get mine merged or closed as I feel too many opened PRs can be scarry to new contributors, and depressing for maintainers. I know the feeling.

Let me tell you a joke to cheer you up and give you the courage to merge this Pull request. You make it hard on me because I have to find one relevant for this Pull request...

A SQL request walk into a bar, where some of it's friends are hanging in the back.
it approaches the tables where they are having multiple row of drinks and ask:
"Can I join you ?"

Well ok, that was lame. But I hope you enjoyed it and got some courage to press the nice green button you can see below. Also if you are afraid to ask from some change or don't have the courage to write a long reply, do not worry I can take a blunt and short answer even negative one. All I want is this thing to move forward. And I know that writing a long nice message that is politically correct can be tough.

Anyway thanks for your hard work, and hopping we can move this forward.

Also I'm still unsure if you are a dog or a cat person, so I'm going to try a dog gif this time. If it does not work I'll likely try squirel, llama, rabbits, raptors or something else next week.

59627412

See you before next monday hopefully.

Contributor

Carreau commented Oct 3, 2016

Morning everyone !

Is is monday again ! I know, I know it's often monday. But according to rough calculation only about 14.28% of the time ! I definitively need more samples to be sure.

Also good news it 💧 rained 💧 this morning in California ! Also my ankle did not hurt before it started raining. I'm sad cause I wanted to be like the old grandpa in movies that can predict rain just depending on whether* their rheumatism are hurting. [*or should I say weather, Haha]

I'm also kind of anti-participating to hacktoberfest: instead of submitting PRs I'm trying to get mine merged or closed as I feel too many opened PRs can be scarry to new contributors, and depressing for maintainers. I know the feeling.

Let me tell you a joke to cheer you up and give you the courage to merge this Pull request. You make it hard on me because I have to find one relevant for this Pull request...

A SQL request walk into a bar, where some of it's friends are hanging in the back.
it approaches the tables where they are having multiple row of drinks and ask:
"Can I join you ?"

Well ok, that was lame. But I hope you enjoyed it and got some courage to press the nice green button you can see below. Also if you are afraid to ask from some change or don't have the courage to write a long reply, do not worry I can take a blunt and short answer even negative one. All I want is this thing to move forward. And I know that writing a long nice message that is politically correct can be tough.

Anyway thanks for your hard work, and hopping we can move this forward.

Also I'm still unsure if you are a dog or a cat person, so I'm going to try a dog gif this time. If it does not work I'll likely try squirel, llama, rabbits, raptors or something else next week.

59627412

See you before next monday hopefully.

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Oct 4, 2016

Member

Sorry, this is on my list of things to review, I've just had a really crappy couple of weeks and haven't been able to focus much.

Member

dstufft commented Oct 4, 2016

Sorry, this is on my list of things to review, I've just had a really crappy couple of weeks and haven't been able to focus much.

@Carreau

This comment has been minimized.

Show comment
Hide comment
@Carreau

Carreau Oct 4, 2016

Contributor

Sorry, this is on my list of things to review, I've just had a really crappy couple of weeks and haven't been able to focus much.

No worries, we all go through these phase. Hope things will go better for you. I was secretly hoping you were not reviewing because you liked the weekly email and were hoping to get more.

Thanks a lot for your time we really appreciate !

Contributor

Carreau commented Oct 4, 2016

Sorry, this is on my list of things to review, I've just had a really crappy couple of weeks and haven't been able to focus much.

No worries, we all go through these phase. Hope things will go better for you. I was secretly hoping you were not reviewing because you liked the weekly email and were hoping to get more.

Thanks a lot for your time we really appreciate !

@mpacer

This comment has been minimized.

Show comment
Hide comment
@mpacer

mpacer Oct 4, 2016

Thank you for taking the time to look at this. I think the most recent commit addresses everything you want (along with updated comments).

mpacer commented Oct 4, 2016

Thank you for taking the time to look at this. I think the most recent commit addresses everything you want (along with updated comments).

mpacer added some commits Oct 4, 2016

@dstufft

This overall looks good. One thing I missed is that you need to add this to the File module, something like:

class File(db.Model):
    ...

    requires_python = Column(Text)

    @validates("requires_python")
    def validate_requires_python(self, *args, **kwargs):
        raise RuntimeError("Cannot set File.requires_python")
@mpacer

This comment has been minimized.

Show comment
Hide comment
@mpacer

mpacer Oct 5, 2016

Is the validates decorator that you're suggesting the same as the sqlalchemy.orm.validates described here or is there a place in the code where you define your own validates decorator that I'm not finding?

mpacer commented Oct 5, 2016

Is the validates decorator that you're suggesting the same as the sqlalchemy.orm.validates described here or is there a place in the code where you define your own validates decorator that I'm not finding?

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Oct 5, 2016

Member

Yea, it's the SQLAlchemy one, and it will just make it so any attempt in the Warehouse ORM to change the value will bomb out (but the trigger will work fine).

Member

dstufft commented Oct 5, 2016

Yea, it's the SQLAlchemy one, and it will just make it so any attempt in the Warehouse ORM to change the value will bomb out (but the trigger will work fine).

@mpacer

This comment has been minimized.

Show comment
Hide comment
@mpacer

mpacer Oct 5, 2016

And it should also be included in the File module?

mpacer commented Oct 5, 2016

And it should also be included in the File module?

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Oct 5, 2016

Member

It should only be on the File model. Modifying Release.requires_python is perfectly fine, the trigger will keep File.requires_python up to date, but modifying File.requires_python should trigger a validation error.

Member

dstufft commented Oct 5, 2016

It should only be on the File model. Modifying Release.requires_python is perfectly fine, the trigger will keep File.requires_python up to date, but modifying File.requires_python should trigger a validation error.

@mpacer

This comment has been minimized.

Show comment
Hide comment
@mpacer

mpacer Oct 5, 2016

Got it, thank you for the pointers! I wouldn't have known to construct it in this way without your help.

mpacer commented Oct 5, 2016

Got it, thank you for the pointers! I wouldn't have known to construct it in this way without your help.

@dstufft

dstufft approved these changes Oct 5, 2016

@mpacer

This comment has been minimized.

Show comment
Hide comment
@mpacer

mpacer Oct 5, 2016

I see now! The validation is accomplishing what we were trying to directly overwrite with triggers.

I agree that raising an error is a better than silently overwriting what someone did.

Also, I just realised I never imported it.

mpacer commented Oct 5, 2016

I see now! The validation is accomplishing what we were trying to directly overwrite with triggers.

I agree that raising an error is a better than silently overwriting what someone did.

Also, I just realised I never imported it.

@mpacer

This comment has been minimized.

Show comment
Hide comment
@mpacer

mpacer Oct 5, 2016

@dstufft Does the codecov issue mean that a test needs to be added that hits the RuntimeError and expects a failure?

mpacer commented Oct 5, 2016

@dstufft Does the codecov issue mean that a test needs to be added that hits the RuntimeError and expects a failure?

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Oct 5, 2016

Member

Yes.

Member

dstufft commented Oct 5, 2016

Yes.

@mpacer

This comment has been minimized.

Show comment
Hide comment
@mpacer

mpacer Oct 5, 2016

Would that go in the tests/unit/packaging/test_models.py under the TestFile class?

mpacer commented Oct 5, 2016

Would that go in the tests/unit/packaging/test_models.py under the TestFile class?

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Oct 5, 2016

Member

Yup.

Member

dstufft commented Oct 5, 2016

Yup.

@willingc

This comment has been minimized.

Show comment
Hide comment
@willingc

willingc Oct 5, 2016

Contributor

@dstufft Thanks for helping out the Jupyter folks.

Sorry about the last few weeks. Your Python friends ❤️ all that you do. Take care. Feel free to ping any of us if we can help. You know how much I love docs ;-)

Contributor

willingc commented Oct 5, 2016

@dstufft Thanks for helping out the Jupyter folks.

Sorry about the last few weeks. Your Python friends ❤️ all that you do. Take care. Feel free to ping any of us if we can help. You know how much I love docs ;-)

@dstufft dstufft merged commit 1400ce5 into pypa:master Oct 5, 2016

2 checks passed

codecov/project 100% (target 100%)
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details
@mpacer

This comment has been minimized.

Show comment
Hide comment
@mpacer

mpacer Oct 5, 2016

Thank you for merging this!

When can we rely on this being available from test_pypi and pypi so we can make progress on integrating requires_python into pypi-legacy while using more efficient sql requests (in re: pypa/pypi-legacy#506)?

mpacer commented Oct 5, 2016

Thank you for merging this!

When can we rely on this being available from test_pypi and pypi so we can make progress on integrating requires_python into pypi-legacy while using more efficient sql requests (in re: pypa/pypi-legacy#506)?

@Carreau Carreau deleted the Carreau:mat_view branch Oct 5, 2016

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Oct 5, 2016

Member

Now.

Member

dstufft commented Oct 5, 2016

Now.

Carreau added a commit to Carreau/warehouse that referenced this pull request Dec 2, 2016

Prevent inconsistency between release and release_files tables.
The work in PR #1448 was meant to replicate the information of the
`requires_python` of the `release` table to the `release_files` table
for efficiency when generating the list of available packages for pip.

While the work on #1448 seem sufficient for warehouse itself, it need to
consider that legacy-pypi also access the same database, and legacy-pypi
violate some constraint.

In particular when using `setup.py register` followed by `twine upload`,
the file upload insert files into `release_files` after inserting into
`releases`. Thus the value in releases is not propagated, leading to an
inconsistency and a listing in pip missing information about
python-version compatibility.

While I doubt there are any packages released between the merge of #1448
and a fix, this update the tables, and bind an already existing trigger
to update the information during insertion in `release_files`

Carreau added a commit to Carreau/warehouse that referenced this pull request Dec 2, 2016

Prevent inconsistency between release and release_files tables.
The work in PR #1448 was meant to replicate the information of the
`requires_python` of the `release` table to the `release_files` table
for efficiency when generating the list of available packages for pip.

While the work on #1448 seem sufficient for warehouse itself, it need to
consider that legacy-pypi also access the same database, and legacy-pypi
violate some constraint.

In particular when using `setup.py register` followed by `twine upload`,
the file upload insert files into `release_files` after inserting into
`releases`. Thus the value in releases is not propagated, leading to an
inconsistency and a listing in pip missing information about
python-version compatibility.

While I doubt there are any packages released between the merge of #1448
and a fix, this update the tables, and bind an already existing trigger
to update the information during insertion in `release_files`

Carreau added a commit to Carreau/warehouse that referenced this pull request Dec 3, 2016

Prevent inconsistency between release and release_files tables.
The work in PR #1448 was meant to replicate the information of the
`requires_python` of the `release` table to the `release_files` table
for efficiency when generating the list of available packages for pip.

While the work on #1448 seem sufficient for warehouse itself, it need to
consider that legacy-pypi also access the same database, and legacy-pypi
violate some constraint.

In particular when using `setup.py register` followed by `twine upload`,
the file upload insert files into `release_files` after inserting into
`releases`. Thus the value in releases is not propagated, leading to an
inconsistency and a listing in pip missing information about
python-version compatibility.

While I doubt there are any packages released between the merge of #1448
and a fix, this update the tables, and bind an already existing trigger
to update the information during insertion in `release_files`

Carreau added a commit to Carreau/warehouse that referenced this pull request Dec 3, 2016

Prevent inconsistency between release and release_files tables.
The work in PR #1448 was meant to replicate the information of the
`requires_python` of the `release` table to the `release_files` table
for efficiency when generating the list of available packages for pip.

While the work on #1448 seem sufficient for warehouse itself, it need to
consider that legacy-pypi also access the same database, and legacy-pypi
violate some constraint.

In particular when using `setup.py register` followed by `twine upload`,
the file upload insert files into `release_files` after inserting into
`releases`. Thus the value in releases is not propagated, leading to an
inconsistency and a listing in pip missing information about
python-version compatibility.

While I doubt there are any packages released between the merge of #1448
and a fix, this update the tables, and bind an already existing trigger
to update the information during insertion in `release_files`

Carreau added a commit to Carreau/warehouse that referenced this pull request Dec 5, 2016

Prevent inconsistency between release and release_files tables.
The work in PR #1448 was meant to replicate the information of the
`requires_python` of the `release` table to the `release_files` table
for efficiency when generating the list of available packages for pip.

While the work on #1448 seem sufficient for warehouse itself, it need to
consider that legacy-pypi also access the same database, and legacy-pypi
violate some constraint.

In particular when using `setup.py register` followed by `twine upload`,
the file upload insert files into `release_files` after inserting into
`releases`. Thus the value in releases is not propagated, leading to an
inconsistency and a listing in pip missing information about
python-version compatibility.

While I doubt there are any packages released between the merge of #1448
and a fix, this update the tables, and bind an already existing trigger
to update the information during insertion in `release_files`

Carreau added a commit to Carreau/warehouse that referenced this pull request Dec 5, 2016

Prevent inconsistency between release and release_files tables.
The work in PR #1448 was meant to replicate the information of the
`requires_python` of the `release` table to the `release_files` table
for efficiency when generating the list of available packages for pip.

While the work on #1448 seem sufficient for warehouse itself, it need to
consider that legacy-pypi also access the same database, and legacy-pypi
violate some constraint.

In particular when using `setup.py register` followed by `twine upload`,
the file upload insert files into `release_files` after inserting into
`releases`. Thus the value in releases is not propagated, leading to an
inconsistency and a listing in pip missing information about
python-version compatibility.

While I doubt there are any packages released between the merge of #1448
and a fix, this update the tables, and bind an already existing trigger
to update the information during insertion in `release_files`

dstufft added a commit that referenced this pull request Dec 5, 2016

Prevent inconsistency between release and release_files tables. (#1513)
The work in PR #1448 was meant to replicate the information of the
`requires_python` of the `release` table to the `release_files` table
for efficiency when generating the list of available packages for pip.

While the work on #1448 seem sufficient for warehouse itself, it need to
consider that legacy-pypi also access the same database, and legacy-pypi
violate some constraint.

In particular when using `setup.py register` followed by `twine upload`,
the file upload insert files into `release_files` after inserting into
`releases`. Thus the value in releases is not propagated, leading to an
inconsistency and a listing in pip missing information about
python-version compatibility.

While I doubt there are any packages released between the merge of #1448
and a fix, this update the tables, and bind an already existing trigger
to update the information during insertion in `release_files`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment