New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Environment marker support fails API tests on PyPy #50

Closed
bb-migration opened this Issue Jul 22, 2013 · 8 comments

Comments

Projects
None yet
1 participant
@bb-migration

bb-migration commented Jul 22, 2013

Originally reported by: jaraco (Bitbucket: jaraco, GitHub: jaraco)


In this Travis-CI run, we see setuptools fails two publish API tests in the environment marker support:

File "/home/travis/build/jaraco/setuptools/tests/api_tests.txt", line 344, in api_tests.txt
Failed example:
    print(im("sys_platform==")) # doctest: +ELLIPSIS
Expected:
    unexpected EOF while parsing (...line 1)
Got:
    invalid syntax (<string>, line 1)

and

File "/home/travis/build/jaraco/setuptools/tests/api_tests.txt", line 356, in api_tests.txt
Failed example:
    print(im("(extra")) # doctest: +ELLIPSIS
Expected:
    unexpected EOF while parsing (...line 1)
Got:
    parenthesis is never closed (<string>, lines 1-2)

It seems the differences in the parser yield different error messages.

Either the API expectations need to be relaxed or setuptools needs to wrap this call to translate the error messages when invoked on PyPy.


@bb-migration

This comment has been minimized.

bb-migration commented Jul 30, 2013

Original comment by iElectric (Bitbucket: iElectric, GitHub: Unknown):


+1 see http://hydra.nixos.org/build/5592031/nixlog/1/raw

@bb-migration

This comment has been minimized.

bb-migration commented Jul 31, 2013

Original comment by iElectric (Bitbucket: iElectric, GitHub: Unknown):


@jaraco would it be fine to port these two tests into python code? It's hard to handle non-deterministic tests in doctests format.

@bb-migration

This comment has been minimized.

bb-migration commented Aug 1, 2013

Original comment by jaraco (Bitbucket: jaraco, GitHub: jaraco):


It would be fine except that these tests not only test the behavior, but also document and advertise the API, on which a client should be able to rely. If the message is different on PyPy, we need to provide a less strict API (not guarantee the output, but only that an error message occurs) or update setuptools to meet the prescribed API even on PyPy.

@bb-migration

This comment has been minimized.

bb-migration commented Aug 1, 2013

Original comment by iElectric (Bitbucket: iElectric, GitHub: Unknown):


Do we really need to show how not to use API? And if so, we could use ellipsis to hide the output.

@bb-migration

This comment has been minimized.

bb-migration commented Aug 5, 2013

Original comment by pje (Bitbucket: pje, GitHub: pje):


That isn't showing how not to use the API, it's showing that the API correctly detects invalid environment markers. Part of its contract is being able to tell you whether a marker is valid.

(For the parse errors provided by Python, though, it doesn't especially matter what the error output is.)

@bb-migration

This comment has been minimized.

bb-migration commented Aug 10, 2013

Original comment by jaraco (Bitbucket: jaraco, GitHub: jaraco):


Replaced overly-specific error messages with more general ones for improved cross-implementation compatibility. Fixes #50.

@bb-migration

This comment has been minimized.

bb-migration commented Aug 10, 2013

Original comment by jaraco (Bitbucket: jaraco, GitHub: jaraco):


I created one implementation that normalizes the error messages returned on the various implementations. If the error messages are unimportant, perhaps the api tests should just report/guarantee the boolean output, and add some internal unit tests to exercise other guarantees.

@bb-migration

This comment has been minimized.

bb-migration commented Aug 10, 2013

Original comment by pje (Bitbucket: pje, GitHub: pje):


Looks good to me! Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment