Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

3.0.0-alpha-2 won't pip install #333

Closed
dhermes opened this issue May 1, 2015 · 33 comments · Fixed by #351
Closed

3.0.0-alpha-2 won't pip install #333

dhermes opened this issue May 1, 2015 · 33 comments · Fixed by #351

Comments

@dhermes
Copy link

dhermes commented May 1, 2015

https://pypi.python.org/pypi/protobuf/3.0.0-alpha-2

For example:

$ virtualenv venv
New python executable in venv/bin/python
Installing setuptools, pip...done.
$ source venv/bin/activate
(venv) $ pip install protobuf==3.0.0-alpha-2
Collecting protobuf==3.0.0-alpha-2
  Using cached protobuf-3.0.0-alpha-2.tar.gz
Requirement already satisfied (use --upgrade to upgrade): setuptools in ./venv/lib/python2.7/site-packages (from protobuf==3.0.0-alpha-2)
Installing collected packages: protobuf
  Running setup.py install for protobuf
    Complete output from command /path/venv/bin/python -c "import setuptools, tokenize;__file__='/tmp/pip-build-oncqNy/protobuf/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-E2WkTi-record/install-record.txt --single-version-externally-managed --compile --install-headers /path/venv/include/site/python2.7/protobuf:
    /path/venv/local/lib/python2.7/site-packages/setuptools/dist.py:282: UserWarning: Normalizing '3.0.0-alpha-2' to '3.0.0a2'
      normalized_version,
    running install
    running build
    running build_py
    Can't find required file: ../src/google/protobuf/unittest.proto
    Generating google/protobuf/unittest_pb2.py...

    ----------------------------------------
    Command "/path/venv/bin/python -c "import setuptools, tokenize;__file__='/tmp/pip-build-oncqNy/protobuf/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-E2WkTi-record/install-record.txt --single-version-externally-managed --compile --install-headers /path/venv/include/site/python2.7/protobuf" failed with error code 255 in /tmp/pip-build-oncqNy/protobuf

Discovered in https://travis-ci.org/GoogleCloudPlatform/gcloud-python/builds/60881924
/cc @tseaver

@xfxyjwf
Copy link
Contributor

xfxyjwf commented May 1, 2015

@haberman

When I uploaded alpha-1 to pypi, I commented out this GenerateUnittestProtos() line: https://github.com/google/protobuf/blob/master/python/setup.py#L124

It obviously indicates that line should be put into a different place but I haven't been able to figure it out. For alpha-2 I think we can use this temporary workaround and refactor the setup.py to address this issue.

@dhermes
Copy link
Author

dhermes commented May 1, 2015

Thanks for the quick reply. We understood the implications of using an alpha release :) but were using alpha-1 because of the Python 3 support. (Discussion in #250 made us feel better about alpha-1)

@tseaver
Copy link
Contributor

tseaver commented May 2, 2015

gcloud-python has four failing PRs due to this bug: we may be forced to pin protobuf==3.0.0-alpha1 to work around it. Note that PyPI will no longer allow re-uploading a distribution file: fixing this issue will require releasing / uploading a new version (could be protobuf-3.0.0-alpha-2.1, if desired).

@tseaver
Copy link
Contributor

tseaver commented May 2, 2015

ISTM that the correct long-term solution would be to automate the process of generating the *_pb2.py files as part of making the release (so that they are part of the sdist source distribution), rather than generating them in setup.py.

@tamird
Copy link
Contributor

tamird commented May 6, 2015

@tseaver would you mind contributing a PR that fixes this on top of #281?

@nicolasnoble
Copy link
Contributor

This is also causing troubles with grpc. We currently are pinned on alpha-1 for python, and pinned on alpha-2 for Java. This is confusing for people, and causing integration issues where we can't have grpc-java and grpc-python running on the same machine.

@haberman
Copy link
Member

Sorry for the confusion. I'm looking into this now, with a fix targeted for alpha-3.

@haberman
Copy link
Member

Ok, what if we make generate_proto() succeed without doing anything if:

  1. the destination _pb2.py file is already present, and
  2. the input file and/or ../src/protoc is not available.

Then the "build" step won't fail for the PyPI packages, but development should still work as desired (protos will get generated eagerly).

@haberman
Copy link
Member

@tseaver @tamird Any thoughts on my proposed solution to this?

@tamird
Copy link
Contributor

tamird commented May 12, 2015

seems reasonable, let's just make sure we add some test coverage for this

@tseaver
Copy link
Contributor

tseaver commented May 12, 2015

I'm kind of confused by the need for having setup.py trigger generation. Here is how I understand the picture:

  • In a development checkout, make feels like the natural tool for generating the *_pb2.py files: it is a natural fit (similar to generating .c sources using lex and yacc). E.g.: make python-protos.
  • When releasing a distribution to PyPI, we have to ship at least some of the generated files (google/protobuf/descriptor_pb2.py, google/protobuf/compiler/plugin_pb2.py, google/protobuf/pyext/python_pb2.py, I think), which means that they need to have been generated before (or when) running setup.py sdist: ISTM that using mak would be the Right Thing(TM) for those cases too. E.g., a python-release target would depend on python-protos, and would then run python setup.py sdist register upload.
  • AFAICT, normal users of the library should never need to regenerate the *_pb2.py files.

@attilaolah
Copy link

I fully agree with @tseaver on this.

@haberman
Copy link
Member

I can see where you are coming from. The main downside of your approach AFAICS is that it requires people who use the Git repo to run an extra make python-protos command before they run setup.py. If they forget to do this they'll get import errors, which could possibly be confusing to track down. And they have to remember to do this again if they change any of the .proto files, though this is probably relatively uncommon.

We would also need to make sure that people run make python-protos before running setup.py sdist, or they would create a broken sdist. If setup.py build triggered proto generation, it wouldn't be possible to make this mistake, since setup.py sdist automatically runs setup.py build AFAIK. But we could mitigate this by having a python-release target in the Makefile like you suggest, and ideally test the output before uploading it too.

So I'm ok with this approach if you prefer it. Want to send a PR?

@tseaver
Copy link
Contributor

tseaver commented May 12, 2015

@haberman Don't folks running from a Git clone need to run make first to get everything else built? We could make the python-protos target one of the dependencies of the default target, and they would be covered.

@haberman
Copy link
Member

@xfxyjwf What do you think about the idea of generating Python protos as part of the top-level "make" target?

The one thing that comes to mind is that the "python" directory will only exist in the Python .tar.gz AFAIK, so it should only try to descend into the "python" directory if it is there.

@xfxyjwf
Copy link
Contributor

xfxyjwf commented May 13, 2015

@haberman I would prefer your proposed approach (i.e., making generate_proto() do nothing in certain scenarios).

Currently we only use Makefiles for C++ build. Other languages has their own build scripts to build their runtime library. It's not the nature way (how we do it in protobuf) to have C++ Makefiles interfere with other languages build process. And it's not true that users need to run "make" before building the python runtime. What's actually needed is the protoc binary, not running "make". Users have to get a protoc binary before they can build python code and "make" is one of the ways to get it. For example for users who want to build protobuf Python on windows, we provide a protoc binary for download directly so they don't need to go to the hassle of running "make" (which is very inconvenient on windows).

@haberman
Copy link
Member

That is a good point about Windows. "python setup.py build" is probably a lot more convenient than "make" for Windows users.

Ok, I'm inclined to agree with Feng here and keep using setup.py to generate protos, but modify it to do nothing if the output file already exists.

@tseaver
Copy link
Contributor

tseaver commented May 14, 2015

FWIW, I see the make approach as useful to maintainers of the library, more than consumers, who will normally just use the sdist published to PyPI. I still don't see any reason to have setup.py run protoc; if needed, #360 addresses the issue by moving the generation to a special setup.py generate_protos command, which I imagine would be run by the maintainers immediately prior to making a release to PyPI, as well as during testing.

@haberman
Copy link
Member

@tseaver I think it boils down to:

  1. On Windows, running make is inconvenient (it's not installed by default, no UNIX shell for running ./configure, etc). This would affect anyone who is building from Git on Windows.
  2. None of our other non-C++ builds use make, they all use the language-specific build system (Maven, Rake, etc). Using make to generate Python protos would diverge from this for no clear reason.
  3. Require python setup.py generate_protos would be an extra command people would have to remember to run.

I think it's simplest if users only have to run the standard python setup.py build instead of also having to know and remember about a separate command python setup.py generate_protos.

We brainstormed another idea which might make this even simpler. For the release we can add an extra file into the distribution called RELEASE_DIST or something like that. If this file is present, setup.py won't try to regenerate protos. Then it's really predictable that the Git development tree will always regenerate the protos, but the release distribution never will.

@tseaver
Copy link
Contributor

tseaver commented May 14, 2015

@haberman if you want to do the last thing, you could just look for the .git/ directory: it won't be present in an sdist.

@jtattermusch
Copy link
Contributor

As it seems this is blocking grpc/grpc#1455, it's also transitively blocking gRPC C# alpha release (as I need to upgrade gRPC's version of protobufs to version that supports C#).

@tamird
Copy link
Contributor

tamird commented May 15, 2015

I have a passing build with this fixed in #351

@jtattermusch
Copy link
Contributor

Great!

On Thu, May 14, 2015 at 6:24 PM, Tamir Duberstein notifications@github.com
wrote:

I have a passing build with this fixed in #351
#351


Reply to this email directly or view it on GitHub
#333 (comment).

@nicolasnoble
Copy link
Contributor

BTW, I'm not sure this issue is, in fact, closed. While the repository has been updated with the fix, doing a pip install protobuf==3.0.0-alpha-2 still doesn't work. As far as I understand, the package needs to be republished. Is this happening ?

@tseaver
Copy link
Contributor

tseaver commented May 18, 2015

@nicolasnoble that is infeasible -- PyPI does not allow for a released distribution file to be republished with a new filename. The most straightforward way to do that is to make a new release (maybe protobuf-3.0.0-alpha2.1).

@nicolasnoble
Copy link
Contributor

Fair enough. Let's do that ?

@haberman
Copy link
Member

I've uploaded protobuf-3.0.0a2. I was able to successfully install it with pip just now.

Sorry for the delay on this. Let me know if there are any other problems.

@xiaofeng-gg
Copy link

Why is it named protobuf-3.0.0a2? It seems you have successfully deleted
protobuf-3.0.0-alpha-2 on PyPI.

On Mon, May 18, 2015 at 5:32 PM, Joshua Haberman notifications@github.com
wrote:

I've uploaded protobuf-3.0.0a2. I was able to successfully install it with
pip just now.

Sorry for the delay on this. Let me know if there are any other problems.


Reply to this email directly or view it on GitHub
#333 (comment).

@haberman
Copy link
Member

It wouldn't let me upload it as protobuf-3.0.0-alpha-2.1 or anything like that. It gave me the error:

Upload failed (400): Invalid version, cannot be parsed as a valid PEP 440 version.

I have no idea how the previous version was uploaded with what appears to be a non-PEP440 version.

@attilaolah
Copy link

3.0.0-alpha-2 seems to be allowed by PEP 440.

@Ivoz
Copy link
Contributor

Ivoz commented Jul 9, 2015

packaging gives 3.0.0-alpha-2 as valid:

>>> from packaging import version
>>> version.parse("3.0.0-alpha-2")
<Version('3.0.0a2')>

@dstufft could you say what would stop an upload with this? Maybe a bug to be filed for PyPI?

@dstufft
Copy link

dstufft commented Jul 10, 2015

I'm confused, didn't @haberman say it wouldn't allow 3.0.0-alpha-2.1 as a version, not 3.0.0-alpha-2? The first one is not a valid PEP 440 version.

@attilaolah
Copy link

Ah, you're right @dstufft. Sorry for the noise — I must have misread that comment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet