-
Notifications
You must be signed in to change notification settings - Fork 882
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot pipenv install confluent-kafka 1.4.0 from source #830
Comments
The source package We're also affected. We use Alpine Linux which doesn't support wheel packages and the library must be built from source package. |
Sorry about that, you should be able to install the sdist now. |
Could be by any chance that the requirement files were not included in the tar file with the source code? I've tried to install 1.4.0 in an alpine distro, and I'm seeing this error:
When I downloaded the tar.gz from the url in the previous comment, and unpacked it, I couldn't see any |
Thanks for the quick fix. Unfortunately I see the same behavior as @marzetas now. |
I see, apparently requirements.txt files are left behind. Standby I'll see what I can do |
Alright created a micro version and put it on test.pypi to ensure it worked.
Prior to opening a PR to get this on the mainline pypi do you mind trying it out as well. I'll see about adding some automated testing for sdist install as we have for wheels as well. |
Working on my side 👍 |
However, librdkafka 1.4 is not available in on |
not working with how monasca is using it. There error is still | ubuntu-bionic | �[0m�[91m ERROR: Traceback (most recent call last): |
@nicocti - I build librdkafka from source by
|
Librdkafka 1.4 ships with cp 5.5 which should be out any day now. I too built librdkafka, should have mentioned that.
Just to clarify you are installing 1.4.0.1 from test.pypi right? The error you are seeing seems to imply that distutils is still looking for requirements.txt however I removed all references to the requirements files from setup.py
I'll try to get it approved, merged and pushed to pypi propper tomorrow now that others have had success as well. |
Btw, many thanks for the quick turnaround, @rnpridgeon |
the openstack CI still uses 1.4.0. once you get 1.4.0.1 released the "constraints" will get updated to 1.4.0.1 and i can retest |
Yes, I'll do it tonght/first thing tomorrow. Going through to see if there are any other small things that can be included. |
any updates? |
confluent-kafka release 1.4.0 includes a buggy source package [1] which prevents installing it from source. As the consequence building Docker images on Alpine Linux fail [2, 3]. [1] confluentinc/confluent-kafka-python#830 [2] https://zuul.openstack.org/builds?job_name=build-monasca-docker-image [3] https://zuul.openstack.org/build/6a5270dd1b4d482da8bfc1a72331f48a/log/job-output.txt#1553 Change-Id: I81f5df877f2c183e563c7f9e74a4f775925a86e2
* Update requirements from branch 'master' - Block confluent-kafka==1.4.0 confluent-kafka release 1.4.0 includes a buggy source package [1] which prevents installing it from source. As the consequence building Docker images on Alpine Linux fail [2, 3]. [1] confluentinc/confluent-kafka-python#830 [2] https://zuul.openstack.org/builds?job_name=build-monasca-docker-image [3] https://zuul.openstack.org/build/6a5270dd1b4d482da8bfc1a72331f48a/log/job-output.txt#1553 Change-Id: I81f5df877f2c183e563c7f9e74a4f775925a86e2
Sorry for the hold up, the packages are up and available. Do note we decided to just go ahead and bump it to v1.4.1 as opposed to v1.4.0.1 |
Description
pipenv install confluent-kafka
works fine, butPIP_NO_BINARY=confluent-kafka pipenv install confluent-kafka==1.4.0
fails indicating that version 1.4.0 is not available. Max version available is shown to be 1.3.0. Also,PIP_NO_BINARY=confluent-kafka pipenv install confluent-kafka==1.3.0
works fine.How to reproduce
Checklist
Please provide the following information:
confluent_kafka.version()
andconfluent_kafka.libversion()
):{...}
'debug': '..'
as necessary)The text was updated successfully, but these errors were encountered: