New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
unable to use local package uploads #98
Comments
I had to use |
So what is the solution if you have mirrored packages and you'd like to submit a fork with a different version of the package? |
it seems like there's only one field in the schema which specifies if the package is local or not. i don't think it would be possible to distinguish which one you were talking about when you went to use it either, if not by name. there are a couple possible workarounds though: you might edit the setup.py for that project you've forked and distinguish it in some way. i don't even think you have to change the packages it provides, just the outer identifier. if you really want to have them live under the same name you could change the version number to a range that the upstream project doesn't use (higher or lower depending on what you want to download with unpinned version numbers) and upload with that version. then you can pin the version to a custom version number for projects that need it. the downside is the package would forever be 'local' and you'd have to upload upstream versions yourself. in any case to get it to dump the package it has already mirrored and upload one yourself you will need to remove it from the db. you can use the django console (he provided a link in the top right menu) and remove it from the packages table. it can clean up all the release file entries too. |
I think for now the best solution is to indeed upload a renamed version. But I'm sometimes encountering the same issue, so will look for something better |
In this type of case, I also have found renaming the "outer identifier" of the package as @xandercrews mentions to be the easiest and least error-prone. For example, if a dependency needs to be forked for some reason and the package name (outer identifier) is the same, then someone who is not configured to use localshop will go out to the public official pypi instead and possibly grab the wrong package. Also, changing the name of the outer identifer can serve as some small documentation for why the package needed to be changed. I would also be interested in other possible solutions, though. |
You must rename the egg in case you want to upload a forked package, that's how PyPI works. One notable example of this pratice is suds-jurko which is a fork of long time dead suds. I am closing this issue as a "Won't fix" |
Hi,
i'm using localshop 1.4.3 installed per the instructions.
I'm able to install packages via my localshop server per the instructions (or by setting an environment variable PIP_INDEX_URL , but i mean same thing right? probly).
However i'm trying to upload a fork of gunicorn, where i've bumped the major version to 180, to my localshop server, and that fails in one of two ways depending on my .pypirc configuration file.
method 1, the documented method:
pypirc =
[distutils]
index-servers =
internal
[internal]
repository: http://10.77.0.5/simple/
username: 2cebb3ed0e6249de947a20e328a43dd5
password: 438c5415b8944beab6027366f89981a8
$python setup.py sdist upload -r internal
<...>
running upload
Submitting dist/gunicorn-180.2.tar.gz to http://10.77.0.5/simple/
Upload failed (400): BAD REQUEST
i look at my server logs and i can see nginx has returned the 400, but there's nothing in the localshop or celery logs indicating what the problem was.
method 2: i saw it on some blog post?
pypirc file is same as in option1, except the repository URL has been stripped of its "/simple", ie:
repository: http://10.77.0.5/
$python setup.py sdist upload -r internal
<...>
running upload
Submitting dist/gunicorn-180.2.tar.gz to http://10.77.0.5/
Server response (200): OK
but!! i don't see gunicorn-180.2 in my localshop packages list!!
in either case when i try to install gunicorn through my localshop, it can see the main pypi versions but never my fork.
I've also tried having the user credentials be those of the django super-user, but no luck there either.
so this is a shame as i was really hoping to use localshop to cache all packages and serve some internal packages, but i can't figure out what's going wrong here.
The text was updated successfully, but these errors were encountered: