Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Explicitly ignore rel='download' links while looking for html pages. #677
Yesterday I got tired to have to wait 15 to 20 minutes each time I need to reinstall the virtualenv of my current project (also had a bad connection which made this more noticeable). This looks clearly too long considering I have a requirements.txt explicitly defining each package version with each package archive already present in the download cache folder.
Ideally I hoped to find an easy way to firstly check the cache when specifying an explicit revision number but that would require too much work because the cache is currently only checked after analyzing every link of every requested page.
Nevertheless, after profiling several pip installations I found out that
I propose not to include the
A simple comparison with
*Patched develop branch:
As you can see we save almost half of the time spent in
I ensured all tests pass.
The test verifies that pip correctly filter links when told to. It doesn't check that we actually exclude the
added a commit
this pull request
Oct 24, 2012
Bummer. The following test fails:
INITools ends up being correctly installed, problem is nose catches an
According to the python doc, http://docs.python.org/library/httplib.html#httplib.BadStatusLine :
Could it be there was a problem with the pypi mirrors during the test ? I'm not really sure where to start looking.