Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pipper: account for extra build deps download #36

Merged
merged 1 commit into from
Jun 19, 2019
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 15 additions & 6 deletions johnnydep/pipper.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
import sys
import tempfile
from argparse import ArgumentParser
from collections import OrderedDict

import pip
import pkg_resources
Expand Down Expand Up @@ -115,18 +116,26 @@ def get(dist_name, index_url=None, env=None, extra_index_url=None):
raise
log.debug("wheel command completed ok")
out = out.decode("utf-8")
links = set()
links = []
for line in out.splitlines():
line = line.strip()
if line.startswith("Downloading from URL"):
link = line.split()[3]
links.add(link)
links.append(link)
elif line.startswith("Source in ") and "which satisfies requirement" in line:
link = line.split()[-1]
links.add(link)
if len(links) != 1:
log.warning(out, links=links)
raise Exception("Expected exactly 1 link downloaded")
links.append(link)
links = list(OrderedDict.fromkeys(links)) # order-preserving dedupe
if not links:
log.warning("could not find download link", out=out)
raise Exception("failed to collect dist")
if len(links) > 1:
log.debug("more than 1 link collected", out=out, links=links)
# Since PEP 517, maybe an sdist will also need to collect other distributions
# for the build system, even with --no-deps specified. pendulum==1.4.4 is one
# example, which uses poetry and doesn't publish any python37 wheel to PyPI.
# However, the dist itself should still be the first one downloaded.
link = links[0]
with working_directory(scratch_dir):
[whl] = [os.path.abspath(x) for x in os.listdir(".") if x.endswith(".whl")]
url, _sep, checksum = link.partition("#")
Expand Down