Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Feature: creation of relocatable tarballs from installed packages #1013

Closed
wants to merge 193 commits into from
Closed
Show file tree
Hide file tree
Changes from 105 commits
Commits
Show all changes
193 commits
Select commit Hold shift + click to select a range
e67b7eb
add create_tarball command to package build artifacts into tarballs
hegner Feb 17, 2016
84de4be
move to non-CERN specific naming conventions
hegner Feb 17, 2016
9c67751
first test version of binary install procedures
hegner Feb 17, 2016
c053647
remove debug printout
hegner Feb 17, 2016
d31ba16
add version to tarball name
hegner Feb 18, 2016
2ef4768
Merge branch 'develop' of https://github.com/LLNL/spack into binary_p…
hegner Feb 18, 2016
90fcba0
Add recursion and force-overwrite to create-tarball.
brettviren Feb 19, 2016
95fe15a
Merge pull request #1 from brettviren/features/binary-packages
hegner Feb 19, 2016
860e5df
Fix dependency recursion.
brettviren Feb 19, 2016
7719192
Merge pull request #2 from brettviren/features/binary-packages
hegner Feb 19, 2016
4498737
Extend binary packaging features
hegner Feb 20, 2016
e7773ce
remove accidentally added files
hegner Feb 20, 2016
4c8c82a
Merge branch 'develop' of https://github.com/LLNL/spack into binary_p…
hegner May 7, 2016
8324fcf
remove duplicate creation of install layout
hegner May 7, 2016
3210dd2
use mirrors for downloading binary tarballs
hegner May 7, 2016
5fb14db
use full version of linux distribution
hegner May 7, 2016
8446096
remove duplicated fetch; remnant from last merge which lead to duplic…
hegner May 8, 2016
80dbd10
add relocation for shebang and config files
hegner May 8, 2016
6bad462
reduce verbosity
hegner May 9, 2016
bfa5f2c
refine relocation
hegner May 9, 2016
5911b0b
fix wrong ident
hegner May 9, 2016
779b8dc
Merge branch 'develop' of https://github.com/LLNL/spack into binary_p…
hegner May 12, 2016
1f9a01d
address flake8 checks
hegner May 12, 2016
a476bd7
quote pathnames in case there are spaces
gartung May 11, 2016
132993d
fix compression flag
hegner May 12, 2016
a506bf9
making this work on OSX
gartung May 11, 2016
ebd7fe9
clean up mac relocation
hegner May 12, 2016
4282850
code cleanup
hegner May 12, 2016
f9cd9a2
refactoring; documentation; testing
hegner May 15, 2016
66a812d
pep8
hegner May 15, 2016
0607152
various fixes
hegner May 17, 2016
a3783c1
use only otool and install_name_tool
gartung May 17, 2016
5f85533
fixes
gartung May 17, 2016
6e1257e
fixes #967
alalazo May 18, 2016
164da8e
Version bump to 0.9.1
tgamblin May 18, 2016
987c549
using @rpath was a bust; switched replacing olddir with newdir; add o…
gartung May 18, 2016
b48047e
revert change to cc script
gartung May 18, 2016
4d3a7dc
fix bug that kept LC_LOAD_DYLIB from being changed
gartung May 19, 2016
a285488
fix incorrect change
gartung May 23, 2016
4033646
typo
gartung May 23, 2016
1602d70
Merge branch 'llnl-develop' into binary_packages
gartung Jun 22, 2016
acaff17
fix typo from merge
gartung Jun 22, 2016
a8e2230
fixup merge
gartung Jun 22, 2016
f2bd84e
more merge fixups
gartung Jun 22, 2016
f42ff74
even more merge fixups
gartung Jun 22, 2016
20953d7
these were removed during merge
gartung Jun 23, 2016
b26f811
do not fail if tarball exists
gartung Jun 23, 2016
c67e4a7
only create tarball if it does not exist
gartung Jun 23, 2016
8c32777
only create tarball if it does not exist
gartung Jun 23, 2016
6f7bcd2
Merge remote-tracking branch 'origin/develop' into binary_packages
gartung Jun 27, 2016
7446f7c
Merge remote-tracking branch 'llnl/develop' into binary_packages
gartung Jun 30, 2016
436d734
Merge branch 'binary_packages' into llnl-develop
gartung Aug 1, 2016
e9bc73c
Merge branch 'develop' of https://github.com/LLNL/spack into llnl-dev…
gartung Aug 1, 2016
90bbdca
Loop over all mirrors trying to find binary tarball
gartung Aug 4, 2016
9ce4863
Merge remote-tracking branch 'llnl/develop' into binary_packages
gartung Aug 9, 2016
22b0e4b
install-policy -> fetch-binary
gartung Aug 11, 2016
4f8167b
Don't assume spack is in the path when building docs.
tgamblin Aug 15, 2016
1d73ce4
Merge branch 'develop' of https://github.com/LLNL/spack into binary_p…
gartung Aug 30, 2016
29e4233
Merge branch 'develop' of https://github.com/LLNL/spack into binary_p…
gartung Aug 30, 2016
cb74a9b
Merge remote-tracking branch 'llnl/develop' into binary_packages
gartung Sep 22, 2016
5f2243a
Merge branch 'develop' of https://github.com/LLNL/spack into binary_p…
gartung Sep 22, 2016
1cf9e76
Merge remote-tracking branch 'my/binary_packages' into binary_packages
gartung Sep 22, 2016
dc5cf9f
new opsnssl version
gartung Sep 23, 2016
772668e
Merge branch 'binary_packages' of github.com:gartung/spack into binar…
gartung Sep 27, 2016
04dc337
Merge branch 'develop' of https://github.com/LLNL/spack into binary_p…
gartung Sep 28, 2016
cfee296
Merge branch 'binary_packages' of github.com:gartung/spack into binar…
gartung Sep 30, 2016
42a2553
add sqlite 3.12.2
gartung Sep 30, 2016
53b15b5
Merge branch 'binary_packages' of github.com:gartung/spack into binar…
gartung Sep 30, 2016
bf3b163
Merge branch 'develop' of https://github.com/LLNL/spack into binary_p…
gartung Nov 3, 2016
bf7f1e0
changes needed after merging
gartung Nov 10, 2016
8576e76
missed
gartung Nov 11, 2016
fa01377
pass fetch_binary with the new api
gartung Nov 11, 2016
6e99a79
Merge branch 'binary_packages' of github.com:gartung/spack into binar…
gartung Nov 15, 2016
9eb6e1a
this got removed in merge
gartung Nov 15, 2016
7da99bb
Merge branch 'binary_packages' of github.com:gartung/spack into binar…
gartung Nov 15, 2016
6f47aa2
this is the correct path to use
gartung Nov 15, 2016
b0206ba
fix errors travis found
gartung Nov 15, 2016
47f5133
sigh
gartung Nov 15, 2016
aa378db
use upstream sqlite
gartung Nov 15, 2016
e18fddd
now avaliable in upstream
gartung Nov 15, 2016
1d475a7
Merge remote-tracking branch 'llnl/develop' into binary_packages
gartung Nov 15, 2016
ac68a2d
this is really needed for the test
gartung Nov 15, 2016
6000d15
Merge branch 'binary_packages' of github.com:gartung/spack into binar…
gartung Nov 16, 2016
41722b8
more nit picking
gartung Nov 16, 2016
e15a3f0
Merge branch 'binary_packages' of github.com:gartung/spack into binar…
gartung Nov 16, 2016
2245c00
more format fixes
gartung Nov 16, 2016
95e8b91
apple-clang does have cxx14 support
gartung Dec 5, 2016
9d94d24
Merge remote-tracking branch 'my/binary_packages' into binary_packages
gartung Dec 5, 2016
a7c0408
more test fixups
gartung Dec 14, 2016
d9fedc4
Merge remote-tracking branch 'llnl/develop' into binary_packages
gartung Dec 16, 2016
9cd1543
no spaces in docstring???
gartung Dec 16, 2016
c881b54
no leading spaces???
gartung Dec 16, 2016
9412ee1
add binary_caches to index.rst???
gartung Dec 16, 2016
52a9e5d
Merge branch 'releases/v0.10.0'
tgamblin Jan 17, 2017
b2019a6
Merge tag 'v0.10.0' into binary_packages
gartung Mar 15, 2017
8830cfc
Merge remote-tracking branch 'llnl/develop' into binary_packages
gartung Apr 28, 2017
9a4bccf
move fetch_binary actions out of install method
gartung May 12, 2017
9552235
Merge branch 'develop' into binary_packages
gartung May 12, 2017
a1d05c0
move the installation of binary tarballs to its own command
gartung May 12, 2017
1222851
autopep8
gartung May 12, 2017
bb117f6
fixes from testing
gartung May 17, 2017
847ba89
add to description
gartung May 17, 2017
2093e57
remove error for create_tarball and install_tarball when spack is run…
gartung May 17, 2017
eaff47b
more fixes from testing
gartung May 17, 2017
3b74ac1
Add generic variant to gmp for CXXFLAGS=-mtune=generic. This allows a…
gartung May 18, 2017
23265aa
mtune=generic is specific to gcc
gartung May 18, 2017
3148849
autopep8
gartung May 18, 2017
5a58895
recommended fix for Illegal instruction error @ http://www.linuxfroms…
gartung May 18, 2017
01f3887
Merge branch 'develop' into binary_packages
gartung May 22, 2017
b60f74d
Merge branch 'develop' into gmp-generic
gartung May 22, 2017
78e0eb5
WIP: binary_distribution: pseudocode for GPG-signed binaries
mathstuf May 22, 2017
d8e2d17
Expand on Ben Boeckel's use of python tarfile
gartung May 23, 2017
7d64020
update binary_cache doc
gartung May 23, 2017
1961e68
Extract the signature files and the tarball in stage directory.
gartung May 23, 2017
99ed0cc
remove outdated packaging.py
gartung May 23, 2017
beb878d
fix the error from spack help -a
gartung May 23, 2017
7f6c669
replace use of getstatusoutput() with which(command) followed by comm…
gartung May 23, 2017
7a269f8
use filter_file instead of os.system('sed....')
gartung May 23, 2017
e06576f
don't need symbolic links
gartung May 23, 2017
30aa688
spack does not seem to like dashes or underscores in the command names
gartung May 23, 2017
52b614c
unknown command error caused by import of relocate.py failing
gartung May 23, 2017
589f119
autopep8
gartung May 23, 2017
2109078
does spack really hate dashes in command names???
gartung May 23, 2017
80ccb59
Make installtarball and createtarball a multi level command
gartung May 23, 2017
4ef502d
autopep8
gartung May 23, 2017
0a41a51
add context.closing around tarfile.open
gartung Jun 15, 2017
7d38571
only make buildcache for link and run depdencies
gartung Jun 15, 2017
e60249b
Merge remote-tracking branch 'llnl/develop' into binary_packages
gartung Jun 15, 2017
6190135
flake8
gartung Jun 16, 2017
719a471
Merge branch 'develop' into gmp-generic
gartung Jun 16, 2017
cf22340
add rpath derived from compiler used by spack
gartung Jun 19, 2017
ec856d8
don't add this for now
gartung Jun 19, 2017
4d3c062
skip adding gcc_prefix if it comes from CLT
gartung Jun 19, 2017
7b87b44
don't try to relocate libgcc_*.dylib on macOS
gartung Jun 21, 2017
3337724
Merge remote-tracking branch 'llnl/develop' into build_caches
gartung Jun 21, 2017
23d878a
keep a copy of spec.yaml out of the .spack tarball so it can be acces…
gartung Jun 21, 2017
12393eb
keep a copy of spec.yaml out of the .spack tarball so it can be acces…
gartung Jun 21, 2017
c184489
Add buildcache list subcommand. Install buildcaches based on listed s…
gartung Jun 22, 2017
d77e971
Merge branch 'build_caches' of github.com:gartung/spack into build_ca…
gartung Jun 22, 2017
04b0076
construct compiler spec to set gcc_prefix
gartung Jun 22, 2017
ff217da
put spec.yaml in build_cache dir
gartung Jun 22, 2017
3670141
find dependcies the correct way
gartung Jun 22, 2017
a7fef8e
always relocate in case compiler prefix has changed
gartung Jun 22, 2017
dc2b964
Merge remote-tracking branch 'llnl/develop' into build_caches
gartung Jun 22, 2017
38ec75a
perl seems to works more reliably
gartung Jun 22, 2017
c3f6efc
Add relative rpaths to elf objects if original rpath is in spack inst…
gartung Jun 26, 2017
3bafaee
Add ability to install by hash
gartung Jun 27, 2017
780cf4c
turn off debug print
gartung Jun 27, 2017
232a1fd
temp fix
gartung Jun 27, 2017
8b57f11
skip buildcache of externals
gartung Jun 27, 2017
478a7cc
format spec when printing
gartung Jun 27, 2017
9cfc344
Reconstruct install directory from spec.yaml
gartung Jun 27, 2017
2b0fa1f
prelocate for Mach-o objects too
gartung Jun 27, 2017
45485c9
This changes the library ID of libgcc_s.1.dylib from an absolute path…
gartung Jun 28, 2017
714c857
autopep8
gartung Jun 28, 2017
72b8d3f
make changing to relative rpaths before tarball creation optional
gartung Jun 28, 2017
90ce619
Merge branch 'develop' of https://github.com/LLNL/spack into gcc-libg…
gartung Jun 28, 2017
a69f312
make this a variant
gartung Jun 28, 2017
d875e54
Merge branch 'gcc-libgcc-rpath-resub' of github.com:gartung/spack int…
gartung Jun 28, 2017
9dd3b57
Merge branch 'gcc-libgcc-rpath-resub' into build_caches
gartung Jun 28, 2017
dd25a40
Document the need for setting extra_rpaths. Use conflicts instead of …
gartung Jun 28, 2017
b7f5fd2
Merge branch 'gcc-libgcc-rpath-resub' into build_caches
gartung Jun 28, 2017
bc5e7c9
Add two python packages for editing mach-O headers. Much faster than …
gartung Jul 11, 2017
bb85cde
Merge branch 'py-mach-o-tools' into build_caches
gartung Jul 11, 2017
4e3620f
autopep8
gartung Jul 11, 2017
cede86f
Merge branch 'gcc-libgcc-rpath-resub' into build_caches
gartung Jul 11, 2017
29155a7
autopep8
gartung Jul 11, 2017
bae8c04
Merge branch 'gmp-generic' into build_caches
gartung Jul 11, 2017
72f1ab2
remove perl ~ backup file
gartung Jul 12, 2017
968fd77
Merge branch 'build_caches' of github.com:gartung/spack into build_ca…
gartung Jul 12, 2017
e5d32d5
remove non-overriding function
gartung Jul 13, 2017
39a351c
Document the need for setting extra_rpaths. Use conflicts instead of …
gartung Jun 28, 2017
fe4eefe
Merge branch 'develop' of https://github.com/LLNL/spack into gcc-libg…
gartung Jul 13, 2017
0729d67
Merge branch 'gcc-libgcc-rpath-resub' into build_caches
gartung Jul 13, 2017
74afb10
All previous commits squashed into one.
gartung Jul 13, 2017
f3b2c9d
autopep8
gartung Jul 13, 2017
a2a66bf
Merge branch 'py-mach-o-tools' into build_caches
gartung Jul 13, 2017
3d77dd0
Merge remote-tracking branch 'origin/binary_packages' into binary_pac…
gartung Jul 13, 2017
431ae5f
don't submit these on this branch
gartung Jul 13, 2017
47485f2
not needed since testing was removed
gartung Jul 13, 2017
5f993d0
pep8 fixes
gartung Jul 14, 2017
674d7b9
add the options to use pgp2 to sign and verify the build caches
gartung Jul 14, 2017
b9ab9a3
flake8 fixes
gartung Jul 14, 2017
ed301e8
move verify parsing
gartung Jul 14, 2017
b6f1f4d
fix order of arguments
gartung Jul 14, 2017
4ac6d50
change order of arguments in correct function call
gartung Jul 14, 2017
4a81550
make buildcache install -f remove install prefix before unpacking tar…
gartung Jul 14, 2017
8a05baa
Set gpg2 signing and verifying to True by default.
gartung Jul 14, 2017
169afab
Don't include gartung.key. The trusted keys should be copied into var…
gartung Jul 14, 2017
685e458
add buildcache keys command to get keys on mirror and option to trust…
gartung Jul 14, 2017
fb46610
Ask before trusting keys downloaded from mirror.
gartung Jul 17, 2017
a610232
keep the spec file
gartung Jul 18, 2017
4316403
Merge remote-tracking branch 'llnl/develop' into binary_packages
gartung Jul 21, 2017
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
66 changes: 66 additions & 0 deletions lib/spack/docs/binary_caches.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
.. _binary_caches:

Binary caches
============================

.. warning:: The feature of binary caches is still experimental
and chosen conventions still may evolve over time.

Some sites may encourage users to set up their own test environments
before carrying out central installations, or some users prefer to set
up these environments on their own motivation. To reduce the load of
recompiling otherwise identical package specs in different installations,
created build artifacts can be put into binary tarballs, uploaded onto
your spack mirror and then downloaded and installed by others.


Creating binary tarballs
-----------------------------

Tarballs of sofware built can be created via ``spack create-tarball``.
It allows either to tar up a single package or a package including all
its dependencies (``-r``, ``--recurse``). The location for the tarballs
can be given via the ``--directory`` option:

.. code-block:: sh

$ spack create-tarball -d ~/caches -r bison
==> recursing dependencies
==> adding dependency bison@3.0.4%gcc@5.3.1=linux-x86_64^libsigsegv@2.10%gcc@5.3.1=linux-x86_64^m4@1.4.17%gcc@5.3.1+sigsegv=linux-x86_64
==> adding dependency m4@1.4.17%gcc@5.3.1+sigsegv=linux-x86_64^libsigsegv@2.10%gcc@5.3.1=linux-x86_64
==> adding dependency libsigsegv@2.10%gcc@5.3.1=linux-x86_64
==> creating tarball for package bison@3.0.4%gcc@5.3.1=linux-x86_64^libsigsegv@2.10%gcc@5.3.1=linux-x86_64^m4@1.4.17%gcc@5.3.1+sigsegv=linux-x86_64
==> /home/hegner/caches/ubuntu16_04-x86_64/gcc-5.3.1/bison/ubuntu16_04-x86_64-bison-3.0.4-n6naf2v2wt2p5tg3jdveuqufhjwlba7o.tar.gz
==> creating tarball for package libsigsegv@2.10%gcc@5.3.1=linux-x86_64
==> /home/hegner/caches/ubuntu16_04-x86_64/gcc-5.3.1/libsigsegv/ubuntu16_04-x86_64-libsigsegv-2.10-klc6t4jq2w6ochuz6xosu6vaujbwszds.tar.gz
==> creating tarball for package m4@1.4.17%gcc@5.3.1+sigsegv=linux-x86_64^libsigsegv@2.10%gcc@5.3.1=linux-x86_64
==> /home/hegner/caches/ubuntu16_04-x86_64/gcc-5.3.1/m4/ubuntu16_04-x86_64-m4-1.4.17-6hpdn55vhztd25vxwuamxqo7edmootwv.tar.gz


The created tarballs are put into the directory structure expected for the
spack mirror.


Installing binary tarballs
--------------------------------

To install binary tarballs, one has to add the corresponding spack mirror
with ``spack mirror add <name> <url>``. Afterwards binaries can be installed
via:

.. code-block:: sh

$ spack install-tarball bison

The package bison and all its dependencies will be downloaded from the
specified mirror(s). It fails if a package cannot be downloaded.


Relocation
-------------------------------

Initial build and later installation do not necessarily happen at the same
location. Spack provides a very basic relocation capability and corrects for
RPATHs and non-relocatable scripts. However, many packages compile paths into
binary artificats directly. In such cases, the build instructions of this
package would need to be adjusted for better re-locatability.
1 change: 1 addition & 0 deletions lib/spack/docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,7 @@ or refer to the full manual below.
repositories
command_index
package_list
binary_caches

.. toctree::
:maxdepth: 2
Expand Down
230 changes: 230 additions & 0 deletions lib/spack/spack/binary_distribution.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,230 @@
# "Benedikt Hegner (CERN)"

import os
import platform
import yaml

import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp

from spack.util.executable import which
import spack.cmd
import spack
from spack.stage import Stage
import spack.fetch_strategy as fs
import spack.relocate


def get_full_system_from_platform():
import platform
import re
system = platform.system()
if system == "Linux":
pf = platform.linux_distribution(full_distribution_name=0)[0]
version = platform.linux_distribution(full_distribution_name=0)[1]
if pf != 'Ubuntu':
# For non-Ubuntu major version number is enough
# to understand compatibility
version = version.split('.')[0]
elif system == "Darwin":
pf = "macos10"
version = platform.mac_ver()[0].split(".")[1]
else:
raise "System %s not supported" % system
sys_type = pf + version + '-' + platform.machine()
sys_type = re.sub(r'[^\w-]', '_', sys_type)
return sys_type.lower()


def prepare():
"""
Install patchelf as pre-requisite to the
required relocation of binary packages
"""
if platform.system() == 'Darwin':
return
dir = os.getcwd()
patchelf_spec = spack.cmd.parse_specs("patchelf", concretize=True)[0]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spec('patchelf').concretized() might be simpler here, but there's a better way now

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@tgamblin - thanks, didn't know. What do you mean with "there's a better way now"?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding comments as I go 😄 -- I mean you don't need to concretize this. installed_db.query(), described below, should match against specs that are installed, even if they don't exactly match what patchelf currently concretizes to.

if not spack.store.layout.check_installed(patchelf_spec):
patchelf = spack.repo.get(patchelf_spec)
patchelf.do_install()
os.chdir(dir)


def buildinfo_file_name(spec):
"""
Filename of the binary package meta-data file
"""
return os.path.join(spec.prefix, ".spack", "binary_distribution")


def read_buildinfo_file(package):
"""
Read buildinfo file
"""
filename = buildinfo_file_name(package)
with open(filename, 'r') as inputfile:
content = inputfile.read()
buildinfo = yaml.load(content)
return buildinfo


def write_buildinfo_file(spec):
"""
Create a cache file containing information
required for the relocation
"""
text_to_relocate = []
binary_to_relocate = []
blacklist = (".spack", "man")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What if they put binaries in man? I mean, this is HPC... I've seen a lot of crazy things.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We removed it because it was just time wasted in there. Scanning plenty of files. I will remove that exclusion.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it saves tons of time it might be worth it 😄. I actually haven't seen anyone put binaries in man, but it does seem like a good idea to be complete.

for root, dirs, files in os.walk(spec.prefix, topdown=True):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it's not ridiculously expensive, we should consider just making this file get generated and installed to .spack after every install, so it's easy to make a tarball later.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd prefer doing that once we went through the first iterations and know the format is really stable.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sounds reasonable

dirs[:] = [d for d in dirs if d not in blacklist]
for filename in files:
path_name = os.path.join(root, filename)
filetype = spack.relocate.get_filetype(path_name)
if spack.relocate.needs_binary_relocation(filetype):
rel_path_name = os.path.relpath(path_name, spec.prefix)
binary_to_relocate.append(rel_path_name)
elif spack.relocate.needs_text_relocation(filetype):
rel_path_name = os.path.relpath(path_name, spec.prefix)
text_to_relocate.append(rel_path_name)

# Create buildinfo data and write it to disk
buildinfo = {}
buildinfo['buildpath'] = spack.store.layout.root
buildinfo['relocate_textfiles'] = text_to_relocate
buildinfo['relocate_binaries'] = binary_to_relocate
filename = buildinfo_file_name(spec)
with open(filename, 'w') as outfile:
outfile.write(yaml.dump(buildinfo, default_flow_style=True))


def tarball_directory_name(spec):
"""
Return name of the tarball directory according to the convention
<os>-<architecture>/<compiler>/<package>/
"""
return "%s/%s/%s" % (get_full_system_from_platform(),
str(spec.compiler).replace("@", "-"),
spec.name)


def tarball_name(spec):
"""
Return the name of the tarfile according to the convention
<os>-<architecture>-<package>-<dag_hash>.tar.gz
"""
return "%s-%s-%s-%s.tar.gz" % (get_full_system_from_platform(),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should use spack.architecture.sys_type(), which contains the "platform", OS, and target... Any reason the package isn't first? What about:

<package>-<platform>-<os>-<target>-<dag_hash>.tar.gz

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is for convenience as this is how we order/group the binary archives in the end. From single package perspective your comment makes sense though.

spec.name,
spec.version,
spec.dag_hash())


def tarball_path_name(spec):
"""
Return the full path+name for a given spec according to the convention
<tarball_directory_name>/<tarball_name>
"""
return os.path.join(tarball_directory_name(spec),
tarball_name(spec))


def build_tarball(spec, outdir, force=False):
"""
Build a tarball from given spec and put it into the directory structure
used at the mirror (following <tarball_directory_name>).
"""
tarfile_dir = os.path.join(outdir, tarball_directory_name(spec))
tarfile = os.path.join(outdir, tarball_path_name(spec))
if os.path.exists(tarfile):
if force:
os.remove(tarfile)
else:
tty.warn("file exists, use -f to force overwrite: %s" % tarfile)
return
if not os.path.exists(tarfile_dir):
mkdirp(tarfile_dir)

tar = which('tar', required=True)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a reason we're using the command line and not the tarfile module?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No. It may not have been available at the inception of this branch.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the record, we use tar from the command line in other places across Spack. Are there any advantages to using tarfile?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did not author that part so I was guessing.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually what appears to be missing is the path to the tarfile being created.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug is in create_tarball.py. Need to set the default output directory to $CWD there.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or it is elsewhere. Tracing now.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mathstuf python tarfile is much slower than command line tar.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually tarfile is fine. I was creating a tarball of everything installed.

dirname = os.path.dirname(spec.prefix)
basename = os.path.basename(spec.prefix)

# handle meta-data
cp = which("cp", required=True)
spec_file = os.path.join(spec.prefix, ".spack", "spec.yaml")
target_spec_file = tarfile + ".yaml"
Copy link
Member

@tgamblin tgamblin Aug 11, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hegner @gartung: is this for easily querying the specs once the tarballs are generated? I couldn't see where this gets used.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yap. This is there to facilitate the mirroring and querying. You don't want to extract the data from every package "manually" again.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok this would be useful, then, for partial matching if we implement that. This can be used to check if a spec is "good enough" to satisfy a build.

cp(spec_file, target_spec_file)

# create info for later relocation and create tar
write_buildinfo_file(spec)
tar("--directory=%s" % dirname, "-czf", tarfile, basename)
tty.msg(tarfile)


def download_tarball(package):
"""
Download binary tarball for given package into stage area
Return True if successful
"""
mirrors = spack.config.get_config('mirrors')
if len(mirrors) == 0:
tty.die("Please add a spack mirror to allow " +
"download of pre-compiled packages.")
tarball = tarball_path_name(package.spec)
for key in mirrors:
url = mirrors[key] + "/" + tarball
# print url
# stage the tarball into standard place
stage = Stage(url, name=package.stage.path)
Copy link
Member

@tgamblin tgamblin Aug 11, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gartung @hegner: This will construct a URLFetchStrategy with no associated checksum, and it'll just trust the remote source that the binary is ok. Before we can put this in the mainline, we need to figure out a way to embed checksums in the packages, although that could result in a lot of information as the number of binary builds grows.

Homebrew's bottle scheme is pretty simple, and it makes it pretty easy to put binary hashes into formulae. Spack has a slightly more difficult situation because a) there can be a lot of different builds for the same package and b) you can't get the symbolic spec straight from a hash.

I think a) and b) can eventually be mitigated if binary archive sites had a way to provide indices of available hashes, so that Spack could download an index of available binaries and find a "close enough match". For a first cut at this, exact matching of the DAG hash is probably fine, especially if you expect users typically not to pull straight from develop (which would mean that things don't change as much).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@tgamblin - seems you are mixing quite a few concepts into your comment and I am not sure I get all the points you are trying to make. I'd believe that is something for the telcon.
And yes, mirrors should have a way to fetch trustworthy hashes from there. I don't think the homebrew approach will work. The extraction of the metadata yaml was a first step into that direction.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok -- we can discuss more on the call. I think it's worthwhile to have a more detailed discussion about this.

I'm not sure you want to actually fetch the hashes directly from the mirror. You need to have a way to guarantee that the hash itself came from a secure channel. If you trust the server to provide both a reliable checksum and a reliable package, you're vulnerable to a compromised server. The hashes/other verification mechanism need to come along some secure channel. The approach I mentioned of signing packages like Debian seems promising for this as it's easy to transfer a public key once and verify packages with it from then on.

btw -- LLNL has some S3 space where we can start to put up a mirror. CERN may have something similar -- we can talk about that on the call as well.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct - trustworthy is the point. More for the telcon.

What concerns mirrors - we could set up something for a start.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@tgamblin I added this kludge because the previous method of fetching the tarball from a mirror was broken after merging in llnl/develop. I have been using my linux desktop as the mirror for testing but it only works onsite at Fermilab. The hashes are there so we can match the binary built to the spec used to build it.

try:
stage.fetch()
return True
except fs.FetchError:
next
return False


def extract_tarball(package):
"""
extract binary tarball for given package into install area
"""
tarball = tarball_name(package.spec)
tar = which("tar")
local_tarball = package.stage.path + "/" + tarball
mkdirp('%s' % package.prefix)
tar("--strip-components=1",
"-C%s" % package.prefix,
"-xf",
local_tarball)


def relocate_package(package):
"""
Relocate the given package
"""
buildinfo = read_buildinfo_file(package)
new_path = spack.store.layout.root
old_path = buildinfo['buildpath']
if old_path == new_path:
return True # No need to relocate

tty.warn("Using experimental feature for relocating package from",
"%s to %s." % (old_path, new_path))

# as we may need patchelf, find out where it is
patchelf_executable = ''
if platform.system() != 'Darwin':
patchelf_spec = spack.cmd.parse_specs("patchelf", concretize=True)[0]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is cool; I'll port it over to the GPG branch.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can thank @hegner

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So it actually isn't working here; it keeps throwing NoCompilersForArchError: No compilers found for operating system debian6 and target x86_64. on me. I have fedora25-x86_64. How do I get it to just use any compiler?

Copy link
Member Author

@gartung gartung May 23, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry. I never tested this on anything but RHEL7 clones and macOS.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a little bit behind in following the developments here. It fails in the concretization of that spec. Any compiler would be fine if there is one for your platform. Question would be why you get debian6 on one side and fedora25 on another. Does a command line install of patchelf work?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It also seems that the cmd infrastructure just swallows exceptions? I see the exceptions in the test, but the command just silently does nothing :/ . tty also doesn't actually output anything.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It isn't patchelf, but gnupg@2 I'm doing (same results without the @2). But yes, a command line install seems to work.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since you have it now installed, can you check whether spack.installed_db.query("gnupg") inside the code yields the proper result?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This returns the right thing: spack.database.Database('opt/spack').query('gnupg@2:'), but I cannot call concretize on a spec :/ .

patchelf = spack.repo.get(patchelf_spec)
patchelf_executable = os.path.join(patchelf.prefix, "bin", "patchelf")

# now do the actual relocation
for filename in buildinfo['relocate_binaries']:
path_name = os.path.join(package.prefix, filename)
spack.relocate.relocate_binary(path_name,
old_path,
new_path,
patchelf_executable)
for filename in buildinfo['relocate_textfiles']:
path_name = os.path.join(package.prefix, filename)
spack.relocate.relocate_text(path_name, old_path, new_path)
68 changes: 68 additions & 0 deletions lib/spack/spack/cmd/create_tarball.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import argparse

import llnl.util.tty as tty

import spack
import spack.cmd
from spack.binary_distribution import build_tarball


description = "Create binary cache tarballs for given packages"
section = "packaging"
level = "long"


def setup_parser(subparser):
subparser.add_argument('-r', '--recurse', action='store_true',
help="also make tarballs for dependencies.")
subparser.add_argument('-f', '--force', action='store_true',
help="overwrite tarball if it exists.")
subparser.add_argument('-d', '--directory', default=".",
help="directory in which to save the tarballs.")

subparser.add_argument(
'packages', nargs=argparse.REMAINDER,
help="specs of packages to package")


def create_tarball(parser, args):
if not args.packages:
tty.die("binary cache tarball creation requires at least one package argument")

pkgs = set(args.packages)
specs = set()
for pkg in pkgs:
for spec in spack.cmd.parse_specs(pkg, concretize=True):
specs.add(spec)
if args.recurse:
tty.msg('recursing dependencies')
for d, node in spec.traverse(order='pre', depth=True):
tty.msg('adding dependency %s' % node)
specs.add(node)
for spec in specs:
tty.msg('creating binary cache tarball for package %s ' % spec)
build_tarball(spec, args.directory, args.force)
Loading