Skip to content
This repository has been archived by the owner on Apr 21, 2023. It is now read-only.

Get packaged for debian #1432

Open
jeffkaufman opened this issue Nov 15, 2016 · 41 comments
Open

Get packaged for debian #1432

jeffkaufman opened this issue Nov 15, 2016 · 41 comments

Comments

@jeffkaufman
Copy link
Contributor

Debian has an existing bug out asking for mod_pagespeed integration: debian:602316.

Their docs on how to get started as a package maintainer: Debian New Maintainers' Guide

I think the process here would be something like:

  1. Start with the tarball for 1.11.33.4 (1.11.33.4-r0.tar.bz2)
  2. Using the debian quilt patching workflow, modify it to build against debian packages instead of bundled dependencies. (See the list I posted at the bottom of debian:602316.)
  3. When we have something that works, start talking to debian. We need to find a [sponsor], since we're not debian maintainers ourselves.

(Everything seems pretty resolvable, except for the situation with serf. Not sure what to do there; we should talk to someone at debian. But we can start the process of getting a debian-compliant tarball without that.)

@jeffkaufman
Copy link
Contributor Author

Debian's "no new packages" date for their next release is right after 1/1, so if we want to get in for Debian 9 we need to move quickly.

@morlovich
Copy link
Contributor

Hmm, what distro should we develop on? Don't think 9-pre-releases are
available on GCE; wonder if a diff with 8 would be material or not.

On Tue, Nov 15, 2016 at 10:41 AM, Jeff Kaufman notifications@github.com
wrote:

Debian's "no new packages" date for their next release is right after 1/1,
so if we want to get in for Debian 9 we need to move quickly.


You are receiving this because you were assigned.
Reply to this email directly, view it on GitHub
#1432 (comment),
or mute the thread
https://github.com/notifications/unsubscribe-auth/ADl1RDpcKr4w0rj6rpf0Grqlqmchx-U0ks5q-dKjgaJpZM4Kyq5a
.

@jeffkaufman
Copy link
Contributor Author

It looks like you set up debian 8, and then upgrade it to unstable: how do I install unstable

@morlovich
Copy link
Contributor

Made some progress --- getting something that probably loads our .so (assuming I actually have everything in patches and not directly in source tree) , and mostly links natively.
Attaching the Debian dir sketch thus far --- still lots of things incomplete, but I feel uncomfortable when I don't have a version control system backing me up.

debian_2.tar.gz

@morlovich
Copy link
Contributor

Will attach some more after I work around github not liking tar.xz.
This actually seems to build self-contained (though off smaller tarballs than we start with --- will send a review for build script for that). Package installs our configs, etc. pagespeed.conf needs to have some lacking substitutions fixed, and I need to work on docs part, too, but this is kinda close.

Couldn't use system modp_b64 and google-sparsehash, the packages seem to have trouble. Didn't touch closure, either.

(Also should remember to limit archs to i386 and amd64)

dpkg-deb -I libapache2-mod-pagespeed_1.11.33.4-1_amd64.deb
new debian package, version 2.0.
size 1311220 bytes: control archive=1103 bytes.
139 bytes, 3 lines conffiles
696 bytes, 11 lines control
338 bytes, 4 lines md5sums
596 bytes, 22 lines * postinst #!/bin/sh
680 bytes, 28 lines * postrm #!/bin/sh
738 bytes, 28 lines * prerm #!/bin/sh
Package: libapache2-mod-pagespeed
Source: modpagespeed
Version: 1.11.33.4-1
Architecture: amd64
Maintainer: Maksim Orlovich morlovich@google.com
Installed-Size: 5642
Depends: libc6 (>= 2.14), libgcc1 (>= 1:3.4), libgflags2v5, libgif7 (>= 5.1), libicu57 (>= 57.1-1~), libjpeg9 (>=
9a), libjsoncpp1 (>= 1.7.4), libpng16-16 (>= 1.6.2-1), libprotobuf10, libre2-3 (>= 20160901), libssl1.0.2 (>= 1.0.2
d), libstdc++6 (>= 5.2), libwebp6 (>= 0.5.1), libwebpmux2 (>= 0.5.1), zlib1g (>= 1:1.1.4), apache2-api-20120211, apache2-bin (>= 2.4.16)
Section: web
Priority: optional
Homepage: https://developers.google.com/speed/pagespeed/module/
Description: Apache module for rewriting web pages for efficiency

@morlovich
Copy link
Contributor

@morlovich
Copy link
Contributor

modpagespeed_1.11.33.4-1.debian.tar.gz

More metatadata filled in, fixes for pagespeed.conf generation, fix for grey_alpha handling of libpng (preliminary, needs more review and trunk link)

@jeffkaufman
Copy link
Contributor Author

Let me know when you have something you'd like me to look at.

We should also (at some point) start looping debian people in to see what they think.

@morlovich
Copy link
Contributor

It's pretty close --- see the doc for status. Actually, there is one thing I need your help with:
We need a paragraph-length or so package description. (And perhaps a better synopsis than what I have).
Spec for what's needed: https://www.debian.org/doc/debian-policy/ch-binary.html#s-descriptions
Currently have:
Description: Apache module for rewriting web pages for efficiency

@jeffkaufman
Copy link
Contributor Author

How about:

Apache module for rewriting web pages for efficiency. mod_pagespeed optimizes web pages and resources as they flow through your server, saving its optimizations in a cache. Core optimizations include longcaching, minifying, combining, and inlining, which it can apply to JavaScript, CSS, and images.

@morlovich
Copy link
Contributor

Used it, thanks.

@morlovich
Copy link
Contributor

modpagespeed_1.11.33.4-1.debian.tar.gz

So this is really close. Probably --- I should actually trying running it :)

Wonder where would a good place to put the .xz version of that, the .dsc file and the slimmed-down tarball --- the 3 together basically correspond to a Debian source package.

(Might also make sense to github the contents of the above tarball somehow?)

@morlovich
Copy link
Contributor

... Crashes at startup w/a bad_alloc throw in some proto stuff. Might need to do something with initialization:

#1 0x00007ffff718440a in __GI_abort () at abort.c:89
#2 0x00007fffefa8104d in __gnu_cxx::__verbose_terminate_handler() ()
from /usr/lib/x86_64-linux-gnu/libstdc++.so.6
#3 0x00007fffefa7f016 in ?? () from /usr/lib/x86_64-linux-gnu/libstdc++.so.6
#4 0x00007fffefa7f061 in std::terminate() () from /usr/lib/x86_64-linux-gnu/libstdc++.so.6
#5 0x00007fffefa7f279 in __cxa_throw () from /usr/lib/x86_64-linux-gnu/libstdc++.so.6
#6 0x00007fffefa7f79c in operator new(unsigned long) () from /usr/lib/x86_64-linux-gnu/libstdc++.so.6
#7 0x00007ffff02b2795 in google::protobuf::internal::ArenaStringPtr::CreateInstanceNoArena(std::__cxx11::basic_str
ing<char, std::char_traits, std::allocator > const*) ()
from /usr/lib/x86_64-linux-gnu/libprotobuf.so.10
#8 0x00007ffff031ed12 in google::protobuf::FileDescriptorProto::MergePartialFromCodedStream(google::protobuf::io::
CodedInputStream*) () from /usr/lib/x86_64-linux-gnu/libprotobuf.so.10
#9 0x00007ffff02c5295 in google::protobuf::MessageLite::ParseFromArray(void const*, int) ()
from /usr/lib/x86_64-linux-gnu/libprotobuf.so.10
#10 0x00007ffff02fc166 in google::protobuf::EncodedDescriptorDatabase::Add(void const*, int) ()
from /usr/lib/x86_64-linux-gnu/libprotobuf.so.10
#11 0x00007ffff02d4b25 in google::protobuf::DescriptorPool::InternalAddGeneratedFile(void const*, int) ()
from /usr/lib/x86_64-linux-gnu/libprotobuf.so.10
#12 0x00007ffff3e3d324 in net_instaweb::protobuf_AddDesc_net_2finstaweb_2frewriter_2fmobilize_5flabeling_2eproto
() at out/Release/obj/gen/protoc_out/instaweb/net/instaweb/rewriter/mobilize_labeling.pb.cc:92
#13 0x00007ffff7de95da in call_init (l=, argc=argc@entry=1, argv=argv@entry=0x7fffffffe628,
env=env@entry=0x7fffffffe638) at dl-init.c:72
#14 0x00007ffff7de96eb in call_init (env=0x7fffffffe638, argv=0x7fffffffe628, argc=1, l=)
at dl-init.c:30
#15 _dl_init (main_map=main_map@entry=0x555555822170, argc=1, argv=0x7fffffffe628, env=0x7fffffffe638)
at dl-init.c:120

@morlovich
Copy link
Contributor

~ProcessContext calling ShutDownProtobufLibrary appears to be the proximate cause.

@morlovich
Copy link
Contributor

Hmm, more like that does it job right, but on the second load of the load/unload/load sequence some things don't seem to be reinitialized by GoogleOnceInit..

==5556== Invalid read of size 8
==5556== at 0xCA94713: google::protobuf::internal::ArenaStringPtr::CreateInstanceNoArena(std::__cxx11::basic_str
ing<char, std::char_traits, std::allocator > const*) (in /usr/lib/x86_64-linux-gnu/libprotobuf.so.10.0.
0)
==5556== by 0xCAFFC76: google::protobuf::DescriptorProto::MergePartialFromCodedStream(google::protobuf::io::Code
dInputStream*) (in /usr/lib/x86_64-linux-gnu/libprotobuf.so.10.0.0)
==5556== by 0xCB0078B: google::protobuf::FileDescriptorProto::MergePartialFromCodedStream(google::protobuf::io::
CodedInputStream*) (in /usr/lib/x86_64-linux-gnu/libprotobuf.so.10.0.0)
==5556== by 0xCAA7294: google::protobuf::MessageLite::ParseFromArray(void const*, int) (in /usr/lib/x86_64-linux
-gnu/libprotobuf.so.10.0.0)
==5556== by 0xCADE145: google::protobuf::EncodedDescriptorDatabase::Add(void const*, int) (in /usr/lib/x86_64-li
nux-gnu/libprotobuf.so.10.0.0)
==5556== by 0xCAB6B24: google::protobuf::DescriptorPool::InternalAddGeneratedFile(void const*, int) (in /usr/lib
/x86_64-linux-gnu/libprotobuf.so.10.0.0)
==5556== by 0x8CDE323: net_instaweb::protobuf_AddDesc_net_2finstaweb_2frewriter_2fmobilize_5flabeling_2eproto()
(mobilize_labeling.pb.cc:92)
==5556== by 0x400F5D9: call_init.part.0 (dl-init.c:72)
==5556== by 0x400F6EA: call_init (dl-init.c:30)
==5556== by 0x400F6EA: _dl_init (dl-init.c:120)
==5556== by 0x4013C67: dl_open_worker (dl-open.c:575)
==5556== by 0x400F483: _dl_catch_error (dl-error.c:187)
==5556== by 0x4013418: _dl_open (dl-open.c:660)
==5556== Address 0x6564ea0 is 0 bytes inside a block of size 32 free'd
==5556== at 0x4C2C2DB: operator delete(void*) (vg_replace_malloc.c:576)
==5556== by 0xCA89BFA: google::protobuf::ShutdownProtobufLibrary() (in /usr/lib/x86_64-linux-gnu/libprotobuf.so.
10.0.0)
==5556== by 0x8E0997D: net_instaweb::ProcessContext::~ProcessContext() (process_context.cc:68)
==5556== by 0x8C0CB14: net_instaweb::(anonymous namespace)::ApacheProcessContext::~ApacheProcessContext() (mod_i
nstaweb.cc:311)
==5556== by 0x5758C8E: __cxa_finalize (cxa_finalize.c:56)
==5556== by 0x8BFF0F2: ??? (in /usr/lib/apache2/modules/mod_pagespeed.so)
==5556== by 0x40147D8: _dl_close_worker (dl-close.c:286)
==5556== by 0x401539D: _dl_close (dl-close.c:822)
==5556== by 0x400F483: _dl_catch_error (dl-error.c:187)
==5556== by 0x6107520: dlerror_run (dlerror.c:163)
==5556== by 0x6106FDE: dlclose (dlclose.c:46)
==5556== by 0x52E88B6: ??? (in /usr/lib/x86_64-linux-gnu/libapr-1.so.0.5.2)
==5556== Block was alloc'd at
==5556== at 0x4C2B21F: operator new(unsigned long) (vg_replace_malloc.c:334)
==5556== by 0xCAA62FD: google::protobuf::internal::InitEmptyString() (in /usr/lib/x86_64-linux-gnu/libprotobuf.s
o.10.0.0)
==5556== by 0xCA8B0A6: google::protobuf::GoogleOnceInitImpl(long*, google::protobuf::Closure*) (in /usr/lib/x86

64-linux-gnu/libprotobuf.so.10.0.0)
==5556== by 0xCAEB873: google::protobuf::FileDescriptorProto::SharedCtor() (in /usr/lib/x86_64-linux-gnu/libprot
obuf.so.10.0.0)
==5556== by 0xCAEBA16: google::protobuf::FileDescriptorProto::FileDescriptorProto() (in /usr/lib/x86_64-linux-gn
u/libprotobuf.so.10.0.0)
==5556== by 0xCADE137: google::protobuf::EncodedDescriptorDatabase::Add(void const*, int) (in /usr/lib/x86_64-li
nux-gnu/libprotobuf.so.10.0.0)
==5556== by 0xCAB6B24: google::protobuf::DescriptorPool::InternalAddGeneratedFile(void const*, int) (in /usr/lib
/x86_64-linux-gnu/libprotobuf.so.10.0.0)
==5556== by 0xCAAE103: google::protobuf::protobuf_AddDesc_google_2fprotobuf_2fany_2eproto() (in /usr/lib/x86_64-
linux-gnu/libprotobuf.so.10.0.0)
==5556== by 0x400F5D9: call_init.part.0 (dl-init.c:72)
==5556== by 0x400F6EA: call_init (dl-init.c:30)
==5556== by 0x400F6EA: _dl_init (dl-init.c:120)
==5556== by 0x4013C67: dl_open_worker (dl-open.c:575)
==5556== by 0x400F483: _dl_catch_error (dl-error.c:187)

... while removing the ShutDown just produces:
[libprotobuf ERROR google/protobuf/descriptor_database.cc:57] File already exists in database: net/instaweb/rewrite
r/mobilize_labeling.proto
[libprotobuf FATAL google/protobuf/descriptor.cc:1164] CHECK failed: generated_database_->Add(encoded_file_descriptor, size):

... Which also suggests a bunch of stuff sticks around after the unload (I would think it unreachable).

@morlovich
Copy link
Contributor

I think I am semi-understanding this.
libprotobuf isn't actually unloaded because of something called STB_GNU_UNIQUE.

So 2 scenarios:
A) We call protobuf shutdown on our unload: we delete a whole bunch of stuff... but since protobuf isn't actually deleted, it doesn't think it needs to reinitialize things. (Perhaps cleanup routines should reset GoogleOnce stuff just in case.. but they don't).
B) If we don't call protobuf shutdown, it remembers everything just fine, we don't, so protobuf check-fails at us trying to re-register types.

Not sure if there is any option better than just picking ourselves.

@morlovich
Copy link
Contributor

pinning, rather.

@morlovich
Copy link
Contributor

This works, and I get quite a bit through integration tests (though not entirely):

--- a/src/net/instaweb/rewriter/process_context.cc
+++ b/src/net/instaweb/rewriter/process_context.cc
@@ -62,10 +62,10 @@
// starting up.
ShutDownCommandLineFlags();

  • // The protobuf shutdown infrastructure is lazily initialized in a threadsafe
  • // manner. See third_party/protobuf/src/google/protobuf/stubs/common.cc,
  • // function InitShutdownFunctionsOnce.
    +#if 0
  • // See comments in ApacheProcessContext ctor for why we can't use this here.
    google::protobuf::ShutdownProtobufLibrary();
    +#endif

    url::Shutdown();
    HtmlKeywords::ShutDown();
    --- a/src/net/instaweb/mod_pagespeed.gyp
    +++ b/src/net/instaweb/mod_pagespeed.gyp
    @@ -29,6 +29,9 @@
    'dependencies+': [
    '<(DEPTH)/third_party/httpd/httpd.gyp:include',
    ],

  •  'libraries': [
    
  •    '-ldl',
    
  •  ],
    

    },
    {
    'target_name': 'mod_pagespeed_ap24',
    --- a/src/pagespeed/apache/mod_instaweb.cc
    +++ b/src/pagespeed/apache/mod_instaweb.cc
    @@ -19,6 +19,7 @@
    // that mod_pagespeed needs to do to be an Apache module.

#include <unistd.h>
+#include <dlfcn.h>

#include
#include
@@ -304,6 +305,27 @@
class ApacheProcessContext {
public:
ApacheProcessContext() : apache_cmds_(NULL) {

  • // One of our dependencies is Protocol Buffers. Dynamically linking
  • // to them causes a problem if the following sequence of steps happens:
  • //
  • // 1) unload mod_pagespeed.so
  • // 2) libprotobuf.so sticks around.
  • // 3) load mod_pagespeed.so again
  • //
  • // When that occurs, a lot of global state ends up being out of sync
  • // between the two .so's regardless of what we do.
  • //
  • // Oh, and w/o this patch, this happens at server startup. Steps 1 and 3
  • // happen because this is just how Apache + apr work. Step 2 is because it
  • // turns out that libprotobuf triggers something called STB_GNU_UNIQUE,
  • // which, among other things, prevents the library from being unloaded.
  • // (Even if that wasn't there, another client could potentially keep
  • // libprotobuf.so around just via refcount).
  • //
  • // The result being, we can't let ourselves be unloaded... so this asks
  • // the dynamic linker to bump the refcount... and sadly hardcodes our path.
  • dlopen("/usr/lib/apache2/modules/mod_pagespeed.so", RTLD_GLOBAL | RTLD_NOW);
  • ApacheRewriteDriverFactory::Initialize();
    InstallCommands();
    }

@morlovich
Copy link
Contributor

modpagespeed_1.11.33.4-1.debian.tar.gz

Looking at integration tests now. A non-trivial number do fail (haven't sussed out the pattern yet), but also most of them do work.

@morlovich
Copy link
Contributor

modpagespeed_1.11.33.4-1.debian.tar.gz

Found another problem involving tail padding.
Minor notes on this:

  1. libpng-padidng-init.patch may be tweaked a bit, out for review right now.
  2. test-hacking.patch is me trying to debug some integration test trouble, not meant to be final/shipable.

@morlovich
Copy link
Contributor

debian-source-package.tar.gz

OK, this is everything that should be in the source package --- including the original tarball[1], the debian dir, and the .dsc file linking the two

[1] Should probably find a proper home for that, and export the script to trunk --- it's basically a subset of the existing tarball.

@jeffkaufman
Copy link
Contributor Author

Working on getting this uploaded to mentors.debian.org so I can file an RFS (request for sponsorship). Steps so far:

gpg --gen-key
# made account on mentors.debian.org, uploaded my gpg public key
wget https://github.com/pagespeed/mod_pagespeed/files/640112/debian-source-package.tar.gz
tar -xvzf debian-source-package.tar.gz
dpkg-source -x modpagespeed_1.11.33.4-1.dsc
cd modpagespeed-1.11.33.4
# edited debian/control to change the Maintainer line to me, to match the gpg key I generated,
# this wasn't enough, needed to explicitly override `--sign-key` to be my gpg key
dpkg-buildpackage --sign-key=8EB3255A1B4B6174628F2D0F5EE668494E566201

@jeffkaufman
Copy link
Contributor Author

Following https://mentors.debian.net/intro-maintainers I put:

[mentors]
fqdn = mentors.debian.net
incoming = /upload
method = http
allow_unsigned_uploads = 0
progress_indicator = 2
# Allow uploads for UNRELEASED packages                                          
allowed_distributions = .*

in my ~/.dput.cf and then ran:

 dput mentors ../modpagespeed_1.11.33.4-1_amd64.changes

@jeffkaufman
Copy link
Contributor Author

The docs say "If you did everything right, you will get a confirmation mail from our site and you can start seeking a sponsor for your package" but I don't see an email from them and I don't see any packages on https://mentors.debian.net/packages/my

@jeffkaufman
Copy link
Contributor Author

https://mentors.debian.net/qa says:

If you upload via HTTP, which is what we recommend, then it will take between 0 and 2 minutes. ... During those 0-2 minutes, the server does quality assurance and other checks on your package. You will receive an email when it is ready.

but it's been more like 10min. I can try being more patient.

@jeffkaufman
Copy link
Contributor Author

Output of dput was:

Checking signature on .changes
gpgme: ../modpagespeed_1.11.33.4-1_amd64.changes: Valid signature from 5EE668494E566201
Checking signature on .dsc
gpgme: ../modpagespeed_1.11.33.4-1.dsc: Valid signature from 5EE668494E566201
Uploading to mentors (via http to mentors.debian.net):
  Uploading modpagespeed_1.11.33.4-1.dsc: done.
  Uploading modpagespeed_1.11.33.4.orig.tar.bz2: done.        
  Uploading modpagespeed_1.11.33.4-1.debian.tar.xz: done.  
  Uploading libapache2-mod-pagespeed-dbgsym_1.11.33.4-1_amd64.deb: done.        
  Uploading libapache2-mod-pagespeed_1.11.33.4-1_amd64.deb: done.      
  Uploading modpagespeed_1.11.33.4-1_amd64.buildinfo: done.
  Uploading modpagespeed_1.11.33.4-1_amd64.changes: done.
Successfully uploaded packages.

@jeffkaufman
Copy link
Contributor Author

The docs say to contact their support email "if problems persist" so it sounds like I should rerun the tool. Trying that now.

@jeffkaufman
Copy link
Contributor Author

$ dput mentors ../modpagespeed_1.11.33.4-1_amd64.changes
Package has already been uploaded to mentors on mentors.debian.net
Nothing more to do for ../modpagespeed_1.11.33.4-1_amd64.changes

@jeffkaufman
Copy link
Contributor Author

Apparently that message just means that ../modpagespeed_1.11.33.4-1_amd64.mentors.upload exists, so renaming the file and trying again:

 $ mv ../modpagespeed_1.11.33.4-1_amd64.mentors.upload ../modpagespeed_1.11.33.4-1_amd64.mentors.upload.first-try
$ dput mentors ../modpagespeed_1.11.33.4-1_amd64.changes
Checking signature on .changes
gpgme: ../modpagespeed_1.11.33.4-1_amd64.changes: Valid signature from 5EE668494E566201
Checking signature on .dsc
gpgme: ../modpagespeed_1.11.33.4-1.dsc: Valid signature from 5EE668494E566201
Uploading to mentors (via http to mentors.debian.net):
  Uploading modpagespeed_1.11.33.4-1.dsc: done.
  Uploading modpagespeed_1.11.33.4.orig.tar.bz2: done.        
  Uploading modpagespeed_1.11.33.4-1.debian.tar.xz: done.  
  Uploading libapache2-mod-pagespeed-dbgsym_1.11.33.4-1_amd64.deb: done.        
  Uploading libapache2-mod-pagespeed_1.11.33.4-1_amd64.deb: done.      
  Uploading modpagespeed_1.11.33.4-1_amd64.buildinfo: done.
  Uploading modpagespeed_1.11.33.4-1_amd64.changes: done.
Successfully uploaded packages.

@jeffkaufman
Copy link
Contributor Author

Checked my spam folder, no response from debian there.

@jeffkaufman
Copy link
Contributor Author

It sent the rejection to morlovich@; I probably need to change another file

@jeffkaufman
Copy link
Contributor Author

Changed debian/changelog:

  • changed to Jeff Kaufman <jefftk@google.com>
  • also set bug number that this closes to #602316 (our ITP)

@jeffkaufman
Copy link
Contributor Author

Running dpkg-buildpackage without --sign-key since whatever email it's using to figure out what signing key to go for is probably the same as what their connect-this-package-to-user-account system is using.

@jeffkaufman
Copy link
Contributor Author

package upload was successful!

@morlovich there are a bunch of errors it finds with the package: https://mentors.debian.net/package/modpagespeed -- are these expected?

@jeffkaufman
Copy link
Contributor Author

@morlovich why is genfiles in this package at all?

@morlovich
Copy link
Contributor

morlovich commented Dec 9, 2016 via email

@jeffkaufman
Copy link
Contributor Author

the JS ones are since we are using precompiled JS stuff by default

I think this is incompatible with their policy guidelines. I think we probably need to set things up so it builds the js from source by default.

(I said this by IM as well, but I wanted to make it public for future people reading this.)

@morlovich
Copy link
Contributor

morlovich commented Dec 9, 2016 via email

@jeffkaufman
Copy link
Contributor Author

Filed the RFS https://bugs.debian.org/847603

@morlovich
Copy link
Contributor

work.diff.txt

Also packaging of pagespeed_js_minify, and license for one thing that ought to be in the tarball but isn't.

@morlovich
Copy link
Contributor

... And now I notice that the extra in the copyright file should be externs

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants