-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Various updates to the NCO package and dependencies #2639
Conversation
Actually, just noticed this:
So there is MPI support, but the developers don't recommend using it for now. |
still WIP? |
Yeah, I want to hear what the NCO devs have to say. |
@@ -38,23 +38,12 @@ class Nco(AutotoolsPackage): | |||
|
|||
# See "Compilation Requirements" at: | |||
# http://nco.sourceforge.net/#bld | |||
variant('mpi', default=True) | |||
|
|||
depends_on('netcdf') | |||
depends_on('antlr@2.7.7+cxx') # required for ncap2 | |||
depends_on('gsl') # desirable for ncap2 | |||
depends_on('udunits2') # allows dimensional unit transformations |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@citibeth I think you were the one who added the udunits2
package. Any reason to go with udunits2
instead of just udunits
? The tarball calls it udunits
, and we usually go by the tarball.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's a mess due to inconsistent branding of UDUNITS vs UDUNITS-2 coming from upstream. Feel free to suggest changing it, but ONLY after you've researched the issue. See here, for an explanation, plus a living example of the inconsistent branding:
http://www.unidata.ucar.edu/software/udunits/udunits-2.1.24/udunits2.html
make() | ||
make("install") | ||
return [ | ||
'--disable-csharp', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spack doesn't support C# and probably won't for a while. No reason having a variant that immediately results in an error.
version('2.2.0', '2f47841c829facb346eb6e3fab5212e2', | ||
url="http://downloads.sourceforge.net/project/expat/expat/2.2.0/expat-2.2.0.tar.bz2") | ||
version('2.1.0', 'dd7dab7a5fea97d2a6a43f511449b7cd', | ||
url="http://downloads.sourceforge.net/project/expat/expat/2.1.0/expat-2.1.0.tar.gz") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I removed version 2.1.0
as it is no longer available for download.
@@ -52,27 +49,19 @@ class Antlr(Package): | |||
# Unpatched version | |||
# url='http://dust.ess.uci.edu/nco/antlr-2.7.7.tar.gz') | |||
|
|||
variant('cxx', default=False, description='Enable ANTLR for C++') | |||
variant('java', default=False, description='Enable ANTLR for Java') | |||
variant('cxx', default=True, description='Enable ANTLR for C++') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't see why this shouldn't default to True. A C++ compiler is almost always available.
@adamjstewart I added
For all these reasons, I would suggest caution in changing ANTLR; if it ain't broke, don't fix it. In particular, I would suggest revisiting your changes to the There is no point in disabling C# language generation. Who knows? Maybe someone wants to use ANTLR to generate C# code, which they will build outside of Spack on their Mono or Windows system. Similar thing with C++ code-generation... the original idea was to turn off all code generation options by default, and have the user specify which ones they need. Similarly, ANTLR should not extend Python. And as for depending on JDK... it needs either JDK or GCJ to run, whether or not Java code generation is enabled. Best would be a variant that controls whether it is built with JDK or GCJ, along with appropriate dependencies. Maybe GCJ could be added to Apparently, it is used as a build-time dependency to generate C++ code. It I would recommend that you do not touch |
@@ -36,9 +36,6 @@ class NetcdfCxx4(Package): | |||
depends_on('netcdf') | |||
depends_on("autoconf", type='build') | |||
|
|||
def install(self, spec, prefix): | |||
def autoreconf(self, spec, prefix): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@alalazo
Is there a more "clever" way in AutotoolsPackage to just say "this needs autoreconf"?
Tried it and it didn't work. Fails with the following error message:
I built it as seen in this PR and the only difference was that it added
I don't think this is necessary. According to the NCO homepage:
So it sounds like this is a fairly standard hack. It might have even been included in newer releases. If someone wants a newer release, they can just add it to the ANTLR package. The only reason we might have a problem is if someone wanted the unhacked
Fair enough. I can allow
Meh. I don't see why it would be a problem defaulting to True. It can't possibly break unless a C++ compiler isn't available. Maybe we should just remove this variant altogether and let
Why not? It builds the appropriate
Can you elaborate on this? Maybe we should just add these dependencies as |
On Wed, Dec 21, 2016 at 3:08 PM, Adam J. Stewart ***@***.***> wrote:
1. antlr should be labeled as a build-time dependency for nco.
Tried it and it didn't work. Fails with the following error message:
ld: cannot find -lantlr
Hmmm... then a deeper understanding of ANTLR is needed. My guess is it
generates C++ code, but you also need `libantlr.so` to run that code. No
idea whether similar issues also exist for C#, Python, etc.
1. ANTLR can be built and run in two ways:
a) Build with GCC (gij). It then runs as a regular binary.
b) Build with JDK. It then requires a JDK to run.
I'm not 100% sure how it was running for me in the current antlr
package; that all depends on whether ANTLR managed to find a JDK or a gcj
on my system.
I built it as seen in this PR and the only difference was that it added
.jar files for ANTLR. I'm not sure whether these were built with JDK or
not.
Probably was built with IcedTea/JDK, not gcj. You can confirm with `which
javac`.
1. ANTLR is up to version 4. Apparently, the C++ parsers generated by
ANTLR 3 and 4 don't work well. Therefore, NCO uses ANTLR2; actually, a
modified/hacked version of ANTLR2. It would not be unreasonable to rename
the current antlr to antlr-nco, which is what it really is; an ANTLR
package that's been hacked for the purposes of NCO. If someone wants to
install a real (and up-to-date) ANTLR, they can do it in antlr/package.py.
I don't think this is necessary. According to the NCO homepage:
ANTLR executables from major distributions are pre-built with the source
patch necessary to allow NCO to link to ANTLR.
So it sounds like this is a fairly standard hack.
Still makes me uncomfortable. This is not an official ANTLR tarball, it
comes from NCO. The official ANTLR-2.7.7 doesn't even exit anymore,
appraently. It all feels dodgy.
It might have even been included in newer releases.
Who knows?
If someone wants a newer release, they can just add it to the ANTLR
package. The only reason we might have a problem is if someone wanted the
unhacked 2.7.7, and even then I would rather make it a variant and patch
it ourselves.
They'll have problems downloading it. We should at least re-name the
version to `2.7.7-nco` or something, to indicated it is a hacked version.
There is no point in disabling C# language generation. Who knows? Maybe
someone wants to use ANTLR to generate C# code, which they will build
outside of Spack on their Mono or Windows system.
Fair enough. I can allow configure to decide whether or not to build with
C#. It just prints out a bunch of warning messages about not finding C# if
I don't explicitly disable it.
Similar thing with C++ code-generation... the original idea was to turn
off all code generation options by default, and have the user specify which
ones they need.
Meh. I don't see why it would be a problem defaulting to True.
Why not just default everything to True? This changes the original design
of the package for no apparent reason.
Maybe we should just remove this variant altogether and let configure
decide?
No. That's a recipe for unexplained "why doesn't this work with C++"
questions.
In general I'm in favor of enabling all variants that don't add any extra
dependencies or cause any incompatibilities, especially if they add new
features.
Then maybe try enabling all by default.
Similarly, ANTLR should not extend Python.
Why not? It builds the appropriate antlr.py file if I do.
OK I see; similar situation to C++, where a runtime library is needed.
I would suggest revisiting your changes to the variant() lines, with the
understanding that the languages in question are target languages, not
bindings. ANTLR itself does not link to any of those languages; it only
generates code in them.
Can you elaborate on this? Maybe we should just add these dependencies as
run dependencies then. But it does search for them during configure.
The problem here is that the ANTLR package does two things:
(1) Generate code for target language
(2) Produce libraries to be used with generated code
Function (2) is potentially useful, with or without the target language
installed on the local machine. Function (1) is not, of course.
Ehh... if I had to say, I'd lean toward simplifying things, and requiring
the runtime system to be there if you're going to build code for a target
language. Which would mean leaving things pretty much the way you've
changed them.
|
Tried making
so it either needs to be a run dependency, or the default. EDIT: This was actually what happened when I tried making it a
So it is picking up the correct Java. |
I don't want to enable all by default because the other variants add extra deps. |
Here's a good reason. Let's say I want to build
|
Ended up going with Now that I think about it, should all Python packages list Python as a |
Except the MPI C++ bindings are deprecated and often cause linking issues for packages that don't use them. In many MPI versions, you have to add a #define to skip the C++ interface, and if you don't you can end up getting undefined symbols of your program links with the MPI C compiler. I'd leave MPI C++ off, especially if antlr is the only thing using it. |
@tgamblin Where do you see that it is deprecated? Is it only deprecated in ANTLR 3 and 4?
Do you mean if NCO is the only thing using ANTLR? NCO specifically requires |
Acknowledge misread this somehow (phone maybe?). I mean the MPI C++ bindings are deprecated. Not antlr. Sorry for the confusion. |
@tgamblin Is this a problem? ANTLR doesn't depend on MPI. |
On Wed, Dec 21, 2016 at 3:29 PM, Adam J. Stewart ***@***.***> wrote:
(on the topic of changing cxx to default to True)
This changes the original design of the package for no apparent reason.
Here's a good reason. Let's say I want to build antlr before building nco
just to test it out and see the output. Once I finish spack install antlr,
I run spack install nco, only to find it building a new antlr+cxx.
That doesn't bother me.
|
On Wed, Dec 21, 2016 at 3:22 PM, Adam J. Stewart ***@***.***> wrote:
Tried making jdk a build dependency:
checking for java... no
checking for gij... /usr/bin/gij
checking for jikes... no
checking for javac... no
checking for gcj... no
============================================================
Warning:
Support for JAVA has been disabled as I have not been able
locate to locate a mandatory program. Please change $PATH or run
with option --help on how to overcome this problem.
============================================================
so it either needs to be a run dependency, or the default.
Documentation of these efforts should go in comments. It was non-obvious,
and required experimentation to figure out.
|
@adamjstewart: no. I'm just spacing out. |
But defaulting to |
But defaulting to +cxx does bother you? Do you even use antlr or do you
only need it to build nco?
ANTLR is only used by NCO. No one else (including myself) has shown any
desire to use it. I guess what bothers me is spending time on this package
when, really, nothing was broken.
I've already taken the time today to share what I know about ANTLR and
NCO. As long as whatever changes take place don't break NCO, I suppose it
doesn't matter to me. But the easiest thing to do would have been to leave
it alone.
On Wed, Dec 21, 2016 at 4:09 PM, Adam J. Stewart <notifications@github.com>
wrote:
… On Wed, Dec 21, 2016 at 3:29 PM, Adam J. Stewart ***@***.***> wrote:
(on the topic of changing cxx to default to True)
This changes the original design of the package for no apparent reason.
Here's a good reason. Let's say I want to build antlr before building nco
just to test it out and see the output. Once I finish spack install antlr,
I run spack install nco, only to find it building a new antlr+cxx.
That doesn't bother me.
But defaulting to +cxx does bother you? Do you even use antlr or do you
only need it to build nco?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#2639 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AB1cd1-9Aar5AGJk3y0-PNVVvEgWsVNSks5rKZWCgaJpZM4LRMem>
.
|
I can confirm that the changes don't break NCO, as they don't affect any of the configure options used. I generally take a look at all dependencies of a package if I'm working on it, and ANTLR in particular was bad. It wasn't necessarily broken, but it had several variants that as soon as they were activated, they would raise an error. That seems like poor design to me. If I see a package with a python or java variant, I would expect that enabling that variant would build support for python and java. Previously it would crash, now it builds support for those languages. |
f5b2c15
to
52b7d30
Compare
'netcdf+mpi, but spec asked for netcdf~mpi.') | ||
depends_on('flex', type='build') | ||
depends_on('bison', type='build') | ||
depends_on('texinfo@4.12:', type='build', when='+doc') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Builds with --enable-doc
by default. The system texinfo
on macOS is too old, so installation was crashing for me. Added a +doc
variant, defaulting to False, that adds the proper dependency and flag.
See nco/nco#10 for details.
make("install") | ||
def check(self): | ||
# h5_test fails when run in parallel | ||
make('check', parallel=False) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Confirmed with the NetCDF developers that this is known behavior.
All good now? |
I think so. If the NCO devs have any more suggestions I can add them in another PR. This should be safe to merge. |
More updates to NCO. This time I'm removing the
+mpi
variant, as it doesn't do anything for this package. If you want to build NCO with NetCDF MPI support, you need to runspack install nco ^netcdf+mpi
.P.S. I'm currently working with one of the NCO developers, Charlie Zender, to get this package installed properly on our clusters, so I may have further updates coming. Don't merge just yet.