Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optional package openmpi-1.4.3 fails to install on Solaris 10 (SPARC) with gcc (OK with Sun compiler) #10869

Closed
sagetrac-drkirkby mannequin opened this issue Mar 3, 2011 · 6 comments

Comments

@sagetrac-drkirkby
Copy link
Mannequin

sagetrac-drkirkby mannequin commented Mar 3, 2011

Hardware & associated software

  • Sun T5240
  • 2 x 1157 MHz T2+ processors (each with 8 cores and 64 threads)
  • 32 GB RAM
  • Solaris 10 update 5 (May 2009 release)
  • gcc 4.5.1 (uses Sun linker and Sun assembler)
  • Sage sage-4.6.2.alpha4

How gcc was configured on t2.math

Since the configuration of gcc is important when building Sage, this is shown. Note both the Sun linker and assembler are used.

kirkby@t2:32 ~$ gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/local/gcc-4.5.1/libexec/gcc/sparc-sun-solaris2.10/4.5.1/lto-wrapper
Target: sparc-sun-solaris2.10
Configured with: ../gcc-4.5.1/configure --prefix=/usr/local/gcc-4.5.1 --with-gmp=/usr/local/gcc-4.5.1 --with-mpfr=/usr/local/gcc-4.5.1 --with-mpc=/usr/local/gcc-4.5.1 --enable-languages=c,c++,fortran --disable-nls --enable-checking=release --enable-werror=no --enable-multilib --with-system-zlib --enable-bootstrap --without-gnu-as --with-as=/usr/ccs/bin/as --without-gnu-ld --with-ld=/usr/ccs/bin/ld --with-pkgversion='Built by D. Kirkby gmp-5.0.1 mpfr-3.0.0 mpc-0.8.2'
Thread model: posix
gcc version 4.5.1 (Built by D. Kirkby gmp-5.0.1 mpfr-3.0.0 mpc-0.8.2) 

The problem

As reported at #8522, the optional package OpenMPI package failed to install on Solaris 10 SPARC. But I will close that ticket as "wont fix" because:

  • Thats against a version of OpenMPI which is very old, and was updated in Sage recently (Update Open MPI package to latest - Sage version is 3 years old! #8537)
  • Uses hardware not generally available to everyone (it was my own Sun Blade 2000)
  • It's against an old version of Sage, which needed a patch to get any optional packages installed.
  • The error message when installing 1.4.3 is totally different to what was seen with the older versions of the Sage and OpenMPI.

Error message seen on t2.math with the latest MPI in Sage

Creating mpi/man/man3/MPI_Type_struct.3 man page...
Creating mpi/man/man3/MPI_Type_ub.3 man page...
Creating mpi/man/man3/MPI_Type_vector.3 man page...
Creating mpi/man/man3/MPI_Unpack.3 man page...
Creating mpi/man/man3/MPI_Unpack_external.3 man page...
Creating mpi/man/man3/MPI_Unpublish_name.3 man page...
Creating mpi/man/man3/MPI_Wait.3 man page...
Creating mpi/man/man3/MPI_Waitall.3 man page...
Creating mpi/man/man3/MPI_Waitany.3 man page...
Creating mpi/man/man3/MPI_Waitsome.3 man page...
Creating mpi/man/man3/MPI_Win_c2f.3 man page...
Creating mpi/man/man3/MPI_Win_call_errhandler.3 man page...
Creating mpi/man/man3/MPI_Win_complete.3 man page...
Creating mpi/man/man3/MPI_Win_create.3 man page...
Creating mpi/man/man3/MPI_Win_create_errhandler.3 man page...
Creating mpi/man/man3/MPI_Win_create_keyval.3 man page...
Creating mpi/man/man3/MPI_Win_delete_attr.3 man page...
Creating mpi/man/man3/MPI_Win_f2c.3 man page...
Creating mpi/man/man3/MPI_Win_fence.3 man page...
Creating mpi/man/man3/MPI_Win_free.3 man page...
Creating mpi/man/man3/MPI_Win_free_keyval.3 man page...
Creating mpi/man/man3/MPI_Win_get_attr.3 man page...
Creating mpi/man/man3/MPI_Win_get_errhandler.3 man page...
Creating mpi/man/man3/MPI_Win_get_group.3 man page...
Creating mpi/man/man3/MPI_Win_get_name.3 man page...
Creating mpi/man/man3/MPI_Win_lock.3 man page...
Creating mpi/man/man3/MPI_Win_post.3 man page...
Creating mpi/man/man3/MPI_Win_set_attr.3 man page...
Creating mpi/man/man3/MPI_Win_set_errhandler.3 man page...
Creating mpi/man/man3/MPI_Win_set_name.3 man page...
Creating mpi/man/man3/MPI_Win_start.3 man page...
Creating mpi/man/man3/MPI_Win_test.3 man page...
Creating mpi/man/man3/MPI_Win_unlock.3 man page...
Creating mpi/man/man3/MPI_Win_wait.3 man page...
Creating mpi/man/man3/MPI_Wtick.3 man page...
Creating mpi/man/man3/MPI_Wtime.3 man page...
Creating mpi/man/man3/OpenMPI.3 man page...
make[2]: Leaving directory `/rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/spkg/build/openmpi-1.4.3/src/ompi'
Making all in mpi/cxx
make[2]: Entering directory `/rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/spkg/build/openmpi-1.4.3/src/ompi/mpi/cxx'
depbase=`echo mpicxx.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ../../../libtool --tag=CXX   --mode=compile g++ -DHAVE_CONFIG_H -I. -I../../../opal/include -I../../../orte/include -I../../../ompi/include -I../../../opal/mca/paffinity/linux/plpa/src/libplpa  -DOMPI_BUILDING_CXX_BINDINGS_LIBRARY=1 -DOMPI_SKIP_MPICXX=1 -I../../..  -D_REENTRANT  -O3 -DNDEBUG -finline-functions  -MT mpicxx.lo -MD -MP -MF $depbase.Tpo -c -o mpicxx.lo mpicxx.cc &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile:  g++ -DHAVE_CONFIG_H -I. -I../../../opal/include -I../../../orte/include -I../../../ompi/include -I../../../opal/mca/paffinity/linux/plpa/src/libplpa -DOMPI_BUILDING_CXX_BINDINGS_LIBRARY=1 -DOMPI_SKIP_MPICXX=1 -I../../.. -D_REENTRANT -O3 -DNDEBUG -finline-functions -MT mpicxx.lo -MD -MP -MF .deps/mpicxx.Tpo -c mpicxx.cc  -fPIC -DPIC -o .libs/mpicxx.o
depbase=`echo intercepts.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ../../../libtool --tag=CXX   --mode=compile g++ -DHAVE_CONFIG_H -I. -I../../../opal/include -I../../../orte/include -I../../../ompi/include -I../../../opal/mca/paffinity/linux/plpa/src/libplpa  -DOMPI_BUILDING_CXX_BINDINGS_LIBRARY=1 -DOMPI_SKIP_MPICXX=1 -I../../..  -D_REENTRANT  -O3 -DNDEBUG -finline-functions  -MT intercepts.lo -MD -MP -MF $depbase.Tpo -c -o intercepts.lo intercepts.cc &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile:  g++ -DHAVE_CONFIG_H -I. -I../../../opal/include -I../../../orte/include -I../../../ompi/include -I../../../opal/mca/paffinity/linux/plpa/src/libplpa -DOMPI_BUILDING_CXX_BINDINGS_LIBRARY=1 -DOMPI_SKIP_MPICXX=1 -I../../.. -D_REENTRANT -O3 -DNDEBUG -finline-functions -MT intercepts.lo -MD -MP -MF .deps/intercepts.Tpo -c intercepts.cc  -fPIC -DPIC -o .libs/intercepts.o
depbase=`echo comm.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ../../../libtool --tag=CXX   --mode=compile g++ -DHAVE_CONFIG_H -I. -I../../../opal/include -I../../../orte/include -I../../../ompi/include -I../../../opal/mca/paffinity/linux/plpa/src/libplpa  -DOMPI_BUILDING_CXX_BINDINGS_LIBRARY=1 -DOMPI_SKIP_MPICXX=1 -I../../..  -D_REENTRANT  -O3 -DNDEBUG -finline-functions  -MT comm.lo -MD -MP -MF $depbase.Tpo -c -o comm.lo comm.cc &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile:  g++ -DHAVE_CONFIG_H -I. -I../../../opal/include -I../../../orte/include -I../../../ompi/include -I../../../opal/mca/paffinity/linux/plpa/src/libplpa -DOMPI_BUILDING_CXX_BINDINGS_LIBRARY=1 -DOMPI_SKIP_MPICXX=1 -I../../.. -D_REENTRANT -O3 -DNDEBUG -finline-functions -MT comm.lo -MD -MP -MF .deps/comm.Tpo -c comm.cc  -fPIC -DPIC -o .libs/comm.o
depbase=`echo datatype.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ../../../libtool --tag=CXX   --mode=compile g++ -DHAVE_CONFIG_H -I. -I../../../opal/include -I../../../orte/include -I../../../ompi/include -I../../../opal/mca/paffinity/linux/plpa/src/libplpa  -DOMPI_BUILDING_CXX_BINDINGS_LIBRARY=1 -DOMPI_SKIP_MPICXX=1 -I../../..  -D_REENTRANT  -O3 -DNDEBUG -finline-functions  -MT datatype.lo -MD -MP -MF $depbase.Tpo -c -o datatype.lo datatype.cc &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile:  g++ -DHAVE_CONFIG_H -I. -I../../../opal/include -I../../../orte/include -I../../../ompi/include -I../../../opal/mca/paffinity/linux/plpa/src/libplpa -DOMPI_BUILDING_CXX_BINDINGS_LIBRARY=1 -DOMPI_SKIP_MPICXX=1 -I../../.. -D_REENTRANT -O3 -DNDEBUG -finline-functions -MT datatype.lo -MD -MP -MF .deps/datatype.Tpo -c datatype.cc  -fPIC -DPIC -o .libs/datatype.o
depbase=`echo win.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ../../../libtool --tag=CXX   --mode=compile g++ -DHAVE_CONFIG_H -I. -I../../../opal/include -I../../../orte/include -I../../../ompi/include -I../../../opal/mca/paffinity/linux/plpa/src/libplpa  -DOMPI_BUILDING_CXX_BINDINGS_LIBRARY=1 -DOMPI_SKIP_MPICXX=1 -I../../..  -D_REENTRANT  -O3 -DNDEBUG -finline-functions  -MT win.lo -MD -MP -MF $depbase.Tpo -c -o win.lo win.cc &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile:  g++ -DHAVE_CONFIG_H -I. -I../../../opal/include -I../../../orte/include -I../../../ompi/include -I../../../opal/mca/paffinity/linux/plpa/src/libplpa -DOMPI_BUILDING_CXX_BINDINGS_LIBRARY=1 -DOMPI_SKIP_MPICXX=1 -I../../.. -D_REENTRANT -O3 -DNDEBUG -finline-functions -MT win.lo -MD -MP -MF .deps/win.Tpo -c win.cc  -fPIC -DPIC -o .libs/win.o
depbase=`echo file.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/bash ../../../libtool --tag=CXX   --mode=compile g++ -DHAVE_CONFIG_H -I. -I../../../opal/include -I../../../orte/include -I../../../ompi/include -I../../../opal/mca/paffinity/linux/plpa/src/libplpa  -DOMPI_BUILDING_CXX_BINDINGS_LIBRARY=1 -DOMPI_SKIP_MPICXX=1 -I../../..  -D_REENTRANT  -O3 -DNDEBUG -finline-functions  -MT file.lo -MD -MP -MF $depbase.Tpo -c -o file.lo file.cc &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile:  g++ -DHAVE_CONFIG_H -I. -I../../../opal/include -I../../../orte/include -I../../../ompi/include -I../../../opal/mca/paffinity/linux/plpa/src/libplpa -DOMPI_BUILDING_CXX_BINDINGS_LIBRARY=1 -DOMPI_SKIP_MPICXX=1 -I../../.. -D_REENTRANT -O3 -DNDEBUG -finline-functions -MT file.lo -MD -MP -MF .deps/file.Tpo -c file.cc  -fPIC -DPIC -o .libs/file.o
/bin/bash ../../../libtool --tag=CXX   --mode=link g++  -O3 -DNDEBUG -finline-functions  -version-info 0:1:0 -export-dynamic   -o libmpi_cxx.la -rpath /rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/local/lib mpicxx.lo intercepts.lo comm.lo datatype.lo win.lo file.lo ../../../ompi/libmpi.la -lsocket -lnsl  -lrt -lm -lthread
libtool: link: g++ -shared -nostdlib -export-dynamic   /usr/local/gcc-4.5.1/lib/gcc/sparc-sun-solaris2.10/4.5.1/crti.o /usr/ccs/lib/values-Xa.o /usr/local/gcc-4.5.1/lib/gcc/sparc-sun-solaris2.10/4.5.1/crtbegin.o  .libs/mpicxx.o .libs/intercepts.o .libs/comm.o .libs/datatype.o .libs/win.o .libs/file.o   -Wl,-R -Wl,/rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/spkg/build/openmpi-1.4.3/src/ompi/.libs -Wl,-R -Wl,/rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/spkg/build/openmpi-1.4.3/src/orte/.libs -Wl,-R -Wl,/rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/spkg/build/openmpi-1.4.3/src/opal/.libs -Wl,-R -Wl,/usr/local/gcc-4.5.1/lib -Wl,-R -Wl,/rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/local/lib -Wl,-R -Wl,/usr/local/gcc-4.5.1/lib -L/rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/spkg/build/openmpi-1.4.3/src/orte/.libs -L/rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/spkg/build/openmpi-1.4.3/src/opal/.libs ../../../ompi/.libs/libmpi.so /rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/spkg/build/openmpi-1.4.3/src/orte/.libs/libopen-rte.so /rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/spkg/build/openmpi-1.4.3/src/opal/.libs/libopen-pal.so -lsocket -lnsl -lrt -lthread -L/rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/local/lib -L/usr/local/gcc-4.5.1/lib/gcc/sparc-sun-solaris2.10/4.5.1 -L/usr/ccs/lib -L/usr/local/gcc-4.5.1/lib/gcc/sparc-sun-solaris2.10/4.5.1/../../.. /usr/local/gcc-4.5.1/lib/libstdc++.so -lm -lgcc_s /usr/local/gcc-4.5.1/lib/gcc/sparc-sun-solaris2.10/4.5.1/crtend.o /usr/local/gcc-4.5.1/lib/gcc/sparc-sun-solaris2.10/4.5.1/crtn.o    -Wl,-h -Wl,libmpi_cxx.so.0 -o .libs/libmpi_cxx.so.0.0.1
collect2: ld returned 1 exit status
make[2]: *** [libmpi_cxx.la] Error 1
make[2]: Leaving directory `/rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/spkg/build/openmpi-1.4.3/src/ompi/mpi/cxx'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/spkg/build/openmpi-1.4.3/src/ompi'
make: *** [all-recursive] Error 1
Error building

real    29m10.533s
user    17m30.476s
sys     12m24.355s
sage: An error occurred while installing openmpi-1.4.3
Please email sage-devel http://groups.google.com/group/sage-devel

Linker version on t2.math

kirkby@t2:32 ~$ ld -V
ld: Software Generation Utilities - Solaris Link Editors: 5.10-1.496
ld: fatal: no files on input command line

Errors message seen on host 'mark' on Skynet (Solaris 10 SPARC)

Although I've not tried this myself, the errors on #8522, which are with a Sun Blade 2500, are a bit different:

make[2]: Entering directory `/home/vbraun/mark/sage-4.6.1.alpha2/spkg/build/openmpi-1.4.3/src/opal/tools/wrappers'
depbase=`echo opal_wrapper.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`;\
        gcc "-DEXEEXT=\"\"" -I. -I../../../opal/include -I../../../orte/include -I../../../ompi/include -I../../../opal/mca/paffinity/linux/plpa/src/libplpa   -I../../..  -D_REENTRANT  -O3 -DNDEBUG -finline-functions -fno-strict-aliasing  -fvisibility=hidden -MT opal_wrapper.o -MD -MP -MF $depbase.Tpo -c -o opal_wrapper.o opal_wrapper.c &&\
        mv -f $depbase.Tpo $depbase.Po
/bin/bash ../../../libtool --tag=CC   --mode=link gcc  -O3 -DNDEBUG -finline-functions -fno-strict-aliasing  -fvisibility=hidden  -export-dynamic   -o opal_wrapper opal_wrapper.o ../../../opal/libopen-pal.la -lsocket -lnsl  -lrt -lm -lthread
libtool: link: gcc -O3 -DNDEBUG -finline-functions -fno-strict-aliasing -fvisibility=hidden -o .libs/opal_wrapper opal_wrapper.o  ../../../opal/.libs/libopen-pal.so -lsocket -lnsl -lrt -lm -lthread -R/home/vbraun/mark/sage-4.6.1.alpha2/local/lib
opal_wrapper.o: In function `main':
opal_wrapper.c:(.text+0xb0c): undefined reference to `opal_basename'
opal_wrapper.c:(.text+0x1700): undefined reference to `opal_few'
../../../opal/.libs/libopen-pal.so: undefined reference to `mca_base_select'
../../../opal/.libs/libopen-pal.so: undefined reference to `mca_base_component_list_item_t_class'
../../../opal/.libs/libopen-pal.so: undefined reference to `lt_dlexit'
../../../opal/.libs/libopen-pal.so: undefined reference to `lt_dlclose'
../../../opal/.libs/libopen-pal.so: undefined reference to `mca_base_components_close'
../../../opal/.libs/libopen-pal.so: undefined reference to `mca_base_components_open'
../../../opal/.libs/libopen-pal.so: undefined reference to `lt_dlinit'
collect2: ld returned 1 exit status
make[2]: *** [opal_wrapper] Error 1
make[2]: Leaving directory `/home/vbraun/mark/sage-4.6.1.alpha2/spkg/build/openmpi-1.4.3/src/opal/tools/wrappers'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/home/vbraun/mark/sage-4.6.1.alpha2/spkg/build/openmpi-1.4.3/src/opal'
make: *** [all-recursive] Error 1
Error building

So two different SPARC machines are giving different error messages with the same version of OpenMPI. This could be due to different linker versions or how different people have their systems configured.

Issues on OpenSolaris too

OpenMPI is failing to install on OpenSolaris too. That's the subject of #10866

Thoughts on OpenMPI in general

I wonder if this package should be made "experimental" rather than "optional", as it is:

  • Not working on all support systems
  • There's no information in the Sage Users guide how to use it .
  • It would appear nobody has actually managed to do anything with a cluster of computers using this in Sage. I can't find any reference to anyone actually done anything useful with a cluster of machines.

CC: @sagetrac-maldun @vbraun @jhpalmieri @fchapoton @dimpase

Component: packages: optional

Issue created by migration from https://trac.sagemath.org/ticket/10869

@sagetrac-drkirkby sagetrac-drkirkby mannequin added this to the sage-5.11 milestone Mar 3, 2011
@sagetrac-drkirkby

This comment has been minimized.

@vbraun
Copy link
Member

vbraun commented Mar 3, 2011

comment:3

Its a pretty low-level C library, so I don't think Sage developers should write any documentation. We don't have documentation for libntl, say, either. You need to know MPI and write Cython code to use it.

You don't necessarily need a cluster, you can also use it to distribute work over multiple cores of a single computer. Although not quite as efficient as threads, that gives you the opportunity to run your code unmodified on a cluster should you need to in the future.

Since its quite a mature package I don't see a reason why it shouldn't be in optional. Most of the experimental spkgs are completely broken by comparison. I also have never seen a SPARC cluster. Though we should fix it eventually, of course. Given that Solaris uses OpenMPI as its official MPI library it must be possible to compile it :-)

@sagetrac-drkirkby
Copy link
Mannequin Author

sagetrac-drkirkby mannequin commented Mar 3, 2011

comment:4

Replying to @vbraun:

Its a pretty low-level C library, so I don't think Sage developers should write any documentation. We don't have documentation for libntl, say, either. You need to know MPI and write Cython code to use it.

I can't feel that it would be better if there were some examples of how to use a cluster. If you take a look at Mathematica's documentation

http://reference.wolfram.com/mathematica/ParallelTools/tutorial/Overview.html

it is pretty clear about how to set up a cluster of machines. For cores on the same machine, one does not need to do anything

drkirkby@hawk:~$ math
Mathematica 7.0 for Sun Solaris x86 (64-bit)
Copyright 1988-2009 Wolfram Research, Inc.

In[1]:= ParallelEvaluate[$ProcessID]

LaunchKernels::launch: Launching 8 kernels...

Out[1]= {17926, 18049, 18172, 18295, 18418, 18541, 18664, 18787}

In[2]:= ParallelSum[i^2, {i, 1000}]

Out[2]= 333833500

For kernels on other machines, it's a bit more complex, but I got it set up in about 30 minutes, using a GUI where you select the kernels.

It seems Sage has the OpenMPI library, but without some examples of how to use it, then it is going to be next to impossible for someone other than the people that implemented it in Sage to know how to use it.

Do you know any Sage user that has used the library to do anything, who is not a developer involved in implementing it in Sage?

I've got a book on OpenMPI here (got it free after reviewing another book), but I feel even if I knew how to use OpenMPI, I fear doing anything in Sage would be hard.

You don't necessarily need a cluster, you can also use it to distribute work over multiple cores of a single computer. Although not quite as efficient as threads, that gives you the opportunity to run your code unmodified on a cluster should you need to in the future.

Same with Mathematica. The kernels can be local, remote, or a mix of the two.

Since its quite a mature package I don't see a reason why it shouldn't be in optional. Most of the experimental spkgs are completely broken by comparison. I also have never seen a SPARC cluster. Though we should fix it eventually, of course. Given that Solaris uses OpenMPI as its official MPI library it must be possible to compile it :-)

Yes, it must be possible to compile it. Not sure why it does not build.

I just tried the package on my OpenSolaris machine using Sun Studio, rather than gcc. That builds OK. Which probably means it would on SPARC too, though I've not tried it. I updated #10866 to indicate the library builds with the Sun compiler if using the Sun compiler. It took 9 minutes to build on my 3.33 GHz OpenSolaris machine, so I hate to think how long it would take on t2.math. It would not surprise me if it took an hour or more, as that machine is very slow.

Dave

@sagetrac-drkirkby
Copy link
Mannequin Author

sagetrac-drkirkby mannequin commented Mar 3, 2011

comment:5

Building with the Sun compiler

This does build ok on t2.math if the Sun compiler is used

make[2]: Nothing to be done for `install-exec-am'.
make[2]: Nothing to be done for `install-data-am'.
make[2]: Leaving directory `/rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/spkg/build/openmpi-1.4.3/src'
make[1]: Leaving directory `/rootpool2/local/kirkby/t2/32/sage-4.6.2.alpha4/spkg/build/openmpi-1.4.3/src'

real    90m1.942s
user    63m26.692s
sys     27m13.200s
Successfully installed openmpi-1.4.3

However, this means the Sun C and C++ compilers are used, but the GNU Fortran compiler, One would need to manually edit the sage_fortran script to build with the Sun compiler

@sagetrac-drkirkby sagetrac-drkirkby mannequin changed the title Optional package openmpi-1.4.3 fails to install on Solaris 10 (SPARC) Optional package openmpi-1.4.3 fails to install on Solaris 10 (SPARC) with gcc (OK with Sun compiler) Mar 3, 2011
@jdemeyer jdemeyer modified the milestones: sage-5.11, sage-5.12 Aug 13, 2013
@sagetrac-vbraun-spam sagetrac-vbraun-spam mannequin modified the milestones: sage-6.1, sage-6.2 Jan 30, 2014
@sagetrac-vbraun-spam sagetrac-vbraun-spam mannequin modified the milestones: sage-6.2, sage-6.3 May 6, 2014
@sagetrac-vbraun-spam sagetrac-vbraun-spam mannequin modified the milestones: sage-6.3, sage-6.4 Aug 10, 2014
@mkoeppe
Copy link
Member

mkoeppe commented Jun 19, 2020

comment:10

solaris tickets should be closed as outdated

@mkoeppe mkoeppe removed this from the sage-6.4 milestone Jun 19, 2020
@fchapoton
Copy link
Contributor

comment:11

ok

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants