Appveyor: enable builds #7616
Appveyor: enable builds #7616
Conversation
Does anyone know why the test output is only on stderr? |
Also it seems quite verbose! |
related: #7597 and numpy/numpy#9429 |
3 failures:
|
+ 'superlu_src', | ||
+ 'sc_c_misc', | ||
+ 'sc_cephes', | ||
+ ] |
ghost
Jul 17, 2017
•
Author
The build_clib is reordered so that fortran libs can be linked against their dependencies.
The build_clib is reordered so that fortran libs can be linked against their dependencies.
+ '-Wl,--enable-auto-import', | ||
+ ], | ||
+ debug=self.debug) | ||
+ |
ghost
Jul 17, 2017
Author
Here we may be building C, fortran, or both. If we need to build fortran, then we need to generate a fortran DLL and a shared lib (see numpy/numpy#9427)
Here we may be building C, fortran, or both. If we need to build fortran, then we need to generate a fortran DLL and a shared lib (see numpy/numpy#9427)
+ libs += [lib_name + '_gfortran.lib'] | ||
+ self.dlls.append(os.path.join( | ||
+ self.build_clib, lib_name + '_gfortran.dll')) | ||
+ |
ghost
Jul 17, 2017
•
Author
Each time we build a DLL, we need to keep track of it so that dependendent fortran code can link to the DLL. However, fortran code cannot call c code.
Each time we build a DLL, we need to keep track of it so that dependendent fortran code can link to the DLL. However, fortran code cannot call c code.
build_clib = self.get_finalized_command('build_clib') | ||
self.library_dirs.append(build_clib.build_clib) | ||
+ self.dlls.extend(build_clib.dlls) | ||
+ self.build_clib = build_clib.build_clib |
ghost
Jul 17, 2017
Author
We need to be able to link fortran code built here to potentially any DLL built in build_clib.
We need to be able to link fortran code built here to potentially any DLL built in build_clib.
+ '-Wl,--enable-auto-import', | ||
+ ], | ||
+ debug=self.debug) | ||
+ |
ghost
Jul 17, 2017
Author
Again, if there is fortran code, then we need to build it into a separate DLL and then generate a lib for msvc.
Again, if there is fortran code, then we need to build it into a separate DLL and then generate a lib for msvc.
+ os.makedirs(dll_folder) | ||
+ for dll in dlls: | ||
+ shutil.copy(dll, dll_folder) | ||
+ |
ghost
Jul 17, 2017
Author
We need to copy all of the DLLs that we have built plus the mingw DLLs plus the OpenBLAS DLL so that our extension can access it.
We need to copy all of the DLLs that we have built plus the mingw DLLs plus the OpenBLAS DLL so that our extension can access it.
The appveyor/ folder preferably should go under tools/, as the rest of
the ci stuff is there.
|
@xoviat in the meantime, do you have AppVeyor set up to run on your SciPy fork? If not, you can set that up and test there first. |
@Eric89GXL Yeah I posted the link above. It works for the most part; now I'm just trying to clean up numpy.distutils. |
The distributor_init stuff also shouldn't do anything by default --- or
at least is should be guarded by "if sys.platform == 'win32'" etc.
|
@@ -0,0 +1,3 @@ | |||
numpy |
pv
Jul 17, 2017
Member
This probably should pin the version number, so that the patch can always be applied.
This probably should pin the version number, so that the patch can always be applied.
ghost
Jul 17, 2017
Author
Done.
Done.
Sorry - I mean when #7613 is merged. |
+ if f_objects: | ||
+ if lib_name == 'odepack': | ||
+ f_objects = [ | ||
+ obj for obj in f_objects if 'xsetf' not in obj and 'xsetun' not in obj] |
matthew-brett
Jul 17, 2017
Contributor
Sorry for my ignorance - but does it make sense to delete these objects from odepack
? Are they not necessary? Or are they duplicates?
Sorry for my ignorance - but does it make sense to delete these objects from odepack
? Are they not necessary? Or are they duplicates?
I wasn't able to get the 32 bit builds working (someone needs to look at the symbols coming out of the gfortran dlls on 32 bit) or Python 2 (I really don't spend my time there) but that's all I have time for as of now. This can be merged with appveyor set up and then the 32 bit issues can be fixed later. |
The 32 bit builds appear to have been fixed. |
No idea, |
@matthew-brett @pv @carlkl The DLL conflict issues have been resolved. Here are the DLLs that could be loaded by scipy: As you can see, a name conflict would be exceedingly unlikely. |
In addition, there is no longer any depdendency on mingw DLLs. Only the mingwpy_v0.19 DLL, but the mingwypy people control that, so there shouldn't be issues there. |
The problem with the mingw DLL is that is reasonably likely that someone will also be linking to a DLL with the same name, for the same reason. Therefore we have to hope that their copy of the DLL is the same as ours. How about using Nathaniel's tool as a post-processing step, to rename this DLL also? |
Possibly, but I would continue this discussion separately. I will now squash the commits in this PR, update the urls to point to rackcdn, and force-push to this contributor branch. If it passes on 64-bit, I'll merge and we continue the 32-bit discussion. |
Looks ok |
Thanks, merged! Thanks @xoviat, @matthew-brett and everyone involved. |
@matthew-brett: we can try a new PR to swap the 32-bit BLAS library used. I suspect the 32-bit hang at least is an issue in CDFLIB, which at least locally according to gdb enters an infinite loop, as the code is ancient and not written with nans in mind. Moreover, it has persistent state that it apparently doesn't reset properly, giving a plausible mechanism for spurious failures e.g. due to test ordering. Not going to look at this today though. |
concerning the OpenBLAS 32bit errors: it turns out, that OpenBLAS uses some C-code instead of vanilla Lapack Fortran code for some routines (called Compiling OpenBLAS with mingw-w64 with a recent gcc and testing with scipy test seems apropriate for me in this case. |
Hi Carl - thanks for tracking that down further. |
Seconding Pauli - a big thank you to @xoviat for all your work here, and thank you too Pauli and Carl and Eric for Ralf for keeping this moving and getting it in. It's great to see the light at the end of the Scipy wheels tunnel. |
I remember the days when you would type On another note, I wonder whether it was really necessary to put everyone through what amounts to 17 years of packaging misery, which was directly due to the fact that the PSF could not come up with a binary standard (wheel) until over a decade from the first python release. Oh well, I guess you can only look forward at this point. |
But definitely, thanks so much for the help on this. |
@xoviat, kudos for your innovative method to combine the MS compiler with gfortran (mingw-w64) |
My hope is, that once we get these wheels up, that will be a stimulus for other scientific Python packages to provide wheels. Scipy has been a big blocker for a while now. |
Thanks to all involved with this, it's great to see this automated build
process happening.
…On 15 August 2017 at 08:14, Matthew Brett ***@***.***> wrote:
My hope is, that once we get these wheels up, that will be a stimulus for
other scientific Python packages to provide wheels. Scipy has been a big
blocker for a while now.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#7616 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AAq51iyIYMFqRayKhdXzvMm054QmmahWks5sYMavgaJpZM4OZdtT>
.
--
_____________________________________
Dr. Andrew Nelson
_____________________________________
|
Enabling potential users of packages to be able to use them without going through a whole build process (for those that need compiled Extensions) is always good. One roadblock that package authors face in making packages available is:
At least the modern CI's make it easier to solve (1). (2) is harder to achieve and is one reason why conda-forge is growing in popularity. |
Sure - for conda-forge - but there is automatation machinery for doing OSX and Manylinux wheels which many projects are using. Up until now, there hasn't been much motivation to improve Windows wheel-building tools, because many packages need Scipy, and so cannot install from Pip. I'd imagine there will now be a lot more interest in that problem - I'm looking forward to seeing what happens next. |
I'm printing this and hanging it on the wall. |
Seriously, this is the most exciting infrastructure change I have seen in SciPy since I started using the project five years ago... I am so thankful to all the people that was involved in this, and also those who wasted long hours without success. Congratulations to all, you deserve all our admiration. |
No description provided.