Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wheels for linux #138

Closed
AbdealiLoKo opened this issue Jun 18, 2016 · 41 comments
Closed

Wheels for linux #138

AbdealiLoKo opened this issue Jun 18, 2016 · 41 comments
Labels

Comments

@AbdealiLoKo
Copy link
Contributor

Numpy, Scipy, etc now provide wheels for linux using manylinux which makes installing them much easier.
It would be nice if dlib did so too, as the compilation does take quite a while.

ManyLinux: https://github.com/pypa/manylinux
Example project using manylinux: https://github.com/pypa/python-manylinux-demo

@davisking
Copy link
Owner

Cool. You setting this up? :)

@AbdealiLoKo
Copy link
Contributor Author

From my understanding of this, there isn't really anything to setup per se.
When making a pypi release, we just need to Pull the docker image quay.io/pypa/manylinux1_i686 and quay.io/pypa/manylinux1_x86_64 and then run /io/travis/build-wheels.sh.

The build-wheels.sh shell script will auto create the wheel at wheelhouse/. And that needs to be uploaded to pypi along with the python setup.py upload.

So, it's an extra step before making a release.

Note: Again, I've never really tried it myself. This is from looking at other projects.

@davisking
Copy link
Owner

I'm not seeing these instructions. Or any instructions really. What
specifically does one type to do this?

On Sat, Jun 18, 2016 at 6:31 AM, AbdealiJK notifications@github.com wrote:

From my understanding of this, there isn't really anything to setup per se.
When making a pypi release, we just need to Pull the docker image
quay.io/pypa/manylinux1_i686 and quay.io/pypa/manylinux1_x86_64 and then
run /io/travis/build-wheels.sh.

The build-wheels.sh shell script will auto create the wheel at wheelhouse/.
And that needs to be uploaded to pypi along with the python setup.py
upload.

So, it's an extra step before making a release.

Note: Again, I've never really tried it myself. This is from looking at
other projects.


You are receiving this because you commented.
Reply to this email directly, view it on GitHub
#138 (comment), or mute
the thread
https://github.com/notifications/unsubscribe/AF-Cx8nOssNVxLr88dSds4stNqRZAe2aks5qM8jpgaJpZM4I46s9
.

@AbdealiLoKo
Copy link
Contributor Author

AbdealiLoKo commented Jun 18, 2016

The instructions I've gotten are from here : https://github.com/pypa/python-manylinux-demo/blob/master/.travis.yml

So, (untesed) commands would be like:

$ docker pull quay.io/pypa/manylinux1_x86_64
$ docker run quay.io/pypa/manylinux1_x86_64

# Now... inside the docker image:
# Install dependencies of dlib into the docker here
$ pip install numpy
$ yum install cmake boost-devel

# Compile wheels for various python versions
$ for PYBIN in /opt/python/*/bin; do
    ${PYBIN}/pip install -r /io/dev-requirements.txt
    ${PYBIN}/pip wheel <path_to_package> -w wheelhouse/
done

# Bundle external shared libraries into the wheels (if any?)
$ for whl in wheelhouse/*.whl; do
    auditwheel repair $whl -w /io/wheelhouse/
done

# Now the wheels should be present in /io/wheelshouse. Install and test them.

Here's the build-wheels.sh found in the example project - https://github.com/pypa/python-manylinux-demo/blob/master/travis/build-wheels.sh

PS: I've also made an issue in manylinux to give these "raw"? instructions to make it easier for users that do not want travis - pypa/manylinux#73

@davisking
Copy link
Owner

davisking commented Jun 19, 2016 via email

@AbdealiLoKo
Copy link
Contributor Author

@davisking making an automated script is a good idea. Im a little busy this week though.

A few questions :
When is the next release planned?
What distro do you plan to run the script on? (in case i need distro specific commands...)

@davisking
Copy link
Owner

davisking commented Jun 19, 2016 via email

@matthew-brett
Copy link

I can help set up wheel building - but I don't know how to run the tests on the Python code - can you give me any pointers?

@AbdealiLoKo
Copy link
Contributor Author

@matthew-brett Awesome to see you here :) And thanks for looking into this !

I can't seem to be able to find tests for the python either. @davisking are there any o.O ?
Would running the ./dtest at /dlib/tests after installing dlib with pip install dlib be a correct way to test whether the dlib installation/compilation is correct ?

@davisking
Copy link
Owner

davisking commented Jun 30, 2016 via email

@matthew-brett
Copy link

There's a draft of the manylinux build here : https://github.com/MacPython/dlib-wheels

These are building here : https://travis-ci.org/MacPython/dlib-wheels/jobs/141487558

But - the problem is that I don't know anything about boost, and so I'm doing a huge brute force install of the boost libraries : https://github.com/MacPython/dlib-wheels/blob/master/config.sh#L29

That big build is proving very slow indeed. Can you give any hints on what I need to install from boost in order for the dlib Python wheel to compile?

@davisking
Copy link
Owner

davisking commented Jul 1, 2016 via email

@matthew-brett
Copy link

matthew-brett commented Jul 1, 2016

Am I right that dlib requires boost-python version exactly equal to 1.41.0 ? https://github.com/davisking/dlib/blob/master/dlib/cmake_utils/add_python_module#L65

EDIT - no - this is a minimum dependency - see below.

@matthew-brett
Copy link

Sorry - scratch that - I see that 1.41.0 is a specification of a minimum dependency.

Watching the installs I can get working, I see that you have an optional dependency on a cblas implementation such as OpenBLAS.

There's some code with "numpy" but it looks like you don't use the numpy standard headers, got with numpy.get_include. How do you deal with different numpy ABIs?

I can build the dlib code without error with cmake . && make, with the compiled boost library location specified with export BOOST_ROOT=/path/to/boost_root. In the same shell, python setup.py build still complains that it cannot find boost:

$ python setup.py build
running build
Detected Python architecture: 64bit
Detected platform: darwin
Configuring cmake ...
-- The C compiler identification is AppleClang 6.0.0.6000057
-- The CXX compiler identification is AppleClang 6.0.0.6000057
-- Check for working C compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang
-- Check for working C compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang++
-- Check for working CXX compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Could NOT find Boost
error: cmake configuration failed!

I assume that the default install doesn't use any not-generic CPU features such as AVX? I'm asking because the wheel should not depend on the CPU features of the host that it was built on.

@matthew-brett
Copy link

Boost error finding Python include path for Python 3 :

Obscure error with no informative error message on Python 2.7:

  running bdist_wheel
  running build
  Detected Python architecture: 64bit
  Detected platform: linux2
  Configuring cmake ...
  -- The C compiler identification is GNU 4.8.2
  -- The CXX compiler identification is GNU 4.8.2
  -- Check for working C compiler: /opt/rh/devtoolset-2/root/usr/bin/cc
  -- Check for working C compiler: /opt/rh/devtoolset-2/root/usr/bin/cc -- works
  -- Detecting C compiler ABI info
  -- Detecting C compiler ABI info - done
  -- Check for working CXX compiler: /opt/rh/devtoolset-2/root/usr/bin/c++
  -- Check for working CXX compiler: /opt/rh/devtoolset-2/root/usr/bin/c++ -- works
  -- Detecting CXX compiler ABI info
  -- Detecting CXX compiler ABI info - done
  -- Boost version: 1.61.0
  error: cmake configuration failed!

@matthew-brett
Copy link

Sadly CMake is an ugly beast for building Python extensions.

@AbdealiLoKo
Copy link
Contributor Author

@matthew-brett The boost building issue with py34 was something I had gotten a while back too : travis-ci/travis-ci#6099

Is it possible to use a conda environment ? That simplified things quite a bit for me in travis. And that was what I had done finally (and am still doing - https://github.com/AbdealiJK/file-metadata/blob/master/.travis.yml#L49)

@AbdealiLoKo
Copy link
Contributor Author

@matthew-brett Also, about the dlib issue - It seems the next thing is to find the Boost python library. My output is:

-- The C compiler identification is GNU 6.1.1
-- The CXX compiler identification is GNU 6.1.1
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Boost version: 1.60.0
-- Found the following Boost libraries:
--   python
-- Found PythonLibs: /home/ajk/miniconda3/envs/py27/lib/libpython2.7.so (found suitable version "2.7.11", minimum required is "2.6")
etc etc etc...

@davisking
Copy link
Owner

I didn't do anything non-portable to interface with numpy as far as I'm aware. It should work with any version.

The python bindings will use SSE4 by default, but not AVX.

There is a lot of variation in how boost.python is installed. It's a frequent user complaint that they get errors from mixing boost.python versions with the wrong version of python, or it's installed in some weird place. I don't really think this is a CMake problem. How else would we compile these modules? You need some reliable way to find the version of boost.python and python libs that match the version of python the user is going to be running. If that could be solved then the setup.py could pass that information to cmake and it should be fine.

For a point of comparison, I made MITIE and it uses CMake to build python extensions and no one ever has any problems with it. But it doesn't use boost.python. Instead I wrote a C API which uses ctypes from python. So there is no dependency on python libs or boost or anything when compiling the shared object. However, writing a C API and binding it through ctypes is a huge pain in the ass. It's only reasonable to do with a small API.

Is there some way to find boost.python and python libs in setup.py and verrify that they are compiled together correctly? I think that's the heart of the issue.

@matthew-brett
Copy link

matthew-brett commented Jul 2, 2016

The maximum guaranteed on a 64-bit install is SSE2, so, for a binary wheel on OSX or Linux, I guess that should be the setting - otherwise the wheels will segfault on older systems.

Sorry - yes - I should have said only that cmake and Python distutils are two very different systems. I think that's why the bug with finding the Python3 include files has not been fixed for a while in the boost-python cmake configuration.

It's not to hard to work round that bug, I guess just adding the Python include files manually to the CFLAGS would probably work.

I'm afraid I'm really inexperienced with cmake, and very experienced (sadly) with distutils, so I wasn't sure how to debug the failure to find or use the boost-python installation. I suppose I'd have to somehow find the implementation of FindPackage for boost-python in cmake, but I wasn't sure where to look.

@davisking
Copy link
Owner

davisking commented Jul 2, 2016 via email

@matthew-brett
Copy link

matthew-brett commented Jul 18, 2016

I added both of you as collaborators on the repo, in case you'd be interested in fixing up the builds somehow.

@y0ast
Copy link
Contributor

y0ast commented Jun 7, 2017

@matthew-brett do you have an update on this?

Otherwise I'll take a stab at it.

@matthew-brett
Copy link

No, sorry, I got lost in the wilds of cmake / distutils. Please do go ahead and have a stab - let me know if you think I can help. I know a lot about distutils and the manylinux / OSX wheel building machinery.

@y0ast
Copy link
Contributor

y0ast commented Jun 12, 2017

So I got cmake 2.8.12 and boost 1.64.0 installed from source on a manylinux docker image.

However now I'm quite stuck with missing: PYTHON_LIBRARIES. Gathering from the manylinux github I don't think that libpython.so exists.

@matthew-brett any thoughts?

For reference:

running build
Detected Python architecture: 64bit
Detected platform: linux2
Removing build directory /root/dlib/./tools/python/build
Configuring cmake ...
-- The C compiler identification is GNU 4.8.2
-- The CXX compiler identification is GNU 4.8.2
-- Check for working C compiler: /opt/rh/devtoolset-2/root/usr/bin/cc
-- Check for working C compiler: /opt/rh/devtoolset-2/root/usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working CXX compiler: /opt/rh/devtoolset-2/root/usr/bin/c++
-- Check for working CXX compiler: /opt/rh/devtoolset-2/root/usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Boost version: 1.64.0
-- Found the following Boost libraries:
--   python
CMake Error at /usr/local/share/cmake-2.8/Modules/FindPackageHandleStandardArgs.cmake:108 (message):
  Could NOT find PythonLibs (missing: PYTHON_LIBRARIES) (found suitable
  version "2.7.13", minimum required is "2.6")
Call Stack (most recent call first):
  /usr/local/share/cmake-2.8/Modules/FindPackageHandleStandardArgs.cmake:315 (_FPHSA_FAILURE_MESSAGE)
  /usr/local/share/cmake-2.8/Modules/FindPythonLibs.cmake:186 (FIND_PACKAGE_HANDLE_STANDARD_ARGS)
  /root/dlib/dlib/cmake_utils/add_python_module:75 (FIND_PACKAGE)
  CMakeLists.txt:6 (include)
-- Configuring incomplete, errors occurred!
See also "/root/dlib/tools/python/build/CMakeFiles/CMakeOutput.log".
error: cmake configuration failed!

@matthew-brett
Copy link

The problem is that the manylinux docker images deliberately do not have the libpython libraries, because the default install of Python on Debian / Ubuntu does not have these libraries either. However, the default install does have all the necessary symbols linked, just not via the libpython library. So, to fix this, I'm afraid you somehow have to tell the building procedure not to look for libpython but assume the relevant symbols are present at runtime.

@y0ast
Copy link
Contributor

y0ast commented Jun 13, 2017

Hmm, so the linking of PYTHON_LIBRARIES happens in this file:

message(STATUS "USING PYTHON_LIBS: ${PYTHON_LIBRARIES}")

I have no clue how to tell that part not to look for libpython.

@davisking do you have an idea?

@davisking
Copy link
Owner

davisking commented Jun 13, 2017 via email

@matthew-brett
Copy link

You definitely do not need libpython at run time, because the Python binary has all the symbols you need. So the error here is the assumption by the build process that libpython is necessary at build time. I know that VTK does allow undefined Python symbols at build time - here's a suggestive reference: https://gitlab.kitware.com/vtk/vtk/commit/50d088ab9cbd4bc0e0215cbe6bdfdea9a392ca4b

@davisking
Copy link
Owner

Huh, well, modify the cmake script to not link to it and see if it works out.

@malarinv
Copy link

malarinv commented Jul 5, 2018

is dlib wheels for linux available yet?

@davisking
Copy link
Owner

No. Someone should set this up :)

@davisking
Copy link
Owner

Although I should also mention that part of the reason I haven't done this, aside from being busy, is because the current setup will compile dlib appropriately for your system. Like do you have a GPU or AVX instructions? It will detect those and use them if available. That has to be done at compile time. A precompiled binary would have to disable all that stuff and so be slower for many users.

@malarinv
Copy link

malarinv commented Jul 5, 2018

BTW, tensorflow solves this problem by providing multiple wheels like tensorflow with cpu-only support / tensorflow-gpu with gpu support. The user has to know and decide there are multiple variants though.

@malarinv
Copy link

malarinv commented Jul 5, 2018

i'm installing face_recognition library which depends on dlib, on a resource constrained machine and it is taking a whole day just to build dlib

@davisking
Copy link
Owner

Yes, that would be fine. Someone should set that up.

A whole day is kinda ridiculous. Where did it take most of the time? On the very last linking step?

@malarinv
Copy link

malarinv commented Jul 5, 2018

python_dlib -> compiling image.cpp, image2.cpp, the instance has only 512 mb and it is hitting swap. maybe should try allocating more memory temporarily and just build it for now 😄

@davisking
Copy link
Owner

davisking commented Jul 5, 2018

Ok. Well, you do need enough RAM. No getting around that. You should try splitting those files into 4 files. That would reduce the amount of RAM required per file.

You can also uncomment the set(PYBIND11_LTO_CXX_FLAGS "") in dlib/tools/python/CMakeLists.txt and it will turn off link time optimization, which makes the build a lot faster and doesn't seem to materially impact runtime speed. I will likely just disable LTO in the future because of how long it makes the compile process.

Anyway, if you want to try those things out and submit a PR if it fixes the compile times that would be sweet :)

@dlib-issue-bot
Copy link
Collaborator

Warning: this issue has been inactive for 61 days and will be automatically closed on 2018-09-07 if there is no further activity.

If you are waiting for a response but haven't received one it's likely your question is somehow inappropriate. E.g. you didn't follow the issue submission instructions, or your question is easily answerable by reading the FAQ, dlib's documentation, or a Google search.

@dlib-issue-bot
Copy link
Collaborator

Notice: this issue has been closed because it has been inactive for 65 days. You may reopen this issue if it has been closed in error.

@epignatelli
Copy link

Is there any update on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

7 participants