Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

macOS install not using brew installed libraries #439

Open
MyztikJenz opened this issue Jan 29, 2023 · 5 comments
Open

macOS install not using brew installed libraries #439

MyztikJenz opened this issue Jan 29, 2023 · 5 comments

Comments

@MyztikJenz
Copy link

Let me start off with saying I'm trying to get MPF to run on an older build of macOS: 10.15.7. The hardware is older (Late 2012 Mac mini) and is not going to benefit from a newer OS. And I know this is going to be more challenging (I've already fought through some brew issues).

That said, I find myself with a problem that presents itself as the following after installing all the necessary bits. When running the mc_demo machine, I get this error:

  File "/Users/pinball/.local/pipx/venvs/mpf/lib/python3.9/site-packages/mpfmc/assets/bitmap_font.py", line 3, in <module>
    from mpfmc.uix.bitmap_font.bitmap_font import BitmapFont
ImportError: dlopen(/Users/pinball/.local/pipx/venvs/mpf/lib/python3.9/site-packages/mpfmc/uix/bitmap_font/bitmap_font.cpython-39-darwin.so, 2): Symbol not found: _kIOMainPortDefault
  Referenced from: /Users/pinball/.local/pipx/venvs/mpf/lib/python3.9/site-packages/mpfmc/uix/bitmap_font/../../.dylibs/libSDL2-2.0.0.dylib (which was built for Mac OS X 12.0)
  Expected in: /System/Library/Frameworks/IOKit.framework/Versions/A/IOKit

Poking at /Users/pinball/.local/pipx/venvs/mpf/lib/python3.9/site-packages/mpfmc/uix/bitmap_font/../../.dylibs/libSDL2-2.0.0.dylib I indeed see it has a reference to _kIOMainPortDefault:

Pinballs-Mini:~ pinball$ nm /Users/pinball/.local/pipx/venvs/mpf/lib/python3.9/site-packages/mpfmc/uix/bitmap_font/../../.dylibs/libSDL2-2.0.0.dylib | grep -i portdefault
                 U _kIOMainPortDefault

This might be the core of the problem... 10.15's IOKit doesn't seem to have a kIOMainPortDefault, only Master:

Pinballs-Mini:mc_demo pinball$ nm /System/Library/Frameworks/IOKit.framework/Versions/A/IOKit | grep -i portdefault
000000000009cb20 S _kIOMasterPortDefault

But I'm curious why bitmap_font chose to use this dylib at all, when libSDL2 is installed already:

Pinballs-Mini:~ pinball$ brew list sdl2
/usr/local/Cellar/sdl2/2.26.2/bin/sdl2-config
/usr/local/Cellar/sdl2/2.26.2/include/SDL2/ (78 files)
/usr/local/Cellar/sdl2/2.26.2/lib/libSDL2-2.0.0.dylib
/usr/local/Cellar/sdl2/2.26.2/lib/cmake/ (2 files)
/usr/local/Cellar/sdl2/2.26.2/lib/pkgconfig/sdl2.pc
/usr/local/Cellar/sdl2/2.26.2/lib/ (4 other files)
/usr/local/Cellar/sdl2/2.26.2/share/aclocal/sdl2.m4

and seemed to have noticed the symbol issue and used the right one:

Pinballs-Mini:~ pinball$ nm /usr/local/Cellar/sdl2/2.26.2/lib/libSDL2-2.0.0.dylib | grep -i portdefault
                 U _kIOMasterPortDefault

If I symlink brew's version of the dylib into the .dylibs folder bitmap_font is looking at, it gets past this error (and onto another missing symbol in a different library). The core of my problem seems to be that mpf-mc wants to use its own dylibs when legit ones are already installed. I went looking for the logic that determines how it chooses which to use but pip install packages are a dark art I don't understand.

Is there a way to force mpf-mc to use brew's library install instead of the shared libs that it comes with? It seems to have this capacity, I have MPF installed on an M1 Mac running macOS 13 and it installed mpf-mc using the brew installed components:

bash-3.2$ otool -L  bitmap_font.cpython-39-darwin.so 
bitmap_font.cpython-39-darwin.so (architecture x86_64):
        /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1311.100.3)
bitmap_font.cpython-39-darwin.so (architecture arm64):
        /opt/homebrew/opt/sdl2/lib/libSDL2-2.0.0.dylib (compatibility version 2401.0.0, current version 2401.0.0)
        /opt/homebrew/opt/sdl2_image/lib/libSDL2_image-2.0.0.dylib (compatibility version 601.0.0, current version 601.2.0)
        /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1311.100.3)

Thanks for reading this far!

@toomanybrians
Copy link
Member

Hi!

First, I'm the guy who wrote the latest installers and the installer documentation, and based on your post I think you know more about this than me. Just so we're level set. 😀

Probably 99% of the complexity of installing MPF-MC is not related to the mpf-mc package itself, but rather the installation of Kivy. Kivy, in turn interfaces with all the SDL2 and Gstreamer libs and all that. If you look into Kivy, either in their docs, or their forums, or their repos, you'll see there are lots and lots of various complexities and ways to specify library paths, which libs are used, etc. I sorta picked the easiest defaults I could get working, but without 100% knowing what I was doing. So you may want to pick around in the Kivy world to see what you can figure out?

Some of the libraries that MPF-MC uses don't matter. e.g. Kivy might support options A, B, C for images, (like SDL2_image, or PIL, or?? and it doesn't really matter to MPF-MC which one you use.. (MPF-MC is just making Kivy calls, and Kivy is handling the rest, so if it works with Kivy it will work with MPF-MC.)

Audio is a bit different since MPF-MC has its own custom implementation that uses Gstreamer, so you need Gstreamer for audio, but the rest I think can work with anything?

Also if you have any ideas, suggestions, fixes, recommendations, etc. we would love the feedback? I'm not a "real" developer, I just write some Python code to do pinball things, so all these multimedia libraries and installers and most of what you're writing is way over my head.

Thanks for these details though and digging in!

@MyztikJenz
Copy link
Author

Thank you for being honest that all of this is a best effort... I totally understand, I'm a software engineer who's trying to learn how to build a pinball machine and there's a lot to understand. I very much appreciate that you're all here and offering help.

I did get my install working... although I'm not sure yet how you could go about making it better for everyone.

Took me a few hours to dive into python build and to understand how pyproject.toml and setup.py work. And to sort out why gstreamer-1.0 kept failing to be found even though it was installed (libffi needed to be >3.0.0 but macOS installs a 2.2.1 version which mucks things up). After that, python -m build in the mpf-mc source worked just fine and produced a .whl. Poking inside that I found my compiled objects all linked the right things:

pinball:mpfmc pinball$ otool -L  uix/bitmap_font/bitmap_font.cpython-39-darwin.so 
uix/bitmap_font/bitmap_font.cpython-39-darwin.so:
	/usr/local/opt/sdl2/lib/libSDL2-2.0.0.dylib (compatibility version 2601.0.0, current version 2601.2.0)
	/usr/local/opt/sdl2_image/lib/libSDL2_image-2.0.0.dylib (compatibility version 601.0.0, current version 601.2.0)
	/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1281.100.1)

A comparison of what's in the x86_64 .whl files available for download (from mpf_mc-0.56.1-cp38-cp38-macosx_10_9_x86_64.whl)

[jimt mpfmc]$ otool -L uix/bitmap_font/bitmap_font.cpython-38-darwin.so 
uix/bitmap_font/bitmap_font.cpython-38-darwin.so:
	@loader_path/../../.dylibs/libSDL2-2.0.0.dylib (compatibility version 2601.0.0, current version 2601.1.0)
	@loader_path/../../.dylibs/libSDL2_image-2.0.0.dylib (compatibility version 601.0.0, current version 601.2.0)
	/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1311.100.3)

(bitmap_font.cpython-38-darwin.so is referencing local libSDL files, not the ones installed in /usr/local/opt).

I had to manually install my .whl file in the mpf virtual environment, but after that mpf both in the demo machine fired right up. Sounds, video, animation... all worked (as far as I can tell).

I tried to make sense of how the .whl files you produce feel they need to contain prebuilt libraries and I'm not seeing how that happens. I wonder if you're building the macOS images from Windows somehow, using a version of the kivy macOS packager that thinks it needs to include these libraries, or cpython on your machine has issues. Or sunspots... those guys are always suspect.

No idea if any of this is helpful, and I'm not sure how you could make this process better (other than making the universal .whl really universal... it's only producing arm64 slices, which is probably this bug in cpython). One package would be easier to maintain than two, in theory. But as it goes, this bug could be closed since it's no longer blocking me, but if you'd like to continue to sort out what's going on here, I'm happy to help track it down.

@toomanybrians
Copy link
Member

Again, thanks for your efforts here. This is all great. Here's some information on the Mac wheel building process(es) which might be helpful? I agree a single universal wheel would be great. So here's how this stuff works now.

First, everything is built in GitHub cloud runners. Unfortunately GitHub does not offer ARM-based Mac runners (I think it's on the roadmap for 3Q23.) So the GitHub workflow which builds everything only builds x86 Mac wheels. This is the main file that controls everything which happens on GitHub on checkin:

https://github.com/missionpinball/mpf-mc/blob/dev/.github/workflows/build_wheels.yml

I think you can kinda figure out what that's doing? It uses the cibuildwheel project which hides some of the complexity, but you can see in lines 62-83 what setup runs on Mac and what the build commands are. And the matrix entries which drive which Mac builds run.

Then for ARM Macs, I just manually build those locally using this build script from the repo

So if there's any value in there, in updating the local build script I run, (which is prob all we can do for now), I'd love another set of eyes on this. If we can get a real universal build happening locally then I'll disable the cloud builds for x86 mac for now, and we can flip over to using whatever new technique we figure out when GitHub gets ARM Mac runners in the future.

@MyztikJenz
Copy link
Author

This makes more sense... I missed the GitHub workflow (wasn't even aware it was a thing, to be honest. TIL). GitHub's docs are pretty good for how this works, which is very nice.

Can I ask why you build for three versions of python? Is that supporting different customer needs or do each of the versions do something different? The universal only builds for 3.9 (which is what I would expect).

Give me some time and I'll poke at this. I would think that a universal build should be possible, but I'll need to go down the python build system rabbit hole again to be sure.

@MyztikJenz
Copy link
Author

OK... I think I understand what's going on now.

My original issue is due to the "delocate" process that runs on wheels targeting macOS. Delocate is the moving of dependencies into the wheel and the fix-up of their loader paths so the python library that's being built does not need to have any other dependencies installed on the host system (wheels are supposed to be portable, and this makes them so). Clever, but confusing the first time you encounter it. And in my case, it copied a library that had the symbol that didn't match my version of IOKit because my OS is super old.

I guess that ciwheelbuilder runs this delocate step as a matter of course, but I didn't go down that rabbit hole far enough to find out. Instead I worked on figuring out how to build a universal wheel.

The issue I referenced before about cpython having issues was out of date (it's been fixed) and pertained to building universal versions of the python framework itself, not wheels. But understanding that issue got me to a page that talked about being able to cross-compile for arm64 from a Github Workflow (apparently ciwheelbuilder supports it; see grpc/grpc#29262 and https://github.com/pietrodn/grpcio-mac-arm-build/blob/main/.github/workflows/wheels.yml).

So I cloned the MPF repos, edited the build_wheels.yml file to cross-compile and... neglected to realize until it failed that dependencies on this runner would still be x86_64, not arm64. So while it tries to compile for arm64, you get fun errors like this when compiling against the SDL2 libraries:

/Applications/Xcode_14.2.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/clang/14.0.0/include/immintrin.h:14:2: error: "This header is only meant to be used on x86 and x64 architecture"
865
    #error "This header is only meant to be used on x86 and x64 architecture"
866
     ^

I tried forcing brew to install the arm64 libraries on GitHub's builder and it didn't care for that at all (failed immediately, which is not expected). Why grpcio can do it and MPF cannot is that grpcio has no dependencies, it's just building itself. If you wanted to build your entire dependency graph from source, I'm betting you could make this work too... but that way lies madness.

I also tried building the universal wheel on my own machine. I installed all the x86_64 libraries via brew and compiling still worked, but -undefined dynamic_lookup gets cranky when two libraries paths are found leading to the same library. And in the end, the linker just picks the first one it finds (and then complains that it can't use the library mismatched arch).

Given all the constraints, I think you're actually building your distributions as best you can right now. And even when GitHub can support arm64 builds, I don't think you want to create universal2 binaries. They'd be twice the size (assuming you delocate the dependencies into the wheel) and have no benefit over building per-architecture wheels since python/pip prefers matching architecture wheels over universal ones anyways.

There's a couple of things worth pointing out. First, the universal2 wheel you're building isn't universal (it's lacking the proper references to x86_64 libraries). You probably aren't getting complaints because x86_64 macOS users get the x86_64 wheel. But if they were to try the universal version, they'd get errors.

You can either stop building the universal variant (which I'm sure you can do, but I'm not sure how. I think which architectures are compiled comes from the Python.framework that is required, and that prefers universal binaries by default) or you can try to fix up the paths to point to the right locations for x86_64. This would be gross, and a waste of time anyways. Or you can ignore this problem entirely, unlikely arm64 users will copy their installs of MPF to x86_64 machines over just re-installing.

Second, the universal version isn't "delocated". It's expecting to find dependencies installed on the host machine. This really isn't a problem, since you have us install the dependencies as part of the MPF install process. But it does make MPF less portable and susceptible to libraries changing out underneath it. It's also a behavior change between x86_64 and arm64 versions, which could be a weird support issue to track down later.

If you do happen to run delocation on the universal binary, I think you could simplify the install process. If all of your dependencies are included in the wheel, there is no reason to have MPF users install them too. This is why you get the "classes implemented in multiple locations" errors at run-time, the linker is noticing the brew installed libraries and the libraries that are copied into the wheel at build-time. I didn't test this, but assuming both MPF and MPF-MC delocate dependencies, it should work.

I am sorry and disappointed I wasn't able to figure out a good way for you to build a universal wheel, but I understand a whole lot more about python library/wheel building process now. Maybe I'll be able to help with something else in the future. For now though, I think we can close this issue and I'll build my own MPF for my old 10.15 machine.

Thank you again for the help and the pointers. This was a fun diversion, getting to understand the internals.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants