Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

M1 Apple Silicon support #330

Closed
antranttu opened this issue Apr 1, 2022 · 21 comments
Closed

M1 Apple Silicon support #330

antranttu opened this issue Apr 1, 2022 · 21 comments

Comments

@antranttu
Copy link

antranttu commented Apr 1, 2022

Hello,

I was trying to installing and use EBM Classifier on my M1 computer but came across the following error:

dlopen(/Users/antran/miniforge3/envs/sk-env/lib/python3.9/site-packages/interpret/glassbox/ebm/../../lib/lib_ebm_native_mac_x64.dylib, 0x0006): tried:
'/Users/antran/miniforge3/envs/sk-env/lib/python3.9/site-packages/interpret/glassbox/ebm/../../lib/lib_ebm_native_mac_x64.dylib'
(mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e')),
'/Users/antran/miniforge3/envs/sk-env/lib/python3.9/site-packages/interpret/lib/lib_ebm_native_mac_x64.dylib'
(mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e'))

I was wondering if interpret is supported on M1 chip yet? Is there any work-around for the error?

Thank you!

@interpret-ml
Copy link
Collaborator

Hi @antranttu -- Not yet, but if you're willing to work with us and try a few experimental builds we could probably get this working. We don't have an m1 mac to test this out on, but we were able to build an m1 version on an Intel mac. If you're willing to try this out, then do the following and let us know how it goes:

pip uninstall interpret-core
pip install -U https://dev.azure.com/ms/_apis/resources/Containers/16386841/wheel?itemPath=wheel%2Finterpret_core-0.2.7-py3-none-any.whl

-InterpretML team

@antranttu
Copy link
Author

antranttu commented Apr 4, 2022

Hi,

Yes I'm willing to try the experimental builds to get support for M1 going. However, I wasn't able to install using the mentioned command because I do not have access and that the request requires authentication.

@interpret-ml
Copy link
Collaborator

interpret-ml commented Apr 4, 2022

Ok then, let's try this. Go to:

https://dev.azure.com/ms/interpret/_build/results?buildId=299543&view=artifacts&pathAsName=false&type=publishedArtifacts

Download the bottom item labeled "wheel"

Extract interpret_core-0.2.7-py3-none-any.whl from the wheel.zip onto your computer. Then, run:

pip uninstall interpret-core
pip install -U interpret_core-0.2.7-py3-none-any.whl

@interpret-ml
Copy link
Collaborator

Sorry, that's the wrong link. Will post the right one in a sec

@interpret-ml
Copy link
Collaborator

Here is the correct download. I will also update the message above.

https://dev.azure.com/ms/interpret/_build/results?buildId=299543&view=artifacts&pathAsName=false&type=publishedArtifacts

@antranttu
Copy link
Author

antranttu commented Apr 4, 2022

It's working! Thank you very much for the help.

One last question: is the performance optimized compared to other platforms? Because I ran the example code on the adult income salary provided at https://interpret.ml/docs/ebm.html, it took like a second on my intel machine to execute but consistently ~3 seconds on the M1 mac, not sure if the computation would be linearly expensive when applied on a larger dataset.

@interpret-ml
Copy link
Collaborator

Great! Wow, when does stuff like that work on the first try. :)

This build was a real hacky hack just to test things out, so I'll refine the solution to work on both m1 and Intel macs. Once we have that working, hopefully you can help us again by testing the finished working product. In the meantime, it should work fine for you if you aren't seeing crashes.

On the speed thing: I'd try it on bigger datasets before speculating on which chipset is faster. 1 sec vs 3 secs could easily be due to things like the time to load/initialize libraries. The build above is running on native ARM instructions and is not emulated.

-InterpretML

@antranttu
Copy link
Author

Thank you, I will test on bigger datasets to test its efficiency. But yes it's working great so far. Please do not hesitate to reach out if you need me for any testing in the future. Closing this for now.

Thanks again for the help.

@interpret-ml
Copy link
Collaborator

Hi @antranttu -- The fully integrated m1 build is ready to be tested. If this works it will be in our next pypi release. Can you please try out the wheel here:

https://dev.azure.com/ms/interpret/_build/results?buildId=300647&view=artifacts&pathAsName=false&type=publishedArtifacts

@antranttu
Copy link
Author

antranttu commented Apr 7, 2022

Hello, I got this error this time around:

dlopen(/Users/antran/miniforge3/envs/boosting/lib/python3.8/site-packages/interpret/glassbox/ebm/../../lib/lib_ebm_native_mac_arm.dylib, 0x0006): 
tried: '/Users/antran/miniforge3/envs/boosting/lib/python3.8/site-
packages/interpret/glassbox/ebm/../../lib/lib_ebm_native_mac_arm.dylib' (no such file), 
'/Users/antran/miniforge3/envs/boosting/lib/python3.8/site-packages/interpret/lib/lib_ebm_native_mac_arm.dylib' 
(no such file)

@interpret-ml
Copy link
Collaborator

Thanks @antranttu . I see the problem and will post a new build shortly.

@interpret-ml
Copy link
Collaborator

@antranttu
Copy link
Author

Yup, working great this time! Do you mind me asking what the error was about?

@interpret-ml
Copy link
Collaborator

Great to hear. Thanks for your help in testing this!

It was an issue in our build pipeline see dfd773e . There is a new arm specific shared library that python calls when it's running on an m1. That arm specific shared library was being built properly, but on the last step it wasn't being copied into the final wheel.

-InterpretML team

@armrib
Copy link

armrib commented May 9, 2022

Hello @interpret-ml !
I am glad this is fixed !
Would it be possible to get this artefact ?
I am running on m1 as well ;)
Thanks

@davidutassy-seon
Copy link

Hello @interpret-ml!
I would like to try out EBM on M1 as well. I can not see anything under the links shared above. Can you please share a working version again with us?
Thanks in advance.

@antranttu
Copy link
Author

antranttu commented May 17, 2022

Hello,

I am not sure if the artifacts for M1 support have been pushed to the official interpret package yet, but I think I can share the working version from the previous discussion with @interpret-ml team. Please follow the instructions above to install them.

wheel.zip

@markustoivonen
Copy link

Hi @interpret-ml, any ETA for a official release that supports M1?

@albertusmagnus3000
Copy link

Hello @antranttu ,

thanks a lot for sharing the ZIP. Tried in my M1, but got the same error as you did before:

dlopen(/Users/antran/miniforge3/envs/boosting/lib/python3.8/site-packages/interpret/glassbox/ebm/../../lib/lib_ebm_native_mac_arm.dylib, 0x0006):
tried: '/Users/antran/miniforge3/envs/boosting/lib/python3.8/site-
packages/interpret/glassbox/ebm/../../lib/lib_ebm_native_mac_arm.dylib' (no such file),
'/Users/antran/miniforge3/envs/boosting/lib/python3.8/site-packages/interpret/lib/lib_ebm_native_mac_arm.dylib'
(no such file)

I tried with then install -U and also --force-reinstall.

Any hint how to identify, which version is installed?

Thanks
Erwin

@interpret-ml : I would join the question of @markustoivonen, is there a ETA?

@albertusmagnus3000
Copy link

Hello @antranttu,

installed in a fresh venv, now it is working. Cool! Thanks for sharing the ZIP again.
Would be good having it in the pypi! Thanks @interpret-ml

Erwin

@romeroraa
Copy link

Also leaving a comment here to express interest in having this more easily available for use in M1 Macs. @interpret-ml

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

7 participants