Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable building jna for Darwin arm64 #1238

Closed
wants to merge 11 commits into from

Conversation

fkistner
Copy link
Contributor

@fkistner fkistner commented Jul 21, 2020

We are interested in getting JNA running on Darwin arm64.

I was able to get JNA compiling and working for the use cases most important to us with these changes and
env JDK_HOME=… SDKROOT=… ant -DCC=clang -Dmake.OPTS="DYNAMIC_LIBFFI=true DARWIN_USE_SDK_LIBFFI=true".
Nevertheless, I am not too familiar with JNA and would love to hear your feedback and whether you are interesting in the changes in their current state.

Summary of the changes:

  • Introduced checks in the Makefile to determine which architectures should be built on Darwin and added flag to allow linking against the system libffi.
  • Big Sur uses a shared dylib cache to lookup system libraries and frameworks. Checking existence in the file system will fail, but loading them will succeed nevertheless.
    Adapted LibraryLoadTest accordingly.
  • Arm64 calling convention requires varargs to always be passed on the stack. JNA's and ffi's argument handling logic seems to handle this case correctly, but it is important that the Java definitions match their native counterparts exactly.
    Adapted VarArgsTest accordingly.
Unfortunately, some tests still fail…
Testsuite: com.sun.jna.AnnotatedLibraryTest
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1,818 sec

Testsuite: com.sun.jna.ArgumentsMarshalNullableTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,303 sec

Testsuite: com.sun.jna.ArgumentsMarshalTest
Tests run: 41, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,432 sec

Testsuite: com.sun.jna.ArgumentsWrappersMarshalTest
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,293 sec

Testsuite: com.sun.jna.BufferArgumentsMarshalTest
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,369 sec

Testsuite: com.sun.jna.ByReferenceArgumentsTest
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,296 sec

Testsuite: com.sun.jna.ByReferenceToStringTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,262 sec

Testsuite: com.sun.jna.CallbacksTest
Exception in thread "Thread-0" JNA: error while handling callback exception, continuing
Exception in thread "main" JNA: error while handling callback exception, continuing
Tests run: 53, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9,849 sec

------------- Standard Error -----------------
java.lang.StackOverflowError
Warning: JVM did not GC Thread mapping after native thread terminated
------------- ---------------- ---------------
Testsuite: com.sun.jna.DefaultMethodInvocationTest
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,341 sec

Testsuite: com.sun.jna.DirectArgumentsMarshalNullableTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,315 sec

Testsuite: com.sun.jna.DirectArgumentsMarshalTest
Tests run: 41, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,384 sec

Testsuite: com.sun.jna.DirectArgumentsWrappersMarshalTest
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,395 sec

Testsuite: com.sun.jna.DirectBufferArgumentsMarshalTest
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,379 sec

Testsuite: com.sun.jna.DirectByReferenceArgumentsTest
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,329 sec

Testsuite: com.sun.jna.DirectCallbacksTest
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  Internal Error (os_bsd_zero.cpp:241), pid=75646, tid=5635
#  fatal error: caught unhandled signal 11 at address 0x0400000001000009
#
# JRE version: OpenJDK Runtime Environment (14.0.1) (build 14.0.1-internal+0)
# Java VM: OpenJDK 64-Bit Zero VM (14.0.1-internal+0, interpreted mode, serial gc, bsd-aarch64)
# No core dump will be written. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# hs_err_pid75646.log
#
# If you would like to submit a bug report, please visit:
#   https://bugreport.java.com/bugreport/crash.jsp
#
Testsuite: com.sun.jna.DirectCallbacksTest
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0 sec

Testcase: com.sun.jna.DirectCallbacksTest:testUnionByValueCallbackArgument:	Caused an ERROR
Forked Java VM exited abnormally. Please note the time in the report does not reflect the time until the VM exit.
junit.framework.AssertionFailedError: Forked Java VM exited abnormally. Please note the time in the report does not reflect the time until the VM exit.
	at jdk.internal.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)


Test com.sun.jna.DirectCallbacksTest FAILED (crashed)
Testsuite: com.sun.jna.DirectReturnTypesTest
Tests run: 19, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,337 sec

Testsuite: com.sun.jna.DirectStructureByValueTest
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,317 sec

Testsuite: com.sun.jna.DirectTest
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,335 sec

Testsuite: com.sun.jna.DirectTypeMapperTest
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,332 sec

Testsuite: com.sun.jna.ELFAnalyserTest
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,199 sec

Testsuite: com.sun.jna.FunctionTest
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,283 sec

Testsuite: com.sun.jna.HeadlessLoadLibraryTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,265 sec

Testsuite: com.sun.jna.IntegerTypeTest
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,288 sec

Testsuite: com.sun.jna.JNALoadTest
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,58 sec

Testsuite: com.sun.jna.LastErrorTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,345 sec

Testsuite: com.sun.jna.LibraryLoadTest
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,787 sec

Testsuite: com.sun.jna.MemoryTest
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13,652 sec

Testsuite: com.sun.jna.NativeLibraryTest
Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2,393 sec

Testsuite: com.sun.jna.NativeTest
Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2,162 sec

------------- Standard Error -----------------
Jul 21, 2020 3:07:10 PM com.sun.jna.Native getCharset
WARNING: JNA Warning: Encoding 'unsupported' is unsupported (unsupported)
Jul 21, 2020 3:07:11 PM com.sun.jna.Native getCharset
WARNING: JNA Warning: Using fallback encoding UTF-8
------------- ---------------- ---------------
Testsuite: com.sun.jna.NativedMappedTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,303 sec

Testsuite: com.sun.jna.PerformanceTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,153 sec

Testsuite: com.sun.jna.PlatformTest
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,164 sec

Testsuite: com.sun.jna.PointerBufferTest
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,255 sec

Testsuite: com.sun.jna.PointerTest
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,273 sec

Testsuite: com.sun.jna.PrematureGCTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,623 sec

Testsuite: com.sun.jna.ReturnTypesTest
Tests run: 19, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,338 sec

Testsuite: com.sun.jna.StructureBufferFieldTest
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,276 sec

Testsuite: com.sun.jna.StructureByValueTest
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,307 sec

Testsuite: com.sun.jna.StructureFieldOrderInspectorTest
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,986 sec

Testsuite: com.sun.jna.StructureTest
Tests run: 90, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,578 sec

Testsuite: com.sun.jna.TypeMapperTest
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,314 sec

Testsuite: com.sun.jna.UnionTest
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,327 sec

Testsuite: com.sun.jna.VMCrashProtectionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,278 sec

Testsuite: com.sun.jna.VarArgsCheckerTest
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,159 sec

Testsuite: com.sun.jna.VarArgsTest
Tests run: 9, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0,309 sec

Testcase: testShortVarArgs(com.sun.jna.VarArgsTest):	FAILED
16-bit integer varargs not added correctly expected:<3> but was:<-697827325>
junit.framework.AssertionFailedError: 16-bit integer varargs not added correctly expected:<3> but was:<-697827325>
	at com.sun.jna.VarArgsTest.testShortVarArgs(VarArgsTest.java:66)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)


Test com.sun.jna.VarArgsTest FAILED

Copy link
Member

@matthiasblaesing matthiasblaesing left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there already a JDK available for mac OS on arm64?

native/Makefile Outdated Show resolved Hide resolved
native/Makefile Outdated Show resolved Hide resolved
native/Makefile Outdated Show resolved Hide resolved
native/Makefile Outdated
# JAWT linkage handled by -framework JavaVM
LIBS=
LIBS+=-framework JavaVM
endif
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is linking against JavaVM not required anymore?

Copy link
Contributor Author

@fkistner fkistner Jul 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apple does not ship a JavaVM framework anymore starting from Big Sur.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So do we need the JavaVM framework? Or could we drop it from the build entirely?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems to me this was only required for JAWT. Looks to me like it could be dropped for non-arm64 as well.

test/com/sun/jna/VarArgsTest.java Show resolved Hide resolved
@fkistner
Copy link
Contributor Author

It is possible to build the Zero variant yourself, if you have access to the hardware: https://github.com/apple/openjdk/

@matthiasblaesing
Copy link
Member

All my questions might have hinted, that I don't own Apple hardware, and I don't see me owning any. For the test error in the callback test, you can try to narrow it down by adding printfs in the testlibrary and/or the native dispatch library to see when it fails or how long it works.

@dkocher
Copy link
Contributor

dkocher commented Aug 10, 2020

I can confirm this is building a universal libjnidispatch.jnilib binary on macOS 10.15.5 with Xcode 12.0

CC=clang SDKROOT=/Applications/Xcode-beta.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.0.sdk DEVELOPER_DIR=/Applications/Xcode-beta.app/Contents/Developer/ ant test -Dmake.OPTS="DYNAMIC_LIBFFI=true DARWIN_USE_SDK_LIBFFI=true"

@matthiasblaesing
Copy link
Member

So we need a strategy how to proceed:

If I remember correctly, Apple does like fat binaries only very selectively. In principle it should be possible to build a fat binary, that supports x86, x86_64 and arm64. This will need support from XCode to build all three binaries. The alternative would be to build the darwin-x86, darwin-x86-64, darwin-aarch64 separately, and merge only at build time.

We could then also provide a way to drop the darwin special handling (fat binaries). We could bundle the mentioned individual darwin libraries and the fat binary.

What I notice in the invocation above is the usage of the SDK libffi - can we build from the embedded libffi source/can we update it?

@tresf
Copy link
Contributor

tresf commented Nov 11, 2020

can we build from the embedded libffi source/can we update it?

@matthiasblaesing this is the strategy I've used in another aarch64 PR (2bf1593 in #1264).

Outdated, click to expand

Note, embedding libffi's project's source code has some snags. Notably, the .gitignore files causes some critical files to not upload (since they're not yet tracked in jna. Worse, the .gitignore is preferred in principal to make tracking source changes easier). So I've made the assumption in #1264 to include everything except native/libffi/.git. If this is incorrect, guidance is needed.

As an aside, submodule (or perhaps a subtree are also viable, but require upstream changes to be accepted first -- such as downstream changes required for msvcc.sh to continue working). There are other disadvantages such as the annoying --recursive keyword when cloning. I don't want to go too off-topic but wanted to at least mention it incase it's something JNA is interested in.

Edit: Sorry, I had missed the native/README.libffi which has explicit instructions for bumping the subtree. According to conversation in #1259, there's divergence issues so I'm still a bit confused as to how best to tackle this.

@tresf
Copy link
Contributor

tresf commented Nov 11, 2020

In principle it should be possible to build a fat binary, that supports x86, x86_64 and arm64.

It appears XCode 10 dropped 32-bit support, so the following is not possible using XCode 12. XCode 12 is required for arm64 support.

clang \
-arch i386 \
-arch x86_64 \
-arch arm64 \
hello.c -o hello
# ld: warning: missing required architecture i386 in file [...]/MacOSX.sdk/usr/lib/libSystem.tbd (3 slices)
# Undefined symbols for architecture i386:

XCode 12 can successfully build both x86_64 and arm64 if -arch i386 is removed.

If I remember correctly, Apple does like fat binaries only very selectively. This will need support from XCode to build all three binaries. The alternative would be to build the darwin-x86, darwin-x86-64, darwin-aarch64 separately, and merge only at build time.

Edit: Just realized Makefile already uses lipo:

jna/native/Makefile

Lines 436 to 444 in 816fdd2

ifneq ($(SDKROOT),)
$(CC) $(LOC_CC_OPTS) -arch $(ARCH) $(CFLAGS) -c $< -o $@.$(ARCH)
for arch in $(ALT_ARCHS); do \
$(CC) $(LOC_CC_OPTS) -arch $$arch -I$(BUILD)/libffi.$$arch/include $(CFLAGS) -c $< -o $@.$$arch; \
done
lipo -create -output $@ $@.*
else
$(CC) $(CFLAGS) $(LOC_CC_OPTS) -c $< $(COUT)
endif

Hidden, click to expand

lipo seems to be suited for this task. A small test shows promise:

clang -arch x86_64 hello.c -o hello_x86_64
file hello_x86_64
# hello_x86_64 (for architecture x86_64):	Mach-O 64-bit executable x86_64
#                                               ^--- 64-bit Intel.  Good, expected.

clang -arch arm64 hello.c -o hello_aarch64
file hello_aarch64
# hello_aarch64 (for architecture arm64):	Mach-O 64-bit executable arm64
#                                               ^--- 64-bit ARM.  Good, expected.

lipo hello_x86_64 hello_aarch64 -create -output hello
file hello
# hello: Mach-O universal binary with 2 architectures: [x86_64:Mach-O 64-bit executable x86_64] [arm64:Mach-O 64-bit executable arm64]
# hello (for architecture x86_64):	Mach-O 64-bit executable x86_64
# hello (for architecture arm64):	Mach-O 64-bit executable arm64
#                                       ^--- Successfully combined.

@matthiasblaesing
Copy link
Member

So the option would be: We build the architecture dependent libraries individually on a matching XCode version and update darwin.jar instead of replacing it. That way we should get a fat binary, that supports x86, x64 and aarch64. We could then think about also providing a "apple" mode, that drops the fat binary and splits darwin.jar into darwin-x86.jar, darwin-x86-64.jar and darwin-aarch64.jar.

In case of an incompatible api change, the whole file will be cleared, rebuilds for mac os need to be build incrementally.

@fkistner
Copy link
Contributor Author

fkistner commented Nov 20, 2020

What I notice in the invocation above is the usage of the SDK libffi - can we build from the embedded libffi source/can we update it?

The embedded libffi compiles fine, but the resulting binary does not work correctly for me. Until libffi is updated, using the system libffi seems to be the best option.

We could then think about also providing a "apple" mode, that drops the fat binary and splits darwin.jar into darwin-x86.jar, darwin-x86-64.jar and darwin-aarch64.jar.

Seems to me, the easiest way would to split off a darwin-aarch64.jar for now. The decision whether the Intel binary should be split as well can be separate.

What do you think about this?

@fkistner
Copy link
Contributor Author

darwin-aarch64.jar can be built successfully on x86_64 using ant native -Ddynlink.native=true -Dbuild.os.arch=aarch64.

@fkistner
Copy link
Contributor Author

Added darwin-aarch64.jar from one of our build agents to give the CI a chance to run. Please advise, if you'd like me to remove it.

@matthiasblaesing
Copy link
Member

What I notice in the invocation above is the usage of the SDK libffi - can we build from the embedded libffi source/can we update it?

The embedded libffi compiles fine, but the resulting binary does not work correctly for me. Until libffi is updated, using the system libffi seems to be the best option.

Please see if rebasing this work onto the work done in #1264 helps. That branch holds upstream libffi. I have some hope, that apple did the sane thing and upstreamed its changes. For the system libffi: It is dynamically linked or statically build into the jnidispatch library? What are the licensing terms?

JNA already allows linking against a system libffi, it is the mode, that is used in the debian builds. However, that will utterly fail on non apple platforms (though I don't now if there are any serious darwin users outside apple).

We could then think about also providing a "apple" mode, that drops the fat binary and splits darwin.jar into darwin-x86.jar, darwin-x86-64.jar and darwin-aarch64.jar.

Seems to me, the easiest way would to split off a darwin-aarch64.jar for now. The decision whether the Intel binary should be split as well can be separate.

What do you think about this?

I would not do it right now. For me this is either-or, darwin was historically the only OS, that was build with fat binaries for JNA. I don't know what triggered that, but it is a fact. A hybrid, where only darwin-aarch64 is split of, would IMHO be an error, that we can not cleanly recover from. We should either split completely or don't. Splitting the darwin binary has also consequences for downstream users and needs to be planned.

Added darwin-aarch64.jar from one of our build agents to give the CI a chance to run. Please advise, if you'd like me to remove it.

In the long run we should see, that the CI infrastructure gets a branch, that is executed on XCode 12, so the binary can be build from there. What I envision is this (only for apple, only while fatbinaries are targetted).

  • the build process is first run on XCode 12, then the x64 + aarch64 binaries in the fat jar are updated
  • the build process is then run on XCode 9, then the x64 + x86 binaries in the fat jar are updated

We end up with a darwin binary, that supports all three architectures. So could you merge the darwin-aarch64 binary into the darwin binary?

@tresf
Copy link
Contributor

tresf commented Nov 23, 2020

the build process is first run on XCode 12, then the x64 + aarch64 binaries in the fat jar are updated

Hi, @Vzor- and myself are working on a modification of this PR that does each arch separately, using unzip|lipo|jar using ant to place only the target architecture back in. It keeps the fat binary (for now), but takes lipo out of the Makefile and places it directly in ant.

I think this is slightly more intuitive than the proposal above which:

  • Creates a fat binary with x86_64 + aarch64 using a newer XCode
  • Creates a fat binary with x86 + x86_64 using and older XCode
    • First must lipo the newer x86_64 back out!

Instead we're focusing on this:

We build the architecture dependent libraries individually on a matching XCode version and update darwin.jar instead of replacing it.

Will share the code soon, probably as a GitHub compare link.

build.xml Outdated Show resolved Hide resolved
build.xml Outdated Show resolved Hide resolved
build.xml Outdated Show resolved Hide resolved
@tresf
Copy link
Contributor

tresf commented Nov 23, 2020

It is possible to build the Zero variant yourself, if you have access to the hardware: https://github.com/apple/openjdk/

Just an update on this comment, since this was posted, Zulu and Microsoft (yes, Microsoft 😄)both have Apple Silicon builds available. At the time of writing this, Microsoft's is broken, so Azul's is the only precompiled version. Microsoft has informed me they'll have a new build out soon.

@catap
Copy link

catap commented Nov 23, 2020

Let me share a few command from real machine with M1 and macOS 12:

catap@Kirills-mini-m1 ~ % uname -m
arm64
catap@Kirills-mini-m1 ~ % uname -a
Darwin Kirills-mini-m1.sa31-cbt.catap.net 20.1.0 Darwin Kernel Version 20.1.0: Sat Oct 31 00:07:10 PDT 2020; root:xnu-7195.50.7~2/RELEASE_ARM64_T8101 arm64
catap@Kirills-mini-m1 ~ % 

It is called itself arm64 and not aarch64 that was a big surprise fr me.

@catap
Copy link

catap commented Nov 23, 2020

@tresf Microsoft's build doesn't work. It crashed on start. :(

Zulu more stable, but it is 16.ea and original branch of OpenJDK port doesn't pass all tests.

@tresf
Copy link
Contributor

tresf commented Nov 23, 2020

It is called itself arm64 and not aarch64 that was a big surprise fr me.

OpenJDK as well as JNA call it aarch64. Let's not get too tied up with semantics. :)

@catap
Copy link

catap commented Nov 23, 2020

@tresf indeed.

Welcome to Scala 2.13.4 (OpenJDK 64-Bit Server VM, Java 16-ea).
Type in expressions for evaluation. Or try :help.

scala> System.getProperty("os.arch")
val res0: String = aarch64

scala> 

@tresf
Copy link
Contributor

tresf commented Nov 23, 2020

@tresf Microsoft's build doesn't work. It crashed on start. :(

I've already posted this information and a link to the details.

Arm64 calling convention requires varargs to always be passed on the stack. JNA's and ffi's argument handling logic seems to handle this case correctly, but it is important that the Java definitions match their native counterparts exactly.
Big Sur uses a shared dylib cache to lookup system libraries and frameworks. Checking existence in the file system will fail, but loading them will succeed nevertheless.
@fkistner fkistner force-pushed the darwin_arm64 branch 4 times, most recently from b802350 to 9e6fd33 Compare January 27, 2021 00:07
@fkistner
Copy link
Contributor Author

Merged as #1297.

@fkistner fkistner closed this Feb 11, 2021
askoog added a commit to AvanzaBank/xap that referenced this pull request Sep 8, 2022
…es tests using testcontainers on apple mac M1

The 5.6.0 version of jna doesn't work on apple mac m1, this is fixed in 5.7.0, but this commit suggests upgrading to the latest version, which currently is 5.13.0

jna has no transitive dependencies and jna release notes does not mention any problems upgrading to 5.13.0

for more info:

https://github.com/java-native-access/jna/blob/master/CHANGES.md
java-native-access/jna#1238
https://www.testcontainers.org/
kenoir added a commit to wellcomecollection/scala-libs that referenced this pull request Jan 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants