Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Apple Silicon toolchain support #88

Closed
jez opened this issue Sep 2, 2021 · 64 comments · Fixed by #174
Closed

Apple Silicon toolchain support #88

jez opened this issue Sep 2, 2021 · 64 comments · Fixed by #174
Labels
enhancement New feature or request

Comments

@jez
Copy link
Contributor

jez commented Sep 2, 2021

I just got an M1 MacBook Pro today, and am looking into how to use this project to generate arm64 binaries (for the record: everything works fine using this project to generate x86_64 binaries, which then run under Rosetta).

In the Apple Developer Documentation, they make it out to be as simple as passing a -target flag to the C compiler, though I'm sure it'll be more work to do the same thing in Bazel.

https://developer.apple.com/documentation/apple-silicon/building-a-universal-macos-binary

Have anyone put thought or time into how this project might be extended to support generating arm64 binaries on M1 Macs? I'm probably going to be spending some time getting this working, and I'd love any tips, ideas, or places to start.

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 2, 2021

I thought about this a bit while working on #85.

I think there are actually two separate things:

  • using clang compiled for x86_64 to produce binaries for arm64 (i.e. cross compiling)
  • using clang compiled for arm64 to produce binaries for arm64

I think the first can be done with the feature #85 adds but I also think it's really the second thing that you'd want when running macOS on ARM (so that clang doesn't needlessly have to run through Rosetta).


To add support for arm64 darwin host platforms, we'd need to:

  • update parts of configure.bzl that hardcode the CPU constraints to x86_64
    • we can either parameterize these over the current CPU or just add entries for every host CPU that we want to support (as we do for OSes – we always have macOS and linux toolchains regardless of the OS of the host platform; we let toolchain resolution pick the right one)
    • I'm inclined to just add entries for all the host CPUs we want to support since I don't know of a good way to determine the CPU type of the host from a repo rule (an alias on a select on the CPU or just running arch are the two ways that come to mind, both of which seem bad); users who aren't using toolchain resolution yet won't have their setups broken since we can add the appropriate entries to the toolchain_suite
      • this has the (odd but not bad) side-effect of letting you "choose" to use x86_64 clang on an M1 mac by running bazel under Rosetta, I think
  • update cc_toolchain_config to take a host_cpu arg
    • at a glance, I think this only really affects the host_system_name, and the default target_system_name, target_cpu, abi_version, and abi_libc_version
  • sysroot.bzl might need some changes depending on whether the sysroots for arm64 macOS installs are any different
  • have llvm_distributions.bzl grow entries for arm64 macOS (more on this below)
  • change BUILD.tpl to match configure.bzl

And I think that should be about it.

I think the bigger obstacle right now is that LLVM doesn't publish binaries for arm64 macOS (even though it's definitely possible to build LLVM for arm64 macOS). This effectively means that you'd have to provide your own download URL to to llvm_toolchain.

Adding tests for this in CI will also be tricky since (afaik) GitHub Actions doesn't yet have ARM worker machines let alone macOS ARM worker machines but that's okay.

Another question is whether we'd want to support creating universal binaries on macOS; I think it makes sense not to default to doing so (users can opt into this by setting some --copts or adding some extra_compile_flags with the features in #85 for now; eventually we can make it an option on llvm_toolchain if there's interest).


@jez I'm happy to try to put together these changes if you're willing to test (I don't have access to an arm64 macOS machine).

@jez
Copy link
Contributor Author

jez commented Sep 2, 2021

I’m more than happy to test! Do you have a sense of when you’d have time to work on this?

Also: I think that even being able to produce arm64 binaries would be an improvement, even if they were built by x86_64 clang. I see that #85 is a draft—is that something you’d like me to test? Or is there something else holding it back from landing?

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

Okay! I put together a thing that uses #85 to set up a toolchain for arm64 (still x86_64 based so it's cross-compiling).

It took a little doing; I'm not on macOS 11 so my sysroot didn't have the right stuff and it took me a while to realize that the ld that ships in LLVM releases doesn't have support for the tbd (TAPI) files that newer macOS SDKs ship. I ended up having to use lld as the linker; I'm not sure this is totally right:

        # newer macOS SDKs use `.tbd` files in their sysroots; `lld` has support for this
        # (on recent versions) but `ld` (that ships in the LLVM releases) does not
        #
        # so, we need to specify that we want to use `lld`
        #
        # historically, we have not done this for macOS by default because of incomplete
        # support for Mach-O in `lld` but newer version seem to have good support.
        "extra_linker_flags": ["-fuse-ld=lld"],

I'd love to know what Apple clang does (just the output of g++ -v -xc++ - <<<"int main() { }" on an M1 machine should give it away I think).

But anyways, with the above, it does get all the way through linking on my x86_64 macOS machine. I can't actually run the generated binaries but hopefully they do actually work 🤞.

I've attached the workspace I put together to this comment.

  • bazel run //:test --config=x86 to check that the regular old x86_64 -> x86_64 toolchain works, etc.
  • bazel build //:test --config=arm64 to make the x86_64 -> arm64 toolchain can generate binaries
  • bazel run //:test --config=arm64 to make sure the binaries emitted can actually run on arm64

In theory running bazel run //:test (without any --config to manually set the target platform) should pick up the arm64 toolchain and use it, depending on what constraints Bazel gives M1 machines. I have no idea how this interacts with Rosetta though; bazel build //:test --toolchain_resolution_debug should give us some hints.

workspace.zip

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

@jez re: your other questions:

I should have time to try to get the arm64 -> arm64 toolchain working over the weekend; assuming ^ works, that should be fairly straight-forward. The trickiest part will probably be finding/building an arm64 LLVM toolchain to use.

#85 is still a draft because it builds on #75 which hasn't yet been merged and because it's missing some docs and polish; I wanted to solicit some feedback about some of the design choices in that PR before cleaning it up. It is more or less functional though.

@jez
Copy link
Contributor Author

jez commented Sep 3, 2021

I'd love to know what Apple clang does (just the output of g++ -v -xc++ - <<<"int main() { }" on an M1 machine should give it away I think).

Here's the output:

output.log

I'll test that workspace out now and see what happens.

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

Here's the output:

output.log

Thanks! Can you run /Library/Developer/CommandLineTools/usr/bin/ld -v also? I'm pretty sure it's just ld64 but just in case.

@jez
Copy link
Contributor Author

jez commented Sep 3, 2021

❯ /Library/Developer/CommandLineTools/usr/bin/ld -v
@(#)PROGRAM:ld  PROJECT:ld64-650.9
BUILD 13:09:13 May 28 2021
configured to support archs: armv6 armv7 armv7s arm64 arm64e arm64_32 i386 x86_64 x86_64h armv6m armv7k armv7m armv7em
LTO support using: LLVM version 12.0.5, (clang-1205.0.22.11) (static support for 27, runtime is 27)
TAPI support using: Apple TAPI version 12.0.5 (tapi-1205.0.7.1)

@jez
Copy link
Contributor Author

jez commented Sep 3, 2021

is the @macos-11.3-sdk// repo basically the same set of files I have at /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk, judging from the output.log above?

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

yup, exactly

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

I had to grab it externally because I'm on an older version of macOS that doesn't have an SDK with the right stuff to build for arm64

but for host arm64 toolchains (as in arm64 -> arm64) we shouldn't actually need to grab it; like we do with the other host toolchains, we can just assume what's on the host system actually works when you're targeting the host system

@jez
Copy link
Contributor Author

jez commented Sep 3, 2021

~/stripe/sandbox/rrbutani-workspace   19s
❯ ./bazel run //:test --config=x86
Starting local Bazel server and connecting to it...
ERROR: /private/var/tmp/_bazel/e9cb4153e0861999826e8879b02ae2cc/external/local_config_cc/BUILD:48:19: in cc_toolchain_suite rule @local_config_cc//:toolchain: cc_toolchain_suite '@local_config_cc//:toolchain' does not contain a toolchain for cpu 'darwin'
ERROR: Analysis of target '//:test' failed; build aborted: Analysis of target '@local_config_cc//:toolchain' failed
INFO: Elapsed time: 69.390s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (16 packages loaded, 61 targets configured)
FAILED: Build did NOT complete successfully (16 packages loaded, 61 targets configured)

~/stripe/sandbox/rrbutani-workspace
❯ ./bazel --version
bazel 4.2.1

(I had to make a slight change to the repo, which is to make it use a script that has contents identical to this:

https://github.com/jez/ragel-bison-parser-sandbox/blob/master/bazel

because our company laptops prevent us from installing bazelisk) but otherwise the above is the result of running things.

Interestingly enough, that's the same error I get when trying to build a normal bazel project on my macbook. For example this tiny project shows the same problems.

bazel-test.zip

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

hmm

Does /private/var/tmp/_bazel/e9cb4153e0861999826e8879b02ae2cc/external/local_config_cc/BUILD:48 look like:

cc_toolchain_suite(
    name = "toolchain",
    toolchains = {
        "k8|clang": ":cc-clang-linux",
        "darwin|clang": ":cc-clang-darwin",
        "k8": ":cc-clang-linux",
        "darwin": ":cc-clang-darwin",
    },
)

for you?

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

oh whoops, nvm; I totally missed that that's @local_config_cc

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

It's not clear to me why it's even analyzing @local_config_cc//:toolchain; can you post what bazel build //:test --toolchain_resolution_debug prints out?

@jez
Copy link
Contributor Author

jez commented Sep 3, 2021

It's not clear to me why it's even analyzing @local_config_cc//:toolchain; can you post what bazel build //:test --toolchain_resolution_debug prints out?

~/stripe/sandbox/rrbutani-workspace
❯ bazel build //:test --toolchain_resolution_debug
INFO: Build options --platforms and --toolchain_resolution_debug have changed, discarding analysis cache.
INFO: ToolchainResolution:     Type @bazel_tools//tools/cpp:toolchain_type: target platform @local_config_platform//:host: Rejected toolchain @llvm_toolchain//:cc-clang-linux; mismatching values: linux
INFO: ToolchainResolution:   Type @bazel_tools//tools/cpp:toolchain_type: target platform @local_config_platform//:host: execution @local_config_platform//:host: Selected toolchain @llvm_toolchain//:cc-clang-darwin
INFO: ToolchainResolution:     Type @bazel_tools//tools/cpp:toolchain_type: target platform @local_config_platform//:host: Rejected toolchain //:clang-darwin-arm64-toolchain; mismatching values: arm64
INFO: ToolchainResolution:     Type @bazel_tools//tools/cpp:toolchain_type: target platform @local_config_platform//:host: Rejected toolchain @local_config_cc//:cc-compiler-armeabi-v7a; mismatching values: arm, android
INFO: ToolchainResolution: Target platform @local_config_platform//:host: Selected execution platform @local_config_platform//:host, type @bazel_tools//tools/cpp:toolchain_type -> toolchain @llvm_toolchain//:cc-clang-darwin
INFO: ToolchainResolution: Target platform @local_config_platform//:host: Selected execution platform @local_config_platform//:host,
INFO: ToolchainResolution: Target platform @local_config_platform//:host: Selected execution platform @local_config_platform//:host,
INFO: ToolchainResolution: Target platform @local_config_platform//:host: Selected execution platform @local_config_platform//:host,
ERROR: /private/var/tmp/_bazel/e9cb4153e0861999826e8879b02ae2cc/external/local_config_cc/BUILD:48:19: in cc_toolchain_suite rule @local_config_cc//:toolchain: cc_toolchain_suite '@local_config_cc//:toolchain' does not contain a toolchain for cpu 'darwin'
ERROR: Analysis of target '//:test' failed; build aborted: Analysis of target '@local_config_cc//:toolchain' failed
INFO: Elapsed time: 0.924s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded, 60 targets configured)

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

Also in your original post you mentioned that you have an x86_64 -> x86_64 setup working; did you have to manually set your host platform or change anything toolchain related to get that to work?

@jez
Copy link
Contributor Author

jez commented Sep 3, 2021

Also in your original post you mentioned that you have an x86_64 -> x86_64 setup working; did you have to manually set your host platform or change anything toolchain related to get that to work?

Yeah, overnight that seems to have stopped working. I can't explain that.

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

It's not clear to me why it's even analyzing @local_config_cc//:toolchain; can you post what bazel build //:test --toolchain_resolution_debug prints out?

~/stripe/sandbox/rrbutani-workspace
❯ bazel build //:test --toolchain_resolution_debug
INFO: Build options --platforms and --toolchain_resolution_debug have changed, discarding analysis cache.
INFO: ToolchainResolution:     Type @bazel_tools//tools/cpp:toolchain_type: target platform @local_config_platform//:host: Rejected toolchain @llvm_toolchain//:cc-clang-linux; mismatching values: linux
INFO: ToolchainResolution:   Type @bazel_tools//tools/cpp:toolchain_type: target platform @local_config_platform//:host: execution @local_config_platform//:host: Selected toolchain @llvm_toolchain//:cc-clang-darwin
INFO: ToolchainResolution:     Type @bazel_tools//tools/cpp:toolchain_type: target platform @local_config_platform//:host: Rejected toolchain //:clang-darwin-arm64-toolchain; mismatching values: arm64
INFO: ToolchainResolution:     Type @bazel_tools//tools/cpp:toolchain_type: target platform @local_config_platform//:host: Rejected toolchain @local_config_cc//:cc-compiler-armeabi-v7a; mismatching values: arm, android
INFO: ToolchainResolution: Target platform @local_config_platform//:host: Selected execution platform @local_config_platform//:host, type @bazel_tools//tools/cpp:toolchain_type -> toolchain @llvm_toolchain//:cc-clang-darwin
INFO: ToolchainResolution: Target platform @local_config_platform//:host: Selected execution platform @local_config_platform//:host,
INFO: ToolchainResolution: Target platform @local_config_platform//:host: Selected execution platform @local_config_platform//:host,
INFO: ToolchainResolution: Target platform @local_config_platform//:host: Selected execution platform @local_config_platform//:host,
ERROR: /private/var/tmp/_bazel/e9cb4153e0861999826e8879b02ae2cc/external/local_config_cc/BUILD:48:19: in cc_toolchain_suite rule @local_config_cc//:toolchain: cc_toolchain_suite '@local_config_cc//:toolchain' does not contain a toolchain for cpu 'darwin'
ERROR: Analysis of target '//:test' failed; build aborted: Analysis of target '@local_config_cc//:toolchain' failed
INFO: Elapsed time: 0.924s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded, 60 targets configured)

That's odd; toolchain resolution happens exactly as we'd expect (it picks the x86 clang toolchain for macOS) but it still pulls in @local_config_cc 😕.

Can you try running bazel cquery 'deps(//:test)' --output=graph --config=x86? It should at least tell us what's pulling it in.

@local_config_cc//:toolchain shouldn't be broken though :-/

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

(actually the contents of /private/var/tmp/_bazel/e9cb4153e0861999826e8879b02ae2cc/external/local_config_cc/BUILD would also be interesting to look at)

@jez
Copy link
Contributor Author

jez commented Sep 3, 2021

local_config_cc/BUILD
# Copyright 2016 The Bazel Authors. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# This becomes the BUILD file for @local_config_cc// under non-BSD unixes.

package(default_visibility = ["//visibility:public"])

load(":cc_toolchain_config.bzl", "cc_toolchain_config")
load(":armeabi_cc_toolchain_config.bzl", "armeabi_cc_toolchain_config")
load("@rules_cc//cc:defs.bzl", "cc_toolchain", "cc_toolchain_suite")

licenses(["notice"])  # Apache 2.0

cc_library(
    name = "malloc",
)

filegroup(
    name = "empty",
    srcs = [],
)

filegroup(
    name = "cc_wrapper",
    srcs = ["cc_wrapper.sh"],
)

filegroup(
    name = "compiler_deps",
    srcs = glob(["extra_tools/**"], allow_empty = True) + [":builtin_include_directory_paths",
    ":cc_wrapper"],
)

# This is the entry point for --crosstool_top.  Toolchains are found
# by lopping off the name of --crosstool_top and searching for
# the "${CPU}" entry in the toolchains attribute.
cc_toolchain_suite(
    name = "toolchain",
    toolchains = {
        "darwin_arm64|clang": ":cc-compiler-darwin_arm64",
        "darwin_arm64": ":cc-compiler-darwin_arm64",
        "armeabi-v7a|compiler": ":cc-compiler-armeabi-v7a",
        "armeabi-v7a": ":cc-compiler-armeabi-v7a",
    },
)

cc_toolchain(
    name = "cc-compiler-darwin_arm64",
    toolchain_identifier = "local",
    toolchain_config = ":local",
    all_files = ":compiler_deps",
    ar_files = ":compiler_deps",
    as_files = ":compiler_deps",
    compiler_files = ":compiler_deps",
    dwp_files = ":empty",
    linker_files = ":compiler_deps",
    objcopy_files = ":empty",
    strip_files = ":empty",
    supports_param_files = 1,
    module_map = ":module.modulemap",
)

cc_toolchain_config(
    name = "local",
    cpu = "darwin_arm64",
    compiler = "clang",
    toolchain_identifier = "local",
    host_system_name = "local",
    target_system_name = "local",
    target_libc = "macosx",
    abi_version = "local",
    abi_libc_version = "local",
    cxx_builtin_include_directories = ["/Library/Developer/CommandLineTools/usr/lib/clang/12.0.5/include",
    "/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include",
    "/Library/Developer/CommandLineTools/usr/include",
    "/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/System/Library/Frameworks",
    "/Library/Developer/CommandLineTools/usr/lib/clang/12.0.5/share",
    "/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include/c++/v1"],
    tool_paths = {"ar": "/usr/bin/libtool",
        "ld": "/usr/bin/ld",
        "llvm-cov": "None",
        "cpp": "/usr/bin/cpp",
        "gcc": "cc_wrapper.sh",
        "dwp": "/usr/bin/dwp",
        "gcov": "/usr/bin/gcov",
        "nm": "/usr/bin/nm",
        "objcopy": "/usr/bin/objcopy",
        "objdump": "/usr/bin/objdump",
        "strip": "/usr/bin/strip"},
    compile_flags = ["-U_FORTIFY_SOURCE",
    "-fstack-protector",
    "-Wall",
    "-Wthread-safety",
    "-Wself-assign",
    "-fcolor-diagnostics",
    "-fno-omit-frame-pointer"],
    opt_compile_flags = ["-g0",
    "-O2",
    "-D_FORTIFY_SOURCE=1",
    "-DNDEBUG",
    "-ffunction-sections",
    "-fdata-sections"],
    dbg_compile_flags = ["-g"],
    cxx_flags = ["-std=c++0x"],
    link_flags = ["-undefined",
    "dynamic_lookup",
    "-headerpad_max_install_names"],
    link_libs = ["-lstdc++",
    "-lm"],
    opt_link_flags = [],
    unfiltered_compile_flags = ["-no-canonical-prefixes",
    "-Wno-builtin-macro-redefined",
    "-D__DATE__=\"redacted\"",
    "-D__TIMESTAMP__=\"redacted\"",
    "-D__TIME__=\"redacted\""],
    coverage_compile_flags = ["-fprofile-instr-generate",  "-fcoverage-mapping"],
    coverage_link_flags = ["-fprofile-instr-generate"],
    supports_start_end_lib = False,
)

# Android tooling requires a default toolchain for the armeabi-v7a cpu.
cc_toolchain(
    name = "cc-compiler-armeabi-v7a",
    toolchain_identifier = "stub_armeabi-v7a",
    toolchain_config = ":stub_armeabi-v7a",
    all_files = ":empty",
    ar_files = ":empty",
    as_files = ":empty",
    compiler_files = ":empty",
    dwp_files = ":empty",
    linker_files = ":empty",
    objcopy_files = ":empty",
    strip_files = ":empty",
    supports_param_files = 1,
)

armeabi_cc_toolchain_config(name = "stub_armeabi-v7a")

and the other one:

❯ bazel cquery 'deps(//:test)' --output=graph --config=x86
INFO: Build options --platforms and --toolchain_resolution_debug have changed, discarding analysis cache.
ERROR: /private/var/tmp/_bazel/e9cb4153e0861999826e8879b02ae2cc/external/local_config_cc/BUILD:48:19: in cc_toolchain_suite rule @local_config_cc//:toolchain: cc_toolchain_suite '@local_config_cc//:toolchain' does not contain a toolchain for cpu 'darwin'
ERROR: Analysis of target '//:test' failed; build aborted: Analysis of target '@local_config_cc//:toolchain' failed
INFO: Elapsed time: 0.791s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded, 61 targets configured)

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

❯ bazel cquery 'deps(//:test)' --output=graph --config=x86
INFO: Build options --platforms and --toolchain_resolution_debug have changed, discarding analysis cache.
ERROR: /private/var/tmp/_bazel/e9cb4153e0861999826e8879b02ae2cc/external/local_config_cc/BUILD:48:19: in cc_toolchain_suite rule @local_config_cc//:toolchain: cc_toolchain_suite '@local_config_cc//:toolchain' does not contain a toolchain for cpu 'darwin'
ERROR: Analysis of target '//:test' failed; build aborted: Analysis of target '@local_config_cc//:toolchain' failed
INFO: Elapsed time: 0.791s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded, 61 targets configured)

oh whoops; does it get any further if you add --keep_going?

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

wait ^ makes sense actually; there isn't an x86_64 darwin toolchain in @local_cc_config
for some reason I was under the impression that Bazel was running under Rosetta in which case there would be, I think

does the error go away if you don't specify --config=x86?

@jez
Copy link
Contributor Author

jez commented Sep 3, 2021

oh whoops; does it get any further if you add --keep_going?

does the error go away if you don't specify --config=x86?

I get the same output for all combinations of --keep_going and --config={x86,arm}

❯ bazel cquery 'deps(//:test)' --output=graph --config=arm64 --keep_going
INFO: Build option --platforms has changed, discarding analysis cache.
ERROR: /private/var/tmp/_bazel/e9cb4153e0861999826e8879b02ae2cc/external/local_config_cc/BUILD:48:19: in cc_toolchain_suite rule @local_config_cc//:toolchain: cc_toolchain_suite '@local_config_cc//:toolchain' does not contain a toolchain for cpu 'darwin'
WARNING: errors encountered while analyzing target '//:test': it will not be built
INFO: Analyzed target //:test (1 packages loaded, 4140 targets configured).
INFO: Found 0 targets...
INFO: Empty query results
digraph mygraph {
  node [shape=box];
}
ERROR: command succeeded, but not all targets were analyzed
INFO: Elapsed time: 16.547s
INFO: 0 processes.
FAILED: Build did NOT complete successfully

is the output when --keep_going is present

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

😕

how about with --host_cpu=darwin_arm64 and/or --cpu=darwin_arm64?

@jez
Copy link
Contributor Author

jez commented Sep 3, 2021

Looks like that works?

~/stripe/sandbox/rrbutani-workspace
❯ bazel build //:test --cpu=darwin_arm64 --config=arm64
INFO: Build option --cpu has changed, discarding analysis cache.
INFO: Analyzed target //:test (0 packages loaded, 4144 targets configured).
INFO: Found 1 target...
INFO: From Linking test:
ld64.lld: warning: ignoring unknown argument: -headerpad_max_install_names
ld64.lld: warning: -sdk_version is required when emitting min version load command.  Setting sdk version to match provided min version
Target //:test up-to-date:
  bazel-bin/test
INFO: Elapsed time: 15.586s, Critical Path: 11.68s
INFO: 5 processes: 3 internal, 2 darwin-sandbox.
INFO: Build completed successfully, 5 total actions

~/stripe/sandbox/rrbutani-workspace   15s
❯ file bazel-bin/test
bazel-bin/test: Mach-O 64-bit executable arm64

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

oh neat

that's super weird though; I thought ^ wouldn't be necessary anymore 😕

Does just --host_cpu=darwin_arm64 also work?

@jez
Copy link
Contributor Author

jez commented Sep 3, 2021

Neither of those works

~/stripe/sandbox/rrbutani-workspace
❯ bazel build //:test --host_cpu=darwin_arm64
INFO: Build options --cpu, --host_cpu, and --platforms have changed, discarding analysis cache.
ERROR: /private/var/tmp/_bazel/e9cb4153e0861999826e8879b02ae2cc/external/local_config_cc/BUILD:48:19: in cc_toolchain_suite rule @local_config_cc//:toolchain: cc_toolchain_suite '@local_config_cc//:toolchain' does not contain a toolchain for cpu 'darwin'
ERROR: Analysis of target '//:test' failed; build aborted: Analysis of target '@local_config_cc//:toolchain' failed
INFO: Elapsed time: 1.699s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded, 60 targets configured)

~/stripe/sandbox/rrbutani-workspace
❯ bazel build //:test --host_cpu=darwin_arm64 --config=arm64
INFO: Build option --platforms has changed, discarding analysis cache.
ERROR: /private/var/tmp/_bazel/e9cb4153e0861999826e8879b02ae2cc/external/local_config_cc/BUILD:48:19: in cc_toolchain_suite rule @local_config_cc//:toolchain: cc_toolchain_suite '@local_config_cc//:toolchain' does not contain a toolchain for cpu 'darwin'
ERROR: Analysis of target '//:test' failed; build aborted: Analysis of target '@local_config_cc//:toolchain' failed
INFO: Elapsed time: 1.270s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded, 4133 targets configured)

@jez
Copy link
Contributor Author

jez commented Sep 3, 2021

You might also be curious to see the cquery output with --cpu=darwin_arm64

Click to expand
❯ bazel cquery 'deps(//:test)' --output=graph --config=arm64 --keep_going --cpu=darwin_arm64
INFO: Build options --cpu and --host_cpu have changed, discarding analysis cache.
INFO: Analyzed target //:test (0 packages loaded, 4144 targets configured).
INFO: Found 1 target...
digraph mygraph {
  node [shape=box];
  "//:test (f7c5632)"
  "//:test (f7c5632)" -> "//:apple-silicon (f7c5632)"
  "//:test (f7c5632)" -> "//:clang-darwin-arm64-toolchain (HOST)"
  "//:test (f7c5632)" -> "//:test.cc (null)"
  "//:test (f7c5632)" -> "@bazel_tools//tools/cpp:grep-includes (HOST)"
  "//:test (f7c5632)" -> "@bazel_tools//tools/cpp:malloc (f7c5632)"
  "//:test (f7c5632)" -> "@bazel_tools//tools/cpp:toolchain (f7c5632)"
  "//:test (f7c5632)" -> "@bazel_tools//tools/cpp:toolchain_type (f7c5632)"
  "//:test (f7c5632)" -> "@bazel_tools//tools/def_parser:def_parser (HOST)"
  "//:test (f7c5632)" -> "@local_config_platform//:host (f7c5632)"
  "@bazel_tools//tools/def_parser:def_parser (HOST)"
  "@bazel_tools//tools/def_parser:def_parser (HOST)" -> "@bazel_tools//src/conditions:host_windows (HOST)"
  "@bazel_tools//tools/def_parser:def_parser (HOST)" -> "@bazel_tools//tools/def_parser:no_op.bat (null)"
  "@bazel_tools//tools/def_parser:def_parser (HOST)" -> "@local_config_platform//:host (HOST)"
  "@bazel_tools//tools/def_parser:no_op.bat (null)"
  "@bazel_tools//src/conditions:host_windows (HOST)"
  "@bazel_tools//src/conditions:host_windows (HOST)" -> "@local_config_platform//:host (HOST)"
  "@bazel_tools//tools/cpp:malloc (f7c5632)"
  "@bazel_tools//tools/cpp:malloc (f7c5632)" -> "//:apple-silicon (f7c5632)"
  "@bazel_tools//tools/cpp:malloc (f7c5632)" -> "//:clang-darwin-arm64-toolchain (HOST)"
  "@bazel_tools//tools/cpp:malloc (f7c5632)" -> "@bazel_tools//tools/cpp:grep-includes (HOST)"
  "@bazel_tools//tools/cpp:malloc (f7c5632)" -> "@bazel_tools//tools/cpp:toolchain (f7c5632)"
  "@bazel_tools//tools/cpp:malloc (f7c5632)" -> "@bazel_tools//tools/cpp:toolchain_type (f7c5632)"
  "@bazel_tools//tools/cpp:malloc (f7c5632)" -> "@local_config_platform//:host (f7c5632)"
  "@bazel_tools//tools/cpp:toolchain_type (f7c5632)"
  "@bazel_tools//tools/cpp:toolchain (f7c5632)"
  "@bazel_tools//tools/cpp:toolchain (f7c5632)" -> "@local_config_cc//:toolchain (f7c5632)"
  "@local_config_cc//:toolchain (f7c5632)"
  "@local_config_cc//:toolchain (f7c5632)" -> "//:apple-silicon (f7c5632)"
  "@local_config_cc//:toolchain (f7c5632)" -> "@local_config_cc//:cc-compiler-armeabi-v7a (f7c5632)"
  "@local_config_cc//:toolchain (f7c5632)" -> "@local_config_cc//:cc-compiler-darwin_arm64 (f7c5632)"
  "@local_config_cc//:toolchain (f7c5632)" -> "@local_config_platform//:host (f7c5632)"
  "@local_config_cc//:cc-compiler-darwin_arm64 (f7c5632)"
  "@local_config_cc//:cc-compiler-darwin_arm64 (f7c5632)" -> "//:apple-silicon (f7c5632)"
  "@local_config_cc//:cc-compiler-darwin_arm64 (f7c5632)" -> "@bazel_tools//tools/build_defs/cc/whitelists/parse_headers_and_layering_check:disabling_parse_headers_and_layering_check_allowed (null)\n@bazel_tools//tools/build_defs/cc/whitelists/starlark_hdrs_check:loose_header_check_allowed_in_toolchain (null)"
  "@local_config_cc//:cc-compiler-darwin_arm64 (f7c5632)" -> "@bazel_tools//tools/cpp:interface_library_builder (49db493)"
  "@local_config_cc//:cc-compiler-darwin_arm64 (f7c5632)" -> "@bazel_tools//tools/cpp:link_dynamic_library (49db493)"
  "@local_config_cc//:cc-compiler-darwin_arm64 (f7c5632)" -> "@local_config_cc//:compiler_deps (49db493)"
  "@local_config_cc//:cc-compiler-darwin_arm64 (f7c5632)" -> "@local_config_cc//:empty (49db493)"
  "@local_config_cc//:cc-compiler-darwin_arm64 (f7c5632)" -> "@local_config_cc//:local (f7c5632)"
  "@local_config_cc//:cc-compiler-darwin_arm64 (f7c5632)" -> "@local_config_cc//:module.modulemap (null)"
  "@local_config_cc//:cc-compiler-darwin_arm64 (f7c5632)" -> "@local_config_platform//:host (f7c5632)"
  "@local_config_cc//:module.modulemap (null)"
  "@local_config_cc//:local (f7c5632)"
  "@local_config_cc//:local (f7c5632)" -> "//:apple-silicon (f7c5632)"
  "@local_config_cc//:local (f7c5632)" -> "@local_config_platform//:host (f7c5632)"
  "@local_config_cc//:compiler_deps (49db493)"
  "@local_config_cc//:compiler_deps (49db493)" -> "@local_config_cc//:builtin_include_directory_paths (null)"
  "@local_config_cc//:compiler_deps (49db493)" -> "@local_config_cc//:cc_wrapper (49db493)"
  "@local_config_cc//:compiler_deps (49db493)" -> "@local_config_platform//:host (49db493)"
  "@local_config_cc//:cc_wrapper (49db493)"
  "@local_config_cc//:cc_wrapper (49db493)" -> "@local_config_cc//:cc_wrapper.sh (null)"
  "@local_config_cc//:cc_wrapper (49db493)" -> "@local_config_platform//:host (49db493)"
  "@local_config_cc//:cc_wrapper.sh (null)"
  "@local_config_cc//:builtin_include_directory_paths (null)"
  "@local_config_cc//:cc-compiler-armeabi-v7a (f7c5632)"
  "@local_config_cc//:cc-compiler-armeabi-v7a (f7c5632)" -> "//:apple-silicon (f7c5632)"
  "@local_config_cc//:cc-compiler-armeabi-v7a (f7c5632)" -> "@bazel_tools//tools/build_defs/cc/whitelists/parse_headers_and_layering_check:disabling_parse_headers_and_layering_check_allowed (null)\n@bazel_tools//tools/build_defs/cc/whitelists/starlark_hdrs_check:loose_header_check_allowed_in_toolchain (null)"
  "@local_config_cc//:cc-compiler-armeabi-v7a (f7c5632)" -> "@bazel_tools//tools/cpp:interface_library_builder (49db493)"
  "@local_config_cc//:cc-compiler-armeabi-v7a (f7c5632)" -> "@bazel_tools//tools/cpp:link_dynamic_library (49db493)"
  "@local_config_cc//:cc-compiler-armeabi-v7a (f7c5632)" -> "@local_config_cc//:empty (49db493)"
  "@local_config_cc//:cc-compiler-armeabi-v7a (f7c5632)" -> "@local_config_cc//:stub_armeabi-v7a (f7c5632)"
  "@local_config_cc//:cc-compiler-armeabi-v7a (f7c5632)" -> "@local_config_platform//:host (f7c5632)"
  "@local_config_cc//:stub_armeabi-v7a (f7c5632)"
  "@local_config_cc//:stub_armeabi-v7a (f7c5632)" -> "//:apple-silicon (f7c5632)"
  "@local_config_cc//:stub_armeabi-v7a (f7c5632)" -> "@local_config_platform//:host (f7c5632)"
  "@local_config_platform//:host (f7c5632)"
  "@local_config_platform//:host (f7c5632)" -> "@platforms//cpu:x86_64 (f7c5632)"
  "@local_config_platform//:host (f7c5632)" -> "@platforms//os:osx (f7c5632)"
  "@platforms//cpu:x86_64 (f7c5632)"
  "@platforms//cpu:x86_64 (f7c5632)" -> "@platforms//cpu:cpu (f7c5632)"
  "@local_config_cc//:empty (49db493)"
  "@local_config_cc//:empty (49db493)" -> "@local_config_platform//:host (49db493)"
  "@bazel_tools//tools/cpp:link_dynamic_library (49db493)"
  "@bazel_tools//tools/cpp:link_dynamic_library (49db493)" -> "@bazel_tools//tools/cpp:link_dynamic_library.sh (null)"
  "@bazel_tools//tools/cpp:link_dynamic_library (49db493)" -> "@local_config_platform//:host (49db493)"
  "@bazel_tools//tools/cpp:interface_library_builder (49db493)"
  "@bazel_tools//tools/cpp:interface_library_builder (49db493)" -> "@bazel_tools//tools/cpp:build_interface_so (null)"
  "@bazel_tools//tools/cpp:interface_library_builder (49db493)" -> "@local_config_platform//:host (49db493)"
  "@local_config_platform//:host (49db493)"
  "@local_config_platform//:host (49db493)" -> "@platforms//cpu:x86_64 (49db493)"
  "@local_config_platform//:host (49db493)" -> "@platforms//os:osx (49db493)"
  "@platforms//os:osx (49db493)"
  "@platforms//os:osx (49db493)" -> "@platforms//os:os (49db493)"
  "@platforms//os:os (49db493)"
  "@platforms//cpu:x86_64 (49db493)"
  "@platforms//cpu:x86_64 (49db493)" -> "@platforms//cpu:cpu (49db493)"
  "@platforms//cpu:cpu (49db493)"
  "@bazel_tools//tools/cpp:grep-includes (HOST)"
  "@bazel_tools//tools/cpp:grep-includes (HOST)" -> "@bazel_tools//tools/cpp:grep-includes.sh (null)"
  "@bazel_tools//tools/cpp:grep-includes (HOST)" -> "@local_config_platform//:host (HOST)"
  "@bazel_tools//tools/cpp:grep-includes.sh (null)"
  "//:test.cc (null)"
  "//:clang-darwin-arm64-toolchain (HOST)"
  "//:clang-darwin-arm64-toolchain (HOST)" -> "//:clang-darwin-arm64-config (HOST)\n@llvm_toolchain//:empty (HOST)"
  "//:clang-darwin-arm64-toolchain (HOST)" -> "//:clang-darwin-arm64-toolchain-all-files (HOST)"
  "//:clang-darwin-arm64-toolchain (HOST)" -> "//:clang-darwin-arm64-toolchain-archiver-files (HOST)"
  "//:clang-darwin-arm64-toolchain (HOST)" -> "//:clang-darwin-arm64-toolchain-assembler-files (HOST)"
  "//:clang-darwin-arm64-toolchain (HOST)" -> "//:clang-darwin-arm64-toolchain-compiler-files (HOST)"
  "//:clang-darwin-arm64-toolchain (HOST)" -> "//:clang-darwin-arm64-toolchain-linker-files (HOST)"
  "//:clang-darwin-arm64-toolchain (HOST)" -> "@bazel_tools//tools/build_defs/cc/whitelists/parse_headers_and_layering_check:disabling_parse_headers_and_layering_check_allowed (null)\n@bazel_tools//tools/build_defs/cc/whitelists/starlark_hdrs_check:loose_header_check_allowed_in_toolchain (null)"
  "//:clang-darwin-arm64-toolchain (HOST)" -> "@bazel_tools//tools/cpp:interface_library_builder (HOST)"
  "//:clang-darwin-arm64-toolchain (HOST)" -> "@bazel_tools//tools/cpp:link_dynamic_library (HOST)"
  "//:clang-darwin-arm64-toolchain (HOST)" -> "@llvm_toolchain//:objcopy (HOST)"
  "//:clang-darwin-arm64-toolchain (HOST)" -> "@local_config_platform//:host (HOST)"
  "@llvm_toolchain//:objcopy (HOST)"
  "@llvm_toolchain//:objcopy (HOST)" -> "@llvm_toolchain//:bin/llvm-objcopy (null)"
  "@llvm_toolchain//:objcopy (HOST)" -> "@local_config_platform//:host (HOST)"
  "@bazel_tools//tools/cpp:link_dynamic_library (HOST)"
  "@bazel_tools//tools/cpp:link_dynamic_library (HOST)" -> "@bazel_tools//tools/cpp:link_dynamic_library.sh (null)"
  "@bazel_tools//tools/cpp:link_dynamic_library (HOST)" -> "@local_config_platform//:host (HOST)"
  "@bazel_tools//tools/cpp:link_dynamic_library.sh (null)"
  "@bazel_tools//tools/cpp:interface_library_builder (HOST)"
  "@bazel_tools//tools/cpp:interface_library_builder (HOST)" -> "@bazel_tools//tools/cpp:build_interface_so (null)"
  "@bazel_tools//tools/cpp:interface_library_builder (HOST)" -> "@local_config_platform//:host (HOST)"
  "@bazel_tools//tools/cpp:build_interface_so (null)"
  "@bazel_tools//tools/build_defs/cc/whitelists/parse_headers_and_layering_check:disabling_parse_headers_and_layering_check_allowed (null)\n@bazel_tools//tools/build_defs/cc/whitelists/starlark_hdrs_check:loose_header_check_allowed_in_toolchain (null)"
  "//:clang-darwin-arm64-toolchain-linker-files (HOST)"
  "//:clang-darwin-arm64-toolchain-linker-files (HOST)" -> "//:clang-darwin-arm64-toolchain-linker_components (HOST)"
  "//:clang-darwin-arm64-toolchain-linker-files (HOST)" -> "@llvm_toolchain//:cc_wrapper (HOST)"
  "//:clang-darwin-arm64-toolchain-linker-files (HOST)" -> "@local_config_platform//:host (HOST)"
  "//:clang-darwin-arm64-toolchain-compiler-files (HOST)"
  "//:clang-darwin-arm64-toolchain-compiler-files (HOST)" -> "//:clang-darwin-arm64-toolchain-compiler_components (HOST)"
  "//:clang-darwin-arm64-toolchain-compiler-files (HOST)" -> "@llvm_toolchain//:cc_wrapper (HOST)"
  "//:clang-darwin-arm64-toolchain-compiler-files (HOST)" -> "@local_config_platform//:host (HOST)"
  "//:clang-darwin-arm64-toolchain-assembler-files (HOST)"
  "//:clang-darwin-arm64-toolchain-assembler-files (HOST)" -> "@llvm_toolchain//:as (HOST)"
  "//:clang-darwin-arm64-toolchain-assembler-files (HOST)" -> "@llvm_toolchain//:cc_wrapper (HOST)"
  "//:clang-darwin-arm64-toolchain-assembler-files (HOST)" -> "@local_config_platform//:host (HOST)"
  "@llvm_toolchain//:as (HOST)"
  "@llvm_toolchain//:as (HOST)" -> "@llvm_toolchain//:bin/clang (null)"
  "@llvm_toolchain//:as (HOST)" -> "@llvm_toolchain//:bin/llvm-as (null)"
  "@llvm_toolchain//:as (HOST)" -> "@local_config_platform//:host (HOST)"
  "//:clang-darwin-arm64-toolchain-archiver-files (HOST)"
  "//:clang-darwin-arm64-toolchain-archiver-files (HOST)" -> "@llvm_toolchain//:ar (HOST)"
  "//:clang-darwin-arm64-toolchain-archiver-files (HOST)" -> "@llvm_toolchain//:cc_wrapper (HOST)"
  "//:clang-darwin-arm64-toolchain-archiver-files (HOST)" -> "@local_config_platform//:host (HOST)"
  "//:clang-darwin-arm64-toolchain-all-files (HOST)"
  "//:clang-darwin-arm64-toolchain-all-files (HOST)" -> "//:clang-darwin-arm64-toolchain-all_components (HOST)"
  "//:clang-darwin-arm64-toolchain-all-files (HOST)" -> "@llvm_toolchain//:cc_wrapper (HOST)"
  "//:clang-darwin-arm64-toolchain-all-files (HOST)" -> "@local_config_platform//:host (HOST)"
  "@llvm_toolchain//:cc_wrapper (HOST)"
  "@llvm_toolchain//:cc_wrapper (HOST)" -> "@llvm_toolchain//:bin/cc_wrapper.sh (null)"
  "@llvm_toolchain//:cc_wrapper (HOST)" -> "@local_config_platform//:host (HOST)"
  "//:clang-darwin-arm64-toolchain-all_components (HOST)"
  "//:clang-darwin-arm64-toolchain-all_components (HOST)" -> "//:clang-darwin-arm64-toolchain-compiler_components (HOST)"
  "//:clang-darwin-arm64-toolchain-all_components (HOST)" -> "//:clang-darwin-arm64-toolchain-linker_components (HOST)"
  "//:clang-darwin-arm64-toolchain-all_components (HOST)" -> "@llvm_toolchain//:binutils_components (HOST)"
  "//:clang-darwin-arm64-toolchain-all_components (HOST)" -> "@local_config_platform//:host (HOST)"
  "@llvm_toolchain//:binutils_components (HOST)"
  "@llvm_toolchain//:binutils_components (HOST)" -> "@llvm_toolchain//:bin/ar (null)\n@llvm_toolchain//:bin/bugpoint (null)\n@llvm_toolchain//:bin/c-index-test (null)\n@llvm_toolchain//:bin/clang-12 (null)\n@llvm_toolchain//:bin/clang-apply-replacements (null)\n@llvm_toolchain//:bin/clang-change-namespace (null)\n@llvm_toolchain//:bin/clang-check (null)\n@llvm_toolchain//:bin/clang-cl (null)\n@llvm_toolchain//:bin/clang-doc (null)\n@llvm_toolchain//:bin/clang-extdef-mapping (null)\n@llvm_toolchain//:bin/clang-format (null)\n...and 96 more items"
  "@llvm_toolchain//:binutils_components (HOST)" -> "@llvm_toolchain//:bin/cc_wrapper.sh (null)"
  "@llvm_toolchain//:binutils_components (HOST)" -> "@llvm_toolchain//:bin/clang (null)"
  "@llvm_toolchain//:binutils_components (HOST)" -> "@llvm_toolchain//:bin/clang++ (null)\n@llvm_toolchain//:bin/clang-cpp (null)"
  "@llvm_toolchain//:binutils_components (HOST)" -> "@llvm_toolchain//:bin/ld (null)\n@llvm_toolchain//:bin/ld.gold (null)\n@llvm_toolchain//:bin/ld.lld (null)"
  "@llvm_toolchain//:binutils_components (HOST)" -> "@llvm_toolchain//:bin/llvm-ar (null)"
  "@llvm_toolchain//:binutils_components (HOST)" -> "@llvm_toolchain//:bin/llvm-as (null)"
  "@llvm_toolchain//:binutils_components (HOST)" -> "@llvm_toolchain//:bin/llvm-objcopy (null)"
  "@llvm_toolchain//:binutils_components (HOST)" -> "@local_config_platform//:host (HOST)"
  "@llvm_toolchain//:bin/llvm-objcopy (null)"
  "@llvm_toolchain//:bin/llvm-as (null)"
  "@llvm_toolchain//:bin/cc_wrapper.sh (null)"
  "@llvm_toolchain//:bin/ar (null)\n@llvm_toolchain//:bin/bugpoint (null)\n@llvm_toolchain//:bin/c-index-test (null)\n@llvm_toolchain//:bin/clang-12 (null)\n@llvm_toolchain//:bin/clang-apply-replacements (null)\n@llvm_toolchain//:bin/clang-change-namespace (null)\n@llvm_toolchain//:bin/clang-check (null)\n@llvm_toolchain//:bin/clang-cl (null)\n@llvm_toolchain//:bin/clang-doc (null)\n@llvm_toolchain//:bin/clang-extdef-mapping (null)\n@llvm_toolchain//:bin/clang-format (null)\n...and 96 more items"
  "//:clang-darwin-arm64-toolchain-linker_components (HOST)"
  "//:clang-darwin-arm64-toolchain-linker_components (HOST)" -> "@llvm_toolchain//:ar (HOST)"
  "//:clang-darwin-arm64-toolchain-linker_components (HOST)" -> "@llvm_toolchain//:clang (HOST)"
  "//:clang-darwin-arm64-toolchain-linker_components (HOST)" -> "@llvm_toolchain//:ld (HOST)"
  "//:clang-darwin-arm64-toolchain-linker_components (HOST)" -> "@llvm_toolchain//:lib (HOST)"
  "//:clang-darwin-arm64-toolchain-linker_components (HOST)" -> "@local_config_platform//:host (HOST)"
  "//:clang-darwin-arm64-toolchain-linker_components (HOST)" -> "@macos-11.3-sdk//:sysroot (HOST)"
  "@llvm_toolchain//:lib (HOST)"
  "@llvm_toolchain//:lib (HOST)" -> "@llvm_toolchain//:lib/clang/12.0.0/lib/darwin/libclang_rt.cc_kext.a (null)\n@llvm_toolchain//:lib/clang/12.0.0/lib/darwin/libclang_rt.cc_kext_ios.a (null)\n@llvm_toolchain//:lib/clang/12.0.0/lib/darwin/libclang_rt.fuzzer_interceptors_ios.a (null)\n@llvm_toolchain//:lib/clang/12.0.0/lib/darwin/libclang_rt.fuzzer_interceptors_iossim.a (null)\n@llvm_toolchain//:lib/clang/12.0.0/lib/darwin/libclang_rt.fuzzer_interceptors_osx.a (null)\n...and 171 more items"
  "@llvm_toolchain//:lib (HOST)" -> "@local_config_platform//:host (HOST)"
  "@llvm_toolchain//:lib/clang/12.0.0/lib/darwin/libclang_rt.cc_kext.a (null)\n@llvm_toolchain//:lib/clang/12.0.0/lib/darwin/libclang_rt.cc_kext_ios.a (null)\n@llvm_toolchain//:lib/clang/12.0.0/lib/darwin/libclang_rt.fuzzer_interceptors_ios.a (null)\n@llvm_toolchain//:lib/clang/12.0.0/lib/darwin/libclang_rt.fuzzer_interceptors_iossim.a (null)\n@llvm_toolchain//:lib/clang/12.0.0/lib/darwin/libclang_rt.fuzzer_interceptors_osx.a (null)\n...and 171 more items"
  "@llvm_toolchain//:ld (HOST)"
  "@llvm_toolchain//:ld (HOST)" -> "@llvm_toolchain//:bin/ld (null)\n@llvm_toolchain//:bin/ld.gold (null)\n@llvm_toolchain//:bin/ld.lld (null)"
  "@llvm_toolchain//:ld (HOST)" -> "@local_config_platform//:host (HOST)"
  "@llvm_toolchain//:bin/ld (null)\n@llvm_toolchain//:bin/ld.gold (null)\n@llvm_toolchain//:bin/ld.lld (null)"
  "@llvm_toolchain//:ar (HOST)"
  "@llvm_toolchain//:ar (HOST)" -> "@llvm_toolchain//:bin/llvm-ar (null)"
  "@llvm_toolchain//:ar (HOST)" -> "@local_config_platform//:host (HOST)"
  "@llvm_toolchain//:bin/llvm-ar (null)"
  "//:clang-darwin-arm64-toolchain-compiler_components (HOST)"
  "//:clang-darwin-arm64-toolchain-compiler_components (HOST)" -> "@llvm_toolchain//:clang (HOST)"
  "//:clang-darwin-arm64-toolchain-compiler_components (HOST)" -> "@llvm_toolchain//:include (HOST)"
  "//:clang-darwin-arm64-toolchain-compiler_components (HOST)" -> "@local_config_platform//:host (HOST)"
  "//:clang-darwin-arm64-toolchain-compiler_components (HOST)" -> "@macos-11.3-sdk//:sysroot (HOST)"
  "@macos-11.3-sdk//:sysroot (HOST)"
  "@macos-11.3-sdk//:sysroot (HOST)" -> "@local_config_platform//:host (HOST)"
  "@macos-11.3-sdk//:sysroot (HOST)" -> "@macos-11.3-sdk//:usr/bin/cups-config (null)\n@macos-11.3-sdk//:usr/bin/curl-config (null)\n@macos-11.3-sdk//:usr/bin/krb5-config (null)\n@macos-11.3-sdk//:usr/bin/ncurses5.4-config (null)\n@macos-11.3-sdk//:usr/bin/net-snmp-config (null)\n@macos-11.3-sdk//:usr/bin/pcap-config (null)\n@macos-11.3-sdk//:usr/bin/php-config (null)\n@macos-11.3-sdk//:usr/bin/xml2-config (null)\n@macos-11.3-sdk//:usr/bin/xslt-config (null)\n...and 3388 more items"
  "@macos-11.3-sdk//:usr/bin/cups-config (null)\n@macos-11.3-sdk//:usr/bin/curl-config (null)\n@macos-11.3-sdk//:usr/bin/krb5-config (null)\n@macos-11.3-sdk//:usr/bin/ncurses5.4-config (null)\n@macos-11.3-sdk//:usr/bin/net-snmp-config (null)\n@macos-11.3-sdk//:usr/bin/pcap-config (null)\n@macos-11.3-sdk//:usr/bin/php-config (null)\n@macos-11.3-sdk//:usr/bin/xml2-config (null)\n@macos-11.3-sdk//:usr/bin/xslt-config (null)\n...and 3388 more items"
  "@llvm_toolchain//:include (HOST)"
  "@llvm_toolchain//:include (HOST)" -> "@llvm_toolchain//:include/c++/v1/__availability (null)\n@llvm_toolchain//:include/c++/v1/__bit_reference (null)\n@llvm_toolchain//:include/c++/v1/__bits (null)\n@llvm_toolchain//:include/c++/v1/__bsd_locale_defaults.h (null)\n@llvm_toolchain//:include/c++/v1/__bsd_locale_fallbacks.h (null)\n@llvm_toolchain//:include/c++/v1/__config (null)\n@llvm_toolchain//:include/c++/v1/__cxxabi_config.h (null)\n@llvm_toolchain//:include/c++/v1/__debug (null)\n...and 363 more items"
  "@llvm_toolchain//:include (HOST)" -> "@local_config_platform//:host (HOST)"
  "@llvm_toolchain//:include/c++/v1/__availability (null)\n@llvm_toolchain//:include/c++/v1/__bit_reference (null)\n@llvm_toolchain//:include/c++/v1/__bits (null)\n@llvm_toolchain//:include/c++/v1/__bsd_locale_defaults.h (null)\n@llvm_toolchain//:include/c++/v1/__bsd_locale_fallbacks.h (null)\n@llvm_toolchain//:include/c++/v1/__config (null)\n@llvm_toolchain//:include/c++/v1/__cxxabi_config.h (null)\n@llvm_toolchain//:include/c++/v1/__debug (null)\n...and 363 more items"
  "@llvm_toolchain//:clang (HOST)"
  "@llvm_toolchain//:clang (HOST)" -> "@llvm_toolchain//:bin/clang (null)"
  "@llvm_toolchain//:clang (HOST)" -> "@llvm_toolchain//:bin/clang++ (null)\n@llvm_toolchain//:bin/clang-cpp (null)"
  "@llvm_toolchain//:clang (HOST)" -> "@local_config_platform//:host (HOST)"
  "@llvm_toolchain//:bin/clang++ (null)\n@llvm_toolchain//:bin/clang-cpp (null)"
  "@llvm_toolchain//:bin/clang (null)"
  "//:clang-darwin-arm64-config (HOST)\n@llvm_toolchain//:empty (HOST)"
  "//:clang-darwin-arm64-config (HOST)\n@llvm_toolchain//:empty (HOST)" -> "@local_config_platform//:host (HOST)"
  "@local_config_platform//:host (HOST)"
  "@local_config_platform//:host (HOST)" -> "@platforms//cpu:x86_64 (HOST)"
  "@local_config_platform//:host (HOST)" -> "@platforms//os:osx (HOST)"
  "@platforms//os:osx (HOST)"
  "@platforms//os:osx (HOST)" -> "@platforms//os:os (HOST)"
  "@platforms//os:os (HOST)"
  "@platforms//cpu:x86_64 (HOST)"
  "@platforms//cpu:x86_64 (HOST)" -> "@platforms//cpu:cpu (HOST)"
  "@platforms//cpu:cpu (HOST)"
  "//:apple-silicon (f7c5632)"
  "//:apple-silicon (f7c5632)" -> "@platforms//cpu:arm64 (f7c5632)"
  "//:apple-silicon (f7c5632)" -> "@platforms//os:osx (f7c5632)"
  "@platforms//os:osx (f7c5632)"
  "@platforms//os:osx (f7c5632)" -> "@platforms//os:os (f7c5632)"
  "@platforms//os:os (f7c5632)"
  "@platforms//cpu:arm64 (f7c5632)"
  "@platforms//cpu:arm64 (f7c5632)" -> "@platforms//cpu:cpu (f7c5632)"
  "@platforms//cpu:cpu (f7c5632)"
}
INFO: Elapsed time: 1.749s
INFO: 0 processes.
INFO: Build completed successfully, 0 total actions

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

thanks, I was just about to ask 😛

@rrbutani
Copy link
Collaborator

rrbutani commented Sep 3, 2021

that's super weird; maybe something leftover from the _toolchain/CROSSTOOL era makes cc_binarys have an implicit dep on @local_config_cc//:toolchain or something

regardless – I'm glad it works! even though it really does seem like you shouldn't have to pass in --cpu=darwin_arm64

Just to be sure: does actually running the binary that's produced work?

@vinistock
Copy link

vinistock commented Jan 14, 2022

I've been working on compiling the Sorbet project using the provided ARM64 toolchains. It seems that the playground workspace changes work as intended, but there is one thing missing for me to be able to fully verify the build completes and works.

The only change I had to make to the playground code was setting custom_target_triple = "arm64-apple-macosx12.1.0" in the cc_toolchain_config. If I don't do that, we end up having the triple as iOS for some reason (which doesn't happen in the playground, but does happen in Sorbet).

Also, despite being on the most recent macos, I still have to use -mlinker-version=450 or else linking fails because the flag platform_version is not recognized.

The last remaining error is when linking we get the error unable to find framework Foundation. This framework is used by one of Sorbet's dependencies (abseil). I tried a few things to fix this, but have been unable so far. Here is what I tried:

  • Adding the framework paths using -F/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/System/Library/Frameworks does not work (the search paths are correct, but linking still can't find it)
  • Same thing for -F/System/Library/Frameworks
  • Tried compiling with the -framework Foundation flag using LLVM 12 and 13 standalone outside of bazel-toolchain. My understanding based on the release notes for LLD 13 was that support for ARM64 was added in that version. However, I was able to compile using the framework flag using LLVM standalone both version 12 and 13. I can't test LLVM 13 with the playground changes though, because that version doesn't support LLVM 13

Would it be possible to rebase that playground version with the current LLVM 13 support that is already in main? Do you believe that is likely the cause of frameworks not working or could it be something else that I am missing?

I did try to use the current version in main to try to upgrade to LLVM 13, but doing so fails with this error, so maybe we need something else too.

Error in fail: Unknown LLVM release: clang+llvm-13.0.0-arm64-apple-darwin.tar.xz

Note: another thing worth noting is that, even when using the playground, we must sign the resulting binary with codesign or else M1 machines will not execute it. I noticed I don't have to do that when using LLVM standalone, so I wonder if this is within the scope of bazel-toolchain or not (or if LLVM 13 fixes it).

@sluongng
Copy link

@vinistock which playground workspace are you referring to? Do you have a fork/branch that you could share?

@rrbutani
Copy link
Collaborator

@vinistock which playground workspace are you referring to? Do you have a fork/branch that you could share?

I think they're referring to the workspace attached in this comment.

@rrbutani
Copy link
Collaborator

The only change I had to make to the playground code was setting custom_target_triple = "arm64-apple-macosx12.1.0" in the cc_toolchain_config. If I don't do that, we end up having the triple as iOS for some reason (which doesn't happen in the playground, but does happen in Sorbet).

Hmm. Are there maybe other toolchains registered by the Sorbet workspace? Can you verify that the aarch64-apple-darwin actually gets used (--toolchain_resolution_debug)? The constraints are generated from the target triple so it's definitely possible something shifted/is broken and is causing arm64-apple-* to generate constraints that result in the toolchain being used but not aarch64-apple-darwin (iirc arm64 and aarch64 are the same CPU constraint though). I'll have to look into it more, later.

Also, despite being on the most recent macos, I still have to use -mlinker-version=450 or else linking fails because the flag platform_version is not recognized.

This is odd; lld has definitely had support for -platform_version for several releases. 😕

The last remaining error is when linking we get the error unable to find framework Foundation. This framework is used by one of Sorbet's dependencies (abseil). I tried a few things to fix this, but have been unable so far. Here is what I tried:

* Adding the framework paths using `-F/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/System/Library/Frameworks` does not work (the search paths are correct, but linking still can't find it)

* Same thing for `-F/System/Library/Frameworks`

* Tried compiling with the `-framework Foundation` flag using LLVM 12 and 13 standalone outside of `bazel-toolchain`. My understanding based on the [release notes for LLD 13](https://releases.llvm.org/13.0.0/tools/lld/docs/ReleaseNotes.html#id9) was that support for ARM64 was added in that version. However, I was able to compile using the framework flag using LLVM standalone both version 12 and 13. I can't test LLVM 13 with the playground changes though, because that version doesn't support LLVM 13

Not sure what's going on here; bazel-toolchain should add that path to the search directories anyways (through the sysroot which is set to be /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk in the playground workspace).

Maybe this has finally become an issue? It'd be helpful to compare against Sorbet on regular x86_64 macOS with the same toolchain (if that builds successfully).

Would it be possible to rebase that playground version with the current LLVM 13 support that is already in main?

The playground is based on #85; main has grown some significant changes in the meantime that make rebasing that PR not entirely trivial.

Do you believe that is likely the cause of frameworks not working or could it be something else that I am missing?

I'm not really sure what's going on but I don't think it's an LLVM version issue.

I did try to use the current version in main to try to upgrade to LLVM 13, but doing so fails with this error, so maybe we need something else too.

Error in fail: Unknown LLVM release: clang+llvm-13.0.0-arm64-apple-darwin.tar.xz

The version of this repo currently in main is trying to grab arm64 binaries for LLVM on macOS for which there aren't official LLVM releases. The playground workspace just uses the x86_64 macOS LLVM binaries through Rosetta IIRC. (main also doesn't have the custom toolchain stuff #85 has).

Note: another thing worth noting is that, even when using the playground, we must sign the resulting binary with codesign or else M1 machines will not execute it. I noticed I don't have to do that when using LLVM standalone, so I wonder if this is within the scope of bazel-toolchain or not (or if LLVM 13 fixes it).

My understanding is that lld does sign the binaries it produces but that this only landed in LLVM 13.

@jez do you remember if you needed to explicitly sign the binaries you got out of Bazel?

Regardless, I think this is very much in scope for bazel-toolchain.


@vinistock Thanks for the notes! I now have access to an Apple Silicon machine; I'm hoping to get back to working on this issue early next week.

If possible, can you post your modified workspace/the commands you are running?

@vinistock
Copy link

@rrbutani thank you so much for the detailed and quick response. I really appreciate your assistance.

Debugging the toolchains

I'm not exactly sure how I should be reading this output. Please, let me know if you need more information. But running build using the --toolchain_resolution_debug flag, prints these statements related to the ARM64 toolchain (plus other information related to other things).

It seems to reject the toolchain, but then selects it afterwards. Not sure if there's an issue in toolchain resolution here.

INFO: ToolchainResolution:   Type @bazel_tools//tools/cpp:toolchain_type: target platform //:apple-silicon: execution @local_config_platform//:host: Selected toolchain //:clang-darwin-arm64-toolchain

...

INFO: ToolchainResolution: Target platform //:apple-silicon: Selected execution platform @local_config_platform//:host, type @bazel_tools//tools/cpp:toolchain_type -> toolchain //:clang-darwin-arm64-toolchain

...

INFO: ToolchainResolution: Target platform @local_config_platform//:host: Selected execution platform @local_config_platform//:host, type @bazel_tools//tools/cpp:toolchain_type -> toolchain @llvm_toolchain_12_0_0//:cc-clang-darwin

Comparing with x86

With the same style of configuration of the playground, if I use --platforms=@//:apple-x86 I can compile Sorbet and do not have the error related to the Foundation framework. So, for some reason, this indeed seems like an ARM64 related issue.

Playground example

The playground example I meant is rrbutani-workspace. I actually forgot to upload, but I modified that workspace to reproduce the framework issue. It's the same workspace, but I changed the triple and added a linkopt for -framework Foundation, which is exactly what abseil does. It will probably be easier to debug this in the playground, since it has a much simpler configuration.

In this playground repo, if the link option for framework is present I cannot compile with the error unable to find framework Foundation (despite the search paths being correct).

If I remove the -framework Foundation flag, I can compile, but cannot run the executable without force signing it with codesign.

Notice that in this playground repo, I can also compile the x86 version successfully even with the framework flag (exactly as in Sorbet).

The command I'm using to compile is

# Successfully compiles even with the -framework Foundation linkopt
bazel build //:test --config=x86

# Fails with `unable to find framework Foundation`
bazel build //:test --config=arm64

plaground-framework.zip

@vinistock
Copy link

Since we determined that we'll need LLVM 13 for signing binaries, I have begun looking into what we'll need to upgrade Sorbet to LLVM 13, using the latest version of bazel-toolchain.

The upgrade is relatively smooth, but I bumped into another issue that I'm having trouble figuring out. Please, let me know if this is not within the scope of bazel-toolchain.

What the upgrade consisted of

Basically, there were two steps

  • upgrading the LLVM version to 13.0.0
  • changing the crosstool_top option to be incompatible_enable_cc_toolchain_resolution as instructed in the README

The bug

After the upgrade, I was able to determine that this invocation to find_cpp_toolchain started returning the wrong toolchain.

In version 12, the toolchain object we get has all the LLVM paths in it (includes, cc_wrapper and so on), all pointing to (bazel info output_base)/external/llvm_toolchain/....

With version 13, the latest bazel-toolchain and the incompatible_enable_cc_toolchain_resolution flag, the same call returns a different toolchain, where the compiler path is different and points to external/local_config_cc/wrapped_clang. The object includes none of the LLVM paths.

In addition to returning the wrong toolchain, I also noticed that all paths are now relative, whereas when using version 12 they are all absolute.

Attempts to fix it

In the Bazel issue linked in the README describing the migration for incompatible_enable_cc_toolchain_resolution, they mention that find_cpp_toolchain has been deprecated and that we should instead use find_cc_toolchain.

I tried doing the migration, making sure that the rules depended on the right toolchains and that we were now using the new rules_cc dependency. Unfortunately, this did not fix the problem and the new find_cc_toolchain still returns the same incorrect response.

I believe this might be related to the incompatible_enable_cc_toolchain_resolution flag and not LLVM 13 itself.

Any ideas on why the LLVM toolchain is not being returned by find_cc_toolchain? Also, please let me know if I can provide more information to be more helpful in the investigations.

@rrbutani
Copy link
Collaborator

rrbutani commented Jan 21, 2022

Please, let me know if this is not within the scope of bazel-toolchain.

This is a little out of scope for this project but that's fine.

After the upgrade, I was able to determine that this invocation to find_cpp_toolchain started returning the wrong toolchain.

In version 12, the toolchain object we get has all the LLVM paths in it (includes, cc_wrapper and so on), all pointing to (bazel info output_base)/external/llvm_toolchain/....

With version 13, the latest bazel-toolchain and the incompatible_enable_cc_toolchain_resolution flag, the same call returns a different toolchain, where the compiler path is different and points to external/local_config_cc/wrapped_clang. The object includes none of the LLVM paths.

In addition to returning the wrong toolchain, I also noticed that all paths are now relative, whereas when using version 12 they are all absolute.

It sounds like toolchain resolution is giving you back the toolchain installed on your machine instead of the one from bazel-toolchain.

Did you also remember to add a call to llvm_register_toolchains() in your WORKSPACE (or add --extra_toolchains=... to your .bazelrc)? These don't seem to be in the upstream version of sorbet.

In case you haven't already come across it, these docs do a good job explaining toolchains, toolchain resolution, and how rulesets should use toolchains. For C/C++ toolchains the actual toolchain lookup (ctx.toolchains) is handled for you by find_cpp_toolchain in a way that's compatible with both workspaces that are and are not using toolchain resolution.

Making the changes described in ^ (i.e. using ctx.toolchains instead of the legacy hidden attribute) isn't required to use toolchain resolution for C/C++ but if you do end up making those changes you may also want to add incompatible_use_toolchain_transition = True to your rule definition.

Building a target that uses that rule with --toolchain_resolution_debug can also be a good way to try to figure out exact why Bazel is picking another toolchain to give to that rule.

In the Bazel issue linked in the README describing the migration for incompatible_enable_cc_toolchain_resolution, they mention that find_cpp_toolchain has been deprecated and that we should instead use find_cc_toolchain.

I tried doing the migration, making sure that the rules depended on the right toolchains and that we were now using the new rules_cc dependency. Unfortunately, this did not fix the problem and the new find_cc_toolchain still returns the same incorrect response.

I'm fairly confident this is unrelated; find_cpp_toolchain in @rules_cc is "deprecated" but just calls find_cc_toolchain anyways and @rules_cc's find_cc_toolchain is essentially a verbatim copy of @bazel_tools's. Both have the logic to use ctx.toolchains when C/C++ toolchain resolution is enabled.

There used to be a bug in the @rules_cc impl caused by a bug in how Bazel handles aliases in string form labels when used with ctx.toolchains but it's since been "fixed" with this workaround; just make sure you're using a version of @rules_cc newer than that commit if you're planning to keep your @rules_cc changes.

@vinistock
Copy link

@rrbutani once again, thank you for the quick and detailed response. I indeed had forgotten to invoke llvm_register_toolchains in the branch I'm working on the LLVM upgrade. I apologize for the confusion, I'm working on multiple branches and got lost. Invoking llvm_register_toolchains fixes the issues with finding the right toolchains.

I'm now hitting one last error before successfully compiling the custom Ruby build. One of Ruby's arguments during compilation is -install_name @execution_path/../lib/libruby.2.7.dylib. The error is

.../llvm_toolchain_13_0_0/bin/cc_wrapper.sh: line 54: executable_path/../lib/libruby.2.7.dylib: No such file or directory

The reason this happens is because the cc_wrapper tries to read the paths for arguments beginning with @ here. However, Ruby adds the install_name flag when compiling libruby.2.7.dylib itself, which means the file indeed doesn't exist at that step. The command has this form

.../cc_wrapper ... -install_name @execution_path/../lib/libruby.2.7.dylib -o libruby.2.7.dylib

Notice that the file not existing only fails because cc_wrapper tries to read it. Invoking clang directly from the llvm_toolchain_llvm/bin folder works. Commenting out the part of cc_wrapper that attempts to read paths starting with @ also makes the build succeed. The step that tries to read the paths was added in #97.

Do you have any context as to why that is necessary or if we can workaround it somehow?

@rrbutani
Copy link
Collaborator

rrbutani commented Jan 26, 2022

I apologize for the confusion, I'm working on multiple branches and got lost. Invoking llvm_register_toolchains fixes the issues with finding the right toolchains.

No worries! Glad to hear it was a simple fix.

I'm now hitting one last error before successfully compiling the custom Ruby build. One of Ruby's arguments during compilation is -install_name @execution_path/../lib/libruby.2.7.dylib. The error is

.../llvm_toolchain_13_0_0/bin/cc_wrapper.sh: line 54: executable_path/../lib/libruby.2.7.dylib: No such file or directory

The reason this happens is because the cc_wrapper tries to read the paths for arguments beginning with @ here. However, Ruby adds the install_name flag when compiling libruby.2.7.dylib itself, which means the file indeed doesn't exist at that step. The command has this form

.../cc_wrapper ... -install_name @execution_path/../lib/libruby.2.7.dylib -o libruby.2.7.dylib

Notice that the file not existing only fails because cc_wrapper tries to read it. Invoking clang directly from the llvm_toolchain_llvm/bin folder works. Commenting out the part of cc_wrapper that attempts to read paths starting with @ also makes the build succeed. The step that tries to read the paths was added in #97.

Do you have any context as to why that is necessary or if we can workaround it somehow?

This is a good catch!

That snippet was added to support parameter files.

The macOS cc wrapper script inspects the full command line in order to remap libraries that are being linked against to their fully resolved paths, taking into account the rpaths added to the binary. I don't have first-hand experience with this but this is allegedly because of some oddness having to do with runpaths added to binaries (this has some context; I think it's that the paths are relative to the build dir and not the workspace that's causing the issue but I have no idea what's introducing the -Wl,-rpaths in the first place).

Anyways, for that reason we need to actually read what's in the parameter file. The PR in this repo you linked to was essentially copied from upstream (this commit); in general the logic in the macOS wrapper mostly comes from upstream.

The issue here, of course, is that the @ in -install_name @executable_path/... does not signify a parameter file!

What's peculiar to me is that upstream Bazel seems to fail on this in the exact same way (here's a minimal test case). Perhaps it's simply not common for users to want to generate dylibs with install_paths from Bazel and it hasn't come up? Not sure.


I think extending the logic in the macOS wrapper to skip processing args starting with @ (like @executable_path/..., @load_path/.., @rpath/..., etc.) when the preceeding arg is -install_name or -rpath would fix the error you're running into.

I have a few concerns though:

  • is this genuinely an error with the default Bazel toolchain? does the workspace you're building work without bazel-toolchain or with any other toolchains?
  • are -install_name and -rpath the only args that accept @ form args?
  • what kind of path remapping should we be doing for -install_name and -rpath?
    • I think the answer is "none" for -install_name but what about -rpath? Don't we have the same issues as with -Wl,-rpath? -rpath certainly does seem to just expand out into -rpath to the linker, experimentally. Are we just banking on users using the not-macOS specific -Wl form?
  • why does the logic in the wrapper only use @loader_path?

@vinistock If possible, it'd be super helpful if you could point me to where in your workspace that flag is getting added.

@rrbutani
Copy link
Collaborator

rrbutani commented Jan 26, 2022

Actually, I'm going to open a new issue for this.

#135

@vinistock
Copy link

I worked through some more investigations on this and thought I'd bring up another thing I'm seeing. I'm trying to change a dependency of a package by matching on a config_setting that uses platform constraint values. Basically, we need a different dependency on darwin-arm64.

The code I have on a Sorbet branch is equivalent to this:

# BUILD file

platform(
    name = "darwin_arm",
    constraint_values = [
        "@platforms//cpu:arm64",
        "@platforms//os:osx",
    ]
)

platform(
    name = "darwin_x86",
    constraint_values = [
        "@platforms//cpu:x86_64",
        "@platforms//os:osx",
    ],
)

config_setting(
    name = "darwin",
    constraint_values = [
        "@platforms//cpu:x86_64",
        "@platforms//os:osx",
    ]
)

config_setting(
    name = "darwin_arm64",
    constraint_values = [
        "@platforms//cpu:arm64",
        "@platforms//os:osx",
    ]
)

# Library BUILD file
cc_library(
  ...
  deps = select({
    "//:darwin": ["x86_specific_dependency"],
    "//:darwin_arm64": ["portable_version_dependency"],
    "//conditions:default": ["portable_version_dependency"]
  })
)

However, we always match //:darwin even when selecting the platform to be --platforms=@//:darwin_arm. If I remove the entry for darwin from the select, it falls under default and compiles successfully. Could this be related to the new toolchains?

This is the Sorbet branch I'm currently playing with if that helps.

@sluongng
Copy link

sluongng commented Feb 9, 2022

@vinistock make sure that you are using a bazel arm64 release and not bazel amd64 running on rosetta. I think there is also a flag that help debug toolchain resolution that can be passed to bazel.

@vinistock
Copy link

@sluongng you are right! I was indeed running an x86 version of bazel. Upgrading to 5.0.0 with an ARM64 executable seems to fix not only the issue with conditionally compiling the dependency, but also the -framework flag problem!!

I will spend some time validating it, but it looks like that was the issue all along.

@vinistock
Copy link

I can confirm that using the ARM64 version of Bazel fixes the -framework issue and now the config_settings match properly. In the branch I'm working on, it's already possible to compile Sorbet in debug and release mode, as well as running some tests.

I am, however, having the same issue I was having before of the custom Ruby build selecting the wrong toolchains when using find_cpp_toolchain. This time, I'm sure I am invoking llvm_register_toolchains.

While debugging, I noticed that the arm64 toolchain is rejected by using the --toolchain_resolution_debug flag.

INFO: ToolchainResolution:     Type @bazel_tools//tools/cpp:toolchain_type: execution platform @local_config_platform//:host: Rejected toolchain //:clang-darwin-arm64-toolchain; mismatching values: x86_64

I was also able to confirm that changing the exec_compatible_with configuration in the toolchain from x86 to arm64 fixes the resolution and selects the right toolchain for the custom Ruby build. However, that breaks the regular Sorbet build.

toolchain(
    name = "clang-darwin-arm64",
    exec_compatible_with = [
        "@platforms//cpu:x86_64", # If this value is x86_64, the toolchain is rejected. If we use arm64, the toolchain is selected
        "@platforms//os:osx",
    ],
    target_compatible_with = [
        "@platforms//cpu:arm64",
        "@platforms//os:osx",
    ],
    toolchain = ":clang-darwin-arm64-toolchain",
    toolchain_type = "@bazel_tools//tools/cpp:toolchain_type",
)

I'm not sure why, when using exec_compatible_with x86, the right toolchain is selected for Sorbet, but not for the Ruby build. Any ideas on what could be happening?

@Kernald
Copy link
Contributor

Kernald commented Aug 16, 2022

Hi! Just wondering if any progress has been made on that since February?

@totnine
Copy link

totnine commented Sep 13, 2022

LLVM 15.0.0 was released, which contains 'clang+llvm-15.0.0-arm64-apple-darwin21.0.tar.xz'.

@omerbenamram
Copy link

omerbenamram commented Sep 20, 2022

I checked it out and it works really nice! (https://github.com/omerbenamram/bazel-toolchain/tree/llvm-15.0.0)
Unfortunately there are still no binaries for x86 linux (which is quite strage) for llvm 15 - so I guess we can't upstream this yet.

binji added a commit to figma/bazel-toolchain that referenced this issue Sep 28, 2022
We've been using the bazel-toolchain library so we can use our custom LLVM
toolchain. This was working perfectly on CI because we compile on Linux there.

On our local M1 machines this was falling back on our xcode compiler instead of
the custom toolchain because bazel-toolchain doesn't have support for the
darwin-arm64 configuration. There is an existing GitHub issue about this
(bazel-contrib#88) with a lot of activity
but nothing landed. There also seemed to be some outstanding PRs, but they are
complete rewrites; and it's not clear when they would land. My guess is that we
may be able to drop our patches at some point, but we shouldn't wait for it.

The main changes here are as follows:

* We add support for a darwin-arm64 toolchain to bazel-toolchain.

* This requires some additional configuration to handle the various ridiculous
goop that macOS builds need; bazel already has toolchain support for this, so
most of it is copied from the bazelbuild repo. It would be great to upstream
this, but that might be a significant effort.

* It appears that bazel-toolchain handled everything using the
`unix_cc_toolchain_config`, which only had limited supported for Objective C.
Since we need to fully be able to compile objective C applications, we need to
include `osx_cc_toolchain_config` as well.

* The `osx_cc_toolchain_config` uses a clang wrapper called `wrapped_clang` and
`wrapped_clang_pp`. This is written as a C++ source file for some reason, so it
needs to be compiled as part of the toolchain build. This wrapper forwards to
xcode clang/clang++ by default, but we want to forward to our own toolchain
instead. So we have to modify the wrapper slightly to look at some environment
variables to determine where the real toolchain is.

* The clang compiler includes some directories by default, specifically ones
that are included with the compiler. This works if you run the toolchain
manually, but bazel doesn't like this because it wants to make sure that it
knows about all included headers. Normally you should be able to use the
`cxx_builtin_include_directories` attribute to notify bazel about these
directories, but it doesn't seem to work. It seems that the problem may be that
the included paths are absolute instead of relative, so they don't appear to be
the same path to bazel. Using clang's `-no-canonical-prefixes` flag is meant to
fix this, but it didn't seem to work for me. As an alternate workaround, I made
sure to include all the built-in directories again via `-isystem` so that the
paths match exactly.
@iamricard
Copy link

iamricard commented Oct 18, 2022

I checked it out and it works really nice! (https://github.com/omerbenamram/bazel-toolchain/tree/llvm-15.0.0) Unfortunately there are still no binaries for x86 linux (which is quite strage) for llvm 15 - so I guess we can't upstream this yet.

15.0.2 does have a couple of x86 linux builds. i tested the RHEL on an ubuntu machine and it worked fine, so maybe we could just use that? more artifacts are listed in the discourse thread for the release.

off version 0.7.2 of this repo, a patch like this seems to work (props to @omerbenamram for basically putting this diff together):

diff --git a/toolchain/cc_toolchain_config.bzl b/toolchain/cc_toolchain_config.bzl
index 8785a8e..1ac65a5 100644
--- a/toolchain/cc_toolchain_config.bzl
+++ b/toolchain/cc_toolchain_config.bzl
@@ -68,6 +68,15 @@ def cc_toolchain_config(
             "clang",
             "darwin_x86_64",
             "darwin_x86_64",
+        ),
+         "darwin-aarch64": (
+            "clang-aarch64-darwin",
+            "aarch64-apple-macosx",
+            "darwin",
+            "macosx",
+            "clang",
+            "darwin_aarch64",
+            "darwin_aarch64",
         ),
         "linux-aarch64": (
             "clang-aarch64-linux",
diff --git a/toolchain/internal/common.bzl b/toolchain/internal/common.bzl
index 7493c64..53b3b53 100644
--- a/toolchain/internal/common.bzl
+++ b/toolchain/internal/common.bzl
@@ -12,7 +12,7 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.

-SUPPORTED_TARGETS = [("linux", "x86_64"), ("linux", "aarch64"), ("darwin", "x86_64")]
+SUPPORTED_TARGETS = [("linux", "x86_64"), ("linux", "aarch64"), ("darwin", "x86_64"), ("darwin", "aarch64")]

 host_tool_features = struct(
     SUPPORTS_ARG_FILE = "supports_arg_file",
@@ -68,7 +68,10 @@ def arch(rctx):
     ])
     if exec_result.return_code:
         fail("Failed to detect machine architecture: \n%s\n%s" % (exec_result.stdout, exec_result.stderr))
-    return exec_result.stdout.strip()
+    arch = exec_result.stdout.strip()
+    if arch == "arm64":
+        return "aarch64"
+    return arch

 def os_arch_pair(os, arch):
     return "{}-{}".format(os, arch)
diff --git a/toolchain/internal/llvm_distributions.bzl b/toolchain/internal/llvm_distributions.bzl
index 074ed84..43473ae 100644
--- a/toolchain/internal/llvm_distributions.bzl
+++ b/toolchain/internal/llvm_distributions.bzl
@@ -207,6 +207,12 @@ _llvm_distributions = {
     "clang+llvm-14.0.0-x86_64-apple-darwin.tar.xz": "cf5af0f32d78dcf4413ef6966abbfd5b1445fe80bba57f2ff8a08f77e672b9b3",
     "clang+llvm-14.0.0-x86_64-linux-gnu-ubuntu-18.04.tar.xz": "61582215dafafb7b576ea30cc136be92c877ba1f1c31ddbbd372d6d65622fef5",
     "clang+llvm-14.0.0-x86_64-linux-sles12.4.tar.xz": "78f70cc94c3b6f562455b15cebb63e75571d50c3d488d53d9aa4cd9dded30627",
+
+    # 15.0.2
+    "clang+llvm-15.0.2-arm64-apple-darwin21.0.tar.xz": "8c33f807bca56568b7060d0474daf63c8c10ec521d8188ac76362354d313ec58",
+    "clang+llvm-15.0.2-x86_64-apple-darwin.tar.xz": "a37ec6204f555605fa11e9c0e139a251402590ead6e227fc72da193e03883882",
+    "clang+llvm-15.0.2-aarch64-linux-gnu.tar.xz": "527ed550784681f95ec7a1be8fbf5a24bd03d7da9bf31afb6523996f45670be3",
+    "clang+llvm-15.0.2-x86_64-unknown-linux-gnu-rhel86.tar.xz": "f48f479e91ee7297ed8306c9d4495015691237cd91cc5330d3e1ee057b0548bd",
 }

 # Note: Unlike the user-specified llvm_mirror attribute, the URL prefixes in
@@ -229,6 +235,7 @@ _llvm_distributions_base_url = {
     "13.0.0": "https://github.com/llvm/llvm-project/releases/download/llvmorg-",
     "13.0.1": "https://github.com/llvm/llvm-project/releases/download/llvmorg-",
     "14.0.0": "https://github.com/llvm/llvm-project/releases/download/llvmorg-",
+    "15.0.2": "https://github.com/llvm/llvm-project/releases/download/llvmorg-",
 }

 def _get_auth(ctx, urls):
diff --git a/toolchain/tools/llvm_release_name.py b/toolchain/tools/llvm_release_name.py
index 39505cc..3485d61 100755
--- a/toolchain/tools/llvm_release_name.py
+++ b/toolchain/tools/llvm_release_name.py
@@ -30,7 +30,12 @@ def _patch_llvm_version(llvm_version):

 def _darwin(llvm_version, arch):
     major_llvm_version = _major_llvm_version(llvm_version)
-    suffix = "darwin-apple" if major_llvm_version == 9 else "apple-darwin"
+    if major_llvm_version == 9:
+        suffix = "darwin-apple"
+    elif arch == "arm64":
+        suffix = "apple-darwin21.0"
+    else:
+        suffix = "apple-darwin"
     return "clang+llvm-{llvm_version}-{arch}-{suffix}.tar.xz".format(
         llvm_version=llvm_version, arch=arch, suffix=suffix)

@@ -86,6 +91,8 @@ def _linux(llvm_version, distname, version, arch):
     # If you find this mapping wrong, please send a Pull Request on Github.
     if arch in ["aarch64", "armv7a", "mips", "mipsel"]:
         os_name = "linux-gnu"
+    elif major_llvm_version == 15:
+        os_name = "unknown-linux-gnu-rhel86"
     elif distname == "freebsd":
         os_name = "unknown-freebsd-%s" % version
     elif distname == "suse":

@jbott
Copy link

jbott commented Oct 29, 2022

It looks like 15.0.3 was recently released, but unfortunately still does not include any x86_64 linux binaries.

For now, I'm used the patch provided by @iamricard and @omerbenamram. It seems to work well, with the addition of an extra patch to fix the strip_prefix for the clang+llvm-15.0.2-x86_64-unknown-linux-gnu-rhel86.tar.xz file. It seems to be only clang+llvm-15.0.2-x86_64-unknown-linux-gnu, missing the -rhel86 suffix. I've only checked the darwin aarch64 and linux x86_64 builds though, so this may be the case with others.

I have an example standalone project using this toolchain here. It's focused on building cross-platform py3_image targets, so the use of this toolchain is really orthogonal to the purpose of the demo, but it may be useful to someone.

Full patch, in case that repo gets removed at some point:

diff --git a/toolchain/cc_toolchain_config.bzl b/toolchain/cc_toolchain_config.bzl
index 8785a8e..1ac65a5 100644
--- a/toolchain/cc_toolchain_config.bzl
+++ b/toolchain/cc_toolchain_config.bzl
@@ -68,6 +68,15 @@ def cc_toolchain_config(
             "clang",
             "darwin_x86_64",
             "darwin_x86_64",
+        ),
+         "darwin-aarch64": (
+            "clang-aarch64-darwin",
+            "aarch64-apple-macosx",
+            "darwin",
+            "macosx",
+            "clang",
+            "darwin_aarch64",
+            "darwin_aarch64",
         ),
         "linux-aarch64": (
             "clang-aarch64-linux",
diff --git a/toolchain/internal/common.bzl b/toolchain/internal/common.bzl
index 7493c64..53b3b53 100644
--- a/toolchain/internal/common.bzl
+++ b/toolchain/internal/common.bzl
@@ -12,7 +12,7 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.

-SUPPORTED_TARGETS = [("linux", "x86_64"), ("linux", "aarch64"), ("darwin", "x86_64")]
+SUPPORTED_TARGETS = [("linux", "x86_64"), ("linux", "aarch64"), ("darwin", "x86_64"), ("darwin", "aarch64")]

 host_tool_features = struct(
     SUPPORTS_ARG_FILE = "supports_arg_file",
@@ -68,7 +68,10 @@ def arch(rctx):
     ])
     if exec_result.return_code:
         fail("Failed to detect machine architecture: \n%s\n%s" % (exec_result.stdout, exec_result.stderr))
-    return exec_result.stdout.strip()
+    arch = exec_result.stdout.strip()
+    if arch == "arm64":
+        return "aarch64"
+    return arch

 def os_arch_pair(os, arch):
     return "{}-{}".format(os, arch)
diff --git a/toolchain/internal/llvm_distributions.bzl b/toolchain/internal/llvm_distributions.bzl
index 074ed84..62011f3 100644
--- a/toolchain/internal/llvm_distributions.bzl
+++ b/toolchain/internal/llvm_distributions.bzl
@@ -207,6 +207,12 @@ _llvm_distributions = {
     "clang+llvm-14.0.0-x86_64-apple-darwin.tar.xz": "cf5af0f32d78dcf4413ef6966abbfd5b1445fe80bba57f2ff8a08f77e672b9b3",
     "clang+llvm-14.0.0-x86_64-linux-gnu-ubuntu-18.04.tar.xz": "61582215dafafb7b576ea30cc136be92c877ba1f1c31ddbbd372d6d65622fef5",
     "clang+llvm-14.0.0-x86_64-linux-sles12.4.tar.xz": "78f70cc94c3b6f562455b15cebb63e75571d50c3d488d53d9aa4cd9dded30627",
+
+    # 15.0.2
+    "clang+llvm-15.0.2-arm64-apple-darwin21.0.tar.xz": "8c33f807bca56568b7060d0474daf63c8c10ec521d8188ac76362354d313ec58",
+    "clang+llvm-15.0.2-x86_64-apple-darwin.tar.xz": "a37ec6204f555605fa11e9c0e139a251402590ead6e227fc72da193e03883882",
+    "clang+llvm-15.0.2-aarch64-linux-gnu.tar.xz": "527ed550784681f95ec7a1be8fbf5a24bd03d7da9bf31afb6523996f45670be3",
+    "clang+llvm-15.0.2-x86_64-unknown-linux-gnu-rhel86.tar.xz": "f48f479e91ee7297ed8306c9d4495015691237cd91cc5330d3e1ee057b0548bd",
 }

 # Note: Unlike the user-specified llvm_mirror attribute, the URL prefixes in
@@ -229,6 +235,7 @@ _llvm_distributions_base_url = {
     "13.0.0": "https://github.com/llvm/llvm-project/releases/download/llvmorg-",
     "13.0.1": "https://github.com/llvm/llvm-project/releases/download/llvmorg-",
     "14.0.0": "https://github.com/llvm/llvm-project/releases/download/llvmorg-",
+    "15.0.2": "https://github.com/llvm/llvm-project/releases/download/llvmorg-",
 }

 def _get_auth(ctx, urls):
@@ -310,6 +317,9 @@ def _distribution_urls(rctx):

     strip_prefix = basename[:(len(basename) - len(".tar.xz"))]

+    if strip_prefix == "clang+llvm-15.0.2-x86_64-unknown-linux-gnu-rhel86":
+      strip_prefix = "clang+llvm-15.0.2-x86_64-unknown-linux-gnu"
+
     return urls, sha256, strip_prefix

 def _host_os_key(rctx):
diff --git a/toolchain/tools/llvm_release_name.py b/toolchain/tools/llvm_release_name.py
index 39505cc..3485d61 100755
--- a/toolchain/tools/llvm_release_name.py
+++ b/toolchain/tools/llvm_release_name.py
@@ -30,7 +30,12 @@ def _patch_llvm_version(llvm_version):

 def _darwin(llvm_version, arch):
     major_llvm_version = _major_llvm_version(llvm_version)
-    suffix = "darwin-apple" if major_llvm_version == 9 else "apple-darwin"
+    if major_llvm_version == 9:
+        suffix = "darwin-apple"
+    elif arch == "arm64":
+        suffix = "apple-darwin21.0"
+    else:
+        suffix = "apple-darwin"
     return "clang+llvm-{llvm_version}-{arch}-{suffix}.tar.xz".format(
         llvm_version=llvm_version, arch=arch, suffix=suffix)

@@ -86,6 +91,8 @@ def _linux(llvm_version, distname, version, arch):
     # If you find this mapping wrong, please send a Pull Request on Github.
     if arch in ["aarch64", "armv7a", "mips", "mipsel"]:
         os_name = "linux-gnu"
+    elif major_llvm_version == 15:
+        os_name = "unknown-linux-gnu-rhel86"
     elif distname == "freebsd":
         os_name = "unknown-freebsd-%s" % version
     elif distname == "suse":

@junhyeokahn
Copy link

@jbott It looks like 15.05 with linux binaires has been released.

@meastham
Copy link
Contributor

I rebased @omerbenamram's change on top of the recent commits from this repo supporting 15.06 if anybody wants to use it: https://github.com/meastham/bazel-toolchain/tree/m1_support

Seems to be working fine

@garymm
Copy link
Contributor

garymm commented Jan 13, 2023

@meastham can you open a PR?

@meastham
Copy link
Contributor

Sure: #174

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.