Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[clang-repl] Support wasm execution #86402

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

vgvassilev
Copy link
Contributor

This commit introduces support for running clang-repl and executing C++ code interactively inside a Javascript engine using WebAssembly when built with Emscripten. This is achieved by producing WASM "shared libraries" that can be loaded by the Emscripten runtime using dlopen()

More discussion is available in https://reviews.llvm.org/D158140

@llvmbot llvmbot added clang Clang issues not falling into any other category clang:frontend Language frontend issues, e.g. anything involving "Sema" labels Mar 23, 2024
@llvmbot
Copy link
Collaborator

llvmbot commented Mar 23, 2024

@llvm/pr-subscribers-clang

Author: Vassil Vassilev (vgvassilev)

Changes

This commit introduces support for running clang-repl and executing C++ code interactively inside a Javascript engine using WebAssembly when built with Emscripten. This is achieved by producing WASM "shared libraries" that can be loaded by the Emscripten runtime using dlopen()

More discussion is available in https://reviews.llvm.org/D158140


Full diff: https://github.com/llvm/llvm-project/pull/86402.diff

6 Files Affected:

  • (modified) clang/lib/Interpreter/CMakeLists.txt (+1)
  • (modified) clang/lib/Interpreter/IncrementalExecutor.cpp (+2)
  • (modified) clang/lib/Interpreter/IncrementalExecutor.h (+7-4)
  • (modified) clang/lib/Interpreter/Interpreter.cpp (+13-2)
  • (added) clang/lib/Interpreter/Wasm.cpp (+111)
  • (added) clang/lib/Interpreter/Wasm.h (+33)
diff --git a/clang/lib/Interpreter/CMakeLists.txt b/clang/lib/Interpreter/CMakeLists.txt
index 9065f998f73c47..9d302d995801d5 100644
--- a/clang/lib/Interpreter/CMakeLists.txt
+++ b/clang/lib/Interpreter/CMakeLists.txt
@@ -20,6 +20,7 @@ add_clang_library(clangInterpreter
   Interpreter.cpp
   InterpreterUtils.cpp
   Value.cpp
+  Wasm.cpp
 
   DEPENDS
   intrinsics_gen
diff --git a/clang/lib/Interpreter/IncrementalExecutor.cpp b/clang/lib/Interpreter/IncrementalExecutor.cpp
index 40bcef94797d43..5180905c9e6d24 100644
--- a/clang/lib/Interpreter/IncrementalExecutor.cpp
+++ b/clang/lib/Interpreter/IncrementalExecutor.cpp
@@ -35,6 +35,8 @@ LLVM_ATTRIBUTE_USED void linkComponents() {
 }
 
 namespace clang {
+IncrementalExecutor::IncrementalExecutor(llvm::orc::ThreadSafeContext &TSC)
+    : TSCtx(TSC) {}
 
 IncrementalExecutor::IncrementalExecutor(llvm::orc::ThreadSafeContext &TSC,
                                          llvm::Error &Err,
diff --git a/clang/lib/Interpreter/IncrementalExecutor.h b/clang/lib/Interpreter/IncrementalExecutor.h
index dd0a210a061415..958be66efc3281 100644
--- a/clang/lib/Interpreter/IncrementalExecutor.h
+++ b/clang/lib/Interpreter/IncrementalExecutor.h
@@ -41,16 +41,19 @@ class IncrementalExecutor {
   llvm::DenseMap<const PartialTranslationUnit *, llvm::orc::ResourceTrackerSP>
       ResourceTrackers;
 
+protected:
+  IncrementalExecutor(llvm::orc::ThreadSafeContext &TSC);
+
 public:
   enum SymbolNameKind { IRName, LinkerName };
 
   IncrementalExecutor(llvm::orc::ThreadSafeContext &TSC, llvm::Error &Err,
                       const clang::TargetInfo &TI);
-  ~IncrementalExecutor();
+  virtual ~IncrementalExecutor();
 
-  llvm::Error addModule(PartialTranslationUnit &PTU);
-  llvm::Error removeModule(PartialTranslationUnit &PTU);
-  llvm::Error runCtors() const;
+  virtual llvm::Error addModule(PartialTranslationUnit &PTU);
+  virtual llvm::Error removeModule(PartialTranslationUnit &PTU);
+  virtual llvm::Error runCtors() const;
   llvm::Error cleanUp();
   llvm::Expected<llvm::orc::ExecutorAddr>
   getSymbolAddress(llvm::StringRef Name, SymbolNameKind NameKind) const;
diff --git a/clang/lib/Interpreter/Interpreter.cpp b/clang/lib/Interpreter/Interpreter.cpp
index 7fa52f2f15fc49..18c76685636855 100644
--- a/clang/lib/Interpreter/Interpreter.cpp
+++ b/clang/lib/Interpreter/Interpreter.cpp
@@ -15,6 +15,7 @@
 #include "IncrementalExecutor.h"
 #include "IncrementalParser.h"
 #include "InterpreterUtils.h"
+#include "Wasm.h"
 
 #include "clang/AST/ASTContext.h"
 #include "clang/AST/Mangle.h"
@@ -183,6 +184,12 @@ IncrementalCompilerBuilder::CreateCpp() {
   std::vector<const char *> Argv;
   Argv.reserve(5 + 1 + UserArgs.size());
   Argv.push_back("-xc++");
+#ifdef __EMSCRIPTEN__
+  Argv.push_back("-target");
+  Argv.push_back("wasm32-unknown-emscripten");
+  Argv.push_back("-pie");
+  Argv.push_back("-shared");
+#endif
   Argv.insert(Argv.end(), UserArgs.begin(), UserArgs.end());
 
   std::string TT = TargetTriple ? *TargetTriple : llvm::sys::getProcessTriple();
@@ -373,14 +380,18 @@ Interpreter::Parse(llvm::StringRef Code) {
 }
 
 llvm::Error Interpreter::CreateExecutor() {
-  const clang::TargetInfo &TI =
-      getCompilerInstance()->getASTContext().getTargetInfo();
   if (IncrExecutor)
     return llvm::make_error<llvm::StringError>("Operation failed. "
                                                "Execution engine exists",
                                                std::error_code());
   llvm::Error Err = llvm::Error::success();
+  const clang::TargetInfo &TI =
+      getCompilerInstance()->getASTContext().getTargetInfo();
+#ifdef __EMSCRIPTEN__
+  auto Executor = std::make_unique<WasmIncrementalExecutor>(*TSCtx, Err, TI);
+#else
   auto Executor = std::make_unique<IncrementalExecutor>(*TSCtx, Err, TI);
+#endif
   if (!Err)
     IncrExecutor = std::move(Executor);
 
diff --git a/clang/lib/Interpreter/Wasm.cpp b/clang/lib/Interpreter/Wasm.cpp
new file mode 100644
index 00000000000000..cb455f111ea888
--- /dev/null
+++ b/clang/lib/Interpreter/Wasm.cpp
@@ -0,0 +1,111 @@
+//===----------------- Wasm.cpp - Wasm Interpreter --------------*- C++ -*-===//
+//
+// Part of the LLVM Project, under the Apache License v2.0 with LLVM Exceptions.
+// See https://llvm.org/LICENSE.txt for license information.
+// SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception
+//
+//===----------------------------------------------------------------------===//
+//
+// This file implements interpreter support for code execution in WebAssembly.
+//
+//===----------------------------------------------------------------------===//
+
+#ifdef __EMSCRIPTEN__
+
+#include "Wasm.h"
+#include "IncrementalExecutor.h"
+
+#include <llvm/IR/LegacyPassManager.h>
+#include <llvm/IR/Module.h>
+#include <llvm/MC/TargetRegistry.h>
+#include <llvm/Target/TargetMachine.h>
+
+#include <clang/Interpreter/Interpreter.h>
+
+#include <dlfcn.h>
+
+namespace clang {
+
+WasmIncrementalExecutor::WasmIncrementalExecutor(
+    llvm::orc::ThreadSafeContext &TSC)
+    : IncrementalExecutor(TSC) {}
+
+llvm::Error WasmIncrementalExecutor::addModule(PartialTranslationUnit &PTU) {
+  PTU.TheModule->dump();
+
+  std::string ErrorString;
+
+  const llvm::Target *Target = llvm::TargetRegistry::lookupTarget(
+      PTU.TheModule->getTargetTriple(), ErrorString);
+  if (!Target) {
+    return llvm::make_error<llvm::StringError>("Failed to create Wasm Target: ",
+                                               llvm::inconvertibleErrorCode());
+  }
+
+  llvm::TargetOptions TO = llvm::TargetOptions();
+  llvm::TargetMachine *TargetMachine = Target->createTargetMachine(
+      PTU.TheModule->getTargetTriple(), "", "", TO, llvm::Reloc::Model::PIC_);
+  PTU.TheModule->setDataLayout(TargetMachine->createDataLayout());
+  std::string OutputFileName = PTU.TheModule->getName().str() + ".wasm";
+
+  std::error_code Error;
+  llvm::raw_fd_ostream OutputFile(llvm::StringRef(OutputFileName), Error);
+
+  llvm::legacy::PassManager PM;
+  if (TargetMachine->addPassesToEmitFile(PM, OutputFile, nullptr,
+                                         llvm::CGFT_ObjectFile)) {
+    return llvm::make_error<llvm::StringError>(
+        "Wasm backend cannot produce object.", llvm::inconvertibleErrorCode());
+  }
+
+  if (!PM.run(*PTU.TheModule)) {
+
+    return llvm::make_error<llvm::StringError>("Failed to emit Wasm object.",
+                                               llvm::inconvertibleErrorCode());
+  }
+
+  OutputFile.close();
+
+  std::vector<const char *> LinkerArgs = {"wasm-ld",
+                                          "-pie",
+                                          "--import-memory",
+                                          "--no-entry",
+                                          "--export-all",
+                                          "--experimental-pic",
+                                          "--no-export-dynamic",
+                                          "--stack-first",
+                                          OutputFileName.c_str(),
+                                          "-o",
+                                          OutputFileName.c_str()};
+  int Result =
+      lld::wasm::link(LinkerArgs, llvm::outs(), llvm::errs(), false, false);
+  if (!Result)
+    return llvm::make_error<llvm::StringError>(
+        "Failed to link incremental module", llvm::inconvertibleErrorCode());
+
+  void *LoadedLibModule =
+      dlopen(OutputFileName.c_str(), RTLD_NOW | RTLD_GLOBAL);
+  if (LoadedLibModule == nullptr) {
+    llvm::errs() << dlerror() << '\n';
+    return llvm::make_error<llvm::StringError>(
+        "Failed to load incremental module", llvm::inconvertibleErrorCode());
+  }
+
+  return llvm::Error::success();
+}
+
+llvm::Error WasmIncrementalExecutor::removeModule(PartialTranslationUnit &PTU) {
+  return llvm::make_error<llvm::StringError>("Not implemented yet",
+                                             llvm::inconvertibleErrorCode());
+}
+
+llvm::Error WasmIncrementalExecutor::runCtors() const {
+  // This seems to be automatically done when using dlopen()
+  return llvm::Error::success();
+}
+
+WasmIncrementalExecutor::~WasmIncrementalExecutor() = default;
+
+} // namespace clang
+
+#endif __EMSCRIPTEN__
diff --git a/clang/lib/Interpreter/Wasm.h b/clang/lib/Interpreter/Wasm.h
new file mode 100644
index 00000000000000..98acd4c14e240b
--- /dev/null
+++ b/clang/lib/Interpreter/Wasm.h
@@ -0,0 +1,33 @@
+//===------------------ Wasm.h - Wasm Interpreter ---------------*- C++ -*-===//
+//
+// Part of the LLVM Project, under the Apache License v2.0 with LLVM Exceptions.
+// See https://llvm.org/LICENSE.txt for license information.
+// SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception
+//
+//===----------------------------------------------------------------------===//
+//
+// This file implements interpreter support for code execution in WebAssembly.
+//
+//===----------------------------------------------------------------------===//
+
+#ifndef LLVM_CLANG_LIB_INTERPRETER_WASM_H
+#define LLVM_CLANG_LIB_INTERPRETER_WASM_H
+
+#include "IncrementalExecutor.h"
+
+namespace clang {
+
+class WasmIncrementalExecutor : public IncrementalExecutor {
+public:
+  WasmIncrementalExecutor(llvm::orc::ThreadSafeContext &TSC);
+
+  llvm::Error addModule(PartialTranslationUnit &PTU) override;
+  llvm::Error removeModule(PartialTranslationUnit &PTU) override;
+  llvm::Error runCtors() const override;
+
+  ~WasmIncrementalExecutor() override;
+};
+
+} // namespace clang
+
+#endif // LLVM_CLANG_LIB_INTERPRETER_WASM_H

@vgvassilev
Copy link
Contributor Author

Copy link

✅ With the latest revision this PR passed the C/C++ code formatter.

Copy link

✅ With the latest revision this PR passed the Python code formatter.

This commit introduces support for running clang-repl and executing C++ code
interactively inside a Javascript engine using WebAssembly when built with
Emscripten. This is achieved by producing WASM "shared libraries" that can be
loaded by the Emscripten runtime using dlopen()

More discussion is available in https://reviews.llvm.org/D158140
@vgvassilev
Copy link
Contributor Author

Hi @AaronBallman, we need some leadership here.

This pull request teaches clang-repl to work inside a browser. It enables webassembly xeus-cpp (through clang-repl) which connects to the JupyterLite infrastructure. This change (actually an older version of it) enables execution of C++ in a browser. Here is an example https://wasmdemo.argentite.me/

This way we can enable in JupyterLite system programming with C/C++ which can be transformative for education in system programming as it only requires a browser not a cloud service.

Unfortunately, this PR cannot be tested in the current testing infrastructure in llvm because it needs a browser to provide a proper execution environment. If we move forward here that would unblock work on the other end of the infrastructure.

Copy link
Collaborator

So are we in a chicken-and-egg situation where we need some support in-tree in order to have a reason to set up a postcommit test bot but we don't want to land untested changes?

At the end of the day, if we're claiming support for something, we need it to be tested and this seems like a situation where we'd need an actual buildbot specific for the need. So we may want to proceed here in a few stages: 1) push up these changes without testing, 2) get a bot set up and have it test these changes, once it's green and we're happy then we publicly claim support, 3) if we don't get a bot set up within some reasonable timeframe, we agree to remove these changes so we don't leave untested code in the repo.

WDYT?

@vgvassilev
Copy link
Contributor Author

That would make sense. I am not sure if we can set a post commit bot though. @argentite what do you think?

@vgvassilev
Copy link
Contributor Author

@AaronBallman, to be fair, clang is testing the wasm features in terms of output. So this is wiring up a bunch of tested features that will allow execution. Clang generally does not test execution but output, so we are not creating a precedent here since that PR can be considered plumbing for downstream consumers.

@AaronBallman
Copy link
Collaborator

@AaronBallman, to be fair, clang is testing the wasm features in terms of output. So this is wiring up a bunch of tested features that will allow execution. Clang generally does not test execution but output, so we are not creating a precedent here since that PR can be considered plumbing for downstream consumers.

If we don't have community test coverage, we'll regress that plumbing for downstream consumers. In general, we shouldn't claim we support something we don't test. However, if there is a downstream consumer that agrees to be actively responsible for repairing breakages, we sometimes allow it (e.g., https://discourse.llvm.org/t/rfc-building-llvm-for-webassembly/79073)

@vgvassilev
Copy link
Contributor Author

@AaronBallman, to be fair, clang is testing the wasm features in terms of output. So this is wiring up a bunch of tested features that will allow execution. Clang generally does not test execution but output, so we are not creating a precedent here since that PR can be considered plumbing for downstream consumers.

If we don't have community test coverage, we'll regress that plumbing for downstream consumers. In general, we shouldn't claim we support something we don't test. However, if there is a downstream consumer that agrees to be actively responsible for repairing breakages, we sometimes allow it (e.g., https://discourse.llvm.org/t/rfc-building-llvm-for-webassembly/79073)

I am not sure if we have the same definition for "claim". FWIW I am not saying we should put any of this on the website or elsewhere. We have a downstream consumer which intergrates the emscripten version of wasm here. @DerThorsten is my go-to person when it comes to emscripten and llvm. I believe they are quite sensitive about breakages. In any case we will develop tests in the CppInterOp project, too.

@AaronBallman
Copy link
Collaborator

@AaronBallman, to be fair, clang is testing the wasm features in terms of output. So this is wiring up a bunch of tested features that will allow execution. Clang generally does not test execution but output, so we are not creating a precedent here since that PR can be considered plumbing for downstream consumers.

If we don't have community test coverage, we'll regress that plumbing for downstream consumers. In general, we shouldn't claim we support something we don't test. However, if there is a downstream consumer that agrees to be actively responsible for repairing breakages, we sometimes allow it (e.g., https://discourse.llvm.org/t/rfc-building-llvm-for-webassembly/79073)

I am not sure if we have the same definition for "claim". FWIW I am not saying we should put any of this on the website or elsewhere. We have a downstream consumer which intergrates the emscripten version of wasm here. @DerThorsten is my go-to person when it comes to emscripten and llvm. I believe they are quite sensitive about breakages. In any case we will develop tests in the CppInterOp project, too.

I wouldn't focus on "claim" too heavily. We should not have code in-tree that's not tested except in exceptional situations and it's not clear to me that this is (or isn't) such an exceptional situation.

@vgvassilev
Copy link
Contributor Author

@AaronBallman, to be fair, clang is testing the wasm features in terms of output. So this is wiring up a bunch of tested features that will allow execution. Clang generally does not test execution but output, so we are not creating a precedent here since that PR can be considered plumbing for downstream consumers.

If we don't have community test coverage, we'll regress that plumbing for downstream consumers. In general, we shouldn't claim we support something we don't test. However, if there is a downstream consumer that agrees to be actively responsible for repairing breakages, we sometimes allow it (e.g., https://discourse.llvm.org/t/rfc-building-llvm-for-webassembly/79073)

I am not sure if we have the same definition for "claim". FWIW I am not saying we should put any of this on the website or elsewhere. We have a downstream consumer which intergrates the emscripten version of wasm here. @DerThorsten is my go-to person when it comes to emscripten and llvm. I believe they are quite sensitive about breakages. In any case we will develop tests in the CppInterOp project, too.

I wouldn't focus on "claim" too heavily.

Ok.

We should not have code in-tree that's not tested except in exceptional situations and it's not clear to me that this is (or isn't) such an exceptional situation.

Well, as of today I do not see how we can test execution of jit-based wasm in tree. I am not a wasm expert but IIUC, testing would require a browser and compiling llvm for emscripten and somehow running some remote execution service...

My worry is that we are losing the forest for the trees because this work has stayed as a patch since maybe more than a year. It'd be a pity to wait more because that blocks downstream consumers in practice...

@DerThorsten
Copy link

@AaronBallman, to be fair, clang is testing the wasm features in terms of output. So this is wiring up a bunch of tested features that will allow execution. Clang generally does not test execution but output, so we are not creating a precedent here since that PR can be considered plumbing for downstream consumers.

If we don't have community test coverage, we'll regress that plumbing for downstream consumers. In general, we shouldn't claim we support something we don't test. However, if there is a downstream consumer that agrees to be actively responsible for repairing breakages, we sometimes allow it (e.g., https://discourse.llvm.org/t/rfc-building-llvm-for-webassembly/79073)

I am not sure if we have the same definition for "claim". FWIW I am not saying we should put any of this on the website or elsewhere. We have a downstream consumer which intergrates the emscripten version of wasm here. @DerThorsten is my go-to person when it comes to emscripten and llvm. I believe they are quite sensitive about breakages. In any case we will develop tests in the CppInterOp project, too.

For emscripten-forge recipes of xeus-cpp /clang you can break whatever you want.
I am sure there are no deployments / users yet

@vgvassilev
Copy link
Contributor Author

@AaronBallman, to be fair, clang is testing the wasm features in terms of output. So this is wiring up a bunch of tested features that will allow execution. Clang generally does not test execution but output, so we are not creating a precedent here since that PR can be considered plumbing for downstream consumers.

If we don't have community test coverage, we'll regress that plumbing for downstream consumers. In general, we shouldn't claim we support something we don't test. However, if there is a downstream consumer that agrees to be actively responsible for repairing breakages, we sometimes allow it (e.g., https://discourse.llvm.org/t/rfc-building-llvm-for-webassembly/79073)

I am not sure if we have the same definition for "claim". FWIW I am not saying we should put any of this on the website or elsewhere. We have a downstream consumer which intergrates the emscripten version of wasm here. @DerThorsten is my go-to person when it comes to emscripten and llvm. I believe they are quite sensitive about breakages. In any case we will develop tests in the CppInterOp project, too.

For emscripten-forge recipes of xeus-cpp /clang you can break whatever you want. I am sure there are no deployments / users yet

@AaronBallman, to be fair, clang is testing the wasm features in terms of output. So this is wiring up a bunch of tested features that will allow execution. Clang generally does not test execution but output, so we are not creating a precedent here since that PR can be considered plumbing for downstream consumers.

If we don't have community test coverage, we'll regress that plumbing for downstream consumers. In general, we shouldn't claim we support something we don't test. However, if there is a downstream consumer that agrees to be actively responsible for repairing breakages, we sometimes allow it (e.g., https://discourse.llvm.org/t/rfc-building-llvm-for-webassembly/79073)

I am not sure if we have the same definition for "claim". FWIW I am not saying we should put any of this on the website or elsewhere. We have a downstream consumer which intergrates the emscripten version of wasm here. @DerThorsten is my go-to person when it comes to emscripten and llvm. I believe they are quite sensitive about breakages. In any case we will develop tests in the CppInterOp project, too.

For emscripten-forge recipes of xeus-cpp /clang you can break whatever you want. I am sure there are no deployments / users yet

I think Aaron's question was if we land this PR and it regresses over time, would our downstream clients be able to catch it in a timely manner.

@AaronBallman
Copy link
Collaborator

@AaronBallman, to be fair, clang is testing the wasm features in terms of output. So this is wiring up a bunch of tested features that will allow execution. Clang generally does not test execution but output, so we are not creating a precedent here since that PR can be considered plumbing for downstream consumers.

If we don't have community test coverage, we'll regress that plumbing for downstream consumers. In general, we shouldn't claim we support something we don't test. However, if there is a downstream consumer that agrees to be actively responsible for repairing breakages, we sometimes allow it (e.g., https://discourse.llvm.org/t/rfc-building-llvm-for-webassembly/79073)

I am not sure if we have the same definition for "claim". FWIW I am not saying we should put any of this on the website or elsewhere. We have a downstream consumer which intergrates the emscripten version of wasm here. @DerThorsten is my go-to person when it comes to emscripten and llvm. I believe they are quite sensitive about breakages. In any case we will develop tests in the CppInterOp project, too.

I wouldn't focus on "claim" too heavily.

Ok.

We should not have code in-tree that's not tested except in exceptional situations and it's not clear to me that this is (or isn't) such an exceptional situation.

Well, as of today I do not see how we can test execution of jit-based wasm in tree. I am not a wasm expert but IIUC, testing would require a browser and compiling llvm for emscripten and somehow running some remote execution service...

Yeah, we don't have any existing test infrastructure we could lean on for it. However, this doesn't seem impossible to test; surely other projects have figured out a test harness for WASM?

My worry is that we are losing the forest for the trees because this work has stayed as a patch since maybe more than a year. It'd be a pity to wait more because that blocks downstream consumers in practice...

Yeah, it's a tough tradeoff. The two things I'm specifically worried about are: 1) users of the downstream see a break and (eventually) report it to Clang rather than the downstream, so now we have to triage it without a way to reproduce, 2) we change internal APIs and need to update this code but have no way to determine if those updates will or won't break the functionality.

If the downstream that needs this functionality can commit to be responsible for maintaining it within Clang, that would help alleviate the concerns somewhat.

@argentite
Copy link
Contributor

Unfortunately, this PR cannot be tested in the current testing infrastructure in llvm because it needs a browser to provide a proper execution environment.

Just to clarify, we don't need an actual full browser to test it. Nodejs should probably be sufficient for execution.

That would make sense. I am not sure if we can set a post commit bot though. @argentite what do you think?

I think it is possible. If we use a buildbot, we currently only need to build llvm, clang and lld for WASM and targetting WASM. Other subprojects are not required for this PR and I am not even sure if they can be built to run in WASM.

Also most of the existing tests in the project (and maybe even the whole test infrastructure) probably won't work and/or does not make sense in WASM. So it would probably have to be running WASM specific tests.

@vgvassilev
Copy link
Contributor Author

@AaronBallman do we have access to some existing bot to set it up as @argentite suggests?

Copy link
Collaborator

AFAIK, none of the bots are ones which allow arbitrary changes; they're all owned or managed by someone you'd have to get permission from. You could look through https://lab.llvm.org/buildbot/#/workers to see what kinds of machines are already in the lab and who manages them, then reach out to the owner to see if you can coordinate putting a new builder on their worker.

@gkistanova might have better ideas, too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
clang:frontend Language frontend issues, e.g. anything involving "Sema" clang Clang issues not falling into any other category
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants