-
Notifications
You must be signed in to change notification settings - Fork 177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SSH remote builds assume sh based shell is the default #665
Comments
When I wrote this, I needed a way to get I'm not particularly attached to my solution. There might be a better way that doesn't force a user to e.g. specify a shell or other params to the remote build to get the env vars. |
I agree there needs to be a way to do this, it is still unclear of the best way to do this. While SSH can directly export environmental variables, this needs to be specifically supported and I believe this isn't the case by default. The likely best solution is to allow specifying an environment setup script of some sort directly, which can be transferred and executed to setup the environment and provide a standardized run environment to Amaranth's executor. That can likely be used to mostly solve this problem itself regardless of potential API changes. The SSH executor could generate a script on the fly to do this, with the appropriate shebang line and then directly execute that file. I might draft a PR to do so if this is unclear. |
The |
You can't detect that, as far as I know. |
I might be missing something, but I do not see how generating a setup script on the fly would not still provide this. It would be generated when executing a remote build (or even just transferring the files), and it could include a line to potentially source a user specified file. The difference is that it would have it's own shebang line at the top to invoke the correct interpreter automatically rather than just assuming that SSH sessions are based around a specific shell. Executing a single command is possible to do directly via SSH, so we just use that instead. This change should be implementable with no meaningful change in behavior over the current implementation. The script could even be uploaded only as part of executing a remote build (although it likely brings little benefit). What we should discuss here is if this approach is viable and if we want to also provide a new/changed API surface to better service this functionality. |
I misunderstood your previous comments in multiple ways (the "how I misunderstood" isn't that important). After your newest comment, I realized your solution works fine and is a better version of what I proposed.
I was under the impression that builds needed to be self-contained. I still think the setup script should still be part of the input artifacts that are sent, even if not executing the build. |
I'm not sure why we need any of the above solutions. I propose the following, which ensures that the build script is always run with diff --git a/amaranth/build/run.py b/amaranth/build/run.py
index 1e42417..72eb635 100644
--- a/amaranth/build/run.py
+++ b/amaranth/build/run.py
@@ -169,8 +169,9 @@ class BuildPlan:
channel = transport.open_session()
channel.set_combine_stderr(True)
- cmd = "if [ -f ~/.profile ]; then . ~/.profile; fi && cd {} && sh {}.sh".format(root, self.script)
- channel.exec_command(cmd)
+ cmd = (f"if [ -f ~/.profile ]; then . ~/.profile; fi && "
+ f"cd {root} && exec $0 {self.script}.sh")
+ channel.exec_command(f"sh -c '{cmd}'")
# Show the output from the server while products are built.
buf = channel.recv(1024) |
Remote builds currently execute as if connecting to a session over SSH will always give a relatively
sh
-style shell. This is not an assumption that should be made and should likely be solved by allowing the path to such a shell to be specified and assuming one exists at/bin/sh
(this latter point is up for debate).The current behavior that is most likely to trigger this is that it attempts to execute
if [ -f ~/.profile ]; then . ~/.profile; fi && cd {} && sh {}.sh
. This could also likely be addressed by giving users control over the script used to launch into the copied build script, but since build scripts basically assumesh
compatible shells exist anyway, this is probably not that useful.The text was updated successfully, but these errors were encountered: