Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SSH remote builds assume sh based shell is the default #665

Open
modwizcode opened this issue Dec 17, 2021 · 6 comments
Open

SSH remote builds assume sh based shell is the default #665

modwizcode opened this issue Dec 17, 2021 · 6 comments

Comments

@modwizcode
Copy link
Contributor

modwizcode commented Dec 17, 2021

Remote builds currently execute as if connecting to a session over SSH will always give a relatively sh-style shell. This is not an assumption that should be made and should likely be solved by allowing the path to such a shell to be specified and assuming one exists at /bin/sh (this latter point is up for debate).

The current behavior that is most likely to trigger this is that it attempts to execute if [ -f ~/.profile ]; then . ~/.profile; fi && cd {} && sh {}.sh. This could also likely be addressed by giving users control over the script used to launch into the copied build script, but since build scripts basically assume sh compatible shells exist anyway, this is probably not that useful.

@cr1901
Copy link
Contributor

cr1901 commented Dec 18, 2021

When I wrote this, I needed a way to get amaranth env variables to be available to the ssh session. The ~/.profile dance was my attempt at a portable way to get those env vars into the session.

I'm not particularly attached to my solution. There might be a better way that doesn't force a user to e.g. specify a shell or other params to the remote build to get the env vars.

@modwizcode
Copy link
Contributor Author

modwizcode commented Dec 18, 2021

I agree there needs to be a way to do this, it is still unclear of the best way to do this. While SSH can directly export environmental variables, this needs to be specifically supported and I believe this isn't the case by default.

The likely best solution is to allow specifying an environment setup script of some sort directly, which can be transferred and executed to setup the environment and provide a standardized run environment to Amaranth's executor.

That can likely be used to mostly solve this problem itself regardless of potential API changes. The SSH executor could generate a script on the fly to do this, with the appropriate shebang line and then directly execute that file. I might draft a PR to do so if this is unclear.

@cr1901
Copy link
Contributor

cr1901 commented Dec 21, 2021

The SSH executor could generate a script on the fly to do this, with the appropriate shebang line and then directly execute that file.

The .profile dance allows to defer setting the env variables until we start running code on the remote host. Maybe a remote-build_*.{sh, fish, zsh} script could be unconditionally generated as part of the input artifacts if the remote-build feature is enabled? This would be analogous to how build_*.{sh, bat} are both generated but only one is (typically) used per host.

@whitequark
Copy link
Member

whitequark commented Dec 22, 2021

if the remote-build feature is enabled

You can't detect that, as far as I know.

@modwizcode
Copy link
Contributor Author

modwizcode commented Dec 22, 2021

The SSH executor could generate a script on the fly to do this, with the appropriate shebang line and then directly execute that file.

The .profile dance allows to defer setting the env variables until we start running code on the remote host. Maybe a remote-build_*.{sh, fish, zsh} script could be unconditionally generated as part of the input artifacts if the remote-build feature is enabled? This would be analogous to how build_*.{sh, bat} are both generated but only one is (typically) used per host.

I might be missing something, but I do not see how generating a setup script on the fly would not still provide this. It would be generated when executing a remote build (or even just transferring the files), and it could include a line to potentially source a user specified file.

The difference is that it would have it's own shebang line at the top to invoke the correct interpreter automatically rather than just assuming that SSH sessions are based around a specific shell. Executing a single command is possible to do directly via SSH, so we just use that instead.

This change should be implementable with no meaningful change in behavior over the current implementation. The script could even be uploaded only as part of executing a remote build (although it likely brings little benefit). What we should discuss here is if this approach is viable and if we want to also provide a new/changed API surface to better service this functionality.

@cr1901
Copy link
Contributor

cr1901 commented Dec 22, 2021

I might be missing something, but I do not see how generating a setup script on the fly would not still provide this.

I misunderstood your previous comments in multiple ways (the "how I misunderstood" isn't that important). After your newest comment, I realized your solution works fine and is a better version of what I proposed.

The script could even be uploaded only as part of executing a remote build (although it likely brings little benefit).

I was under the impression that builds needed to be self-contained. I still think the setup script should still be part of the input artifacts that are sent, even if not executing the build.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests

3 participants