New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Documentation] Running commands that do not terminate #137
Comments
Hi jiashenC, My option is use screen for that: This creates a terminal that you can attach later if you wish. |
I see. I use multithreading in Python to bypass the issue, but I just wonder if |
The problem is manage the stdin/stdout/stderr pipes, at beginning I just added " &" to the command, but execution crashed by this reason because pipes already closed. |
Hi there, Thanks for the interest and report. Can you clarify what you mean by blocks? Does From what you are describing, there is not anything the library can do, unless There are two options:
The latter can be done by running There is no capability in the SSH protocol to manage shell redirection, it all has to be done within the shell. As a rule of thumb, if something happens when using the |
Hi, I want to use Parallel-SSH to start server on remote device from my local machine. The server will run forever and constantly print some stats. From what I observe, If I try to run some commands by I switch to using For non-blocking, I wonder is it possible to send the command to the remote device but keep the client alive in the background, so the local program is able to execute other codes. I think your option two also satisfies my need. |
Can you show code that reproduces this behaviour? The call to Making the command a daemon is out of scope for an SSH library. Try the command with |
Sure, I can post a simpler version of my code. server.py from BaseHTTPServer import BaseHTTPRequestHandler, HTTPServer
from threading import Thread
class Handler(BaseHTTPRequestHandler):
def do_GET(self):
self.send_response(200)
self.send_header('Content-type','text/html')
self.end_headers()
self.wfile.write("Hello World !")
return
def main():
server = HTTPServer(('0.0.0.0', 12345), Handler)
server.allow_reuse_address = True
Thread(target=server.serve_forever, args=()).start()
if __name__ == '__main__':
main() run_system.sh python server.py ssh.py from pssh.pssh_client import ParallelSSHClient
from pssh.utils import load_private_key
key = load_private_key('...')
server_hosts = ParallelSSHClient('111.111.111.111', user='test', pkey=pkey)
output = server_hosts.run_command('bash run_system.sh') |
Thanks @jiashenC , will try to replicate with the above and get back to you. |
Hi @jiashenC, Do not see an issue with the library, the remote command is not a daemon and will therefore terminate when the client exits. Here is test case with a simple run forever remote process:
Client code:
Output:
Reading from output will run forever until interrupted - this is expected. When client exits, the remote command will end - this is also expected. For it to keep running after client exits the command needs to be a daemon. Eg with nohup:
Output:
Reading output returns immediately with output of nohup, not the server. Then when logging in a second time to remote host, there will be a The library does not know whether a command is a daemon or not, it only interacts with a remote shell via SSH. It is up to the user to make those commands into daemons appropriately if the intention is for the client to terminate and the command to still be running. This is the case whether the command is run via the library or any other SSH client. For things like Commands that are already daemons may not require Will leave open to add an example to documentation as this has come up a couple times now. |
@pkittenis Thanks for your detailed explanation! That's very helpful! |
I am using
run_command()
function to run bash command remotely. My current issue is I want to start a server on remote device. The server runs forever, which blocksrun_command()
function.Is there a way to use run this method as non-blocking function?
The text was updated successfully, but these errors were encountered: