Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to send a command while ssh_exec_wait is not finnished. #18

Closed
JackHsueh opened this issue Dec 11, 2018 · 3 comments
Closed

How to send a command while ssh_exec_wait is not finnished. #18

JackHsueh opened this issue Dec 11, 2018 · 3 comments

Comments

@JackHsueh
Copy link

I am using the ssh coding in a remote platform. With xshll, after connected, the prompt is [user@ip]$, and when I input command SCALAstart, it will be shift to scala platform with prompt scala>, and then I can input the comman of scala code to manipulate the data. But when I used the ssh_exec_wait in Rstudio, it also can get back the output in consol with scala>, But the function is endless, and I can not sent the comman to the scala> line, neither the new ssh_exec_wait function, nor the console input. So can I execute command withou waiting the returns, and execute new command while the last comman is running. And an additional function is prefer to return the out or error independently.

sp181211_104433

@jeroen
Copy link
Member

jeroen commented Feb 18, 2019

I don't understand what you are asking. Can you please include example R code?

@jeroen jeroen closed this as completed Feb 18, 2019
@JackHsueh
Copy link
Author

JackHsueh commented Feb 27, 2019

Sorry if I wasn't clear. What I want is to operate the remote data in EMR with Spark Shell.

Typically, I use xshell to connect to the EMR server, after connected, the to the prompt symbol is [user@ip]$ , here I can execute the linux command interactively. when I input command SCALAstart (like input python in windows cmd and enable python command) , it will be shift to the Spark shell with prompt symbol scala>, and then I can input Scala code interactively, such as var i=1.

When I used ssh package, it can connect to the server successfully.

library(ssh)
session<-ssh_connect("user@ip",keyfile="e:\\user.pem")

Then I can execute the linux command successfully.

ssh_exec_wait(session, command = "date")

But when I execute the start command of spark shell,

ssh_exec_wait(session,command="SCALAstart")

it start up successfully and end with the symbol scala> wating for scala codes.

but I can not send new command to the server interactively. So, I guess it is because that the command ssh_exec_wait is just waiting the end mark of command SCALAstar (A red stop button is in top right conner of r studio consol window), and new command of Scala code is not allowed to input.

#invalid code after ssh_exec_wait(session,command="SCALAstart")
ssh_exec_wait(session, command = "var i=1")
var i=1
ssh_exec_wait(session, command = ":quit")
#in xshell, the codes below is valid after command: SCALAstart
var i=1
:quit

My question is "Can I input the scala code interactively during command SCALAstart is not finished? " like xshell, running on the spark shell prompt scala>, when SCALAstart running on the linux prompt[user@ip]$". (same as running python code under python prompt>>> with "python interpreter command" is just running on windows CMD prompt> . )

@jeroen
Copy link
Member

jeroen commented Feb 27, 2019

Currently you can only run scripts, there is no way to setup an interactive shell. I'm actually surprised SCALAstart doesn't recognize that it is in a non-interactive shell and hence freezes waiting for user input. I'll open a new issue for this: #23 but I won't be able to look at it any time soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants