Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

System cannot find the path specified. #399

Closed
dustindall opened this issue Jan 2, 2017 · 13 comments
Closed

System cannot find the path specified. #399

dustindall opened this issue Jan 2, 2017 · 13 comments

Comments

@dustindall
Copy link

I'm getting the following error using both v0.5.1 on CRAN and v0.5.2 on github. Any help will be appreciated.

Thanks.

Error in force(code) :
Failed while connecting to sparklyr to port (8880) for sessionid (3794): Gateway in port (8880) did not respond.
Path: C:\Users\xxxx xxxx\AppData\Local\rstudio\spark\Cache\spark-1.6.2-bin-hadoop2.6\bin\spark-submit2.cmd
Parameters: --class, sparklyr.Backend, --jars, "C:/Users/xxxx xxxx/Documents/R/win-library/3.3/sparklyr/java/spark-csv_2.11-1.3.0.jar","C:/Users/xxxx xxxx/Documents/R/win-library/3.3/sparklyr/java/commons-csv-1.1.jar","C:/Users/xxxx xxxx/Documents/R/win-library/3.3/sparklyr/java/univocity-parsers-1.5.1.jar", "C:\Users\xxxx xxxx\Documents\R\win-library\3.3\sparklyr\java\sparklyr-1.6-2.10.jar", 8880, 3794

---- Output Log ----
The system cannot find the path specified.

---- Error Log ----

@gbortz27
Copy link

gbortz27 commented Jan 3, 2017

Same errors as me

Thanks
Graham

@kevinushey
Copy link
Contributor

I wasn't able to reproduce this. Do you by any chance have any accented / non-ASCII characters in your username? Do you have an antivirus / firewall that could be blocking connections on port 8880?

@gbortz27
Copy link

gbortz27 commented Jan 4, 2017 via email

@gbortz27
Copy link

gbortz27 commented Jan 4, 2017 via email

@dustindall
Copy link
Author

I do have a space in my username but they are an ASCII code. I'll look into my antivirus/firewall and report back.

@gbortz27
Copy link

gbortz27 commented Jan 4, 2017 via email

@kevinushey
Copy link
Contributor

I've seen these permissions as well on a Windows VM, but only intermittently and I wasn't yet able to ascertain a root cause. :/ I wonder if the issue is that multiple threads are attempting to read / write this file at the same time?

I'm also not quite sure to make of the writable permissions error:

Caused by: java.lang.RuntimeException: The root scratch dir:
C:/Users/grbortz/AppData/Local/rstudio/spark/Cache/
spark-2.0.2-bin-hadoop2.7/tmp/hive on HDFS *should be writable.* Current
permissions are: rw-rw-rw-

Perhaps the directory needs executable permissions as well? (I'm not exactly sure how this translates into the Windows permissions model yet, though)

What's the output of file.info("C:\\Users\\grbortz\\AppData\\Local\\rstudio\\spark\\Cache\\spark-2.0.2-bin-hadoop2.7\\tmp\\hadoop")?

@gbortz27
Copy link

gbortz27 commented Jan 4, 2017 via email

@gbortz27
Copy link

gbortz27 commented Jan 4, 2017 via email

@dustindall
Copy link
Author

I was able to connect by moving my spark install to a directory c:\spark. I'm thinking the error was attributed to a space within my user name that affected my spark paths.

Let me know if I should go ahead and close this issue.

@gbortz27
Copy link

Did U actually install it again throughSpark or did you redirect through environmental variables. Are your Temp directories in the new spark path or the old. What is happening with me is the antivirus program is protecting the default spark path ie in the AppData Path. Even when I redirected R studio to the new Spark path , ,by resetting the environmntal variables it read the new spark path when initiating the .cmdr files BUT it still placed files in the default Temp path ie a Temp folder in the AppData path. This is why I ask if there is a difference when you install from spark and when you do this do you have the option of the directory where you want to install it, or do you have to specify this in the command used to install spark , as an option. If so what is the format of the command. If I have downloaded the spark .tar or .tgz (which One must I use?) file how can I direct R studio to install from that folder rather than re download from the web.

@dustindall
Copy link
Author

I moved the spark install from the AppData path to a folder right under my c: drive, updated the path variables, and deleted the old spark versions and everything worked.

@gbortz27
Copy link

gbortz27 commented Jan 28, 2017 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants