Skip to content

Commit

Permalink
Update running-workflows.rst
Browse files Browse the repository at this point in the history
  • Loading branch information
jayantshekhar committed Jan 3, 2018
1 parent 7accd31 commit ae8cf27
Showing 1 changed file with 7 additions and 7 deletions.
14 changes: 7 additions & 7 deletions docs/troubleshooting/running-workflows.rst
Original file line number Diff line number Diff line change
Expand Up @@ -39,23 +39,23 @@ When running on the Cluster, you are running into the exception below::

org.apache.hadoop.security.AccessControlException: Permission denied: user=admin, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x

If the above exception is coming up when running the workflow, then it means that the logged in user does not exist on HDFS.
* If the above exception is coming up when running the workflow, then it means that the logged in user does not exist on HDFS.

In the above case, the user is logged into Fire as 'admin'. So the jobs submitted by Fire on the cluster is as the user 'admin'. But the user 'admin' does not exist on HDFS.
* In the above case, the user is logged into Fire as 'admin'. So the jobs submitted by Fire on the cluster is as the user 'admin'. But the user 'admin' does not exist on HDFS.

Please make sure to log into Fire as a user which exists on HDFS.
* Please make sure to log into Fire as a user which exists on HDFS.


When running the example workflows on the Spark Cluster it is not able to find the input files
-----------------------------------------------------------------------------------------------

The example workflows read in input files::
The example workflows read in input files.

They have to be on HDFS in the home directory of the logged in user.
* They have to be on HDFS in the home directory of the logged in user.

The data directory which comes with Sparkflows has to be uploaded onto HDFS.
* The data directory which comes with Sparkflows has to be uploaded onto HDFS.

For example, if the logged in user is 'john', then the data directory would be on HDFS in the directory /user/john
* For example, if the logged in user is 'john', then the data directory would be on HDFS in the directory /user/john


Getting Exception : Server returned HTTP response code: 405 for URL: http://10.125.221.72:8080/ messageFromSparkJob
Expand Down

0 comments on commit ae8cf27

Please sign in to comment.