Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

point at spark master? #54

Open
dotnwat opened this issue Feb 29, 2016 · 5 comments
Open

point at spark master? #54

dotnwat opened this issue Feb 29, 2016 · 5 comments

Comments

@dotnwat
Copy link

dotnwat commented Feb 29, 2016

This is probably a really simple question. How can I point bin/run to a running Spark cluster?

@GuanhuaWang
Copy link

I also has the same question. It seems that this can only run on localhost

@gaohannk
Copy link

Can someone provide tutorial to run on spark? And how to import TPC-DS.

@GuanhuaWang
Copy link

Yeah, definitely

@hchawla1
Copy link

Go to this directory -> spark-sql-perf-master/src/main/scala/com/databricks/spark/sql/perf
and in file RunBenchmark.scala
change .setMaster("local[*]") to pint to your master. Hopefully this should work

@madhugithub2014
Copy link

Local:
val conf = new SparkConf().setAppName("appname").setMaster("local[*]");
val sc = new SparkContext(conf);
Cluster mode:

Start spark master if required worker also (eg: start-all.sh (SPARK_HOME/sbin))

By default u can get spark master url by giving http://localhost:8080 then you can see the spark master url.

val conf = new SparkConf().setAppName("appname").setMaster("spark://localhost:7077");
val sc = new SparkContext(conf);

i think this will help you run your example

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants