Skip to content

QuentinElGuay/Spark

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Spark

Experimentations with spark

Install Spark

/!\ Spark only works on Linux / MacOs.

  1. Download Spark. I used Spark 2.4.5.
  2. Unzip Spark on your computer.
  3. You'll need Java installed. Carefull Spark 2.4.5 runs on Java 8. If you have an upper version of Java installed, you can follow this thread to solve your problem.
  4. Go to your Spark folder and run ./spark-2.4.5-bin-hadoop2.7/bin/spark-submit --h to make sure Spark is installed.
  5. Install pyspark:
pip install pyspark
  1. Run scripts and have fun:
./spark-2.4.5-bin-hadoop2.7/bin/spark-submit --master local[4] ./Spark/WordCount/wordcount.py

About

Experimentations with spark

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages