-
Notifications
You must be signed in to change notification settings - Fork 365
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Adding Spark 2.0.1 to the demo (#429)
- Loading branch information
Showing
15 changed files
with
283 additions
and
34 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
32 changes: 32 additions & 0 deletions
32
genie-demo/src/main/docker/apache/files/applications/spark/2.0.1/setup.sh
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,32 @@ | ||
#!/bin/bash | ||
|
||
set -o errexit -o nounset -o pipefail | ||
|
||
start_dir=`pwd` | ||
cd `dirname ${BASH_SOURCE[0]}` | ||
SPARK_BASE=`pwd` | ||
cd $start_dir | ||
|
||
export SPARK_DAEMON_JAVA_OPTS="-verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps" | ||
|
||
SPARK_DEPS=${SPARK_BASE}/dependencies | ||
|
||
export SPARK_VERSION="2.0.1" | ||
|
||
tar xzf ${SPARK_DEPS}/spark-${SPARK_VERSION}-bin-hadoop2.7.tgz -C ${SPARK_DEPS} | ||
|
||
# Set the required environment variable. | ||
export SPARK_HOME=${SPARK_DEPS}/spark-${SPARK_VERSION}-bin-hadoop2.7 | ||
export SPARK_CONF_DIR=${SPARK_HOME}/conf | ||
export SPARK_LOG_DIR=${GENIE_JOB_DIR} | ||
export SPARK_LOG_FILE=spark.log | ||
export SPARK_LOG_FILE_PATH=${GENIE_JOB_DIR}/${SPARK_LOG_FILE} | ||
export CURRENT_JOB_WORKING_DIR=${GENIE_JOB_DIR} | ||
export CURRENT_JOB_TMP_DIR=${CURRENT_JOB_WORKING_DIR}/tmp | ||
|
||
# Make Sure Script is on the Path | ||
export PATH=$PATH:${SPARK_HOME}/bin | ||
|
||
# Delete the tarball to save space | ||
rm ${SPARK_DEPS}/spark-${SPARK_VERSION}-bin-hadoop2.7.tgz | ||
|
6 changes: 6 additions & 0 deletions
6
genie-demo/src/main/docker/apache/files/commands/spark/2.0.1/setupShell.sh
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
#!/bin/bash | ||
|
||
set -o errexit -o nounset -o pipefail | ||
|
||
# copy hive-site.xml configuration | ||
#cp ${GENIE_COMMAND_DIR}/config/* ${SPARK_CONF_DIR} |
6 changes: 6 additions & 0 deletions
6
genie-demo/src/main/docker/apache/files/commands/spark/2.0.1/setupSubmit.sh
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
#!/bin/bash | ||
|
||
set -o errexit -o nounset -o pipefail | ||
|
||
# copy hive-site.xml configuration | ||
#cp ${GENIE_COMMAND_DIR}/config/* ${SPARK_CONF_DIR} |
11 changes: 11 additions & 0 deletions
11
genie-demo/src/main/docker/client/example/applications/spark201.yml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,11 @@ | ||
id: spark201 | ||
name: spark | ||
user: genieDemo | ||
status: ACTIVE | ||
description: Spark Application | ||
setupFile: http://genie-apache/applications/spark/2.0.1/setup.sh | ||
version: 2.0.1 | ||
type: spark | ||
tags: ['type:spark', 'ver:2.0.1', 'ver:2.0'] | ||
dependencies: | ||
- http://genie-apache/applications/spark/2.0.1/spark-2.0.1-bin-hadoop2.7.tgz |
11 changes: 11 additions & 0 deletions
11
genie-demo/src/main/docker/client/example/commands/sparkShell201.yml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,11 @@ | ||
id: sparkshell201 | ||
name: Spark Shell | ||
user: genieDemo | ||
description: Spark Shell Command | ||
status: ACTIVE | ||
setupFile: http://genie-apache/commands/spark/2.0.1/setupShell.sh | ||
configs: [] | ||
executable: ${SPARK_HOME}/bin/spark-shell | ||
version: 2.0.1 | ||
tags: ['type:spark-shell', 'ver:2.0.1'] | ||
checkDelay: 5000 |
11 changes: 11 additions & 0 deletions
11
genie-demo/src/main/docker/client/example/commands/sparkSubmit201.yml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,11 @@ | ||
id: sparksubmit201 | ||
name: Spark Submit | ||
user: genieDemo | ||
description: Spark Submit Command | ||
status: ACTIVE | ||
setupFile: http://genie-apache/commands/spark/2.0.1/setupSubmit.sh | ||
configs: [] | ||
executable: ${SPARK_HOME}/bin/spark-submit --master yarn --deploy-mode client | ||
version: 2.0.1 | ||
tags: ['type:spark-submit', 'ver:2.0.1'] | ||
checkDelay: 5000 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
64 changes: 64 additions & 0 deletions
64
genie-demo/src/main/docker/client/example/run_spark_shell_job.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,64 @@ | ||
#!/usr/bin/python2.7 | ||
|
||
# Copyright 2016 Netflix, Inc. | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); | ||
# you may not use this file except in compliance with the License. | ||
# You may obtain a copy of the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, software | ||
# distributed under the License is distributed on an "AS IS" BASIS, | ||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
# See the License for the specific language governing permissions and | ||
# limitations under the License. | ||
|
||
################################################################################## | ||
# This script assumes setup.py has already been run to configure Genie and that | ||
# this script is executed on the host where Genie is running. If it's executed on | ||
# another host change the localhost line below. | ||
################################################################################## | ||
|
||
from __future__ import absolute_import, division, print_function, unicode_literals | ||
|
||
import logging | ||
import pygenie | ||
import sys | ||
|
||
logging.basicConfig(level=logging.ERROR) | ||
|
||
LOGGER = logging.getLogger(__name__) | ||
|
||
pygenie.conf.DEFAULT_GENIE_URL = "http://genie:8080" | ||
|
||
# Create a job instance and fill in the required parameters | ||
job = pygenie.jobs.GenieJob() \ | ||
.job_name('Genie Demo Spark Shell Job') \ | ||
.genie_username('root') \ | ||
.job_version('3.0.0') | ||
|
||
# Set cluster criteria which determine the cluster to run the job on | ||
job.cluster_tags(['sched:' + str(sys.argv[1]), 'type:yarn']) | ||
|
||
# Set command criteria which will determine what command Genie executes for the job | ||
job.command_tags(['type:spark-shell']) | ||
|
||
# Any command line arguments to run along with the command. In this case it holds | ||
# the actual query but this could also be done via an attachment or file dependency. | ||
# This jar location is where it is installed on the Genie node but could also pass | ||
# the jar as attachment and use it locally | ||
job.command_arguments( | ||
"--help" | ||
) | ||
|
||
# Submit the job to Genie | ||
running_job = job.execute() | ||
|
||
print('Job {} is {}'.format(running_job.job_id, running_job.status)) | ||
print(running_job.job_link) | ||
|
||
# Block and wait until job is done | ||
running_job.wait() | ||
|
||
print('Job {} finished with status {}'.format(running_job.job_id, running_job.status)) |
Oops, something went wrong.