Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add option to download Spark from a custom URL #101

Closed
rmessner opened this issue Apr 6, 2016 · 7 comments · Fixed by #125
Closed

Add option to download Spark from a custom URL #101

rmessner opened this issue Apr 6, 2016 · 7 comments · Fixed by #125

Comments

@rmessner
Copy link
Contributor

rmessner commented Apr 6, 2016

We are trying to setup a cluster with Spark 1.6.1, without HDFS, but it fails.
The reason is that the file downloaded is corrupted, so we can't untar it.

Would be a nice feature to allow user to specify an alternate mirror to download the pre-build spark. This would add a new key to configuration, like :

services:
  spark:
    version: 1.6.0
    preferred-mirror: http://apache.crihan.fr/dist/spark/spark-${spark_version}/${file}

The default value would be https://s3.amazonaws.com/spark-related-packages/${file} and the variables available file ( as the file name of pre-build spark is always the same whatever the mirror is ), spark_version ( e.g. : 1.6.1) , distribution ( e.g. : hadoop2.6 )

Related to #71.

@nchammas
Copy link
Owner

nchammas commented Apr 6, 2016

Thanks for reporting this. I was just thinking about it earlier this week.

The problem you are seeing with Spark 1.6.1 is a known issue upstream that I've been bugging the core Spark maintainers about for a few weeks now. 😞

The Spark 1.6.1 package on S3 that Flintrock uses is corrupt, and unfortunately we don't have the ability to control it. That resource is controlled by the Spark maintainers.

For the record, your proposal here is the Spark analogue to #71, which is for Hadoop.

@rmessner
Copy link
Contributor Author

rmessner commented Apr 6, 2016

My co-workers just told me they have the same issue, so i will make the same fix that i submit on #104 if it's okay for you @nchammas .

@nchammas
Copy link
Owner

nchammas commented Apr 6, 2016

You mean you'll open a PR for #71 that's similar to #104?

@rmessner
Copy link
Contributor Author

rmessner commented Apr 6, 2016

Yes, or make it in the same PR, as you want

@nchammas
Copy link
Owner

nchammas commented Apr 6, 2016

Oh sure, it can be the same PR.

@nchammas nchammas changed the title [SPARK] - Spark file corrupt on S3 ( unable to untar ) Spark file corrupt on S3 (unable to untar) Apr 9, 2016
@JoshRosen
Copy link

AFAIK the corrupt packages should be fixed now; let me know if they're still a problem.

@nchammas
Copy link
Owner

Yes, I believe they're fixed now. Thanks for taking care of that @JoshRosen!

Retitling issue accordingly.

@nchammas nchammas changed the title Spark file corrupt on S3 (unable to untar) Add option to download Spark from a custom URL Apr 27, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants