SPARK-1121: Include avro for yarn-alpha builds #49

Closed
wants to merge 1 commit into
from

Conversation

Projects
None yet
3 participants
@pwendell
Contributor

pwendell commented Feb 28, 2014

This lets us explicitly include Avro based on a profile for 0.23.X
builds. It makes me sad how convoluted it is to express this logic
in Maven. @tgraves and @sryza curious if this works for you.

I'm also considering just reverting to how it was before. The only
real problem was that Spark advertised a dependency on Avro
even though it only really depends transitively on Avro through
other deps.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Feb 28, 2014

Merged build triggered.

Merged build triggered.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Feb 28, 2014

Merged build started.

Merged build started.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Feb 28, 2014

Merged build triggered.

Merged build triggered.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Feb 28, 2014

Merged build finished.

Merged build finished.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Feb 28, 2014

All automated tests passed.
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/12933/

All automated tests passed.
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/12933/

@tgravescs

This comment has been minimized.

Show comment Hide comment
@tgravescs

tgravescs Feb 28, 2014

Contributor

I'll give it a try. Any reason we don't just tie this to the yarn-alpha profile? Or does it not apply to the hadoop 2.0.2 type builds?

Contributor

tgravescs commented Feb 28, 2014

I'll give it a try. Any reason we don't just tie this to the yarn-alpha profile? Or does it not apply to the hadoop 2.0.2 type builds?

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Mar 1, 2014

Merged build triggered.

Merged build triggered.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Mar 1, 2014

Merged build started.

Merged build started.

@pwendell

This comment has been minimized.

Show comment Hide comment
@pwendell

pwendell Mar 1, 2014

Contributor

Your suggestion of @tgravescs tying this yarn-alpha I think is strictly better. Just updated the patch.

Contributor

pwendell commented Mar 1, 2014

Your suggestion of @tgravescs tying this yarn-alpha I think is strictly better. Just updated the patch.

SPARK-1121: Add avro to yarn-alpha profile
This lets us explicitly include Avro based on a profile for 0.23.X
builds. It makes me sad how convoluted it is to express this logic
in Maven.
@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Mar 1, 2014

Merged build finished.

Merged build finished.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Mar 1, 2014

One or more automated tests failed
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/12939/

One or more automated tests failed
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/12939/

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Mar 1, 2014

Merged build triggered.

Merged build triggered.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Mar 1, 2014

Merged build started.

Merged build started.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Mar 1, 2014

Merged build finished.

Merged build finished.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Mar 1, 2014

One or more automated tests failed
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/12940/

One or more automated tests failed
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/12940/

@pwendell

This comment has been minimized.

Show comment Hide comment
@pwendell

pwendell Mar 1, 2014

Contributor

Jenkins, test this please.

Contributor

pwendell commented Mar 1, 2014

Jenkins, test this please.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Mar 1, 2014

Merged build triggered.

Merged build triggered.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Mar 1, 2014

Merged build started.

Merged build started.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Mar 1, 2014

Merged build finished.

Merged build finished.

@AmplabJenkins

This comment has been minimized.

Show comment Hide comment
@AmplabJenkins

AmplabJenkins Mar 1, 2014

All automated tests passed.
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/12943/

All automated tests passed.
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/12943/

@pwendell

This comment has been minimized.

Show comment Hide comment
@pwendell

pwendell Mar 2, 2014

Contributor

Hey guys, I tried building for YARN with 2.2.0 and 0.23.9 and seems like it's working. I'd like to get this in as a build fix, but please re-open SPARK-1121 if you have any issues and we can patch it up.

# Yarn alpha build
mvn -Dhadoop.version=0.23.9 -Dyarn.version=0.23.9 -DskipTests clean package -Pyarn-alpha
# Yarn build
mvn -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 -DskipTests clean package -Pyarn
Contributor

pwendell commented Mar 2, 2014

Hey guys, I tried building for YARN with 2.2.0 and 0.23.9 and seems like it's working. I'd like to get this in as a build fix, but please re-open SPARK-1121 if you have any issues and we can patch it up.

# Yarn alpha build
mvn -Dhadoop.version=0.23.9 -Dyarn.version=0.23.9 -DskipTests clean package -Pyarn-alpha
# Yarn build
mvn -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 -DskipTests clean package -Pyarn

@asfgit asfgit closed this in c3f5e07 Mar 3, 2014

jhartlaub referenced this pull request in jhartlaub/spark May 27, 2014

Merge pull request #49 from mateiz/kryo-fix-2
Fix Chill serialization of Range objects

It used to write out each element one by one, creating very large objects.

(cherry picked from commit 320418f)
Signed-off-by: Reynold Xin <rxin@apache.org>

CrazyJvm added a commit to CrazyJvm/spark that referenced this pull request Jun 1, 2014

SPARK-1121: Include avro for yarn-alpha builds
This lets us explicitly include Avro based on a profile for 0.23.X
builds. It makes me sad how convoluted it is to express this logic
in Maven. @tgraves and @sryza curious if this works for you.

I'm also considering just reverting to how it was before. The only
real problem was that Spark advertised a dependency on Avro
even though it only really depends transitively on Avro through
other deps.

Author: Patrick Wendell <pwendell@gmail.com>

Closes #49 from pwendell/avro-build-fix and squashes the following commits:

8d6ee92 [Patrick Wendell] SPARK-1121: Add avro to yarn-alpha profile

gzm55 pushed a commit to MediaV/spark that referenced this pull request Jul 17, 2014

SPARK-1121: Include avro for yarn-alpha builds
This lets us explicitly include Avro based on a profile for 0.23.X
builds. It makes me sad how convoluted it is to express this logic
in Maven. @tgraves and @sryza curious if this works for you.

I'm also considering just reverting to how it was before. The only
real problem was that Spark advertised a dependency on Avro
even though it only really depends transitively on Avro through
other deps.

Author: Patrick Wendell <pwendell@gmail.com>

Closes #49 from pwendell/avro-build-fix and squashes the following commits:

8d6ee92 [Patrick Wendell] SPARK-1121: Add avro to yarn-alpha profile

JasonMWhite pushed a commit to JasonMWhite/spark that referenced this pull request Dec 2, 2015

Merge pull request #49 from Shopify/use_releases_path
use release path for uploading jar, not current

clockfly added a commit to clockfly/spark that referenced this pull request Aug 30, 2016

[SC-3909][SQL] Closure translation part 1: Byte code parser
## What changes were proposed in this pull request?

This PR adds the byte code parser for the closure translation optimization. It translates Java closure like MapFunction, or FilterFunction to an intermediate format:`Node` tree.

For example, closure `(v: Int) => { v > 0 }` will be translated to `Node` tree:

```
Arithmetic[Z](>)
  Argument[I]
  Constant[I](0)
```

## How was this patch tested?

Unit test.

Author: Sean Zhong <seanzhong@databricks.com>

Closes #49 from clockfly/closure_parser_part1.

ash211 pushed a commit to ash211/spark that referenced this pull request Jan 31, 2017

Access the Driver Launcher Server over NodePort for app launch + subm…
…it jars (#30)

* Revamp ports and service setup for the driver.

- Expose the driver-submission service on NodePort and contact that as
opposed to going through the API server proxy
- Restrict the ports that are exposed on the service to only the driver
submission service when uploading content and then only the Spark UI
after the job has started

* Move service creation down and more thorough error handling

* Fix missed merge conflict

* Add braces

* Fix bad merge

* Address comments and refactor run() more.

Method nesting was getting confusing so pulled out the inner class and
removed the extra method indirection from createDriverPod()

* Remove unused method

* Support SSL configuration for the driver application submission (#49)

* Support SSL when setting up the driver.

The user can provide a keyStore to load onto the driver pod and the
driver pod will use that keyStore to set up SSL on its server.

* Clean up SSL secrets after finishing submission.

We don't need to persist these after the pod has them mounted and is
running already.

* Fix compilation error

* Revert image change

* Address comments

* Programmatically generate certificates for integration tests.

* Address comments

* Resolve merge conflicts

* Fix bad merge

* Remove unnecessary braces

* Fix compiler error

lins05 pushed a commit to lins05/spark that referenced this pull request Apr 23, 2017

Access the Driver Launcher Server over NodePort for app launch + subm…
…it jars (#30)

* Revamp ports and service setup for the driver.

- Expose the driver-submission service on NodePort and contact that as
opposed to going through the API server proxy
- Restrict the ports that are exposed on the service to only the driver
submission service when uploading content and then only the Spark UI
after the job has started

* Move service creation down and more thorough error handling

* Fix missed merge conflict

* Add braces

* Fix bad merge

* Address comments and refactor run() more.

Method nesting was getting confusing so pulled out the inner class and
removed the extra method indirection from createDriverPod()

* Remove unused method

* Support SSL configuration for the driver application submission (#49)

* Support SSL when setting up the driver.

The user can provide a keyStore to load onto the driver pod and the
driver pod will use that keyStore to set up SSL on its server.

* Clean up SSL secrets after finishing submission.

We don't need to persist these after the pod has them mounted and is
running already.

* Fix compilation error

* Revert image change

* Address comments

* Programmatically generate certificates for integration tests.

* Address comments

* Resolve merge conflicts

* Fix bad merge

* Remove unnecessary braces

* Fix compiler error

erikerlandson pushed a commit to erikerlandson/spark that referenced this pull request Jul 28, 2017

Access the Driver Launcher Server over NodePort for app launch + subm…
…it jars (#30)

* Revamp ports and service setup for the driver.

- Expose the driver-submission service on NodePort and contact that as
opposed to going through the API server proxy
- Restrict the ports that are exposed on the service to only the driver
submission service when uploading content and then only the Spark UI
after the job has started

* Move service creation down and more thorough error handling

* Fix missed merge conflict

* Add braces

* Fix bad merge

* Address comments and refactor run() more.

Method nesting was getting confusing so pulled out the inner class and
removed the extra method indirection from createDriverPod()

* Remove unused method

* Support SSL configuration for the driver application submission (#49)

* Support SSL when setting up the driver.

The user can provide a keyStore to load onto the driver pod and the
driver pod will use that keyStore to set up SSL on its server.

* Clean up SSL secrets after finishing submission.

We don't need to persist these after the pod has them mounted and is
running already.

* Fix compilation error

* Revert image change

* Address comments

* Programmatically generate certificates for integration tests.

* Address comments

* Resolve merge conflicts

* Fix bad merge

* Remove unnecessary braces

* Fix compiler error
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment