New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Randomize ports and turn off Spark UI to reduce bind exceptions in tests #557

Merged
merged 1 commit into from Jan 25, 2015

Conversation

Projects
None yet
3 participants
@massie
Member

massie commented Jan 25, 2015

No description provided.

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Jan 25, 2015

Member

Nice!

Member

fnothaft commented Jan 25, 2015

Nice!

@AmplabJenkins

This comment has been minimized.

Show comment
Hide comment
@AmplabJenkins

AmplabJenkins Jan 25, 2015

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/ADAM-prb/556/
Test PASSed.

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/ADAM-prb/556/
Test PASSed.

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Jan 25, 2015

Member

Looking in the logs, it looks like we're still getting bind exceptions:


2015-01-25 11:49:51 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-01-25 11:49:52 WARN  AbstractLifeCycle:204 - FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:444)
    at sun.nio.ch.Net.bind(Net.java:436)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
    at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
    at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
    at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
    at org.eclipse.jetty.server.Server.doStart(Server.java:293)
    at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
    at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:194)
    at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:204)
    at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:204)
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1676)
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
    at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1667)

Thoughts?

Member

fnothaft commented Jan 25, 2015

Looking in the logs, it looks like we're still getting bind exceptions:


2015-01-25 11:49:51 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-01-25 11:49:52 WARN  AbstractLifeCycle:204 - FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:444)
    at sun.nio.ch.Net.bind(Net.java:436)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
    at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
    at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
    at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
    at org.eclipse.jetty.server.Server.doStart(Server.java:293)
    at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
    at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:194)
    at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:204)
    at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:204)
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1676)
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
    at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1667)

Thoughts?

@massie

This comment has been minimized.

Show comment
Hide comment
@massie

massie Jan 25, 2015

Member

The Features2ADAMSuite test class needs to use the SparkFunSuite in to fix the error about multiple spark contexts in a single JVM.

The ./scripts/jenkins-tests script needs to be updated to either (a) have an exclusive lock to prevents multiple scripts from running at once or (b) randomizing the ports that are used for the tests.

Member

massie commented Jan 25, 2015

The Features2ADAMSuite test class needs to use the SparkFunSuite in to fix the error about multiple spark contexts in a single JVM.

The ./scripts/jenkins-tests script needs to be updated to either (a) have an exclusive lock to prevents multiple scripts from running at once or (b) randomizing the ports that are used for the tests.

@massie

This comment has been minimized.

Show comment
Hide comment
@massie

massie Jan 25, 2015

Member

This PR is step one which fixes the bind exceptions thrown in the SparkFunSuite. I'll update the Jenkins script in a subsequent PR.

Member

massie commented Jan 25, 2015

This PR is step one which fixes the bind exceptions thrown in the SparkFunSuite. I'll update the Jenkins script in a subsequent PR.

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Jan 25, 2015

Member

Sounds good, thanks for the clarification @massie!

The Features2ADAMSuite test class needs to use the SparkFunSuite in to fix the error about multiple spark contexts in a single JVM.

Ah, makes sense. And, this is because the Features2ADAMSuite doesn't create a SparkContext, but Features2ADAM.run() does. I'll open a PR to fix that.

Member

fnothaft commented Jan 25, 2015

Sounds good, thanks for the clarification @massie!

The Features2ADAMSuite test class needs to use the SparkFunSuite in to fix the error about multiple spark contexts in a single JVM.

Ah, makes sense. And, this is because the Features2ADAMSuite doesn't create a SparkContext, but Features2ADAM.run() does. I'll open a PR to fix that.

fnothaft added a commit that referenced this pull request Jan 25, 2015

Merge pull request #557 from massie/fix-bind-errors
Randomize ports and turn off Spark UI to reduce bind exceptions in tests

@fnothaft fnothaft merged commit 479d416 into bigdatagenomics:master Jan 25, 2015

1 check passed

default Merged build finished.
Details
@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Jan 25, 2015

Member

Merged! Thanks @massie!

Member

fnothaft commented Jan 25, 2015

Merged! Thanks @massie!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment