Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[jvm-packages] update xgboost4j cross build script to be compatible with older glibc #3307

Merged
merged 11 commits into from May 10, 2018

Conversation

CodingCat
Copy link
Member

No description provided.

@CodingCat
Copy link
Member Author

@dbtsai you can check this, I have uploaded the version compiled with GLIBC 2.12 and tested with 2.19 environment,

@codecov-io
Copy link

codecov-io commented May 10, 2018

Codecov Report

Merging #3307 into master will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff            @@
##             master    #3307   +/-   ##
=========================================
  Coverage     45.52%   45.52%           
  Complexity      228      228           
=========================================
  Files           166      166           
  Lines         12970    12970           
  Branches        466      466           
=========================================
  Hits           5904     5904           
  Misses         6874     6874           
  Partials        192      192

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 9a8211f...7eee733. Read the comment docs.

@CodingCat CodingCat merged commit 49b9f39 into dmlc:master May 10, 2018
@dbtsai
Copy link

dbtsai commented May 11, 2018

+@lindblombr

Thanks @CodingCat for this PR. How can I test it if it run in our older Linux runtime? Will you cut a release or snapshot?

Thanks.

@CodingCat
Copy link
Member Author

CodingCat commented May 11, 2018

you just need to follow README.md in jvm-packages, I have pushed the latest jar to github-based repo, you should be able to download it @dbtsai

@CodingCat CodingCat deleted the static_lib_c branch May 15, 2018 15:56
@lev112
Copy link

lev112 commented Jun 12, 2018

@CodingCat I'm using the published xgboost 0.72, and getting the following error when running on my hadoop cluster

/lib64/libm.so.6: version 'GLIBC_2.23' not found

the version of glibc on my cluster is 2.17

wasn't this PR supposed to solve this issue?
does 2.17 should work?

thanks

the full stack trace:

Exception in thread "SparkListenerBus" java.lang.InterruptedException: ExecutorLost during XGBoost Training: java.lang.UnsatisfiedLinkError: /ssd/yarn/nm/usercache/it-research/appcache/application_1527022184221_1205144/container_e18_1527022184221_1205144_01_000010/tmp/libxgboost4j3688032372517259873.so: /lib64/libm.so.6: version `GLIBC_2.23' not found (required by /ssd/yarn/nm/usercache/it-research/appcache/application_1527022184221_1205144/container_e18_1527022184221_1205144_01_000010/tmp/libxgboost4j3688032372517259873.so)
	at java.lang.ClassLoader$NativeLibrary.load(Native Method)
	at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
	at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
	at java.lang.Runtime.load0(Runtime.java:809)
	at java.lang.System.load(System.java:1086)
	at ml.dmlc.xgboost4j.java.NativeLibLoader.loadLibraryFromJar(NativeLibLoader.java:66)
	at ml.dmlc.xgboost4j.java.NativeLibLoader.smartLoad(NativeLibLoader.java:152)
	at ml.dmlc.xgboost4j.java.NativeLibLoader.initXGBoost(NativeLibLoader.java:40)
	at ml.dmlc.xgboost4j.java.XGBoostJNI.<clinit>(XGBoostJNI.java:34)
	at ml.dmlc.xgboost4j.java.Rabit.init(Rabit.java:65)
	at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:131)
	at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:117)
	at org.apache.spark.rdd.ZippedPartitionsRDD2.compute(ZippedPartitionsRDD.scala:89)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
	at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1038)
	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1029)
	at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:969)
	at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1029)
	at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:760)
	at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

	at org.apache.spark.TaskFailedListener.onTaskEnd(SparkParallelismTracker.scala:116)
	at org.apache.spark.scheduler.SparkListenerBus$class.doPostEvent(SparkListenerBus.scala:45)
	at org.apache.spark.scheduler.LiveListenerBus.doPostEvent(LiveListenerBus.scala:36)
	at org.apache.spark.scheduler.LiveListenerBus.doPostEvent(LiveListenerBus.scala:36)
	at org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:63)
	at org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:36)
	at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(LiveListenerBus.scala:94)
	at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
	at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
	at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:78)
	at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1279)
	at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:77)
java.net.ConnectException: Connection refused (Connection refused)
	at java.net.PlainSocketImpl.socketConnect(Native Method)
	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
	at java.net.Socket.connect(Socket.java:589)
	at java.net.Socket.connect(Socket.java:538)
	at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
	at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
	at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
	at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
	at sun.net.www.http.HttpClient.New(HttpClient.java:308)
	at sun.net.www.http.HttpClient.New(HttpClient.java:326)
	at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1202)
	at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1138)
	at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1032)
	at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:966)
	at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1546)
	at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1474)
	at java.net.URL.openStream(URL.java:1045)
	at org.codehaus.jackson.JsonFactory._optimizedStreamFromURL(JsonFactory.java:935)
	at org.codehaus.jackson.JsonFactory.createJsonParser(JsonFactory.java:530)
	at org.codehaus.jackson.map.ObjectMapper.readTree(ObjectMapper.java:1590)
	at org.apache.spark.SparkParallelismTracker.org$apache$spark$SparkParallelismTracker$$numAliveCores(SparkParallelismTracker.scala:53)
	at org.apache.spark.SparkParallelismTracker$$anonfun$execute$1.apply$mcZ$sp(SparkParallelismTracker.scala:102)
	at org.apache.spark.SparkParallelismTracker$$anonfun$1.apply$mcV$sp(SparkParallelismTracker.scala:71)
	at org.apache.spark.SparkParallelismTracker$$anonfun$1.apply(SparkParallelismTracker.scala:71)
	at org.apache.spark.SparkParallelismTracker$$anonfun$1.apply(SparkParallelismTracker.scala:71)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

@lock lock bot locked as resolved and limited conversation to collaborators Jan 18, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants