You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
我在添加jars之前都和题主一样。但是添加完jars之后出现了以下错误,能不能帮我看看哪里出了问题
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/08/02 16:56:21 INFO SparkContext: Running Spark version 2.3.0-SNAPSHOT
17/08/02 16:56:22 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/08/02 16:56:23 WARN Utils: Your hostname, hjw-XiaoXin-700 resolves to a loopback address: 127.0.1.1; using 192.168.191.4 instead (on interface wlp2s0)
17/08/02 16:56:23 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
17/08/02 16:56:23 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: A master URL must be set in your configuration
at org.apache.spark.SparkContext.(SparkContext.scala:368)
at org.apache.spark.examples.LogQuery$.main(LogQuery.scala:47)
at org.apache.spark.examples.LogQuery.main(LogQuery.scala)
17/08/02 16:56:23 ERROR Utils: Uncaught exception in thread main
java.lang.NullPointerException
at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$postApplicationEnd(SparkContext.scala:2401)
at org.apache.spark.SparkContext$$anonfun$stop$1.apply$mcV$sp(SparkContext.scala:1890)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1331)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1889)
at org.apache.spark.SparkContext.(SparkContext.scala:581)
at org.apache.spark.examples.LogQuery$.main(LogQuery.scala:47)
at org.apache.spark.examples.LogQuery.main(LogQuery.scala)
17/08/02 16:56:23 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration
at org.apache.spark.SparkContext.(SparkContext.scala:368)
at org.apache.spark.examples.LogQuery$.main(LogQuery.scala:47)
at org.apache.spark.examples.LogQuery.main(LogQuery.scala)
The text was updated successfully, but these errors were encountered:
我在添加jars之前都和题主一样。但是添加完jars之后出现了以下错误,能不能帮我看看哪里出了问题
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/08/02 16:56:21 INFO SparkContext: Running Spark version 2.3.0-SNAPSHOT
17/08/02 16:56:22 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/08/02 16:56:23 WARN Utils: Your hostname, hjw-XiaoXin-700 resolves to a loopback address: 127.0.1.1; using 192.168.191.4 instead (on interface wlp2s0)
17/08/02 16:56:23 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
17/08/02 16:56:23 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: A master URL must be set in your configuration
at org.apache.spark.SparkContext.(SparkContext.scala:368)
at org.apache.spark.examples.LogQuery$.main(LogQuery.scala:47)
at org.apache.spark.examples.LogQuery.main(LogQuery.scala)
17/08/02 16:56:23 ERROR Utils: Uncaught exception in thread main
java.lang.NullPointerException
at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$postApplicationEnd(SparkContext.scala:2401)
at org.apache.spark.SparkContext$$anonfun$stop$1.apply$mcV$sp(SparkContext.scala:1890)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1331)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1889)
at org.apache.spark.SparkContext.(SparkContext.scala:581)
at org.apache.spark.examples.LogQuery$.main(LogQuery.scala:47)
at org.apache.spark.examples.LogQuery.main(LogQuery.scala)
17/08/02 16:56:23 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration
at org.apache.spark.SparkContext.(SparkContext.scala:368)
at org.apache.spark.examples.LogQuery$.main(LogQuery.scala:47)
at org.apache.spark.examples.LogQuery.main(LogQuery.scala)
The text was updated successfully, but these errors were encountered: