You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying use some python libs in my project.
The libs use psutil.But throw some error.
This is a full stracktrace of the error :
18/09/17 14:02:40 INFO org.apache.spark.deploy.yarn.ApplicationMaster: Preparing Local resources
18/09/17 14:02:41 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/09/17 14:02:41 INFO org.apache.spark.deploy.yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1535027781207_39790_000001
18/09/17 14:02:41 INFO org.apache.spark.SecurityManager: Changing view acls to: hadoop
18/09/17 14:02:41 INFO org.apache.spark.SecurityManager: Changing modify acls to: hadoop
18/09/17 14:02:41 INFO org.apache.spark.SecurityManager: Changing view acls groups to:
18/09/17 14:02:41 INFO org.apache.spark.SecurityManager: Changing modify acls groups to:
18/09/17 14:02:41 INFO org.apache.spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); groups with view permissions: Set(); users with modify permissions: Set(hadoop); groups with modify permissions: Set()
18/09/17 14:02:41 INFO org.apache.spark.deploy.yarn.ApplicationMaster: Starting the user application in a separate Thread
18/09/17 14:02:41 INFO org.apache.spark.deploy.yarn.ApplicationMaster: Waiting for spark context initialization...
18/09/17 14:02:42 ERROR org.apache.spark.deploy.yarn.ApplicationMaster: User application exited with status 1
18/09/17 14:02:42 INFO org.apache.spark.deploy.yarn.ApplicationMaster: Final app status: FAILED, exitCode: 1, (reason: User application exited with status 1)
18/09/17 14:02:42 ERROR org.apache.spark.deploy.yarn.ApplicationMaster: Uncaught exception:
org.apache.spark.SparkException: Exception thrown in awaitResult:
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:401)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:254)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:764)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:67)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:66)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:762)
at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: org.apache.spark.SparkUserAppException: User application exited with 1
at org.apache.spark.deploy.PythonRunner$.main(PythonRunner.scala:97)
at org.apache.spark.deploy.PythonRunner.main(PythonRunner.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:635)
18/09/17 14:02:42 INFO org.apache.spark.deploy.yarn.ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: User application exited with status 1)
18/09/17 14:02:42 INFO org.apache.spark.deploy.yarn.ApplicationMaster: Deleting staging directory hdfs://hadoop2cluster/user/hadoop/.sparkStaging/application_1535027781207_39790
18/09/17 14:02:42 INFO org.apache.spark.util.ShutdownHookManager: Shutdown hook called
LogType:stdout
Log Upload Time:17-Sep-2018 14:02:43
LogLength:3335
Log Contents:
Traceback (most recent call last):
File "20180917112227___main__.py", line 4, in <module>
from src.main.python.com.lucky.ai.food.main import main
File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140126_src.zip/src/main/python/com/lucky/ai/food/main.py", line 8, in <module>
File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140126_src.zip/src/main/python/com/lucky/ai/food/feature_tools_data.py", line 3, in <module>
File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/__init__.py", line 7, in <module>
File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/synthesis/__init__.py", line 3, in <module>
File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/synthesis/api.py", line 5, in <module>
File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/synthesis/dfs.py", line 5, in <module>
File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/computational_backends/__init__.py", line 2, in <module>
File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/computational_backends/api.py", line 2, in <module>
File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/computational_backends/calculate_feature_matrix.py", line 17, in <module>
File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/computational_backends/utils.py", line 9, in <module>
File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/psutil/__init__.py", line 41, in <module>
ImportError: cannot import name _common
Concurrent marking:
0 init marks: total time = 0.00 s (avg = 0.00 ms).
1 remarks: total time = 0.00 s (avg = 4.87 ms).
[std. dev = 0.00 ms, max = 4.87 ms]
1 final marks: total time = 0.00 s (avg = 0.89 ms).
[std. dev = 0.00 ms, max = 0.89 ms]
1 weak refs: total time = 0.00 s (avg = 3.98 ms).
[std. dev = 0.00 ms, max = 3.98 ms]
1 cleanups: total time = 0.00 s (avg = 2.98 ms).
[std. dev = 0.00 ms, max = 2.98 ms]
Final counting total time = 0.00 s (avg = 0.49 ms).
RS scrub total time = 0.00 s (avg = 1.03 ms).
Total stop_world time = 0.01 s.
Total concurrent time = 0.01 s ( 0.01 s marking).
The text was updated successfully, but these errors were encountered:
I am trying use some python libs in my project.
The libs use psutil.But throw some error.
This is a full stracktrace of the error :
The text was updated successfully, but these errors were encountered: