Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Linux]Cannot Import Name _common #1338

Closed
McRays opened this issue Sep 17, 2018 · 1 comment
Closed

[Linux]Cannot Import Name _common #1338

McRays opened this issue Sep 17, 2018 · 1 comment
Labels

Comments

@McRays
Copy link

McRays commented Sep 17, 2018

I am trying use some python libs in my project.
The libs use psutil.But throw some error.
This is a full stracktrace of the error :

18/09/17 14:02:40 INFO org.apache.spark.deploy.yarn.ApplicationMaster: Preparing Local resources
18/09/17 14:02:41 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/09/17 14:02:41 INFO org.apache.spark.deploy.yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1535027781207_39790_000001
18/09/17 14:02:41 INFO org.apache.spark.SecurityManager: Changing view acls to: hadoop
18/09/17 14:02:41 INFO org.apache.spark.SecurityManager: Changing modify acls to: hadoop
18/09/17 14:02:41 INFO org.apache.spark.SecurityManager: Changing view acls groups to: 
18/09/17 14:02:41 INFO org.apache.spark.SecurityManager: Changing modify acls groups to: 
18/09/17 14:02:41 INFO org.apache.spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(hadoop); groups with view permissions: Set(); users  with modify permissions: Set(hadoop); groups with modify permissions: Set()
18/09/17 14:02:41 INFO org.apache.spark.deploy.yarn.ApplicationMaster: Starting the user application in a separate Thread
18/09/17 14:02:41 INFO org.apache.spark.deploy.yarn.ApplicationMaster: Waiting for spark context initialization...
18/09/17 14:02:42 ERROR org.apache.spark.deploy.yarn.ApplicationMaster: User application exited with status 1
18/09/17 14:02:42 INFO org.apache.spark.deploy.yarn.ApplicationMaster: Final app status: FAILED, exitCode: 1, (reason: User application exited with status 1)
18/09/17 14:02:42 ERROR org.apache.spark.deploy.yarn.ApplicationMaster: Uncaught exception: 
org.apache.spark.SparkException: Exception thrown in awaitResult: 
	at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
	at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:401)
	at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:254)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:764)
	at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:67)
	at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:66)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
	at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
	at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:762)
	at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: org.apache.spark.SparkUserAppException: User application exited with 1
	at org.apache.spark.deploy.PythonRunner$.main(PythonRunner.scala:97)
	at org.apache.spark.deploy.PythonRunner.main(PythonRunner.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:635)
18/09/17 14:02:42 INFO org.apache.spark.deploy.yarn.ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: User application exited with status 1)
18/09/17 14:02:42 INFO org.apache.spark.deploy.yarn.ApplicationMaster: Deleting staging directory hdfs://hadoop2cluster/user/hadoop/.sparkStaging/application_1535027781207_39790
18/09/17 14:02:42 INFO org.apache.spark.util.ShutdownHookManager: Shutdown hook called

LogType:stdout
Log Upload Time:17-Sep-2018 14:02:43
LogLength:3335
Log Contents:
Traceback (most recent call last):
  File "20180917112227___main__.py", line 4, in <module>
    from src.main.python.com.lucky.ai.food.main import main
  File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140126_src.zip/src/main/python/com/lucky/ai/food/main.py", line 8, in <module>
  File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140126_src.zip/src/main/python/com/lucky/ai/food/feature_tools_data.py", line 3, in <module>
  File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/__init__.py", line 7, in <module>
  File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/synthesis/__init__.py", line 3, in <module>
  File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/synthesis/api.py", line 5, in <module>
  File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/synthesis/dfs.py", line 5, in <module>
  File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/computational_backends/__init__.py", line 2, in <module>
  File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/computational_backends/api.py", line 2, in <module>
  File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/computational_backends/calculate_feature_matrix.py", line 17, in <module>
  File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/featuretools/computational_backends/utils.py", line 9, in <module>
  File "/data/nm-local-dir/usercache/hadoop/appcache/application_1535027781207_39790/container_1535027781207_39790_01_000003/20180917140125_depend.zip/psutil/__init__.py", line 41, in <module>
ImportError: cannot import name _common
 Concurrent marking:
      0   init marks: total time =     0.00 s (avg =     0.00 ms).
      1      remarks: total time =     0.00 s (avg =     4.87 ms).
           [std. dev =     0.00 ms, max =     4.87 ms]
         1  final marks: total time =     0.00 s (avg =     0.89 ms).
              [std. dev =     0.00 ms, max =     0.89 ms]
         1    weak refs: total time =     0.00 s (avg =     3.98 ms).
              [std. dev =     0.00 ms, max =     3.98 ms]
      1     cleanups: total time =     0.00 s (avg =     2.98 ms).
           [std. dev =     0.00 ms, max =     2.98 ms]
    Final counting total time =     0.00 s (avg =     0.49 ms).
    RS scrub total time =     0.00 s (avg =     1.03 ms).
  Total stop_world time =     0.01 s.
  Total concurrent time =     0.01 s (    0.01 s marking).
@giampaolo
Copy link
Owner

I can't know why that happens but it doesn't look like a psutil bug. Closing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants