Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'JavaPackage' object is not callable error in Python API #153

Closed
sryza opened this issue Jun 24, 2016 · 3 comments
Closed

'JavaPackage' object is not callable error in Python API #153

sryza opened this issue Jun 24, 2016 · 3 comments

Comments

@sryza
Copy link
Owner

sryza commented Jun 24, 2016

This was reported on the mailing list:
https://groups.google.com/forum/#!topic/spark-ts/4ZJ7t-w0s7o

<ipython-input-8-b40312bd4ec8> in <module>()
      1 # Create an daily DateTimeIndex over August and September 2015
----> 2 freq = BusinessDayFrequency(1, sc)

/databricks/python/local/lib/python2.7/site-packages/sparkts/datetimeindex.pyc in __init__(self, bdays, sc)
    115 
    116     def __init__(self, bdays, sc):
--> 117         self._jfreq = sc._jvm.com.cloudera.sparkts.BusinessDayFrequency(bdays)
    118 
    119     def __eq__(self, other):
@hajimupakura
Copy link

Did anyone get a solution to this error?

@blpabhishek
Copy link

if I am not wrong you are not providing a way for adding sparkts jar as PYSPARK_SUBMIT_ARGS to use jvm objects in python.

@diegorep
Copy link

diegorep commented Aug 30, 2016

This happened to me when I had not properly set the --jars path in the spark-defaults.conf.

My experience is that these "not callable" errors usually show up when an object does not have some property or when the property is being called with incorrect parameters. Here, the object is the com.cloudera.sparkts package and we'd technically be looking for property BusinessDayFrequency.

Usually, python will throw a different error if it can't find sparkts (most likely at some import statement), but since sparkts is a java package living in the JVM, it won't attempt this until you actually try to access BusinessDayFrequency, which will of course break it if the sparkts jar was not properly added to the spark conf.

@blpabhishek's solution will also work, albeit you'll have to make sure you pass in that argument each time you start up the shell

sryza added a commit that referenced this issue Nov 15, 2016
* Added 'SNAPSHOT' tag to jar references in python build configuration. Updated python version to match maven 0.3.0 -> 0.4.0. updated .gitignore to ignore .jar files

* Updated README.md to reflect changes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants