-
Notifications
You must be signed in to change notification settings - Fork 130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
nlu support on Python 3.8 #11
Comments
It’s coming from pyspark 2.4 that is not running on 3.8... but yes, makes sense to put limit on python version |
Thanks for sharing this issue. We are looking to fix various versioning issues for the next release |
Maybe a similar issue on 3.8.0 Contents of ngram.py: class NGram: |
Hi @phoebusg , thanks for sharing, this is an issue with the nlu version 1.0.1 which is fixed in nlu 1.0.2 which will be released today. |
Thank you C-K, I appreciate the update... looking forward to the release! :) |
python .\demo.py :( too lazy to setup an environment with <3.8-- I'll wait for the next update then. |
@phoebusg NLU depends on the Spark 2.4 that doesn't work with Python 3.8... |
Do you guys have any idea when Python >3.8 will be supported by NLU? I think Spark >3.0 should support Python 3.8. Trying to figure out whether I should set up another environment with an earlier Python version. |
Hi @muhlbach in the upcoming NLU healthcare release we will support Python 3.8 and Spark 3.0. |
@C-K-Loan thanks for your answer! That’s great. Looking forward to being able to use NLU again on Python 3.8. |
@muhlbach
Stay tuned for the full release this week :) |
@C-K-Loan, thanks! I tried installing the pre-lease and it installed perfectly. However, trying: I'm wondering whether this has something to do with my Java installation? Here's the version:
Btw. I'm running on an Apple M1 chip. |
Hi @muhlbach
|
Just in case your M1 chip had an issue with Java, you can follow this as well: JohnSnowLabs/spark-nlp#2282 |
NLU 3.0.0 is now released and supports Python 3.8 and Spark 3.1.X and Spark 3.0.X Be aware, if you run with a Spark version below 3, you cannot use Python 3.8, since that is only supported in Spark 3+ |
On
import nlu
, looks like pyspark/cloudpickle.py is failing with:TypeError: an interger is required (got type bytes)
. On some research, I found this is an issue with running pysark on Python 3.8. I am not sure if this is the only cause, but if it is, i recommend placing a requirements for Python<3.8The text was updated successfully, but these errors were encountered: