Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add size to lru_cache #1

Open
gsasikiran opened this issue Nov 22, 2021 · 6 comments
Open

Add size to lru_cache #1

gsasikiran opened this issue Nov 22, 2021 · 6 comments

Comments

@gsasikiran
Copy link

gsasikiran commented Nov 22, 2021

/usr/local/lib/python3.7/dist-packages/zeroshot_topics/__init__.py in <module>()
      1 __version__ = '0.1.0'
      2 
----> 3 from .zeroshot_tm import ZeroShotTopicFinder

/usr/local/lib/python3.7/dist-packages/zeroshot_topics/zeroshot_tm.py in <module>()
      1 import attr
      2 from keybert import KeyBERT
----> 3 from .utils import load_zeroshot_model
      4 from nltk.corpus import wordnet as wn
      5 

/usr/local/lib/python3.7/dist-packages/zeroshot_topics/utils.py in <module>()
      4 
      5 @lru_cache
----> 6 def load_zeroshot_model(model_name="valhalla/distilbart-mnli-12-6"):
      7     classifier = pipeline("zero-shot-classification", model=model_name)
      8     return classifier

/usr/lib/python3.7/functools.py in lru_cache(maxsize, typed)
    488             maxsize = 0
    489     elif maxsize is not None:
--> 490         raise TypeError('Expected maxsize to be an integer or None')
    491 
    492     def decorating_function(user_function):

TypeError: Expected maxsize to be an integer or None

I assume that you have to provide, maxsize parameter to lru_cache. Worked for me, when I provided the parameter.

@JosephGatto
Copy link

Also getting this error (on Colab )

@gsasikiran
Copy link
Author

@JosephGatto I also got that error on colab only.

Also getting this error (on Colab )

@AnjanaRita
Copy link
Collaborator

@gsasikiran was not able to replicate this on my local, but we could definitely set the maxsize to ensure this error does not occur again. If possible, could you open a PR with the maxsize fix you mentioned, I can merge it right away.

@gsasikiran
Copy link
Author

@AnjanaRita I have considered 'maxsize' to be 'None', as the model is bigger and am not sure, how much max size is required. If you would prefer to push the same, I will.

@pascalhuszar
Copy link

Also got this error but then i changed the python version to 3.8 and the problem no longer occur.
According to this stackoverflow post, the @lru_cache decorator in utils.py needs parantheses, at least in py3.7.
In py3.8 it works fine without.

But i got a different error/problem after searching for topics. It seems that nltk.download('wordnet') is missing. Because i get the warning after executing zsmodel.find_topic(text, n_topic=2)

@gsasikiran
Copy link
Author

gsasikiran commented Nov 23, 2021

@AnjanaRita You should allow my username to push the files. Or you can add the paranthesis and push yourself, along with nltk.download("wordnet")

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants