Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Test Obsei in GPU environment #73

Closed
lalitpagaria opened this issue Apr 1, 2021 · 1 comment · Fixed by #87
Closed

[BUG] Test Obsei in GPU environment #73

lalitpagaria opened this issue Apr 1, 2021 · 1 comment · Fixed by #87
Assignees
Labels
bug Something isn't working high priority

Comments

@lalitpagaria
Copy link
Collaborator

No description provided.

@lalitpagaria lalitpagaria added the bug Something isn't working label Apr 1, 2021
@lalitpagaria
Copy link
Collaborator Author

lalitpagaria commented Apr 3, 2021

@GirishPatel
We need to pass a device parameter to pipeline and it will automatically leverage GPU.
https://discuss.huggingface.co/t/pytorch-nlp-model-doesn-t-use-gpu-when-making-inference/1160/3

https://huggingface.co/transformers/main_classes/pipelines.html

So we have two choice. Either add variable a variable use_gpu in all transformer's based analyzer class. So use can control whether they want to use GPU or not. Default should be use GPU as CPU so it will work for all platforms. Let me know WDYT?

Sample https://github.com/deepset-ai/haystack/blob/master/haystack/summarizer/transformers.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working high priority
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants