Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add RoBERTa model for question answering using SQuADv2.0 dataset. #19

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

QingtaoLi1
Copy link

No description provided.

@dnfadmin
Copy link

dnfadmin commented Apr 8, 2022

CLA assistant check
All CLA requirements met.

@GeorgeS2019
Copy link

GeorgeS2019 commented Apr 20, 2022

@QingtaoLi1 thanks for submitting a PR related to TorchText or Pytorch Deep NLP/Pytorch for HuggingFace.

I wonder if there is need now to start thinking how to organize Torchsharp so NLP tutorials and use cases are more aligned with TorchText.

Just an unrelated question, have you looked into Huggingface GPT2?

@QingtaoLi1
Copy link
Author

Just an unrelated question, have you looked into Huggingface GPT2?

I have not used GPT2 yet. The main difference between the GPTs and BERTs is that the former is a generative model which can output a sequence in an auto-regression format. So I believe many of code are very similar except for generation-related parts.

@GeorgeS2019
Copy link

GeorgeS2019 commented Apr 20, 2022

@QingtaoLi1

This is an attempt to get GPT2 working using BlingFire GPT2 tokeniser and GPT2 ONNX with the preliminary generation-related parts which are the issue

If you have time and interest, hopefully, get your perspective of the challenges of getting the generation parts working.

@QingtaoLi1
Copy link
Author

@GeorgeS2019 Sorry but I wonder what you would like to achieve with GPT2 or BlingFire/ONNX?

@GeorgeS2019
Copy link

GeorgeS2019 commented Apr 20, 2022

RoBERT is for question and answer.
GPT2 is for Text Generation or NLG, which is rare in .NET

I hope this answer your question.

Microsoft BlingFire...there should be a tokeniser for RoBERT...Do check out..better performance?

@GeorgeS2019
Copy link

@QingtaoLi1 updated the reply

@QingtaoLi1
Copy link
Author

QingtaoLi1 commented Apr 20, 2022

I see. You want to include GPT2 in this repo, right? I currently have other work to do; may seek for some time later.

And I know some are attempting to create infrastructure for general tokenizers. I guess your thoughts of re-organizing TorchSharpExamples to align with torchtext will have similar effect on NLP systems -- this will make it easier to build different NLP models for different tasks/datasets.

@GeorgeS2019
Copy link

The spirit of Torchshap is to empower. NET developers to do deep AI within .NET without the need to go back to python.

@QingtaoLi1
Copy link
Author

I'm not turning back to python; I mean infrastructure in .NET world.

@QingtaoLi1
Copy link
Author

An unrelated question: what do you think are the main obstacles for developers to build a deep AI system?

@GeorgeS2019
Copy link

@QingtaoLi1
Copy link
Author

Thanks very much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants