New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Available Projects #1
Comments
Dear @Alfo5123, Thanks for your interest. @tatevmejunts is doing an experiment on overfitting with LSTM networks on a sentence classification task as part of her undergrad thesis. I think it is possible to try more models on more tasks and compare the results. No one is yet working on dataset distillation. |
Thanks for the reply. I am interested in working on the Dataset Distillation project with @marinagomtsian . |
Perfect. I'd suggest you to read the paper, come up with a more detailed plan for the project, and I think I'll be able to comment on / discuss it |
@Hrant-Khachatrian , we discussed with @marinagomtsian to start with the following plan:
We created this repository to upload our progress: https://github.com/Alfo5123/Text-Distill |
I am sorry for the late reply. Although we have not been working during the last months. I would like to stress my interest in continuing and finishing this project. Currently Marina was accepted in a master program and she won't be able to continue working on this, however I have a bad job which gives me enough time to work on the project. We already included in our repository the code for the pre-trained CNN Sentence Classification model, which we will use to distill the dataset. Thanks for your time and consideration, |
Sounds good! |
I was interested in working on the "Overfitting ability of recurrent networks" project for non-image classification task. However, I noticed on the website that there is someone already working on it. Is it possible to work on it on parallel? Or it would be better to tackle another project (I am also interested in "Dataset distillation for other tasks")? Thanks
The text was updated successfully, but these errors were encountered: