Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About sent_splits.mat #14

Closed
NanAlbert opened this issue Dec 19, 2021 · 5 comments
Closed

About sent_splits.mat #14

NanAlbert opened this issue Dec 19, 2021 · 5 comments

Comments

@NanAlbert
Copy link

Hello, I would like to know how to generate the file sent_splits.mat or where to get the file of other datasets, e.g., AwA2, SUN, APY.
Thank you so much!

@1maojian1
Copy link

@NanAlbert hello,I have the same problem,Have you solved it now?

@NanAlbert
Copy link
Author

@NanAlbert hello,I have the same problem,Have you solved it now?

As I know so far, there is no CNN-RNN sentence embeddings for AwA2, SUN and APY datasets.

@1maojian1
Copy link

1maojian1 commented Aug 13, 2022 via email

@rongtongxueya
Copy link

same question,How should such semantic data sets be generated,I want to build my own data set

@NanAlbert
Copy link
Author

same question,How should such semantic data sets be generated,I want to build my own data set

As I mentioned before, there are no CNN-RNN sentence embeddings available for the AwA2, SUN, and APY datasets. However, you could consider using language models like BERT to generate semantic data for your own dataset.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants