Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not find SWEM-hier #2

Open
chenghuige opened this issue Jun 5, 2018 · 16 comments
Open

Not find SWEM-hier #2

chenghuige opened this issue Jun 5, 2018 · 16 comments

Comments

@chenghuige
Copy link

Hi, seems not find hier encoder as paper mentioned. Very interested to see it :)

@dinghanshen
Copy link
Owner

Sure, I will merge the hierarchical pooling encoder into the model.py file soon.

@cuteapi
Copy link

cuteapi commented Jun 12, 2018

+1 Very interested to see it :)

@ariwaranosai
Copy link

is there any progress on this issue?

@hanhao0125
Copy link

any progress?thanks @dinghanshen

@OliverKehl
Copy link

still looking forward to this. thanks @dinghanshen

1 similar comment
@pemywei
Copy link

pemywei commented Jul 5, 2018

still looking forward to this. thanks @dinghanshen

@ericxsun
Copy link

ericxsun commented Jul 5, 2018

Still looking forward to this. thanks @dinghanshen

2 similar comments
@qichaotang
Copy link

Still looking forward to this. thanks @dinghanshen

@LittleSummer114
Copy link

Still looking forward to this. thanks @dinghanshen

@hanxiao
Copy link

hanxiao commented Sep 5, 2018

please refer to the level-mean-max for hierarchical pooling: https://github.com/hanxiao/tf-nlp-blocks/blob/8f14a864a66f976857adc04a5f3f0797dd877731/nlp/pool_blocks.py#L26

It's part of a bigger project called tf-nlp-block

@windpls
Copy link

windpls commented Sep 17, 2018

Still looking forward to this. thanks @dinghanshen
Or, could you tell what's the stride when setting local window size = 5?

@beyondguo
Copy link

read through the paper, I didn’t find what w2v embedding other models(such as LSTM,CNN) are using. It is amazing that SWEM -ave can achieve better results than LSTM or CNN in some tasks, which in fact I don’t believe! I have done a lot of nlp tasks and I know that simply average the word embedding of a text is usually very poor.
I don’t think the comparisons of other models are fair. They don’t even use the same pretrained w2v. So maybe it’s just the Glove you used is better than the embedding other models used.

@LittleSummer114
Copy link

LittleSummer114 commented Jul 2, 2019 via email

@JayYip
Copy link

JayYip commented Oct 23, 2019

Hi The author gave me the swer-hier embedding, but I do not re-run it, I am also confused why so simple operation can achieve so good performance. However, our group recently finish some experiments, actually simple-operation can achieve comparable performance. For me, if you don not believe this result, you can forget this paper or you can re-run it to show whether you are right or not. Best Regards,

---Original--- From: "beyondguo"notifications@github.com Date: Wed, Jul 3, 2019 00:47 AM To: "dinghanshen/SWEM"SWEM@noreply.github.com; Cc: "LittleSummer114"582326366@qq.com;"Comment"comment@noreply.github.com; Subject: Re: [dinghanshen/SWEM] Not find SWEM-hier (#2) read through the paper, I didn’t find what w2v embedding other models(such as LSTM,CNN) are using. It is amazing that SWEM -ave can achieve better results than LSTM or CNN in some tasks, which in fact I don’t believe! I have done a lot of nlp tasks and I know that simply average the word embedding of a text is usually very poor. I don’t think the comparisons of other models are fair. They don’t even use the same pretrained w2v. So maybe it’s just the Glove you used is better than the embedding other models used. — You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

Hi, could you share the code to me? Thanks.

@LLIKKE
Copy link

LLIKKE commented Dec 17, 2023

Still looking forward to this. thanks

@LittleSummer114
Copy link

LittleSummer114 commented Dec 17, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests