-
Notifications
You must be signed in to change notification settings - Fork 745
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ConvLayer Filters #54
Comments
It always is 3-dimensional if you count the filter number in.
The depth is actually the filter number. I took a screenshot from cs231n, maybe this will help a bit: edit: |
Yes, it is actually 3 dimensional.
Number of feature maps is equal to the number of filters. |
Thank you, this clears things up. From my understanding it appears there is still a discrepancy from the paper in {"filter_shape":[1,2]}. The paper implies it should be {"filter_shape":[1,3]}, doesn't it? |
You can freely play around with the filter size in the first ConvLayer 2, 3 or 5 seem to be values giving reasonable results. |
Yes, as we mentioned in README, the hyper-parameters are different as that listed in the article. |
Thank you for the information @ZhengyaoJiang. It is a great paper, and well written code, I really appreciate it. I am a CS master's student at Simon Fraser University, I am basing my current projects off of this paper. |
Thanks for your compliment. |
Figure 2 in Paper (Attached image):
Shouldn't the convolutional filters be 3 dimensional? I mean, in the original convolution how do we go from 3 feature maps to 2 feature maps. I believe this would make sense if the filter was of dimension 2x1x3 (same as described but with additional depth of 2). And then the second convolution would be 2x48 to get the 20 11x1 feature maps.
net_config.json:
In ConvLayer, I don't understand how {"filter_shape":[1,2],"filter_number":3} corresponds to the filters outlined in the paper as described in my above question. (Excuse my ignorance of tflearn, but the params to conv2d() are not well explained in the documentation)
The text was updated successfully, but these errors were encountered: