Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to use dropout layer? #1043

Closed
lixin7895123 opened this issue Sep 7, 2014 · 3 comments
Closed

how to use dropout layer? #1043

lixin7895123 opened this issue Sep 7, 2014 · 3 comments
Labels

Comments

@lixin7895123
Copy link

to my understanding, dropout is applied to full-connected layer,
how can i use dropout layer in caffe? is there any demo about this?
shall i use it on top of a ip layer?or anything else?

@netheril96
Copy link
Contributor

You can simple read the definition files included in examples/imagenet. There are two dropout layers in them.

@shelhamer
Copy link
Member

Right. Examples are your friends. Please ask on caffe-users. Issues are for development discussion. Thanks!

@yolanda93
Copy link

yolanda93 commented Jun 4, 2016

The dropout layer reduces overfitting preventing complex co-adaptations on the training data. Here I provided an example that takes the output of an InnerProduct layer (ip11), after an ReLU layer as an activation function.

layer {
name: "drop1"
type: "Dropout"
bottom: "ip11"
top: "ip11"
dropout_param {
dropout_ratio: 0.5
}
}

In this layer each hidden unit is randomly omitted from the network with a probability of 0.5 (the dropout ratio)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants