You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello
I don't understand your prototxt of yolo.
You have removed all the batch normalization layer and the scale after convolution
For me a convolution is
layer {
name: "conv2d_1"
type: "Convolution"
bottom: "data"
top: "conv2d_1"
convolution_param {
num_output: 32
kernel_size: 3
stride: 1
pad : 1
}
}
layer {
name: "batch_normalization_1"
type: "BatchNorm"
bottom: "conv2d_1"
top: "conv2d_1"
batch_norm_param {
eps : 0.000001
}
}
layer {
name: "scale_1"
type: "Scale"
bottom: "conv2d_1"
top: "conv2d_1"
scale_param {
bias_term : true
}
}
layer {
name: "relu_1"
type: "ReLU"
bottom: "conv2d_1"
top: "conv2d_1"
relu_param {
negative_slope: 0.1
}
}
the batch_normalization take training parameter if y remove it for me it's doesn't work.
Hello
I don't understand your prototxt of yolo.
You have removed all the batch normalization layer and the scale after convolution
For me a convolution is
layer {
name: "conv2d_1"
type: "Convolution"
bottom: "data"
top: "conv2d_1"
convolution_param {
num_output: 32
kernel_size: 3
stride: 1
pad : 1
}
}
layer {
name: "batch_normalization_1"
type: "BatchNorm"
bottom: "conv2d_1"
top: "conv2d_1"
batch_norm_param {
eps : 0.000001
}
}
layer {
name: "scale_1"
type: "Scale"
bottom: "conv2d_1"
top: "conv2d_1"
scale_param {
bias_term : true
}
}
layer {
name: "relu_1"
type: "ReLU"
bottom: "conv2d_1"
top: "conv2d_1"
relu_param {
negative_slope: 0.1
}
}
the batch_normalization take training parameter if y remove it for me it's doesn't work.
[convolutional]
batch_normalize=1
filters=64
size=7
stride=2
pad=1
activation=leaky
The text was updated successfully, but these errors were encountered: