Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add switch order layer for FCN model #2788

Merged
merged 16 commits into from
Sep 7, 2017

Conversation

wanghaoshuang
Copy link
Contributor

@wanghaoshuang wanghaoshuang commented Jul 10, 2017

  1. Add switch order layer for switching image dimensions order

fix #2787

1. Add switch function for switching image dimensions order
2. Add CpuMatrix::backwardSoftmax function
3. Add pixel softmax layer, python wrapper and grad_test
* how many zeros to add before and after the input in channel
* dimension. And the heightStart and heightEnd indicate padding
* in height dimension. The widthStart and widthEnd indicate the
* padding in width dimension.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The above comments are not correct for NCHW2NHWCFunc.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thx.

size_t inW = inputs[0].shape()[3];
typename Tensor<real, Device>::Vector vec(outputs[0].shape().getElements(),
outputs[0].data<real>());
vec.zero();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is no need to set outputs to zero, since the assignment (=) is used in NCHW2NHWC.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thx.

* Argument in this Function:
* \param pad_ The same meaning as it in PadFunc.
* \param inputs The gradient with respect to the output value of PadFunc.
* \param outputs The gradient with respect to the input value of PadFunc.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The above comments are not correct.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thx.

Layer::forward(passType);
MatrixPtr input = inputLayers_[0]->getOutputValue();
size_t batchSize = input->getHeight();
// cout<<"useGpu:"<<useGpu(deviceId_)<<endl;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove this line.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thx.

inW_ = img_conf.img_size();
inC_ = img_conf.channels();
createFunction(forward_, "NCHW2NHWC", FuncConfig());
createFunction(backward_, "NHWC2NCHW", FuncConfig());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe, the name forward_ and backward_ is not quite good. nchw2nhwc_ and nhwc2nchw_?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't rename forward_ and backward_ directly . Because they are defined in Layer.h. So i added std::vector<std::shared_ptr<FunctionBase>> nchw2nhwc_; and std::vector<std::shared_ptr<FunctionBase>> nhwc2nchw_; into PixelSoftmaxLayer.h to fix this problem. Thx for your suggestion.

@wrap_name_default('pixel_softmax')
def pixel_softmax_layer(input, name=None, layer_attr=None):
"""
This layer calculate softmax in image channel dimension
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need more detailed comments.

:param name: Name of this layer.
:type name: basestring
:param input: The input layer.
:type input: LayerOutput
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's better to put input in front of the name to keep the order same with input arguments.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed.

if isinstance(input, LayerOutput):
input = [input]
elif isinstance(input, Projection):
input = [input]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This layer can not support Projection. Only the mixed_layer can use the Projection as input.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed.

elif isinstance(input, Projection):
input = [input]
else:
assert isinstance(input, collections.Sequence)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This only support one input layer. Lines 5902 to 5907 should be:

assert isinstance(input, LayerOutput)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed.

assert isinstance(input, collections.Sequence)
l = Layer(
name=name,
inputs=[x.name for x in input],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

inputs=input,

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed.

@wanghaoshuang wanghaoshuang changed the title Add pixel softmax layer for FCN model Add switch order layer for FCN model Jul 19, 2017
@wanghaoshuang wanghaoshuang merged commit 1cf9800 into PaddlePaddle:develop Sep 7, 2017
@chengyuz
Copy link

what has_depth() means in SwitchOrderLayer(LayerBase) function, the function is in config_parser.py, i encounter an error: LayerConfig object has no attribute 'has_depth()', thx

heavengate pushed a commit to heavengate/Paddle that referenced this pull request Aug 16, 2021
* add config for jde and deepsort, add engine metric tools

* add mot_metrics

* fix configs engine and others

* fix configs others

* move configs

* fix ap_per_class, fix doc

* fix test_mode, seqs

* fix test_mode to metric

* fix JDE arch metric

* clean requirement

* add to requirement
@wanghaoshuang wanghaoshuang deleted the pixel_softmax_layer branch May 20, 2022 03:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add pixel softmax layer for FCN model
3 participants