Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

any format works for all layers now? #39

Closed
tensor-tang opened this issue Mar 18, 2017 · 1 comment
Closed

any format works for all layers now? #39

tensor-tang opened this issue Mar 18, 2017 · 1 comment
Labels

Comments

@tensor-tang
Copy link
Contributor

tensor-tang commented Mar 18, 2017

As I know, for conv layer, no matter forward and backward, I can get the best format if use any, right?

How about other layers, like pool?

As pooling layer:

Suppose that we have a simple net: CONV1->POOL1->CONV2.

For forward: if the output format of CONV1 is nchw, then the POOL1 will directly use nchw as input foramt, since pool layer seems do not handle src format as any(maybe we can add it ??).

Likewise for backward: the CONV2 may have nchw as botdiff format, so POOL1 will directly use it.

We do not want to see these cases right? Users may hope nChw8c/nChw16c can be used as the input format if appropriate.

That's for pool layer, how about other layers, like batch norm, fc and concat?

Thanks very much.

@vpirogov
Copy link
Member

Format any is designed to allow compute intensive layers like convolution and inner product to choose the layout which will result in the best performance. The rest of the layers, including relu, lrn, batch norm and concats/splits will work with all the layouts that come out of the convolutions or inner products. When creating these primitives you should use previous layer output descriptor directly. Please refer to examples or documentation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants