New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for one-padding #252
Conversation
As discussed with @lgeiger I'm fine with using |
Nice!
Yes, I think I would slightly prefer naming it padding="SAME_ONE"
, though padding="ONE"
is also fine with me.
I am seeing in larq/larq#438 @lgeiger uses an additional argument to pass the padding values which is default to 0. I think for bconv we should use the same interface as well. basically in this way the padding type stays the same however, the padding default varlue for padding=SAME will be set according to our needs. We can check if the default value is zero or not, it its not then we do all the extra stuff that needs to be done. |
I don't have a strong opinion on that, but I think I'd prefer to stick with |
my problem with |
Rebased on master to be able to do the MLIR one-padding PR on top of this. |
@lgeiger @Tombana After lots of discussion I finally understand why we are doing it this way. I think it would have been a lot better design if we have had the Larq/LCE in the following form:
- The Larq layer
QuantConv
accepts any float padding value - During the training we pad the values then sign the tensor(going to +1,-1 in float space for the entire padded tensor) and compute the convolution
- with MLIR converter we detect which pad value is passed, then pass that value to BConv Op
- BConv op could decide based on the padding value how to do the correction (if needed at all bc correction is needed only if the padding in float space was 0 and every other padding value could have worked out of the box without any correction.)
* Implement one padding transformation * Check spatial padding size before fusing * Prefer ConstantOp over TF_ConstOp * Correctly convert strided convolutions
This adds support for setting the attribute
pad_values = 1
.