-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Micro: port op EXPAND_DIMS from Lite #46258
Comments
Hey @rsun-bdti, how is it going with the PR2? Am I wrong or is there no header available in reference_ops.h? |
@stephanboner, good catch. There is no function for EXPAND_DIMS in lite/kernels/internal/reference/reference_ops.h, and the TFLM op does not depend on a reference implementation, so PR 2 was skipped. I have updated as such in the header above. |
@rsun-bdti cool, thanks for updating. But does it build for you then and can you use the operator? I was struggeling using it today... |
@stephanboner: Are you trying to use the Lite kernel EXPAND_DIMS, or the micro kernel? The latter is still under review and has not been merged into the master yet. |
@rsun-bdti I'm trying to use it in micro and I implemented both your PRs in my projects tensorflow source |
@stephanboner I see. After PR3, the micro kernel is not fully functional, nor is it part of the build. There is a PR4 pending for review right now. Hopefully you can use it when that PR is approved and merged into the master. As it is, this micro kernel can not be used yet. That's why this issue is still open. I updated the header above with PR4. |
Oh thanks @rsun-bdti. Sorry for causing such a discussion, I just copied the lite kernel instead of taking the adjusted version of your PR4 as well as added the op in |
@stephanboner My pleasure. The discussion is helpful to me as well. I realize that I need to keep the information for my issues updated. (PS: thanks for approving the PR4) |
@tensorflow/micro
System information
Host OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 18.04
TensorFlow installed from (source or binary): source
Tensorflow version (commit SHA if source): master
Target platform (e.g. Arm Mbed OS, Arduino Nano 33 etc.): Sparkfun Edge
Describe the problem
I am about to port The TF Lite kernel op EXPAND_DIMS to TF Lite Micro.
Please provide the exact sequence of commands/steps when you ran into the problem
PR 1: refactor flatbuffer_conversions parsing function
PR 2: refactor reference implementation from lite/kernels/internal/reference/reference_ops.h into its own header without making any changes
PR 3: copy the EXPAND_DIMS kernel from lite to micro without making any changes. At this point the kernel is in micro but it is not part of the build
PR4: make the EXPAND_DIMS micro kernel fully functional. Also finish the testing code.
(Update 2021-02-26: There is no function for EXPAND_DIMS in lite/kernels/internal/reference/reference_ops.h, and the TFLM op does not depend on a reference implementation, so PR 2 was skipped)
(Update 2021-02-26: Added PR4 in the sequence)
The text was updated successfully, but these errors were encountered: