Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move pruning-preserving quantization scheme out of the experimental API package #738

Merged
merged 1 commit into from
Jul 2, 2021

Conversation

copybara-service[bot]
Copy link

@copybara-service copybara-service bot commented Jun 17, 2021

Move pruning-preserving quantization scheme out of the experimental API package

Pruning-preserving quantization scheme is now available in tfmot.sparsity.keras.quantization.collaborative_optimizations

@google-cla google-cla bot added the cla: yes PR contributor has signed CLA label Jun 17, 2021
@github-actions github-actions bot added api-review API review needed technique:qat Regarding tfmot.quantization.keras (for quantization-aware training) APIs and docs labels Jun 17, 2021
@copybara-service copybara-service bot changed the title Move collaborative optimizations out of the experimental API package Move pruning-preserving quantization scheme out of the experimental API package Jul 2, 2021
…PI package

Pruning-preserving quantization scheme is now available in tfmot.sparsity.keras.quantization.collaborative_optimizations

PiperOrigin-RevId: 382675092
@copybara-service copybara-service bot merged commit 8582015 into master Jul 2, 2021
@copybara-service copybara-service bot deleted the test_379976115 branch July 2, 2021 05:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api-review API review needed cla: yes PR contributor has signed CLA technique:qat Regarding tfmot.quantization.keras (for quantization-aware training) APIs and docs
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant