Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve and robustify normalisation #12

Merged
merged 11 commits into from
Dec 31, 2017
Merged

Conversation

tspooner
Copy link
Owner

Thus far the projection structs each coded their own implementation of the various normalisation schemes such as l1 or l2 norm. With this PR we include a new geometry::norms module which includes some methods for computing normalisation constants.

In addition, the projection structs have been refactored to use these new methods and their tests updated accordingly.

Add new submodule, norms, to the geometry module with the following
utility functions for computing the:
- l1 (Taxicab/Manhattan) norm,
- l2 (Euclidean) norm,
- general lp norm,
- max (infinity/uniform/supremum) norm.
As in other implementations dotted across the web, we now scale the
coefficients using the l2-norm and filter for duplicates. An l1-norm is
then applied to the final feature vector to ensure sum(phi) = 1.
@coveralls
Copy link

Coverage Status

Coverage increased (+2.3%) to 73.086% when pulling fb66db1 on ft-improved_normalisation into cb4b36e on master.

@tspooner tspooner merged commit 311695a into master Dec 31, 2017
@tspooner tspooner deleted the ft-improved_normalisation branch January 1, 2018 15:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants