Turn Feature abstract class into a Protocol and lazy load feature modules #307
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
I noticed that the Preprocessing classes were implemented as a protocol instead of an abstract class.
I thought it would be a good idea to do the same with the Feature class
I also renamed NMFeature for clarity, Feature seemed a very generic word that is repeated many times over the codebase.
I took the opportunity to change the Features class (maybe rename this one too?) initialization to only import the feature modules when the feature is to be calculated, that way there are no unnecessary imports and the module already takes forever to initialized (>3 seconds just for the startup).
One issue with this changes is that numpy is complaining about "divisions by zero" during feature normalization
But I checked and the zeroes were present already, but for some reason the changes triggered the printing of the warning. I'll check if the output is the same, and if it is I'll try to suppress the warning.
But are there supposed to be divisions by zero anyway? Seems kind of suspicious to have nan or 0 standard deviation.