Skip to content

Commit

Permalink
Syncing various fixes
Browse files Browse the repository at this point in the history
  • Loading branch information
GilesStrong committed Nov 3, 2020
1 parent 5f35ba8 commit b645b61
Show file tree
Hide file tree
Showing 4 changed files with 18 additions and 3 deletions.
10 changes: 9 additions & 1 deletion .vscode/settings.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
{ "editor.autoIndent": true,
"python.linting.pylintEnabled": false,
"python.linting.flake8Enabled": true,
"python.linting.enabled": true,
"python.linting.banditEnabled": false,
"python.linting.mypyEnabled": false,
Expand All @@ -19,4 +18,13 @@
"MD034": false
},
"restructuredtext.confPath": "${workspaceFolder}/docs",
"spellright.language": [
"en"
],
"spellright.documentTypes": [
"markdown",
"latex",
"plaintext"
],
"python.linting.flake8Enabled": true,
}
4 changes: 3 additions & 1 deletion CHANGES.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,15 @@

## Additions

- `__del__` method to `FowardHook` class

## Removals

## Fixes

- Potential bug in convolutional models where checking the out size of the head would affect the batchnorm averaging
- Potential bug in `plot_sample_pred` to do with bin ranges

- `ForwardHook` not working with passed hook functions

## Changes

Expand Down
2 changes: 2 additions & 0 deletions lumin/nn/models/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -204,6 +204,8 @@ def evaluate_from_by(self, by:BatchYielder, callbacks:Optional[List[AbsCallback]
(weighted) loss of model predictions on provided data
'''

# TODO: Fix this to work for incomplete batch

loss = 0
for x, y, w in by: loss += self.evaluate(x, y, w, callbacks)*by.bs
return loss/(len(by)*by.bs)
Expand Down
5 changes: 4 additions & 1 deletion lumin/utils/misc.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
from typing import Union, List, Tuple, Optional
import pandas as pd
import sympy
from functools import partial

from sklearn.utils import resample

Expand Down Expand Up @@ -128,8 +129,10 @@ class FowardHook():
'''
def __init__(self, module:nn.Module, hook_fn:Optional=None):
self.input,self.output = None,None
if hook_fn is not None: self.hook_fn = hook_fn
if hook_fn is not None: self.hook_fn = partial(hook_fn, self)
self.hook = module.register_forward_hook(self.hook_fn)

def __del__(self): self.remove()

def hook_fn(self, module:nn.Module, input:Union[Tensor,Tuple[Tensor]], output:Union[Tensor,Tuple[Tensor]]) -> None:
r'''
Expand Down

0 comments on commit b645b61

Please sign in to comment.