Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in build_convolution targeting Brian2 #44

Closed
wilkieolin opened this issue Oct 1, 2019 · 1 comment
Closed

Error in build_convolution targeting Brian2 #44

wilkieolin opened this issue Oct 1, 2019 · 1 comment

Comments

@wilkieolin
Copy link
Contributor

wilkieolin commented Oct 1, 2019

I'm building the sample Keras LeNet ANN into an SNN targeting the Brian2 backend. However, there is an error when running the build_convolution section of the conversion where conn is referenced before being defined.

def build_convolution(self, layer, input_weight=None):
        from snntoolbox.simulation.utils import build_convolution

        delay = self.config.getfloat('cell', 'delay')
        transpose_kernel = \
            self.config.get('simulation', 'keras_backend') == 'tensorflow'
        self._conns, self._biases = build_convolution(layer, delay,
                                                      transpose_kernel)
        self.set_biases()

        print("Connecting layer...")
        for conn in self._conns:
            i = conn[0]
            j = conn[1]
            self.connections[-1].connect(i=i, j=j)
        if input_weight is not None:
            self.connections[-1].w = input_weight.flatten()
        else:
           # --- here conn is referenced without being defined ---
            self.connections[-1].w[i, j] = conn[2] 

I believe the following code is appropriate to fix the issue but wanted to confirm.

def build_convolution(self, layer, input_weight=None):
        from snntoolbox.simulation.utils import build_convolution

        delay = self.config.getfloat('cell', 'delay')
        transpose_kernel = \
            self.config.get('simulation', 'keras_backend') == 'tensorflow'
        self._conns, self._biases = build_convolution(layer, delay,
                                                      transpose_kernel)
        self.set_biases()

        print("Connecting layer...")
        np_conns = np.array(self._conns)

        self.connections[-1].connect(i=np_conns[:,0].astype('int64'), j=np_conns[:,1].astype('int64'))
        if input_weight is None:
            self.connections[-1].w = np_conns[:,2]
        else:
            self.connections[-1].w = input_weight.flatten()
@rbodo
Copy link
Contributor

rbodo commented Oct 4, 2019

This indentation error was introduced by a recent pull request and I unfortunately did not catch it in review. Thanks for noticing and proposing the parallelized approach. If you make two cosmetic changes (spaces after ,, and a linebreak before j=...), I'd be happy to pull this fix. Thank you!

rbodo pushed a commit that referenced this issue Oct 6, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants