Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compatibility for nengo_lasagne #979

Merged
merged 4 commits into from
Mar 23, 2016
Merged

Compatibility for nengo_lasagne #979

merged 4 commits into from
Mar 23, 2016

Conversation

drasmuss
Copy link
Member

@drasmuss drasmuss commented Mar 9, 2016

Some changes I made for compatibility with the nengo_lasagne back end.

The first commit just separates out the code for computing gains and biases into its own function (so that I can reuse that code in nengo_lasagne). Doesn't change anything in reference Nengo.

The second commit is more significant, it adds the option to pass a Distribution for the transform parameter. This isn't a necessary feature (you could always just manually sample a correctly sized matrix from the desired distribution, and pass that instead), but it is useful (and has been desired in other cases as well #639/#942).

I would say that this is a post-2.1 change, unless someone feels strongly about having nengo_lasagne be compatible with 2.1.

gain, bias = ens.neuron_type.gain_bias(max_rates, intercepts)

return gain, bias, max_rates, intercepts

@Builder.register(Ensemble) # noqa: C901
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need two blank lines (see flake8 fail).

@arvoelke
Copy link
Contributor

The code looks good to me (assuming my question is right), but it's missing unit tests!

@drasmuss
Copy link
Member Author

Yeah it's still a work in progress, not ready to merge yet.

@Seanny123
Copy link
Contributor

I have a vested interest in getting this merged to master because I'll soon be using Nengo with Lasagne. Are unit tests, formatting and rebasing all this requires?

@drasmuss
Copy link
Member Author

Assuming no one has any objections to the transform distribution initialization part. That part hasn't really been battle tested, I just threw it in and tried a few examples.

@@ -49,6 +49,23 @@ def get_activities(model, ens, eval_points):
x, model.params[ens].gain, model.params[ens].bias)


def get_gain_bias(ens, rng=np.random):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why bother extracting this into a function? This refactoring doesn't seem to be being used anywhere else in the pull request. Maybe extract it to a separate one?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According to the OP,

(so that I can reuse that code in nengo_lasagne)

As long as there's a unit test for this new function, and for the new use case in general, I'm happy with the rest.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Derp. Don't know how I missed that.

@drasmuss drasmuss force-pushed the nengo_lasagne_compat branch 2 times, most recently from 4dbbdbc to 78087e3 Compare March 21, 2016 16:58
@drasmuss
Copy link
Member Author

Rebased and added some unit tests. I also added a changelog entry in the 2.1 section, but as I said I don't think this is critical to have in that release.

@Seanny123
Copy link
Contributor

LGTM

@hunse
Copy link
Collaborator

hunse commented Mar 22, 2016

So currently I think this will break if the transform is a distribution and you use a weight solver. That code was already pretty messy, and the distribution stuff makes it even more so, so I'm going to look at if it can be refactored easily.

@hunse
Copy link
Collaborator

hunse commented Mar 22, 2016

Okay, I fixed this so that transform distributions work with weight solvers.

I also added a commit to make the transform accessible in BuiltConnection. This makes sense to me because now transform is something that might be sampled, so users might want to know what it is independent of the weights. I think the meaning of transform in BuiltConnection is the same as it was in 2.0, so we don't have any problem there, though we might want to modify the change log since now we've removed transform from BuiltConnection and then added it back in the course of 2.1 development.

@@ -169,7 +171,7 @@ def get_prepost_signal(is_pre):
# Sample transform if given a distribution
transform = (conn.transform.sample(conn.size_out, conn.size_mid, rng=rng)
if isinstance(conn.transform, Distribution) else
conn.transform)
np.array(conn.transform))
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The only reason to make a copy here is so that if someone accesses BuiltConnection.transform it isn't aliased with the Connection.transform right? Just wondering if we could mark it as readonly or something instead, to cut down on memory usage (since these could be large arrays).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So that's the side benefit. The main reason we want the copy is so that if we're doing something that doesn't have an ensemble as the pre, the transform simply becomes weights, and then we don't want our weights signal value to be a view on conn.transform.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code used to try to be super smart about this, and only made the copy if we were not in that case where we were solving for decoders. But it made things a bit more complicated to read, and I also wanted to avoid making a copy of the sampled distribution, since that's extra computation (though arguably memory savings are more important). And then I realized that if we're storing it in BuiltConnection, it would be nice to have a copy anyway, and so that convinced me that it's fine to just make a copy right off the bat. A readonly view is a possibility, though the thing I always worry about is a user creates an array, sets it to be the transform, then modifies the original array. There's no way that we can set the user's view on their array to be readonly, so they'll always be able to do this. But maybe that's their own damn fault? There are definitely arguments for saving memory, especially when these transforms are in neuron->neuron connections and therefore possibly very large. Could we have an rc setting that copies by default, but the user can advise to not copy if they need the memory?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah yeah I see, you just moved the copy from line 204 to here. So we'll end up with an extra copy of the transform in the pre==Ensemble case, but in that case the transform is probably pretty small so it doesn't really matter.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry commented on your first post before reading the second. Yeah, I think it's easiest/clearest to just do the copy here. But that'd be a nice wishlist item, a config setting that'd act like a compiler flag to optimize for memory usage.

@drasmuss
Copy link
Member Author

Those changes all look good to me.

@hunse
Copy link
Collaborator

hunse commented Mar 23, 2016

All looks good to me, too. I checked the changelog, and even though we've added transform back into BuiltConnection, that wasn't referenced specifically so it reads fine as-is.

@tbekolay, do you want to do the honours, since my hands are on this now too? I figured we would get this into 2.1, because it might be nice to have it and the changelog entry is already under 2.1 :)

@tbekolay
Copy link
Member

Yep, I'll take a last look and then merge :)

@tbekolay tbekolay added this to the 2.1.0 release milestone Mar 23, 2016
drasmuss and others added 4 commits March 23, 2016 17:43
Using Distributions for transforms was not being handled correctly
with weight solvers. Also did some minor refactoring for clarity.
@tbekolay tbekolay merged commit 86ee038 into master Mar 23, 2016
@tbekolay tbekolay deleted the nengo_lasagne_compat branch March 23, 2016 22:13
mundya added a commit to project-rig/nengo_spinnaker that referenced this pull request Mar 24, 2016
Changes to `build_decoders` in nengo/nengo#979 were breaking for
nengo_spinnaker.

This change simply copies an older implementation of `build_decoders`
into nengo_spinnaker - probably not an ideal solution.  Some discussion
as to what API backends should and shouldn't use might be a good idea to
avoid this kind of thing regularly happening.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants