Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

removed working memory and product config #814

Merged
merged 1 commit into from
Sep 19, 2015
Merged

removed working memory and product config #814

merged 1 commit into from
Sep 19, 2015

Conversation

Seanny123
Copy link
Contributor

As discussed in #790 configs should not be passed as parameters into networks anymore, so I removed them from the Product network and the Working Memory network.

I would have changed the Action Selection network too, but that depends on #810 .

I'll squash the commits together once everyone agrees on how I proceeded.

mem_config = nengo.Config(nengo.Connection)
mem_config[nengo.Connection].synapse = nengo.Lowpass(0.1)
mem_config = nengo.Config(nengo.Connection)
mem_config[nengo.Connection].synapse = nengo.Lowpass(0.1)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the config parameter is removed (as I completely agree with), this synapse parameters should be exposed. I'd recommend an argument like recurrent_synapse and have the default be 0.1. Hmm, although it looks like the network also uses that same value for the connection from the difference output to the memory input. Should those be the same value? Should that be a separate parameter? If they should be the same value, then is there a better parameter name than recurrent_synapse?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What are common use-cases for changing the recurrent_synapse and the difference_synapse? Are they usually changed by the same amount?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You'd definitely want to be able to chance the synapse of the recurrent connection. 0.1s is a pretty large value, and you can reduce the value for a more responsive (but less accurate) system.

As for the difference synapse, you'd usually match the two values. But I can see making the difference synapse bigger than the recurrent synapse (you'd want a slower / delayed approach to the input value)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool. How about having recurrent_synapse=0.1, difference_synapse=None and if difference_synapse is None then it defaults to whatever recurrent_synapse is?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah. That works.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, have we decided to remove the reset_gain and gate_gain, but keep the synapse parameters as they current are? Alternatively, should this just be marked as a discussion to continue at the next Nengo-dev meeting?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm +1 on removing reset_gain and gate_gain. I'm also +1 on removing difference_synapse, but I'm willing to let @xchoo 's vote count stronger than mine.

Also, speaking of parameters, fdbk_scale should be changed to be just `feedback(or whatever it's being called in the equivalent case inspa.State``), for consistency.

As for discussing at the next Nengo-dev meeting, I think most of the main interested parties are already involved, so hopefully we can sort it out here... although we might want to move it to the main discussion thread rather than in this comment on an outdated commit.... :)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd take out reset_gain and gate_gain but leave difference_synapse in there.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool... I'm fine with @xchoo outvoting us 1 to 2.... especially since we're getting fewer parameters anyway, so having one weird one isn't too bad. :)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

On 20 August 2015 at 12:48, tcstewar notifications@github.com wrote:

In nengo/networks/workingmemory.py
#814 (comment):

 """Stores a given vector in memory, with input controlled by a gate."""
 if net is None:
     net = nengo.Network(label="Input Gated Memory")
  • if mem_config is None:
  •    mem_config = nengo.Config(nengo.Connection)
    
  •    mem_config[nengo.Connection].synapse = nengo.Lowpass(0.1)
    
  • mem_config = nengo.Config(nengo.Connection)
  • mem_config[nengo.Connection].synapse = nengo.Lowpass(0.1)

Cool... I'm fine with @xchoo https://github.com/xchoo outvoting use 1
to 2.... especially since we're getting fewer parameters anyway, so having
one weird one isn't too bad. :)


Reply to this email directly or view it on GitHub
https://github.com/nengo/nengo/pull/814/files#r37552615.

if difference_synapse is None:
difference_synapse = recurrent_synapse
mem_config = nengo.Config(nengo.Connection)
mem_config[nengo.Connection].synapse = nengo.Lowpass(recurrent_synapse)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this should be done this way. Get rid of the Config object completely and just do

        nengo.Connection(net.mem.output, net.mem.input,
                              transform=fdbk_scale, synapse=recurrent_synapse)

@Seanny123
Copy link
Contributor Author

Does this look good now? If so, I'll squash up the commits and mark for it to be merged.

neuron_nodes=True, label="mem")
nengo.Connection(net.mem.output, net.mem.input,
transform=fdbk_scale,
synapse=nengo.Lowpass(recurrent_synapse))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be synapse=recurrent_synapse

@Seanny123
Copy link
Contributor Author

Fixed my derps. Thanks @tbekolay and @tcstewar for explaining to me the synapse thing.

neuron_nodes=True, label="mem")
nengo.Connection(net.mem.output, net.mem.input,
transform=feedback,
synapse=nengo.Lowpass(recurrent_synapse))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This still needs to be changed to synapse=recurrent_synapse.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, I get it now. From now on, whenever the synapse is parameter, you don't cast it as nengo.Lowpass.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, you can. You just don't need to. 😄

@Seanny123
Copy link
Contributor Author

Squashed and ready to merge.

@tbekolay
Copy link
Member

Should probably make a note in the changelog about the parameter changes... no need to keep backward compatibility though (I doubt anyone's using these parameters).

@Seanny123
Copy link
Contributor Author

@tbekolay updated the changelog as requested. <3

@Seanny123
Copy link
Contributor Author

@tbekolay I rebased so this is ready to merge

@tbekolay tbekolay merged commit 70a46e8 into master Sep 19, 2015
@tbekolay tbekolay deleted the remove-config branch September 19, 2015 18:23
@tbekolay tbekolay mentioned this pull request May 10, 2017
5 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants