Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide SP expressions as keys to the SPA associative memory #982

Merged
merged 3 commits into from
Jun 16, 2016

Conversation

ikajic
Copy link
Contributor

@ikajic ikajic commented Mar 14, 2016

I needed this scenario for my model, so I decided to submit a PR in case it might be of interest for others. Currently, only single keys can be passed as the input and output keys to the associative memory module. This PR slightly changes the existing code so that it also supports expressions in both input or output keys. Then something like this can be done:

import nengo
from nengo import spa

with spa.SPA() as model:
    D = 16

    voc1 = spa.Vocabulary(D)
    voc2 = spa.Vocabulary(D)

    voc1.parse('A+B+C')
    voc2.parse('D+E+F')

    in_keys = ['A', 'B+C']
    out_keys = ['D-E', '0.4*F']

    model.am = spa.AssociativeMemory(input_vocab=voc1,
                                     output_vocab=voc2,
                                     input_keys=in_keys,
                                     output_keys=out_keys
                                    )
    model.inp = spa.State(D, vocab=voc1)
    nengo.Connection(model.inp.output, model.am.input)

@jgosmann
Copy link
Collaborator

Seem like it could be useful. Can you add unit tests for this?


# If output vocabulary is not specified, use input vocabulary
# (i.e autoassociative memory)
if output_vocab is None:
output_vocab = input_vocab
output_vectors = input_vectors
output_keys = input_keys
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line seems unecessary?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't think so...

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It only gets used in the else block which initializes it itself if necessary.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And it wasn't there before ..

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, looking at it now I think that line shouldn't be there. Unless @xchoo wants to share a bit more of his thoughts :)

I added it because I thought keys would be used as class attributes/passed to the network.AssociativeMemory, but that doesn't seem to be the case. It seems like those keys are only used to create input and output vectors, if I'm not mistaking...

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

@pblouw
Copy link
Contributor

pblouw commented Mar 14, 2016

+1 for this PR - I ran into a similar scenario when I put together my Cogsci submission a while ago.

@jgosmann
Copy link
Collaborator

@ikajic
Copy link
Contributor Author

ikajic commented Mar 15, 2016

Would this be the right place to add tests: https://github.com/nengo/nengo/blob/master/nengo/spa/tests/test_assoc_mem.py?

@jgosmann
Copy link
Collaborator

Exactly that place. 🎯

@ikajic
Copy link
Contributor Author

ikajic commented Mar 23, 2016

I've added a few tests but I would like to discuss how useful they are since the associative memory seems to have a lot of noise when doing this sort of mapping. So I allowed a pretty big error margin -- does this make sense at all? @xchoo @jgosmann

Also, did I just mess up stuff in this branch by merging with the master? 😕

@jgosmann
Copy link
Collaborator

It's the right idea, but the error margin is too large. The proper fix would probably be to improve the associative memory to produce a more precise output instead of overshooting, but not sure how easily that can be achieved.

We could also say that we don't support exact outputs like 0.3 * A (do you need those?) and thus not test for those. We could still support expressions like 0.5 * A + 0.7 * B or A * POS1 and compare the relative strengths of the vectors.

There are a few assertions in that I don't understand, but I mostly skimmed it so far.

Also, did I just mess up stuff in this branch by merging with the master?

Yes (but it's fixable). Never do merges in the Nengo development model, always to a rebase instead. (It might also make sense to configure git pull to do rebase instead of merge, but that would not have helped here.)

@xchoo
Copy link
Member

xchoo commented Mar 23, 2016

Hmmmm. I think we might want to document that the associative memory works by computing the dot product between the input vector and the output vector (i.e. it is expecting the input vector magnitudes to be nominally 1).

There might be some confusion if this is not stated, because a user might put in, as an input sp phrase, something like 0.5*A, and expect the output of the associative memory to be maximal when the input vector is 0.5*A. In reality though, the output would be 0.25*A.

vec_sim = sim.data[prob]

final_vecA = vec_sim[100:490].mean(axis=0)
simA = np.dot(voc2.vectors, final_vecA)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think in this case, we want to use the vector norm of the vector difference (or the cosine angle) as a comparison here. The dot product is not accurate when the input vectors do not have a magnitude of 1 (-C+D and 0.3*E are not guaranteed to have a magnitude of 1).

@ikajic
Copy link
Contributor Author

ikajic commented Mar 24, 2016

[...] exact outputs like 0.3 * A (do you need those?) [..] and compare
the relative strengths of the vectors.

What do you mean by "compare the relative strengths of the vectors"? I
actually had a case where I needed a mapping such as X->0.3*Y, but an
approximation/noisy version was fine for me.

On 23 March 2016 at 15:35, Jan Gosmann notifications@github.com wrote:

It's the right idea, but the error margin is too large. The proper fix
would probably be to improve the associative memory to produce a more
precise output instead of overshooting, but not sure how easily that can be
achieved.

We could also say that we don't support exact outputs like 0.3 * A (do
you need those?) and thus not test for those. We could still support
expressions like 0.5 * A + 0.7 * B or A * POS1 and compare the relative
strengths of the vectors.

There are a few assertions in that I don't understand, but I mostly
skimmed it so far.

Also, did I just mess up stuff in this branch by merging with the master?

Yes (but it's fixable). Never do merges in the Nengo development model,
always to a rebase instead. (It might also make sense to configure git
pull to do rebase instead of merge, but that would not have helped here.)


You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub
#982 (comment)

@jgosmann
Copy link
Collaborator

What do you mean by "compare the relative strengths of the vectors"?

Normalizing the vectors to length 1 for comparison (i.e. computing the cosine angle; cp. @xchoo's comment).

@ikajic
Copy link
Contributor Author

ikajic commented Mar 24, 2016

I've updated tests to use cos similarity and simplified them by presenting just one input. What the current tests are doing is:

  • testing whether a SP expression can be provided in the input keys
  • testing whether a SP expression can be provided in the output keys
  • testing for the right output values with the 0.2 error margin

I agree with Xuan that it might not be clear what are expected outputs, so I think documenting how AM works would fit well in the docstring for the AM module (which is currently a one-liner and would contribute to closing #885), but I think it'd make sense to make another PR for documentation.

@jgosmann
Copy link
Collaborator

testing whether a SP expression can be provided in the input keys

What would be the use case of providing an expression like 0.9 * A (vs. just A)? It is not clear to me how this would change the operation of the AM and unit tests are also to some degree a usage example. So I might go with an expression like A + B or A * B where I clearly can see a use case.

You're relying on the order of the pointers in the vocab. It would be nicer (for readability) to access them by name instead. For selecting the probe values sim.trange() > 0.15 might be more expressive (it doesn't require me to recall what dt is).

@jgosmann
Copy link
Collaborator

It might also make sense to configure git pull to do rebase instead of merge, but that would not have helped here.

I guess it would have helped because there is now an additional merge commit which looks exactly like it was produced by a git pull doing a merge. To change that setting add

[pull]
    rebase = true

to your ~/.gitconfig. I'll see if I can get this branch fixed.

@jgosmann
Copy link
Collaborator

That was pretty easy to fix actually. Just a git rebase -i master (and a git rebase --skip at one point) did it. Note that I had to force push this branch. To get your local branch in sync don't do a git pull, but the following:

  1. git fetch
  2. git checkout spaassocmem-keys-as-expr (if not already on that branch)
  3. git reset --hard origin/spaassocmem-keys-as-expr (WARNING: You lose all local changes and commits on this branch. If you have any local changes/commits you want to keep the process is slightly more complicated. In case you already lost them, because you read this too late, they can still be recovered.)

@xchoo
Copy link
Member

xchoo commented Mar 24, 2016

I've updated the tests. 😄

@ikajic
Copy link
Contributor Author

ikajic commented Mar 25, 2016

Thanks for the update @xchoo, looks good to me! Btw. Is there a way to get plots when only running these tests? This line: python -m py.test test_assoc_mem.py --plot doesn't work for me.

Regarding use-cases: I had an example where I needed keys such as 0.9*A+0.3*B+0.1*C in the input. I had examples where mappings to the output keys were depending on the scaling for A, B and C.

@tbekolay
Copy link
Member

To get plots, the command line option is --plots (note the plural), and the plots get saved to a directory like nengo.simulator.plots in whatever directory you ran py.test from.

@jgosmann
Copy link
Collaborator

Is there a way to get plots when only running these tests? This line: python -m py.test test_assoc_mem.py --plot doesn't work for me.

It will save the plots in nengo.simulator.plots.

I had an example where I needed keys such as 0.9_A+0.3_B+0.1*C in the input. I had examples where mappings to the output keys were depending on the scaling for A, B and C.

Does this only depend on the relative scaling of A, B, and C? Or is the actual length of the vector important?

@ikajic
Copy link
Contributor Author

ikajic commented Mar 25, 2016

Does this only depend on the relative scaling of A, B, and C?

Yeah, it was just the scaling.


# Specify t ranges
t = sim.trange()
t_item1 = (t > 0.075) & (t < 0.1)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't know that you can use the & operator; I always used np.logical_and. This is so much more convenient! :)

@jgosmann
Copy link
Collaborator

Yeah, it was just the scaling.

In that case I would argue that it is different from 0.9 * A.

Made a few comments in the code, but apart from that it LGTM. 🍰

@pblouw
Copy link
Contributor

pblouw commented Jun 15, 2016

It would be great to get this into master :)

@jgosmann jgosmann self-assigned this Jun 15, 2016
@jgosmann jgosmann removed their assignment Jun 15, 2016
@tcstewar
Copy link
Contributor

It would be great to get this into master :)

I also ran into a situation today where this would have been handy.... So I'm kicking myself for not reviewing this earlier....

@Seanny123
Copy link
Contributor

LGTM.

@jgosmann
Copy link
Collaborator

jgosmann commented Jun 16, 2016

  • This should get a changelog entry.

@jgosmann jgosmann self-assigned this Jun 16, 2016
@jgosmann
Copy link
Collaborator

I'll add the changelog entry and clean up the history.

@jgosmann jgosmann added this to the 2.1.1 release milestone Jun 16, 2016
@Seanny123
Copy link
Contributor

Looks even better to me.

ikajic and others added 3 commits June 16, 2016 09:53
This allows to use Semantic Pointer expressions like '0.5*A + 0.3*B'
as input_keys and output_keys to the AssociativeMemory. Before it was
only possible to give specific Semantic Pointers like 'A'.

Co-authored by: Xuan Choo <xchoo.mainframe@gmail.com>
@jgosmann jgosmann removed their assignment Jun 16, 2016
@jgosmann
Copy link
Collaborator

I cleaned the history up, added the changelog entry, and added another fixup commit. If I get an ok on this, I can merge this.

@Seanny123
Copy link
Contributor

I am now utterly convinced that this is the best this commit can possibly be.

@jgosmann jgosmann self-assigned this Jun 16, 2016
@jgosmann jgosmann merged commit ea15d64 into master Jun 16, 2016
@jgosmann jgosmann removed their assignment Jun 16, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants