Skip to content
This repository has been archived by the owner on Sep 1, 2023. It is now read-only.

HTM attention module #2648

Open
breznak opened this issue Oct 8, 2015 · 2 comments
Open

HTM attention module #2648

breznak opened this issue Oct 8, 2015 · 2 comments

Comments

@breznak
Copy link
Member

breznak commented Oct 8, 2015

Interesting thought of HTM, where SP's active columns get boosted from anomalous input, therefore creating a "short-term attention memory"

@subutai @cogmission

@cogmission
Copy link
Contributor

Doesn't boosting affect affinity for all previously seen patterns that are just beneath the connected threshold - globally without regard for; or biasing toward any particular pattern? Just wondering how the could act as an attention "focusing" mechanism?

@breznak
Copy link
Member Author

breznak commented Oct 8, 2015

Doesn't boosting affect affinity for all previously seen patterns that
are just beneath the connected threshold - globally without regard for; or
biasing toward any particular pattern? Just wondering how the could act as
an attention "focusing" mechanism?

Yes, that is boosting as the mechanism, but I've meant to boost (like a
word) the synaptic input (or maybe slightly increase sensitivity even
globally) for situations when "something unexpected happens". That leads to
auto focus on "wrong" details, or higher sensitivity in risky situations.
(or at least I believe so)

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

2 participants