forked from numenta/nupic.core-legacy
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Temporal Memory Examples (Python) #497
Merged
Merged
Changes from all commits
Commits
Show all changes
10 commits
Select commit
Hold shift + click to select a range
1dcb9c5
Merge TM Examples from nupic.py. Original code, no changes.
ctrl-z-9000-times 3c5e775
TM Examples (Python) Used program "2to3" to convert to python 3
ctrl-z-9000-times 2456f80
TM Bindings, TM Documentation
ctrl-z-9000-times fed24ed
Revised python example hello_tm
ctrl-z-9000-times cd86f33
Revised tm_high_order.py - TM Python Example
ctrl-z-9000-times 6f019a3
TM bindings & printParameter cleanup
ctrl-z-9000-times e938ecd
Merge branch 'master' into tm-examples
ctrl-z-9000-times ae6b3bd
TM Examples Review Feedback *Thanks D. Keeney*
ctrl-z-9000-times 12eb10a
TM Python Examples - Documentation fixes
ctrl-z-9000-times 250ca73
Merge branch 'master' into tm-examples
ctrl-z-9000-times File filter
Filter by extension
Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,2 +1,4 @@ | ||
from nupic.bindings.algorithms import __doc__ | ||
|
||
from nupic.bindings.algorithms import SpatialPooler | ||
from nupic.bindings.algorithms import TemporalMemory |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,11 @@ | ||
Temporal Memory Sample Code | ||
===== | ||
|
||
This directory contains a number of files that demonstrate how to use the | ||
temporal memory algorithm directly. | ||
|
||
The best place to start is hello_tm.py | ||
|
||
WARNING: understanding these files requires building up a very detailed | ||
knowledge of how the temporal memory works in HTM's. The documentation is not | ||
great at this level of detail - any suggestions or help appreciated! |
Empty file.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,135 @@ | ||
# ---------------------------------------------------------------------- | ||
# Numenta Platform for Intelligent Computing (NuPIC) | ||
# Copyright (C) 2013, Numenta, Inc. | ||
# 2019, David McDougall | ||
# | ||
# Unless you have an agreement with Numenta, Inc., for a separate license for | ||
# this software code, the following terms and conditions apply: | ||
# | ||
# This program is free software: you can redistribute it and/or modify | ||
# it under the terms of the GNU Affero Public License version 3 as | ||
# published by the Free Software Foundation. | ||
# | ||
# This program is distributed in the hope that it will be useful, | ||
# but WITHOUT ANY WARRANTY; without even the implied warranty of | ||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. | ||
# See the GNU Affero Public License for more details. | ||
# | ||
# You should have received a copy of the GNU Affero Public License | ||
# along with this program. If not, see http://www.gnu.org/licenses. | ||
# | ||
# http://numenta.org/licenses/ | ||
# ---------------------------------------------------------------------- | ||
|
||
__doc__ = """ | ||
This program shows how to access the Temporal Memory algorithm directly. This | ||
program demonstrates how to create a TM instance, train it, get predictions and | ||
anomalies, and inspect the state. | ||
|
||
The code here runs a very simple version of sequence learning, with one | ||
cell per column. The TM is trained with the simple sequence A->B->C->D->E | ||
""" | ||
|
||
from nupic.bindings.sdr import SDR | ||
from nupic.algorithms import TemporalMemory as TM | ||
|
||
# Utility routine for printing an SDR in a particular way. | ||
def formatBits(sdr): | ||
s = '' | ||
for c in range(sdr.size): | ||
if c > 0 and c % 10 == 0: | ||
s += ' ' | ||
s += str(sdr.dense.flatten()[c]) | ||
s += ' ' | ||
return s | ||
|
||
def printStateTM( tm ): | ||
# Useful for tracing internal states | ||
print("Active cells " + formatBits(tm.getActiveCells())) | ||
print("Winner cells " + formatBits(tm.getWinnerCells())) | ||
tm.activateDendrites(True) | ||
print("Predictive cells " + formatBits(tm.getPredictiveCells())) | ||
print("Anomaly", tm.anomaly * 100, "%") | ||
print("") | ||
|
||
|
||
print("################################################################################") | ||
print(__doc__) | ||
print("################################################################################") | ||
print("") | ||
print("Creating the Temporal Memory") | ||
tm = TM(columnDimensions = (50,), | ||
cellsPerColumn=1, | ||
initialPermanence=0.5, | ||
connectedPermanence=0.5, | ||
minThreshold=8, | ||
maxNewSynapseCount=20, | ||
permanenceIncrement=0.1, | ||
permanenceDecrement=0.0, | ||
activationThreshold=8, | ||
) | ||
tm.printParameters() | ||
|
||
print(""" | ||
Creating inputs to feed to the temporal memory. Each input is an SDR | ||
representing the active mini-columns. Here we create a simple sequence of 5 | ||
SDRs representing the sequence A -> B -> C -> D -> E """) | ||
dataset = { inp : SDR( tm.numberOfColumns() ) for inp in "ABCDE" } | ||
dataset['A'].dense[0:10] = 1 # Input SDR representing "A", corresponding to mini-columns 0-9 | ||
dataset['B'].dense[10:20] = 1 # Input SDR representing "B", corresponding to mini-columns 10-19 | ||
dataset['C'].dense[20:30] = 1 # Input SDR representing "C", corresponding to mini-columns 20-29 | ||
dataset['D'].dense[30:40] = 1 # Input SDR representing "D", corresponding to mini-columns 30-39 | ||
dataset['E'].dense[40:50] = 1 # Input SDR representing "E", corresponding to mini-columns 40-49 | ||
# Notify the SDR object that we've updated its dense data in-place. | ||
for z in dataset.values(): | ||
z.dense = z.dense | ||
for inp in "ABCDE": | ||
print("Input:", inp, " Bits:", formatBits( dataset[inp]) ) | ||
print("") | ||
|
||
print("################################################################################") | ||
print("") | ||
print("""Send this simple sequence to the temporal memory for learning.""") | ||
print(""" | ||
The compute method performs one step of learning and/or inference. Note: here | ||
we just perform learning but you can perform prediction/inference and learning | ||
in the same step if you want (online learning). | ||
""") | ||
for inp in "ABCDE": # Send each letter in the sequence in order | ||
print("Input:", inp) | ||
activeColumns = dataset[inp] | ||
|
||
print(">>> tm.compute()") | ||
tm.compute(activeColumns, learn = True) | ||
|
||
printStateTM(tm) | ||
|
||
print("""The reset command tells the TM that a sequence just ended and essentially | ||
zeros out all the states. It is not strictly necessary but it's a bit | ||
messier without resets, and the TM learns quicker with resets. | ||
""") | ||
print(">>> tm.reset()") | ||
print("") | ||
tm.reset() | ||
|
||
|
||
print("################################################################################") | ||
print("") | ||
print("""Send the same sequence of vectors and look at predictions made by | ||
temporal memory. | ||
|
||
The following prints out the active cells, predictive cells, active segments and | ||
winner cells. | ||
|
||
What you should notice is that the mini-columns where active state is 1 | ||
represent the SDR for the current input pattern and the columns where predicted | ||
state is 1 represent the SDR for the next expected pattern. | ||
""") | ||
for inp in "ABCDE": | ||
print("Input:", inp) | ||
activeColumns = dataset[inp] | ||
|
||
print(">>> tm.compute()") | ||
tm.compute(activeColumns, learn = False) | ||
|
||
printStateTM(tm) |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TM with 1 cell/col does not really make sense (no higher order learning), unless it's necessary for the example, this should use "normal" number of cells, ie 4,8,...