Skip to content

Commit

Permalink
Update working-with-sdrs.md
Browse files Browse the repository at this point in the history
  • Loading branch information
MahdiehPirmoradian committed Jan 21, 2024
1 parent 642e18e commit ba79eb2
Showing 1 changed file with 4 additions and 3 deletions.
7 changes: 4 additions & 3 deletions source/NeoCortexUtils/docs/working-with-sdrs.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,13 @@
# working-with-sdrs

## Introduction:
Neural network is a focal element in the area of machine learning. Inspired by the biological neurons that are present in the human brain, an artificial neural network is designed that mimics the human brain’s behavior, helping computer programs to identify patterns and answers to these related issues. It would be able to perform actions like the human brain and have the capability of learning things. These neural networks work on the principle of learning input/output operations. In this project, SDR representation has been implemented in a variety of ways, including SDR as indices and bitmaps. Furthermore, we developed methods for comparing two SDRs by using intersection, union, and overlap. In addition, we have added a new representation of Spatial pooler learning as a "Column/Overlap" ratio, which is another representation of a heatmap.
Neural networks represent a cornerstone in the field of machine learning, drawing inspiration from the biological neurons in the human brain. These artificial neural networks are engineered to emulate the brain's ability to recognize patterns and solve complex problems. Central to this capability is their potential to learn and perform tasks akin to human cognition, grounded in the principle of learning from input/output operations.

The inputs that we are using are scalar values and images. We specified how these inputs are converted to SDR. Furthermore, this procedure of SDR representations involves the use of Encoders, Spatial Pooler (SP), and Temporal Memory (TM). Encoders are the basic components used in this network, which takes human justifiable information as input data i.e. (image, scalar value), and changes it to machine-readable format, binary array with n size. SP uses these encoded binary arrays from encoders as input for the generation of SDRs.
In our project, we have explored various implementations of Sparse Distributed Representation (SDR), including using SDRs as indices and bitmaps. Our methods for comparing SDRs involve techniques like intersection, union, and overlap calculations. Additionally, we've introduced a novel concept: representing Spatial Pooler learning through a "Column/Overlap" ratio, akin to a heatmap representation.

TM is used to learn the sequence of these generated SDRs which are given as input from the Spatial Pooler (SP).
The inputs for our neural network are scalar values and images. We have detailed the process of converting these inputs into SDRs, which is a crucial step in our methodology. This conversion involves several key components: Encoders, the Spatial Pooler (SP), and Temporal Memory (TM). Encoders serve as the initial processing unit, transforming human-interpretable data (such as images or scalar values) into a binary array format that is machine-readable. The Spatial Pooler then takes these encoded arrays and generates SDRs.

Finally, Temporal Memory plays a pivotal role in learning the sequences of these SDRs, which are fed from the Spatial Pooler. This learning process is fundamental in enabling the neural network to understand and predict patterns in the data, a critical aspect of machine learning.

#### What is an SDR:
According to recent research in neuroscience, our brain uses SDRs to process information. SDRs are the binary representation of data which is approximately 2% of bits that are active. In SDRs, each bit has a meaning i.e. the active bits in the same places of two different vectors make them semantically similar. By comparing SDRs of different samples, the similarity between them can be estimated. For storing the SDRs, a list of indices of active bits is kept which saves a lot of space.
Expand Down

0 comments on commit ba79eb2

Please sign in to comment.