World ComplexiPhi
THIS WORLD DOES NOT SHIP WITH DEFAULT MABE, The version of the code is out of date, but could be brought up to current standards with out too much effort. If you are interested in getting the code for this world, please contact us.
This world is also known as the "blocky catch world. It implements the so called active categorical perception task defined by Beer (1996, 2003) which we used in several publications already (Marstaller 2013, Albantakis 2014, Schossau 2015). In this environment we use small and large blocks and throw them at the agent, and the agent has to navigate in such a way that it has to catch the small ones, and avoid the large ones. All of this happens in a 2d world, resembling the game pong quite a bit.
The blocks are typically 2 or 4 pixels (units) wide and fall over 20 time steps toward the agent, and while falling they either move to the left or right, and never fall straight. The agent has sensors on the right and left side facing up with a gap in the middle, and it can move left or right. Typically the agent is configured in such a way that it can't tell by single world observations what the size of the block is, hence they have to evolve memory (R) and also need to temporally and spatially information (phi).
The world needs to be setup by defining the number of the sensors on the left and right side (nrOfSensorsLeft, nrOfSensorsRight) as well as the gap width (gapWidth). Keep in mind that the world is 16 pixels (units) wide, so the sum of those three values shouldn't be larger than 16 - obviously. The blocks you can drop can be varied, and you need to supply a list of bitmasks separated with a comma. Sounds difficult, but isn't. A "3" in binary is 11 and thus encodes a block two wide, while a "15" in binary is 1111 and thus encodes a block of width four. Thus the default setup parameter for blockPatternsString is 3,15 . When you change that, the even numbers are to be avoided, the odd numbers should be caught. 3,15,1,31 for example encodes blocks of the width: 2,4,1,5 and blocks of size 1 and 2 should be caught, and blocks of size 4 and 5 should be avoided.
Cheers Arend
Beer, R. (1996). Toward the evolution of dynamical neural networks for minimally cognitive behavior. In P. Maes, M. Mataric, J. Meyer, J. Pollack, & S. Wilson (Eds.), From animals to animats: Proceedings of the Fourth International Conference on Simulation of Adaptive Behavior (pp. 421–429). Cambridge, MA: MIT Press.
Beer, R. (2003). The dynamics of active categorical perception in an evolved model agent. Adaptive Behavior, 11, 209–243.
Schossau, J., Adami, C., & Hintze, A. (2015). Information-theoretic neuro-correlates boost evolution of cognitive systems. Entropy, 18(1), 6.
Albantakis, L., Hintze, A., Koch, C., Adami, C., & Tononi, G. (2014). Evolution of integrated causal structures in animats exposed to environments of increasing complexity. PLoS Comput Biol, 10(12), e1003966.
Marstaller, L., Hintze, A., & Adami, C. (2013). The evolution of representation in simple cognitive networks. Neural computation, 25(8), 2079-2107.
home
welcome
MABE Parameter Widget
Installation and quick start
license
citations
release notes
developer contributions
consistency testing
Using MABE
Using Settings Files
Output Files
Creating Graphs with python
MABE framework
Defining Update
Brains
Markov Brain
Neuron Gate
Wire Brain
Human Brain
ConstantValues Brain
CGP Brain
Genetic Programing Brain
Artificial Neural Networks
Brains Structure and Connectome
Genomes
Circular Genome
Multi Genome
Genome Handlers
Genome Value Conversions
Organisms
Groups
Archivists
popFileColumns
Optimizers
Lexicase Optimizer
Worlds
Berry World
ComplexiPhi World
MultiThreadTemplate World
Utilities
DataMap
Parameters
Parameters Name Space
Adding Parameters to Code
ParametersTable
MTree
sequence function
Population Loading
PythonTools
MBuild
MGraph
MQ
findRelatedness
generatePhylogeny
Information Theory Tools
Brain States and Life Times
TimeSeries
Entropy Functions
Smearing
Fragmentation
State to State
Brain Infomation Tools
ProcessingTools