Skip to content

ViciousSquid/Dosidicus

Repository files navigation

"What if a Tamagotchi had a neural network and could learn stuff?" - Gigazine , Hackaday

AI GPL-2.0 Translations Buy Me A Coffee

Dosidicus electronicus

A transparent cognitive sandbox disguised as a digital pet squid with a neural network you can see thinking

  • Part educational neuro tool, part sim game, part fever dream
  • A unique intersection of 1990s retro-gaming aesthetic and modern computational neuroscience.
  • Build-your-own neural network - learn how an NN works by raising one as a pet

Compiled binaries for Windows, Mac and Linux: see Releases page

curl -sSL https://raw.githubusercontent.com/ViciousSquid/Dosidicus/2.6.2.0_LatestVersion/linux_setup.sh | bash
image

Myth & Mechanism

Dosidicus is a digital squid born with a randomly wired brain.

Feed him, stimulate neurons, watch him learn.

  • He starts with 8 neurons.
  • He grows new structure via neurogenesis and rewires using Hebbian learning
  • He forms memories.
  • He develops quirks.

Every squid is different. Every save file is a cognitive history.

Under the hood runs STRINg simulation engine:

  • Built from scratch in NumPy
  • No TensorFlow. No PyTorch. No NEAT.
  • Fully visible neuron activations
  • Structural growth over time
  • Dual memory system
  • Headless training mode
  • Most AI is a black box: Dosidicus lets you see the mind forming - every neuron is visible, stimulatable, understandable.

The squid serves as a digital pioneer in our quest to understand the mechanisms of thought and the evolution of autonomy in a synthetic world.

Want the full conceptual philosophy behind Dosidicus? Read the Cognitive Sandbox Manifesto


Share Your Squid

No two squids are wired the same.

Early interactions permanently alter their structure. Tiny differences amplify. Habits form. Fears emerge. Personalities drift.

Your squid's brain is a cognitive history - shaped by you.

So share it.

  • Export save files and let others explore your squid's neural structure.

  • Post screenshots of strange activation patterns and unexpected growth.

  • Show bizarre learned behaviors (Why is yours afraid of poop?)

  • Compare cognitive histories and trace how experience shaped structure.

  • Did yours grow 40 neurons?

  • Did it develop a persistent avoidance loop?

  • Did you accidentally create a neurotic reward spiral?

Every squid is an experiment.


Docker

Two targets are provided: headless (CLI trainer) and gui (PyQt5 app with X11).

Headless (recommended for containers):

docker build -t dosidicus:headless --target headless .
docker run --rm -v ${PWD}/headless_output:/app/output dosidicus:headless --ticks 10000 --output /app/output/trained_brain.json

GUI (Linux host with X11 or WSLg):

docker build -t dosidicus:gui --target gui .
docker run --rm \
  -e DISPLAY=$DISPLAY \
  -e QT_X11_NO_MITSHM=1 \
  -v /tmp/.X11-unix:/tmp/.X11-unix:rw \
  -v ${PWD}/saves:/app/saves \
  -v ${PWD}/logs:/app/logs \
  dosidicus:gui

Compose:

docker compose up --build
docker compose --profile gui up --build

WSLg note: If the GUI fails to start with a Qt platform plugin error, try:

export QT_QPA_PLATFORM=wayland
docker compose --profile gui up --build

Note: On Windows without WSLg, you will need an X server and a valid DISPLAY value to run the GUI container.

Note: Attempting to build the Docker container on Windows ARM64 will fail because there is no pyqt5 wheel [32] - Use the prebuilt binary from releases instead

Troubleshooting (quick):

  • If DISPLAY is empty in WSL: WSLg is not active. Use WSLg or run an X server on Windows.
  • If Docker errors mention docker_engine/pipe not found: start Docker Desktop and ensure WSL integration is enabled.
  • If GUI still exits with Qt plugin errors: rebuild the image (docker compose --profile gui build --no-cache) and retry.

Project Overview

  • 51,994 lines, one developer, 28 months, GPL 2.0 license

  • Dependencies:

    • Python ^3.9
    • PyQt5 ^5.15 (GUI framework)
    • numpy ^1.21 (neural network computations)
    • OPTIONAL onnxruntime or onnxruntime-directml (more info)
  • Core Structure: Modular codebase in src/ including brain designer, decision engine, learning algorithms, personality traits, memory management, UI components, and interaction systems. Entry point via main.py.

Key Project Components

  • Plugin System: Extensible architecture with built-in plugins for achievements (tracking milestones) and multiplayer (networked interactions).
  • Save System: Persistent saves in saves/ for pet states, autosaves, and achievement logs.
  • Headless Mode: Standalone training and simulation in headless/ for GUI-less operation, ideal for background training or server environments (experimental)
  • Custom Brains: Library of pre-configured neural networks in custom_brains/ (e.g., "Plant-Seeker", "Insomniac") for quick behavior setup.
  • Memory Management: Dual memory system (_memory/) with long-term and short-term storage for learning persistence.
  • Examples and Tools: Example squids, configuration files (config.ini), and version tracking.

A year ago I got a tattoo of this project to celebrate its first development milestone!


Visitors