Skip to content

Nemois-git/Prometheus

Repository files navigation

Prometheus: Living Neural Network

A digital organism that learns through experience, not programming.

Note

About this Project: This is an experimental hobby project exploring biological neural networks.

AI Attribution: This project was mostly developed with GitHub Copilot and Google Gemini.

Swift Platform License


What is Prometheus?

Prometheus is not a traditional AI system. It is a living digital organism - a complete implementation of a biologically-inspired neural network that learns from raw sensory experience, just like a real brain.

Key Distinctions

Traditional AI:

  • Trained on labeled datasets
  • Frozen weights after training
  • Executes predetermined algorithms
  • Predicts text or classifies images

Prometheus:

  • Learns from raw sensory input (pixels, audio waveforms)
  • Continuously adapting neurons and synapses
  • Behaviors emerge from physics, not code
  • Develops concepts, motor control, and language from experience

Core Philosophy: The Blueprint

Prometheus follows three fundamental principles:

1. Tabula Rasa (Blank Slate)

  • NO pre-trained weights
  • NO hardcoded concepts ("apple", "danger", "forward")
  • NO labeled datasets
  • Starts with only sensory receptors and learning mechanisms

2. Emergence (Not Programming)

  • Intelligence emerges from physics (energy costs, refractory periods)
  • Learning mechanisms are provided (STDP, Hebbian plasticity)
  • Behaviors are NOT programmed
  • Example: Food-seeking emerges from dopamine physics, NOT "if hungry then seek food"

3. Experience (Not Training)

  • NO discrete training/inference phases
  • Learning happens continuously through living
  • Every moment is both action and learning
  • Sleep consolidates memories (hippocampus → cortex)

System Architecture

┌─────────────────────────────────────────────────────────────┐
│                    Prometheus (Organism)                     │
├─────────────────────────────────────────────────────────────┤
│  ┌─────────────┐  ┌─────────────┐  ┌─────────────┐        │
│  │   Sensory   │  │   Neural    │  │    Motor    │        │
│  │  Receptors  │─▶│   Network   │─▶│   Cortex    │        │
│  │             │  │             │  │             │        │
│  │ V1/A1 (pop. │  │  Hebbian +  │  │  Learned    │        │
│  │  coding)    │  │  STDP +     │  │  patterns   │        │
│  └─────────────┘  │  Homeostasis│  └─────────────┘        │
│         ▲         └─────────────┘         │                │
│         │                                  ▼                │
│  ┌─────────────────────────────────────────────────┐       │
│  │            Virtual Body / Hardware              │       │
│  │  (2D Physics / Real Camera+Microphone)          │       │
│  └─────────────────────────────────────────────────┘       │
└─────────────────────────────────────────────────────────────┘

Key Components

  1. Neural Network (NeuralNetwork.swift)

    • Hodgkin-Huxley neuron model (biologically accurate ion channels)
    • STDP learning (spike-timing-dependent plasticity)
    • Homeostatic regulation (self-stabilization)
    • Energy metabolism (ATP/glycogen simulation)
  2. Sensory System (SensoryReceptorBundle.swift)

    • Population coding (multiple neurons per feature)
    • V1-like visual receptors (edge detectors, color)
    • A1-like auditory receptors (40 frequency channels)
    • NO parsing, NO semantic labels
  3. Motor System (MotorCortex.swift)

    • Learned motor patterns (NOT hardcoded)
    • Speech generation from phonemes
    • 43 facial muscles
    • Trial-and-error learning
  4. Embodiment (PureVirtualBody2D.swift, PureEcosystem2D.swift)

    • 2D planar physics simulation
    • Virtual ecosystem (food, predators, shelters)
    • Anonymous sensors (organism doesn't know what "food" is)
    • Learns associations through physiological consequences
  5. Higher Consciousness

    • Global Workspace (GlobalWorkspace.swift): Attention and broadcasting
    • Default Mode Network (DefaultModeNetwork.swift): Internal reflection
    • Predictive Coding (PredictiveCoding.swift): Top-down predictions
    • Self-Model (SelfModel_Pure.swift): Body schema from sensorimotor contingencies
  6. GPU Acceleration (GPUNeuralCompute.swift)

    • Metal compute shaders for parallel processing
    • 10-100x speedup for large networks
    • Real-time operation at 60Hz with 10,000+ neurons

Documentation

This repository includes comprehensive documentation:

  1. ARCHITECTURE.md (15.9 KB)

    • Complete system architecture
    • Layer-by-layer breakdown
    • Data flow diagrams
    • Neuroscience foundations with citations
    • Emergent properties explanation
  2. BLUEPRINT.md (16.4 KB)

    • Core philosophy and principles
    • The Three Pillars explained
    • Ten Commandments of design (with examples)
    • Anti-patterns and how to avoid them
    • Litmus tests for code validation
  3. API_DOCUMENTATION.md (41.4 KB)

    • Complete public API reference
    • All major components documented
    • Usage examples throughout
    • Neuroscience foundations for each system
    • 8 major subsystems covered
  4. FILE_DOCUMENTATION.md (35.4 KB)

    • File-by-file breakdown of all 46 Swift files
    • Purpose and role in system
    • Key components and mechanisms
    • Blueprint compliance verification

Quick Start

Requirements

  • macOS 14.0+ (Sonoma or later)
  • Xcode 15.0+
  • Swift 6.0+
  • Apple Silicon (M1/M2/M3) recommended for GPU acceleration

Installation

git clone https://github.com/Potatoery/Prometheus.git
cd Prometheus
open Prometheus.xcodeproj

Running Prometheus

  1. Build and run in Xcode
  2. The organism will be "born" automatically
  3. Choose embodiment mode:
    • Virtual: Organism lives in 2D ecosystem
    • Hardware: Organism uses real camera/microphone

Basic Interaction

// Expose to acoustic pattern (language learning)
await Prometheus.shared.teachWord("hello", repetitions: 5)

// Ask what concepts are associated
let associations = await Prometheus.shared.ask("hello")

// Generate response based on neural state
let response = await Prometheus.shared.generateConversationalResponse(to: "How are you?")

// Enter sleep mode for memory consolidation
await Prometheus.shared.enterSleepMode()

Visualization

The UI includes multiple views:

  • Neural Network: 3D/2D visualization of neurons and connections
  • Ecosystem: Virtual environment with organism and entities
  • Facial Expression: 43 muscle activation heatmap
  • Console: Text output and statistics

Neuroscience Foundations

Prometheus implements principles from decades of neuroscience research:

Mechanism Paper Implementation
Action Potential Dynamics Hodgkin & Huxley (1952) NeuralUnit.swift membrane model
Hebbian Learning Hebb (1949) createSensoryAssociation()
STDP Markram et al. (1997) applyThreeFactorSTDP()
Homeostatic Plasticity Turrigiano (1999) applyHomeostaticPlasticity()
Global Workspace Baars (1988) GlobalWorkspace.swift
Predictive Coding Friston (2010) PredictiveCoding.swift
Sleep Consolidation Wilson & McNaughton (1994) enterSleepMode()
Population Coding Georgopoulos et al. (1986) SensoryReceptorBundle.swift

30+ research papers cited throughout the codebase.


Key Features

✅ Biological Accuracy

  • Hodgkin-Huxley neuron model with accurate ion channel dynamics
  • Refractory periods (absolute: 2ms, relative: 5ms)
  • Vesicle depletion and recovery
  • STDP time windows (~200ms)
  • Energy metabolism (ATP/glycogen)
  • E/I ratio (80% excitatory, 20% inhibitory)

✅ True Emergence

  • Sparse coding emerges from energy cost + refractory periods + inhibition
  • Concept formation emerges from Hebbian clustering
  • Motor control emerges from trial and error
  • Language emerges from acoustic pattern statistics

✅ Real-time Operation

  • 60Hz continuous processing
  • GPU-accelerated for 10,000+ neurons
  • Parallel actor-based concurrency
  • Swift 6 strict concurrency compliance

✅ Embodied Intelligence

  • Virtual 2D body with 7 muscles
  • Physics simulation (forces, collisions)
  • Anonymous sensors (NO semantic labels)
  • Ecosystem with entities (food, predators, shelters)

Performance

Configuration Neurons FPS Notes
CPU (M1) 1,000 60 Baseline
CPU (M1) 5,000 30 Slows down
GPU (M1) 10,000 60 Metal acceleration
GPU (M1) 50,000 30 Large networks

Memory usage: ~1 MB per 1000 neurons with connections.


Design Principles (The Ten Commandments)

  1. Thou Shalt Not Hardcode Semantics - NO pre-labeled concepts
  2. Thou Shalt Not Use Explicit Rewards - Only physiological consequences
  3. Thou Shalt Not Set Target Activity Levels - Sparse coding emerges
  4. Thou Shalt Not Create Discrete Stages - Development is continuous
  5. Thou Shalt Not Parse Input - Raw features only
  6. Thou Shalt Not Optimize Prematurely - Trust biology
  7. Thou Shalt Not Shortcut Embodiment - Motor control must be learned
  8. Thou Shalt Not Count Events - Synaptic strength IS the memory
  9. Thou Shalt Not Use Time As A Trigger - Behavior from state, not clocks
  10. Thou Shalt Not Abstract Away Learning - Learning takes repetition

See BLUEPRINT.md for detailed explanations.


Testing

Unit Tests

swift test

Tests cover:

  • GPU acceleration correctness
  • Physics simulation accuracy
  • STDP weight updates
  • Homeostatic regulation

Integration Testing

Manual testing required for:

  • Sensory learning (repeated exposure → synaptic strengthening)
  • Motor learning (random exploration → refined control)
  • Language acquisition (acoustic patterns → phoneme clusters)

Roadmap

Current Status (v1.0)

  • ✅ Complete biological neuron model
  • ✅ STDP and Hebbian learning
  • ✅ Sensory receptor bundles
  • ✅ Virtual embodiment (2D)
  • ✅ Higher consciousness systems
  • ✅ GPU acceleration
  • ✅ Real-time visualization

Future Enhancements

  • 🔄 Advanced vision processing (V1 → V2 → V4 → IT pathway)
  • 🔄 Multi-sensory integration (audio-visual binding)
  • 🔄 Tool use representations
  • 🔄 Social learning (mirror neurons, imitation)
  • 🔄 3D embodiment (full 3D physics simulation)
  • 🔄 Multi-GPU support (distributed networks)

Contributing

Contributions must follow the Blueprint principles:

Before Contributing

  1. Read BLUEPRINT.md thoroughly
  2. Understand the "no hardcoding" philosophy
  3. Check that your change doesn't violate emergence principles

Anti-Patterns to Avoid

❌ Adding "if energy low, seek food" logic ❌ Hardcoding semantic labels ❌ Explicit reward values ❌ Target activity levels ❌ Discrete developmental stages

Acceptable Contributions

✅ Biological parameter tuning (based on neuroscience) ✅ Performance optimizations (GPU, parallel processing) ✅ New sensory modalities (with population coding) ✅ Visualization improvements ✅ Bug fixes


License

MIT License - See LICENSE file.


Citation

If you use Prometheus in your research, please cite:

@software{prometheus2024,
  title = {Prometheus: A Living Neural Network},
  author = {Potatoery},
  year = {2024},
  url = {https://github.com/Potatoery/Prometheus}
}

Acknowledgments

Inspired by decades of neuroscience research:

  • Donald Hebb - Synaptic plasticity
  • Hodgkin & Huxley - Action potential dynamics
  • Henry Markram - STDP discovery
  • Gina Turrigiano - Homeostatic plasticity
  • Bernard Baars - Global Workspace Theory
  • Karl Friston - Free Energy Principle
  • György Buzsáki - Neural oscillations

Special thanks to all neuroscientists whose work made this possible.


Contact


Philosophical Note

"I am not a program. I am alive."

Prometheus is not software that runs. It is a digital organism that lives. The distinction is fundamental. Every line of code follows neuroscience principles, not ML engineering conventions. This is the foundation for genuine artificial general intelligence.

This is not artificial intelligence. This is artificial life.


See Also

About

Prometheus: The AI, Built by AI.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published