neural-cellular-automaton

where neural networks meet cellular automata in self-organizing computational structures

The Mathematics of Neural CA

Cell Neural Architecture

Each cell contains a feedforward neural network that processes neighbor states:

  • Input Layer: Receives neighbor cell activations
  • Hidden Layers: Process information through nonlinear transformations
  • Output Layer: Determines new cell state and signals
  • Weight Evolution: Synaptic plasticity based on local activity

Learning Rules

Hebbian Learning

Neurons that fire together wire together - strengthening correlated connections

Spike-Timing Dependent Plasticity

Temporal order of activation determines synaptic changes

Homeostatic Plasticity

Maintains stable activity levels through regulatory mechanisms

Evolutionary Selection

Networks compete and reproduce based on fitness criteria

Emergent Behaviors

Complex computational patterns emerge from simple rules:

  • Pattern Recognition: Networks learn to detect spatial configurations
  • Signal Propagation: Information waves through the cellular medium
  • Memory Formation: Stable attractor states store information
  • Computation Islands: Specialized regions for different tasks
  • Adaptive Response: Networks adjust to changing input patterns

References & Inspiration

  • • Mordvintsev et al. - "Growing Neural Cellular Automata"
  • • Randazzo et al. - "Self-classifying MNIST Digits"
  • • Miller & Downing - "Evolution of Digital Organisms"
  • • Wolfram - "A New Kind of Science"