MH-FLOCKE MH-FLOCKE
HomeDocsGitHubBlogPaperYouTubeReddit𝕏

Population Coding

Population Coding — Gaussian Tuning Curves

MH-FLOCKE encodes continuous sensor values as patterns of spiking activity across populations of neurons, just like biological sensory cortex. Each sensor channel is represented by 8 neurons with overlapping Gaussian receptive fields.

Encoding (Sensors → SNN)

activation_i = exp(-0.5 × ((value - preferred_i) / σ)²) × gain

8 neurons per channel, preferred values spread from -1 to +1
σ = range / (n-1) × 1.5  (overlap for smooth coding)
gain = v_threshold × 2.0

Biology: this is how place cells, head direction cells, and motor cortex neurons encode continuous variables (Georgopoulos 1986).

Decoding (SNN → Motors)

Motor output uses population voting with push/pull neuron pairs (6 per joint: 3 push + 3 pull):

control = (push_rate - pull_rate) / (substeps × half)
clipped to [-1, 1]

Vision Channels (Issue #76d)

Two additional channels encode visual target information: heading (-1 to +1) and distance (0 to 1). These 16 neurons (8 heading + 8 distance) are the creature’s “eyes” — they let the SNN learn to orient toward the ball via R-STDP.

References

  • Georgopoulos et al. (1986). Neuronal population coding of movement direction. Science

API Reference

rate_encode(value, n_neurons, min_val, max_val, gain) → Tensor

Gaussian population coding. Returns [n_neurons] activation values.

decode_motor_spikes(spike_counts, n_per_joint, substeps) → list[float]

Population voting: push/pull pairs → joint commands [-1, 1].

MuJoCoCreature.get_sensor_input() → Tensor

Full sensor encoding pipeline: position(3) + velocity(3) + orientation(3) + height(1) + upright(1) + forward_vel(1) + joint_angles(n) + joint_velocities(n) + vision(2) = ~54 channels × 8 = ~432 input neurons.