Synaptogenesis — SNN ↔ Knowledge Graph Bridge
Synaptogenesis bridges real-time SNN activity with a symbolic concept graph for long-term memory. Repeated spike patterns are consolidated into named concepts; retrieved concepts prime the SNN via apical context injection.
Three Mechanisms
- Observation — SNN spike windows are compressed via random projection into 64-dimensional patterns
- Consolidation — Frequent patterns (appearing 5+ times with >0.7 similarity) become concept nodes in the graph. Each concept stores: pattern, valence, properties (behavior, heading, distance), activation count
- Retrieval — Current SNN activity is matched against stored concepts. Top-k matches inject their patterns as apical context into hidden neurons at 5% strength
Concept Graph
Lightweight standalone graph (no external DB). Nodes = concepts, edges = similarity relations. Auto-evicts least-activated concepts when full (max 1000).
References
- Holtmaat & Svoboda (2009). Experience-dependent structural synaptic plasticity. Nature Reviews Neuroscience
API Reference
Synaptogenesis(config, snn, multi_compartment=None)
observe_spikes(spikes: Tensor)
Add spike frame to observation window.
record_experience(context: dict, valence: float)
Record current SNN pattern + context into experience buffer.
consolidate() → dict
Dream-phase: cluster frequent patterns into concepts. Returns n_new_concepts, n_updated, graph_size.
retrieve(current_context=None) → Tensor
Return apical modulation vector [n_neurons] from matching concepts.
ConceptGraph(max_concepts=1000)
add_concept(label, pattern, valence, properties) → int
find_similar(pattern, top_k=5) → list[(id, similarity)]
SynaptogenesisConfig
consolidation_threshold: 5 similarity_threshold: 0.7 max_concepts: 1000 pattern_dimensions: 64 retrieval_strength: 0.5 retrieval_top_k: 5