nuonuo/doc/p5_snn_hopfield.md
Fam Zheng d923aa1e31 NuoNuo: Hippocampal memory module prototype
Hopfield + Hebbian hybrid memory system for LLMs.
Two nights of experiments (16 iterations), validated on LongMemEval (ICLR 2025).

Architecture:
- Single-hop: Two-Stage Hopfield (NN top-20 → softmax settle)
- Multi-hop: Hebbian W matrix with WTA pattern separation
- 64% on LongMemEval (500 questions), retrieval-only, no LLM dependency
- 4ms latency @ 20K memories, ~1GB VRAM

Key findings:
- Hopfield attention solved noise tolerance (20% → 100% vs flat Hebbian)
- WTA pattern separation enables 20K+ capacity
- Multi-hop associative chains (6 hops, CosSim=1.0) — RAG can't do this
- MiniLM-L6 is optimal (discrimination gap > absolute similarity)
- Paraphrase cue augmentation: 55% → 100% on synthetic, 36% → 64% on benchmark
- SNN encoder viable (CosSim 0.99) but not needed for current architecture
2026-04-07 10:37:24 +01:00

37 lines
1.5 KiB
Markdown
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

# P5: SNN-native Hopfield
## 结论:当前不可行,标准 softmax Hopfield 远优于 LIF dynamics
## 对比
| | SNN Hopfield | Standard Hopfield |
|---|---|---|
| Paraphrase recall (5 pairs) | 20% | **100%** |
| With background (10+) | 0% | **90%+** |
| Latency | 10.8ms | **1.9ms** |
| Scalability | O(steps × dim²) | O(N × dim) |
## 为什么 SNN 失败
1. **More steps = worse**: steps=20 时 5/5steps=200 时 1/5。LIF 动力学不收敛到正确 attractor而是发散或卡在错误状态。
2. **LIF 不是 softmax**: Modern Hopfield 的 softmax 是精确的能量最小化。LIF 的 spike dynamics 不保证收敛到 Boltzmann 均衡分布。
3. **膜电位衰减干扰**: β=0.9 的指数衰减让信号快速丢失,长时间 settle 变成纯噪声。
## 需要什么才能让 SNN Hopfield work
1. 更复杂的神经元模型(不只是 LIF——需要 AdEx、Izhikevich、或 stochastic neurons
2. 精确调谐的 E/I 平衡(兴奋/抑制)
3. 可能需要 stochastic neurons 做 proper Boltzmann sampling
4. 专用 neuromorphic 硬件Loihi 2 的可编程神经元模型)
## SNN 在整个系统中的定位
| 组件 | SNN 可行性 | 当前方案 |
|------|-----------|---------|
| Encoder (emb↔spike) | ✅ 验证通过 (CosSim 0.99) | 保留备用 |
| Hopfield attention | ❌ 不可行 | 用 softmax |
| Hebbian multi-hop | ✅ WTA codes + W matrix | 已实现 |
| Pattern separation | ✅ WTA = 生物 DG | 已实现 |
**SNN 的真正价值在 neuromorphic 部署,不在 GPU 上替代 softmax。**