Classic spaced repetition treats words like detached units. But language is relational code — every token’s meaning warps subtly with intent, register, and surrounding discourse.
| Principle | Effect |
|---|---|
| Narrative Embedding | Creates episodic anchors |
| Role Variation | Prevents rigid encoding |
| Consequence Loops | Boosts salience |
| Interleaved Reuse | Natural spacing without grind |
Instead of a review queue, you progress through branching micro-stories. Previously encountered lexemes resurface in new pragmatic frames—question → command → negotiation → introspection.
If you can’t answer: “Where did I use this last?” the memory trace is brittle.
Words live longer when they do something.