You spend hours researching. You build connections. You understand how everything fits together. Then you save a bookmark, take a screenshot, copy a quote—and something essential disappears. It's not the information that's lost. It's your entire way of understanding it.
What Are Mental Models, Really?
A mental model isn't just what you know—it's how you understand what you know. It's the invisible framework that connects pricing strategy to market positioning, customer feedback to product features, research findings to real-world applications.
Watch a Mental Model Collapse (Click to see the difference)
With Context: Your Mental Model
Without Context: Scattered Facts
Michael Xieyang Liu's groundbreaking research at Carnegie Mellon University on "Tool Support for Knowledge Foraging, Structuring, and Transfer During Online Sensemaking" revealed a crucial insight: the relationships between pieces of information contain more intelligence than the information itself. When studying how developers make sense of complex technical information, Liu found that people don't just collect facts—they build living mental models that evolve with each new discovery.
The Anatomy of Context Collapse
Every time you save information using current tools, you're unknowingly stripping away layers of meaning. Here's what actually happens:
The Context Stripping Machine
Context preserved:
• Found while comparing 5 competitors
• Part of pricing strategy research
• Related to feature analysis
• Connected to customer feedback
❌ Lost: What you were comparing
❌ Lost: Related discoveries
❌ Lost: Your insights
Context preserved:
• From 15 customer interviews
• Contradicts competitor approach
• Validates pricing hypothesis
• Shapes product roadmap
❌ Lost: Strategic implications
❌ Lost: Conflicting views
❌ Lost: Decision context
The Rebuild Tax: Quantifying the Hidden Cost
Research from CHI 2022 on "Crystalline: Lowering the Cost for Developers to Collect and Organize Information for Decision Making" (Liu, Kittur, and Myers) found that when context is lost, knowledge workers spend an average of 35-45 minutes reconstructing their mental models before they can make progress on their original task. This "rebuild tax" compounds dramatically in complex domains.
Your Annual Rebuild Tax
That's 8 work weeks spent rebuilding mental models you already built
Based on average of 3 context switches per day × 40 minutes per rebuild
But the real cost isn't time—it's the compound loss of understanding. Each rebuild is slightly different, slightly degraded. Nuances are forgotten. Connections are missed. Innovation opportunities slip away.
Why Context Matters More Than Content
Research Insight: The DeepStress Study (CHI 2024)
The DeepStress system, developed by researchers at CHI 2024, demonstrated the critical importance of preserving contextual relationships. By using a quasi-experimental approach to help users understand causal relationships in their data, the study showed that users could only interpret data correctly when they maintained the full context of discovery. Without context, the same data led to opposite conclusions. This "predictive multiplicity" shows why preserving mental models—not just facts—is crucial for decision-making.
Recent research from EMNLP 2024 on "Context-Aware Knowledge Enhancement" further validates this. Studies on knowledge graph-enhanced large language models show that preserving the relational context between information nodes is critical for accurate reasoning. When context is stripped away, even sophisticated AI systems fail to make correct inferences.
Context is the connective tissue that transforms isolated facts into actionable intelligence. It includes:
- The Question Context: What problem were you trying to solve?
- The Discovery Context: What else were you looking at? What patterns were emerging?
- The Temporal Context: How did your understanding evolve? What changed your perspective?
- The Application Context: How does this connect to your goals? What decisions does it inform?
Real example from "Supporting Business Document Workflows" (ACL 2024): Researchers studying knowledge workers found that participants' predominant pain points focused on information foraging across documents. Despite using various tools, workers struggled because each tool stripped away the context of why information was relevant. A financial analyst spent a week building a mental model of market dynamics across different sectors. She understood not just the numbers, but why trends differed by industry, company size, and market conditions. She saved her findings in separate documents—market analysis, competitor research, regulatory updates.
Six weeks later, preparing for an investment committee meeting, she couldn't reconstruct the connections. The documents contained facts, but her mental model—the understanding of how everything fit together—was gone. She had to spend another three days rebuilding her understanding from scratch, and still missed key insights from her original analysis.
Digital Amnesia vs. Lost Mental Models
There's a crucial distinction here. Digital amnesia is forgetting facts—not remembering that competitor's price or that customer quote. But losing mental models is far worse. It's forgetting how to think about the problem itself.
Research from ACL 2024 on "Augmenting Expert Cognition in the Age of Generative AI" found that:
- Facts can be re-discovered in minutes through search
- Mental models take hours or days to rebuild, requiring deep cognitive effort
- Rebuilt models are often incomplete or biased by recent information
- Lost models can't contribute to future insights or compound learning
The Compound Loss Problem
The most insidious aspect of context collapse is that mental models build on each other. Your understanding of market dynamics informs your product strategy, which shapes your research questions, which guide your customer interviews.
EMNLP 2024's research on "Temporal Fact Reasoning over Hyper-Relational Knowledge Graphs" demonstrates this mathematically: knowledge compounds when temporal and relational context is preserved. But when context is lost, this compound growth stops. You're not just starting over—you're losing the ability to see deeper patterns that only emerge when knowledge builds over time.
The Key Insight
Current tools optimize for storing information, but knowledge work isn't about information—it's about understanding. And understanding lives in the connections, the context, the mental models we build. When we lose those, we lose our ability to think. As the Information Foraging Theory (Pirolli & Card) demonstrates, humans follow "information scent" to build understanding—but digital tools systematically destroy these scent trails.
What Preserving Context Actually Looks Like
Imagine if your tools understood that knowledge isn't static—it's a living, evolving understanding. Research from CHI 2024 on "Selenite: Scaffolding Online Sensemaking" shows what's possible when tools preserve context:
- Discovery Journeys: The full path of exploration, including dead ends that taught you something valuable
- Relationship Maps: How each piece of information connects to others, preserving the "why" behind each connection
- Evolution Tracking: How your understanding changed over time, capturing the learning process itself
- Context Preservation: Why something mattered at the moment of discovery, maintaining the original problem-solving context
FolkLore AI: Preserving Mental Models, Not Just Information
This is exactly why we built FolkLore AI. We recognized that the problem isn't storing more information—it's preserving the mental models that make information meaningful. Our approach is grounded in the latest research from CHI, ACL, and EMNLP on sensemaking, knowledge foraging, and context preservation.
With FolkLore AI:
- Full Context Capture: We preserve your entire research state—every tab, every highlight, every connection, implementing the principles from Liu's sensemaking research
- Relationship Preservation: We maintain how information relates, not just what it contains, using graph-based approaches validated by recent EMNLP research
- Mental Model Continuity: Return to your exact thinking state, even months later, eliminating the 35-45 minute rebuild tax
- Understanding Evolution: See how your mental models develop and compound over time, enabling true knowledge accumulation
We're not building another bookmark manager or note-taking app. We're building a system that understands how human understanding actually works—through context, connections, and continuous evolution. FolkLore AI is the first step toward a future where tools amplify human cognition rather than fragmenting it.
Stop Losing Your Mental Models
Every day without proper context preservation is another day of rebuild tax. Start preserving not just your research, but your understanding itself.
Preserve Your ThinkingNext in this series: "The Compound Knowledge Effect: Why Your Best Ideas Come From Connected Thinking"