In developing Basic Memory, we discovered something fundamental about how Large Language Models interact with knowledge systems. While we initially set out to create a personal knowledge management tool, we inadvertently stumbled upon what might be called the “native file format” for LLM cognition – structured text with semantic relationships.
Current knowledge systems for AI tend to fall into one of several categories:
- Databases: Highly structured but lose the nuance and contextual richness that LLMs need
- Document systems: Human-readable but lack explicit semantic relationships
- Vector databases: Focus on similarity but miss explicit connections
- Knowledge graphs: Relationship-focused but often too rigid for natural language processing
Each approach optimizes for either human readability or machine processing, but rarely both. This creates a fundamental mismatch with how LLMs actually work.
What makes Basic Memory different is that it’s effectively “bilingual” – speaking both human language through familiar Markdown text and LLM language through semantic observations and relationships.
LLMs, despite their sophisticated capabilities, fundamentally operate on text. They were trained on text, they predict text, and they understand the world through text. When we provide tools that transform context into textual representations, LLMs can work with them much more naturally than with abstract data structures.
This realization came through practical experimentation rather than theoretical design. When we implemented Basic Memory with Markdown as the primary format, adding just enough structure for semantic relationships, we found that both humans and LLMs could interact with it naturally.
The key insight is finding the right balance:
- Text-based at its core: Keeping the primary medium as text that LLMs can process natively
- Lightly structured: Adding just enough semantic relationships for navigation
- Human readable and editable: Using familiar formats that people can work with directly
- Bidirectionally accessible: Ensuring both humans and AI can read and write
This balance creates what we might call “cognitive compatibility” – aligning with how both humans and LLMs naturally process information.
This approach has several practical benefits:
- Natural interactions: LLMs can reason about and generate knowledge more fluidly
- Easier maintenance: Humans can directly edit and understand the knowledge base
- Lower complexity: Simpler systems with fewer specialized components
- Better alignment: Knowledge representation that matches how both parties think
- Persistence with meaning: Information retains both its content and its context
As AI systems continue to evolve, this principle of text-based knowledge with semantic structure may become increasingly important. Rather than forcing LLMs to adapt to rigid database schemas or complex object models, we can create knowledge systems that work with their inherent capabilities.
Basic Memory represents one implementation of this principle, but the broader insight about the “native file format” for LLM cognition could inform many aspects of AI system design going forward.
The most effective AI tools might not be the ones with the most sophisticated data structures, but those that find the right balance between human readability and machine navigability – working with the grain of how both humans and LLMs actually think.
Sometimes the most important discoveries come not from theoretical breakthroughs but from practical experimentation. In developing Basic Memory, we weren’t setting out to create a new paradigm for knowledge representation, but by listening to how LLMs actually interact with information, we found an approach that feels natural to both human and artificial intelligence.
The future of AI knowledge systems might not be about increasingly complex structures, but about finding that sweet spot where text and semantics meet – creating interfaces that bring joy to both human and AI users.
Want to explore Basic Memory for yourself? Check out our GitHub repository to get started.