As models have become better as agents, we’ve found success by providing fewer details up front, making it easier for the agent to pull relevant context on its own.Dynamic discovery is more token-efficient and improves response quality by reducing context window bloat. Bossa provides the files primitive that this pattern relies on.
Why Files Work
Files are the universal interface for dynamic context discovery. Agents already understandls, grep, read, and tail. No new abstractions to learn. Cursor notes that “files have been a simple and powerful primitive to use”—and Bossa gives your agents a persistent, searchable filesystem for exactly that.
When context lives in files, the agent can:
- Discover structure with
lsandglobbefore loading anything - Search with
grepto find relevant files without reading everything - Read only what it needs, when it needs it
- Persist context (scratchpad, memory) for later discovery
How Bossa Fits
Bossa’s operations map directly to the dynamic context discovery pattern:| Operation | Role in Dynamic Discovery |
|---|---|
ls / glob | Discover structure before loading; see what’s available |
grep | Find relevant files without reading everything |
read | Pull only what’s needed into the context window |
write | Persist context (summaries, preferences, session notes) for later discovery |
ls, grep, or glob to find the right files, then read to load only those. No embedding calls. No retrieval pipeline. Just files.
Practical Patterns
The patterns in Agent Integration are dynamic context discovery in action:Explore Before Reading
Usels to discover structure, then read only for relevant files:
Search Without Reading Everything
Usegrep with output_mode="files_with_matches" to find files, then read only matches:
Next Steps
- Why Filesystem Over RAG? — Filesystem vs RAG for agent memory
- Agent Integration — CLI and MCP examples, tool usage patterns
- Getting Started — Sign up and make your first request
References
- Dynamic context discovery — Cursor (Jan 2026)
- Context Engineering for Agents — LangChain (Jul 2025)