Context Window Observability

Your agent is running in the dark.

starnose turns the lights on. See exactly what context your LLM agent is sending — zero code changes.

$ pip install starnose
View on GitHub →
Local-first No account No config MIT License Python 3.10+ macOS · Linux · WSL2
snose watch

Live session monitor

See token spend, context composition, and warnings in real time as your agent works.

snose inspect

Post-hoc analysis

Review any past session — what was read, re-read, and how much it cost.

snose diff

Context diffs

Compare context windows between calls to see exactly what changed and why.