starnose turns the lights on. See exactly what context your LLM agent is sending — zero code changes.
pip install starnose
See token spend, context composition, and warnings in real time as your agent works.
Review any past session — what was read, re-read, and how much it cost.
Compare context windows between calls to see exactly what changed and why.