code-xray is a terminal-based code exploration and explanation tool powered by local LLMs (like Ollama).
Select lines of code interactively, send them for explanation, and get human-friendly insights – right in your terminal.
- ✅ Terminal-based file viewer with syntax highlighting
- ✅ Line-by-line navigation and selection
- ✅ Interactive directory tree when run without arguments
- ✅ Integration with local LLMs via Ollama
- ✅ On-demand code explanation using selected lines and full-file context
- ✅ Works fully offline
- ✅ Switch between file viewer and file tree (
bto go back) - ✅ Customizable LLM model and port via CLI
code-xrayThis opens a directory tree starting from your current working directory.
You can navigate folders and open files for explanation. Press b inside a viewer to return to the file tree.
code-xray /path/to/your/file.pyThis opens an interactive terminal interface to browse and explain code.
code-xray /path/to/your/file.py --model mistral --port 11434--modelor-m: LLM model name (e.g.mistral,llama3,codellama)--portor-p: Port where Ollama is running (default is11434)
| Key | Action |
|---|---|
h |
Move up one line |
l |
Move down one line |
Shift+h |
Expand selection up |
Shift+l |
Expand selection down |
e |
Explain selected code |
b |
Go back to file tree |
q |
Quit viewer or popup |
Esc |
Close explanation popup |
Enter |
Select file or enter folder |
../ |
Navigate up in the file tree |
- Python 3.10+
- Ollama running locally with your preferred model
Example to pull a model:
ollama pull mistralThen start the server:
ollama servepip install code-xrayMake sure
code-xrayis available in your PATH or create an alias.
- Textual for the beautiful terminal UI
- Ollama for local model hosting
- Rich for the syntax highlighting
Pull requests welcome! Feel free to fork and build on top of this.

