Commander is a lightweight, fully local fine-tuned Large Language Model (LLM)-powered terminal assistant. Designed for Unix systems, it lets you generate, explain, and run terminal commands through natural language — all from your terminal and without any need for cloud services or internet access.
Think of it as your AI command-line co-pilot for linux commands — private, fast, and efficient.
- Local LLM-Powered: Works entirely offline — your data stays on your machine.
- Interactive Shell: Start a session where you can chat with the LLM to generate commands.
- Quick Prompt Mode: Run one-off prompts directly from the command line.
- Lightweight: Minimal dependencies and fast startup.
- Unix-Focused: Tailored to understand and generate Unix-based commands.
You can install Commander locally using pip:
git clone https://github.com/sszxt/commander.git
cd commander
pip install .The lightweight LLM model previously used in this project has been removed from Hugging Face. I've switched to the TinyLlama 1.1B model as a replacement. While it may not be as fast or responsive as the earlier model, it should still get the job done.
Fine-tuning will work on both the previous and the new TinyLlama model, so all customizations and improvements can continue seamlessly. I'll work on building a new model and will update it as soon as it's ready.
Interactive Mode
commander- Start an AI-powered interactive shell
- Then, start chatting with your LLM assistant to get help, generate commands, or run them directly.
Example:
> How do I find the largest file in a directory?
> find . -type f -exec du -h {} + | sort -rh | head -n 1One - Line Prompt Mode
commander "how to list all open ports"- Skips the interactive shell and get instant suggestions for a one-line prompt
Example :
commander "how to list all open ports"
assistant : sudo lsof -i -P -n | grep LISTENcommander/
├── commander/ # Source code and model
│ ├── __main__.py # CLI entry point
│ └── model/
│ └── checkpoint-750/ # Trained Local model files
├── build/ # Build artifacts
├── requirements.txt # Python dependencies
├── setup.py # Installer
└── readme.md # You're reading this right now.Commander runs entirely locally. No commands, prompts, or data are sent to any server. It's your assistant, and your business stays yours.
Commander leverages powerful open-source LLMs and tools, and I welcome your contributions — feel free to open an issue or pull request for bug reports, suggestions, or new features!
Developed by Mohamed Sameer.
