Niko CLI rgcsekaraa
winget install --id=rgcsekaraa.NikoCLI -e Niko translates natural language queries into executable shell commands. Describe what you want in plain English and get the exact command. Works offline with local AI models.
winget install --id=rgcsekaraa.NikoCLI -e Niko translates natural language queries into executable shell commands. Describe what you want in plain English and get the exact command. Works offline with local AI models.
AI-powered CLI: explain code, generate shell commands, use any LLM provider.
Built in Rust. Works on macOS, Linux, and Windows.
$ niko cmd "find all files larger than 100MB"
find . -type f -size +100M
Copied to clipboard
$ cat main.rs | niko explain
š 42 lines analyzed ā completed in 2.1s
## Overview
...
cmd, explain, settingscurl -fsSL https://raw.githubusercontent.com/rgcsekaraa/niko-cli/main/install.sh | sh
iwr -useb https://raw.githubusercontent.com/rgcsekaraa/niko-cli/main/install.ps1 | iex
# Install latest version from git
cargo install --git https://github.com/rgcsekaraa/niko-cli
# Or install from local source
cargo install --path .
# First run ā interactive setup wizard
niko settings configure
This will:
~/.niko/config.yamlcmd ā Generate Shell Commands$ niko cmd "find python files modified today"
find . -name "*.py" -mtime 0
Copied to clipboard
$ niko cmd "kill process on port 3000"
$ niko cmd "compress logs folder to tar.gz"
$ niko cmd "git commits from last week"
$ niko cmd "show disk usage by directory"
explain ā Explain Code# From a file
niko explain -f src/main.rs
# Pipe code in
cat complex_module.py | niko explain
# Paste interactively (live line counter, Ctrl-D or two empty lines to finish)
niko explain
For large files, Niko:
settings ā Configuration# Interactive setup wizard
niko settings configure
# Show current config
niko settings show
# Set a value directly
niko settings set openai.api_key sk-xxx
niko settings set openai.model gpt-4o
niko settings set active_provider openai
# Reset to defaults
niko settings init
# Print config path
niko settings path
niko cmd "list files" --provider openai
niko explain -f main.rs --provider claude
Niko is designed for production use with reliability and speed:
| Feature | Details |
|---|---|
| Streaming | Tokens appear immediately as the LLM generates them (all providers) |
| Retry | 3 attempts with exponential backoff (500ms ā 2s + jitter) |
| Retryable errors | Timeouts, connection resets, 429/5xx, rate limits, model loading |
| Connection pooling | HTTP keep-alive, 4 idle connections/host, TCP keepalive 30s |
| Model keep-alive | Ollama keeps model in VRAM for 30 min (no reload between calls) |
| Flash attention | Enabled by default for Ollama (faster on Apple Silicon / GPU) |
| Adaptive tokens | cmd mode uses 512 max tokens, explain uses 4096 ā less KV cache for short tasks |
| Adaptive context | Ollama context window scales with prompt size (4K ā 16K) |
| Empty response guard | Detects and retries empty/null LLM responses |
| Truncation detection | Warns when response hits max_tokens (Claude, OpenAI) |
| Context memory | Multi-chunk explanations carry 10-line code overlap for boundary continuity |
| Structured errors | Parses API error responses for clear, actionable messages |
| Provider | Type | How to set up |
|---|---|---|
| Ollama | Local (free) | Auto-installed, models downloaded on demand |
| OpenAI | API | niko settings configure ā select OpenAI ā enter key |
| Claude | API | niko settings configure ā select Claude ā enter key |
| DeepSeek | API | niko settings configure ā select DeepSeek ā enter key |
| Grok | API | niko settings configure ā select Grok ā enter key |
| Groq | API | niko settings configure ā select Groq ā enter key |
| Mistral | API | niko settings configure ā select Mistral ā enter key |
| Together | API | niko settings configure ā select Together ā enter key |
| OpenRouter | API | niko settings configure ā select OpenRouter ā enter key |
| Custom | API | niko settings configure ā choose "Custom" ā enter URL + key |
All API providers fetch models dynamically from their /models endpoint ā nothing is hardcoded.
API keys can also be set via environment variables:
export OPENAI_API_KEY=sk-xxx
export ANTHROPIC_API_KEY=sk-ant-xxx
export DEEPSEEK_API_KEY=xxx
export GROK_API_KEY=xxx
export GROQ_API_KEY=xxx
export TOGETHER_API_KEY=xxx
export MISTRAL_API_KEY=xxx
export OPENROUTER_API_KEY=xxx
For local models (Ollama), Niko estimates the maximum model size your system can handle:
| System RAM | Max Model Size |
|---|---|
| 8 GB | ~4B parameters |
| 16 GB | ~12B parameters |
| 32 GB | ~28B parameters |
| 64 GB | ~60B parameters |
Models exceeding your RAM limit are hidden from the selection list. You can still force-select them with a confirmation prompt.
All settings are stored in ~/.niko/config.yaml. The file uses a dynamic structure ā providers are a map, so you can add as many as you want:
active_provider: openai
providers:
ollama:
kind: ollama
base_url: http://127.0.0.1:11434
model: qwen2.5-coder:7b
openai:
kind: openai_compat
api_key: sk-xxx
base_url: https://api.openai.com/v1
model: gpt-4o
claude:
kind: anthropic
api_key: sk-ant-xxx
model: claude-sonnet-4-20250514
rm $(which niko)
rm -rf ~/.niko
MIT