Prerequisites
- Node.js >= 22.12.0
- kubectl configured with access to a Kubernetes cluster
- An LLM API key (Anthropic, OpenAI, or any OpenAI-compatible provider)
Install
Option 1: npx (Fastest)
Option 2: npm global install
Option 3: Clone and build
Configure Your LLM
On first run, Siclaw will prompt you to configure an LLM provider. You can also override via environment variables:~/.siclaw/config/settings.json:
Siclaw supports any OpenAI-compatible API. See LLM Providers for Ollama, vLLM, Azure, and other setups.
Run Your First Investigation
Start Siclaw and describe an issue:- Gather context — cluster state, events, pod logs, recent deployments
- Generate hypotheses — ranked list of possible root causes
- Validate in parallel — up to 3 sub-agents independently test each hypothesis
- Conclude — structured report with root cause, confidence score, and remediation steps
~/.siclaw/reports/.
What’s Next?
- Core Concepts — understand the investigation engine
- Your First Investigation — walk through a complete diagnosis
- LLM Providers — detailed provider configuration
- Deploy for your team — multi-user setup with Web UI