Setup
Prerequisites
- Supported OS — Use a package built for your operating system and architecture (available from the download page).
- AI backend — A local LLM server (for example LLaMA.cpp or Ollama) or an external OpenAI-compatible API.
Install from a prebuilt package
Download the package for your platform. The official site serves
olaf.tar.gzat this link; other formats may come from your organization’s mirror.Extract the tarball (or run your installer), then locate the
olafbinary:tar -xzf olaf.tar.gzMake it executable (Unix-like systems), if needed:
chmod +x olafPut it on your
PATH— Either move the binary to a directory already on yourPATH, or add its directory:export PATH="/path/to/olaf/dir:$PATH"For a permanent setup, use your shell profile or system-wide path configuration.
Verify:
olaf --help(Exact flags depend on the shipped binary;
--helpor running without arguments may print usage.)
Run
After configuring backends (see Configuration):
olaf
Use your platform’s terminal or SSH session; Olaf is a full-screen TUI.
Gateway (optional)
Some deployments include olaf-gateway as a separate binary in the same package. See your release notes or administrator documentation for proxy setup and the optional web UI served at /web/ on the gateway port.