Skip to content

Quickstart (Docker Compose)

  • Docker ≥ 24 and Docker Compose v2
  • An API key from at least one LLM provider (OpenAI, Anthropic, or Azure OpenAI)
Terminal window
git clone https://github.com/geeper-io/relay
cd relay
cp .env.example .env

Edit .env and set at minimum one provider key:

.env
OPENAI_API_KEY=sk-... # OpenAI
ANTHROPIC_API_KEY=sk-ant-... # Anthropic (optional)
# Auto-generated on first start if left empty:
PROXY_MASTER_KEY=
Terminal window
docker compose up -d

This starts:

  • proxy — the Geeper Relay on port 8000
  • postgres — PostgreSQL 16 for API keys, users, usage records
  • chromadb — vector store for RAG (optional, controlled by config.yaml)
Terminal window
curl http://localhost:8000/healthz
# {"status":"ok"}

Use the master key (from .env or from the startup logs) to create a user key:

Terminal window
curl -X POST http://localhost:8000/internal/api-keys \
-H "Authorization: Bearer $PROXY_MASTER_KEY" \
-H "Content-Type: application/json" \
-d '{"name": "dev", "user_id": "alice"}'

Response:

{
"id": "ak_01j...",
"name": "dev",
"key": "llmp_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"key_prefix": "llmp_xxxx",
"user_id": "alice"
}
Terminal window
export API_KEY=llmp_xxxx... # the key you just created
curl http://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer $API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [{"role": "user", "content": "Hello from Geeper Relay!"}]
}'

Or with the OpenAI Python SDK — zero code changes:

from openai import OpenAI
client = OpenAI(
base_url="http://localhost:8000/v1",
api_key="llmp_xxxx...",
)
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)