Open models. European servers. One API call.

Pick the right LLM. Compare performance. Deploy in one API call. Fully GDPR-compliant, hosted in Europe.

Integration

One line to switch

Use the OpenAI SDK you already know. Just change the base URL.

app.ts
import OpenAI from "openai"

const supa = new OpenAI({
  baseURL: "https://api.supa.works/v1",
  apiKey: process.env.SUPA_API_KEY,
})

const completion = await supa.chat.completions.create({
  model: "meta-llama/llama-3.3-70b",
  messages: [{ role: "user", content: "Hello!" }],
})
OpenAI compatibleWorks with LangChainVercel AI SDK

Infrastructure

Runs in Europe. Stays in Europe.

Every request processed on dedicated servers in Europe. No data leaves the EU. No US subprocessors on the inference path.

100%

European infrastructure

0

US subprocessors

100%

GDPR compliant