A self-hosted Ollama instance deployed on Koyeb, providing access to open-source AI models through OpenAI-compatible API.

| Model | Speed | Latency | Tests |
|---|---|---|---|
| deepseek-r1:671b | 1.58 t/s | 0.54s | 5 |
| Time | Model | Speed | Latency |
|---|---|---|---|
| Feb 15, 05:48 PM | deepseek-r1:671b | 1.58 t/s | 0.54s |