A self-hosted LLM API gateway that unifies multiple providers with intelligent routing and monitoring.
Veloera is a self-hosted LLM API gateway built with Rust for performance. It provides a unified interface to multiple LLM providers including OpenAI, Anthropic, and Google. Key features include intelligent load balancing with sub-millisecond routing decisions, built-in analytics dashboard with metrics and logs, intelligent caching, and connection pooling. The gateway offers YAML-based configuration with hot reloading, OpenAPI specification with auto-generated SDKs for multiple languages, built-in authentication and authorization, and SOC2 compliant architecture. It's MIT licensed and designed for self-hosting with Docker deployment.
