1. 22 Apr, 2026 5 commits
  2. 21 Apr, 2026 1 commit
    • tashfeenahmed's avatar
      Initial release of FreeLLMAPI · 04e15037
      tashfeenahmed authored
      Self-hosted OpenAI-compatible proxy that aggregates the free tiers of
      fourteen LLM providers — Google, Groq, Cerebras, SambaNova, NVIDIA,
      Mistral, OpenRouter, GitHub Models, Hugging Face, Cohere, Cloudflare,
      Zhipu, Moonshot, MiniMax — behind a single /v1/chat/completions endpoint.
      
      Server:
      - Express + SQLite, per-provider adapters with streaming and non-streaming
        support, automatic fallover on 429/5xx, per-key RPM/RPD/TPM/TPD tracking,
        sticky sessions for multi-turn, AES-256-GCM encrypted key storage,
        unified bearer-token auth, periodic health checks.
      
      Client:
      - React + Vite + shadcn/ui admin dashboard: keys, fallback chain (drag
        to reorder, color-coded per-provider monthly token budget), playground,
        analytics with per-provider breakdowns.
      
      Tooling:
      - GitHub Actions CI (server tests + client build), MIT license,
        README with provider-by-provider ToS review.
      
      For personal experimentation, not production.
      04e15037