AIDEX
LiteLLM logo

LiteLLM

by BerriAI

Open-source proxy to call 100+ LLM APIs in the OpenAI format with load balancing and cost tracking

Open SourceOpen source (MIT), Enterprise with support available APIOpen Source api
Visit LiteLLM

About LiteLLM

LiteLLM is an open-source proxy that provides a unified interface to call 100+ LLM providers using the OpenAI API format. It handles authentication, load balancing, fallbacks, and cost tracking across providers. LiteLLM can run as a proxy server or be used as a Python SDK, making it ideal for teams managing multiple LLM providers.

Key Features

  • 100+ LLM provider support
  • OpenAI-compatible format
  • Load balancing
  • Automatic fallbacks
  • Cost tracking
  • Rate limiting
  • Caching

Pros

  • Truly open source
  • Massive provider support
  • Easy provider switching
  • Good cost management

Cons

  • Proxy adds latency
  • Configuration can be complex
  • Enterprise features limited in OSS

Tags

llm-proxyapi-gatewayopen-sourceload-balancingcost-tracking