Introduction: LLM Gateway providing model access, logging, and usage tracking across 100+ LLMs in the OpenAI format.
Added on: Jan 20, 2025
Berri AI

What is Berri AI

LiteLLM is a gateway designed to provide seamless access to over 100 large language models (LLMs) in the OpenAI format. It offers comprehensive logging, usage tracking, and control features, enabling developers to manage model access, track spending, and set budgets efficiently. Trusted by a large community, LiteLLM ensures high uptime and scalability, making it a reliable solution for enterprises and developers.

How to Use Berri AI

  1. Logging + Spend Tracking: Log requests, responses, and usage data to platforms like S3, Datadog, OTEL, or Langfuse. Start Here
  2. Control Model Access: Manage access to models using virtual keys, teams, and model access groups. Start Here
  3. Budgets & Rate Limits: Set budget limits and track spending across models, keys, teams, and custom tags. Start Here
  4. Pass-through Endpoints: Migrate projects to the proxy with built-in spend tracking and logging. Start Here
  5. OpenAI-Compatible API: Access 100+ LLMs using the OpenAI format across /chat/completion, /embedding, etc. Start Here
  6. Self-serve Portal: Enable teams to manage their own keys via SSO in a self-serve portal. Start Here

Features of Berri AI

  • Stay in Control

    Provides tools for logging, spend tracking, and controlling model access.

  • Logging + Spend Tracking

    Logs requests, responses, and usage data to various platforms like S3, Datadog, OTEL, and Langfuse.

  • Control Model Access

    Manages access to models using virtual keys, teams, and model access groups.

  • Budgets & Rate Limits

    Tracks spending and sets budget limits across models, keys, teams, and custom tags.

  • Move Fast

    Offers pass-through endpoints and OpenAI-compatible APIs for seamless integration.

  • Pass-through Endpoints

    Enables easy migration of projects to the proxy with built-in spend tracking and logging.

  • OpenAI-Compatible API

    Supports access to 100+ LLMs in the OpenAI format across various endpoints.

  • Self-serve Portal

    Allows teams to manage their own keys via SSO in a self-serve portal.