OpenAI-Proxy

OpenAI-Proxy

OpenAI Proxy (100+ LLMs) - OpenAI, Azure, Bedrock, Anthropic, HuggingFace

Deploy OpenAI-Proxy

litellm

berriai/litellm:main-stable

Just deployed

A fast, and lightweight OpenAI-compatible server to call 100+ LLM APIs.

Call all LLM APIs using the OpenAI format. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs)

Test your deployed proxy:

import openai
client = openai.OpenAI(
    api_key="your-master-key",
    base_url="your-proxy-url"
)

# request sent to model set on litellm proxy, `litellm --model`
response = client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages = [
        {
            "role": "user",
            "content": "this is a test request, write a short poem"
        }
    ]
)

print(response)

Deploy Now

Details

Ishaan Jaffer

Created on May 24, 2024

113 total projects

81 active projects

89% success on recent deploys

AI/ML



More templates in this category

View Template
Chat Chat

Chat Chat

Chat Chat, your own unified chat and search to AI platform.


Harry Yep

View Template
openui

openui

Deploy OpenUI: AI-powered UI generation with GitHub OAuth and OpenAI API.


zexd

View Template
firecrawl

firecrawl

firecrawl api server + worker without auth, works with dify


Neuron Capital