openai-proxy

openai-proxy

OpenAI Proxy (100+ LLMs) - OpenAI, Azure, Bedrock, Anthropic, HuggingFace

Deploy openai-proxy

openai-proxy

BerriAI/litellm

Just deployed

A fast, and lightweight OpenAI-compatible server to call 100+ LLM APIs.

Call all LLM APIs using the OpenAI format. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs)

Test your deployed proxy import openai openai.api_base = "http://0.0.0.0:8000"

print(openai.ChatCompletion.create(model="test", messages=[{"role":"user", "content":"Hey!"}]))


Template Content

openai-proxy

BerriAI/litellm
Deploy Now

Details

Ishaan Jaff

Created on Oct 21, 2023

183 total projects

120 active projects

100% success on recent deploys

Python, Dockerfile, Shell

AI/ML



More templates in this category

View Template
Chat Chat

Chat Chat

Chat Chat, your own unified chat and search to AI platform.


Harry Yep

View Template
openui

openui

Deploy OpenUI: AI-powered UI generation with GitHub OAuth and OpenAI API.


zexd

View Template
firecrawl

firecrawl

firecrawl api server + worker without auth, works with dify


Neuron Capital