All Templates / AI/ML
OpenAI-Proxy
OpenAI Proxy (100+ LLMs) - OpenAI, Azure, Bedrock, Anthropic, HuggingFace
litellm
berriai/litellm:main-stable
Just deployed
A fast, and lightweight OpenAI-compatible server to call 100+ LLM APIs.
Call all LLM APIs using the OpenAI format. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs)
Test your deployed proxy:
import openai
client = openai.OpenAI(
api_key="your-master-key",
base_url="your-proxy-url"
)
# request sent to model set on litellm proxy, `litellm --model`
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages = [
{
"role": "user",
"content": "this is a test request, write a short poem"
}
]
)
print(response)
Template Content
Details
Ishaan Jaffer
Created on May 24, 2024
176 total projects
122 active projects
100% success on recent deploys
AI/ML
More templates in this category
firecrawl
firecrawl api server + worker without auth, works with dify
Neuron Capital
81