Berri AI
The fastest way to take your LLM app to production
Pinned Loading
Repositories
Showing 10 of 44 repositories
- litellm Public
Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
BerriAI/litellm’s past year of commit activity - locust-load-tester Public
BerriAI/locust-load-tester’s past year of commit activity - proxy_load_tester Public
BerriAI/proxy_load_tester’s past year of commit activity - provider-litellm-http Public Forked from crossplane-contrib/provider-http
Crossplane Provider designed to facilitate sending LiteLLM HTTP requests as resources.
BerriAI/provider-litellm-http’s past year of commit activity - example_litellm_gcp_cloud_run Public
Example Repo to deploy LiteLLM Proxy (AI Gateway) on GCP Cloud Run
BerriAI/example_litellm_gcp_cloud_run’s past year of commit activity - prometheus-deploy Public Forked from devato/render-prometheus
A Blueprint for deploying Prometheus to render.com
BerriAI/prometheus-deploy’s past year of commit activity
People
This organization has no public members. You must be a member to see who’s a part of this organization.
Top languages
Loading…
Most used topics
Loading…