We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Hi @DaehanKim I'm the maintainer of LiteLLM. we allow you to create a proxy server to call 100+ LLMs to make it easier to run benchmark / evals .
I'm making this issue because I believe LiteLLM makes it easier for you to run benchmarks and evaluate LLMs (I'd love your feedback if it does not)
Try it here: https://docs.litellm.ai/docs/simple_proxy https://github.com/BerriAI/litellm
Ollama models
$ litellm --model ollama/llama2 --api_base http://localhost:11434
Hugging Face Models
$ export HUGGINGFACE_API_KEY=my-api-key #[OPTIONAL] $ litellm --model claude-instant-1
Anthropic
$ export ANTHROPIC_API_KEY=my-api-key $ litellm --model claude-instant-1
Palm
$ export PALM_API_KEY=my-palm-key $ litellm --model palm/chat-bison
python3 -m lm_eval \ --model openai-completions \ --model_args engine=davinci \ --task crows_pairs_english_age
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Hi @DaehanKim I'm the maintainer of LiteLLM. we allow you to create a proxy server to call 100+ LLMs to make it easier to run benchmark / evals .
I'm making this issue because I believe LiteLLM makes it easier for you to run benchmarks and evaluate LLMs (I'd love your feedback if it does not)
Try it here: https://docs.litellm.ai/docs/simple_proxy
https://github.com/BerriAI/litellm
Using LiteLLM Proxy Server
Creating a proxy server
Ollama models
Hugging Face Models
Anthropic
$ export ANTHROPIC_API_KEY=my-api-key $ litellm --model claude-instant-1
Palm
$ export PALM_API_KEY=my-palm-key $ litellm --model palm/chat-bison
Using to run an eval on lm harness:
The text was updated successfully, but these errors were encountered: