Skip to content
/ collmbo Public

A Slack bot that lets you choose your preferred LLM using LiteLLM.

License

Notifications You must be signed in to change notification settings

iwamot/collmbo

Repository files navigation

Collmbo

A Slack bot that lets you choose your preferred LLM using LiteLLM. Pronounced the same as "Colombo".

Quick Start

Collmbo supports multiple LLMs, but let's begin with OpenAI's gpt-4o model for a quick setup.

1. Create a Slack App

Create a Slack app and obtain the required tokens:

  • App-level token (xapp-1-...)
  • Bot token (xoxb-...)

2. Create a .env File

Save your credentials in a .env file:

SLACK_APP_TOKEN=xapp-1-...
SLACK_BOT_TOKEN=xoxb-...
LITELLM_MODEL=gpt-4o
OPENAI_API_KEY=sk-...

3. Run Collmbo Container

Start the bot using Docker:

docker run -it --env-file .env ghcr.io/iwamot/collmbo:latest-slim

Why latest-slim Here?

Collmbo provides two official Docker image flavors:

Flavor Description
slim A minimal image with only essential dependencies
full A full-featured image with additional libraries (e.g., boto3 for Amazon Bedrock)

You must specify a flavor explicitly. If you want to use the latest image, use latest-slim or latest-full.

Additionally, you can specify a versioned tag like x.x.x-slim. For more details, please check the list of available tags.

4. Say Hello!

Mention the bot in Slack and start chatting:

@Collmbo hello!

Collmbo should respond in channels, threads, and DMs.

Want to Use a Different LLM?

First, pick your favorite LLM from LiteLLM supported providers.

To use it, update the relevant environment variables in your .env file and restart the container.

Here are some examples:

Gemini - Google AI Studio (Gemini 2.0 Flash)

SLACK_APP_TOKEN=xapp-1-...
SLACK_BOT_TOKEN=xoxb-...
LITELLM_MODEL=gemini/gemini-2.0-flash-001
GEMINI_API_KEY=...

Azure OpenAI (gpt-4o)

SLACK_APP_TOKEN=xapp-1-...
SLACK_BOT_TOKEN=xoxb-...
LITELLM_MODEL=azure/<your_deployment_name>

# Specify the model type to grab details like max input tokens
LITELLM_MODEL_TYPE=azure/gpt-4o

AZURE_API_KEY=...
AZURE_API_BASE=...
AZURE_API_VERSION=...

Amazon Bedrock (Claude 3.7 Sonnet)

SLACK_APP_TOKEN=...
SLACK_BOT_TOKEN=...
LITELLM_MODEL=bedrock/us.anthropic.claude-3-7-sonnet-20250219-v1:0

# You can specify a Bedrock region if it's different from your default AWS region
AWS_REGION_NAME=us-west-2

# You can use your access key for authentication, but IAM roles are recommended
AWS_ACCESS_KEY_ID=...
AWS_SECRET_ACCESS_KEY=...

When using Amazon Bedrock, as mentioned earlier, you need to use the full flavor image:

docker run -it --env-file .env ghcr.io/iwamot/collmbo:latest-full

Deployment

Collmbo does not serve endpoints and can run in any environment with internet access.

Features

Configuration

Collmbo runs with default settings, but you can customize its behavior by setting optional environment variables.

Contributing

Contributions are welcome! Feel free to open an issue or submit a pull request.

Before opening a PR, please run:

./validate.sh

This helps maintain code quality.

Related Projects

License

The code in this repository is licensed under the MIT License.

The Collmbo icon (assets/icon.png) is licensed under CC BY-NC-SA 4.0. For example, you may use it as a Slack profile icon.

About

A Slack bot that lets you choose your preferred LLM using LiteLLM.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Languages