Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Adds Ollama and OpenWebUI services as tools #351

Merged
merged 7 commits into from
Dec 6, 2024
Merged

Adds Ollama and OpenWebUI services as tools #351

merged 7 commits into from
Dec 6, 2024

Conversation

tonysm
Copy link
Contributor

@tonysm tonysm commented Nov 16, 2024

Changelog

  • Adds Ollama as a tool
  • Adds OpenWebUI as a tool

These both tools should work together. But we can use Ollama without the OpenWebUI tool (in case we're using Ollama in an application).

To run the Ollama tool (use defaults):

php ./takeout enable ollama

To run the OpenWebUI (use defaults):

php ./takeout enable openwebui

This should start Ollama at port 11434 and OpenWebUI at port 3000.

Open http://localhost:3000 (it takes a while for OpenWebUI to load), navigate to the top-right corner Admin Panel menu:

Screenshot

image

Then, to the Settings -> Models section, choose a model (like llama3), then hit the download button... this will take a while...

Screenshot

image

Once it's done downloading, you can hit "New Chat" at the top-left corner, then choose the downloaded model in the model dropdown, then use it

Screenshots

image

@tonysm tonysm changed the title Adds Ollama and OpenWebui services as tools Adds Ollama and OpenWebUI services as tools Nov 16, 2024
@tonysm tonysm merged commit c7ecea9 into main Dec 6, 2024
11 checks passed
@tonysm tonysm deleted the tm/ollama branch December 6, 2024 18:30
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants