Skip to content

a chat window to talk to a local network hosted llama instance

License

Notifications You must be signed in to change notification settings

CaKellum/ollama_question.nvim

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Ollama_question.nvim

This is a quick and dirty implementation to get a window open and you can talk to a Ollama hosted model

requires nvim-lua/plenary.nvim

require("ollamachat").setup({
    url = "http://127.0.0.1:11434/api/generate"
})

replace the url with what ever your instances url is, if you want to use this you likely won't want to though.

I will not be fixing any thing and generally only will update it if I find issue within my use case. If you find an issue and want to use this as a base fork it and fix the issue for yourself.

About

a chat window to talk to a local network hosted llama instance

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages