Replies: 2 comments
-
I am having the same issue. I have checked that Ollama is working by opening http://127.0.0.1:11434/ in my browser. It says 'Ollama is running'. Using Ollama LLM gives the (missing 1 required positional argument: 'base_url') error. No errors if I use OpenAI. I have only 1 model which is (llama3). |
Beta Was this translation helpful? Give feedback.
-
Hi! 👋 We are using other channels as our official means of communication with users. We apologize for the delayed response. Thank you for your understanding. Best regards, |
Beta Was this translation helpful? Give feedback.
-
I am new to Langflow and I was trying to use Llama2 through Ollama as the model but I am getting the following error:
ValueError: Error building vertex Ollama: ChatOllamaComponent.build() missing 1 required positional argument: 'base_url'
The base url is default on http://localhost:11434/
Beta Was this translation helpful? Give feedback.
All reactions