-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how can i use ollama models in openui #87
Comments
No need for an API key. Just set |
I do not have an OpenAI API Key but do have my own ollama instance. If I remove the OPENAI_API_KEY var and set OLLAMA_HOST var to my ollama URL, the container fails to start, complaining about not having the openai_api_key var set or something. |
no no no, i don't use openui docker, just run the openui docker locally. |
I don't know if you have solved your problem already, but it seems similar to this issue. The solution worked for me. |
soo, if you unset OPENAI_API_KEY then I get: after setting OLLAMA_HOST to my localhost, I get a choice of models from ollama and can choose it, but then I get lots of errors and a 500 - what is the correct way of running ollama here? |
@sokoow you can just set the OPENAI_API_KEY to something like |
Indeed, after I've set OPENAI_API_KEY to an empty string and got bunch of errors, after setting it to something else everything works fine. Thanks for reply @vanpelt |
my openui is in ubuntu18 vmware workstation like 192.168.1.169,my ollama and models is in physical host like 192.168.1.103. how can i use ollama models in openui of vmware workstation.
The text was updated successfully, but these errors were encountered: