-
Notifications
You must be signed in to change notification settings - Fork 704
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature]: support Ollama backend #1064
Comments
On my Mac I did this to make it work. Installed ollama as application, then:
|
yes we can use localai, cause both support the OpenAI's API spec, but I guess we can surface it as its own backend since it's a different project that localai |
if they have same inference interface, how about using opensource bankend which can |
Hello, any comment on this code to support Ollama directly? Seems to be working locally for me.
|
HI @ronaldpetty , follow the #1065 (comment). |
Checklist
Is this feature request related to a problem?
None
Problem Description
No response
Solution Description
support Ollama like localai.
Benefits
The Ollama can make it easier for users to interact with K8SGPT.
Potential Drawbacks
No response
Additional Information
No response
The text was updated successfully, but these errors were encountered: