Open
Description
So I ran across your project and wanted to try it out. I have a GPU so I'm trying the beta test curl installer. However, it fails because it assumes i don't have ollama already installed. The reason I use a local ollama is because I have a partition under /mnt/llm which is where I keep my downloaded LLM's. I also want a local Ollama for other things such as aider-chat. What would be the best way to deal with this?
Metadata
Metadata
Assignees
Labels
No labels