2 stable releases
2.2.10 | Nov 19, 2024 |
---|---|
2.2.6 |
|
1.0.4 | May 29, 2024 |
1.0.3 |
|
1.0.0 |
|
#212 in Machine learning
585KB
1K
SLoC
Llamma Desktop
Desktop app to connect to Ollama and send queries.
Llama Desktop reads the Ollama service URI from the environment variable
OLLAMA_HOST
, defaults to http://localhost:11434
.
Installation
Ollama
In case you have an NVIDIA GPU and want to run Ollama locally:
curl -fsSL https://ollama.com/install.sh | sh
systemctl enable ollama
systemctl start ollama
ollama pull mistral:latest
ollama pull phind-codellama:latest
Last stable release
cargo install llama-desktop
Development version
cargo install git@github.com:cacilhas/llama-desktop.git
License
Dependencies
~26–48MB
~839K SLoC