2 stable releases

2.2.6 Jun 8, 2024
2.2.5 Jun 7, 2024
1.0.4 May 29, 2024
1.0.3 Apr 10, 2024
1.0.0 Jan 24, 2024

#196 in Machine learning

27 downloads per month

MIT license

585KB
1K SLoC

Llamma Desktop

Llama

Desktop app to connect to Ollama and send queries.

Llama Desktop reads the Ollama service URI from the environment variable OLLAMA_HOST, defaults to http://localhost:11434.

Installation

Ollama

In case you have an NVIDIA GPU and want to run Ollama locally:

curl -fsSL https://ollama.com/install.sh | sh
systemctl enable ollama
systemctl start ollama
ollama pull mistral:latest
ollama pull phind-codellama:latest

Last stable release

cargo install llama-desktop

Development version

cargo install git@github.com:cacilhas/llama-desktop.git

License

Dependencies

~24–45MB
~792K SLoC