26 releases (8 breaking)

0.13.0 Nov 15, 2023
0.12.3 Jun 27, 2023
0.12.0 May 31, 2023

#1042 in Machine learning


Used in llm-chain-llama

MIT license

1MB
22K SLoC

C 10K SLoC // 0.1% comments C++ 5K SLoC // 0.1% comments Rust 4.5K SLoC // 0.0% comments Python 1K SLoC // 0.1% comments CUDA 545 SLoC // 0.0% comments Shell 178 SLoC // 0.2% comments Zig 50 SLoC Batch 48 SLoC Swift 20 SLoC

llama-sys

llama-sys is a set of bindgen generated wrappers for llama.cpp. This crate provides a low-level interface to llama.cpp, allowing you to use it in your Rust projects. To use llama-sys, simply add the following to your Cargo.toml file:

[dependencies]
llama-sys = "0.1.0"
use llama_sys::\*;

Note that llama-sys provides a lower-level interface than llm-chain-llama, and may be more difficult to use. However, if you need fine-grained control over llama.cpp, llama-sys is the way to go.

No runtime deps