2 releases
0.0.2 | Oct 10, 2024 |
---|---|
0.0.1 | Oct 4, 2024 |
#95 in #openai
375 downloads per month
Used in llm_client
7MB
10K
SLoC
llm_interface: The Backend for the llm_client Crate
This crate contains the build.rs, data types, and behaviors for LLMs.
- Llama.cpp (through llama-server)
- Various LLM APIs including support for generic OpenAI format LLMs
You can use this crate to run local LLMs and make requests to LLMs. It's set up to be easy to integrate into other projects.
See the various Builders
implemented in the lib.rs file for an example of using this crate.
For a look at a higher level API and how it implements this crate, checkout the llm_client crate and it's lib.rs file.
Dependencies
~33–66MB
~1M SLoC