#llm #back-end #openai #interface #local #api #llm-client

bin+lib llm_interface

llm_interface: The backend for the llm_client crate

3 releases

0.0.3 Jan 29, 2025
0.0.2 Oct 10, 2024
0.0.1 Oct 4, 2024

#54 in #backend

Download history 3/week @ 2024-11-03 2/week @ 2024-11-17 23/week @ 2024-12-08 1/week @ 2024-12-15 125/week @ 2025-01-26 31/week @ 2025-02-02 10/week @ 2025-02-09

166 downloads per month
Used in llm_client

MIT license

8.5MB
10K SLoC

llm_interface: The backend for the llm_client crate

API Documentation

The llm_interface crate is a workspace member of the llm_client project.

This crate contains the build.rs, data types, and behaviors for LLMs.

Features

  • Integration with Llama.cpp (through llama-server)
    • Repo cloning and building
    • Managing Llama.cpp server
  • Support for various LLM APIs including generic OpenAI format LLMs

This crate enables running local LLMs and making requests to LLMs, designed for easy integration into other projects.

Examples

See the various Builders implemented in the integration tests for examples of using this crate.

For a look at a higher level API and how it implements this crate, see the llm_client crate.


lib.rs:

llm_interface: The backend for the llm_client crate

API Documentation

The llm_interface crate is a workspace member of the llm_client project.

This crate contains the build.rs, data types, and behaviors for LLMs.

Features

  • Integration with Llama.cpp (through llama-server)
    • Repo cloning and building
    • Managing Llama.cpp server
  • Support for various LLM APIs including generic OpenAI format LLMs

This crate enables running local LLMs and making requests to LLMs, designed for easy integration into other projects.

Examples

See the various Builders implemented in the integration tests for examples of using this crate.

For a look at a higher level API and how it implements this crate, see the llm_client crate.

Dependencies

~35–67MB
~1M SLoC