4 releases
new 0.1.3 | Nov 21, 2024 |
---|---|
0.1.2 | Nov 18, 2024 |
0.1.1 | Nov 18, 2024 |
0.1.0 | Nov 17, 2024 |
#335 in Configuration
456 downloads per month
15KB
134 lines
riscv_ai_infer π
A Rust-based, lightweight AI inference engine optimized for RISC-V boards. This project aims to enable efficient AI model inference on RISC-V systems, especially useful for resource-constrained environments like IoT devices and edge computing.
π Table of Contents
π Introduction
riscv_ai_infer
is a tool designed to perform AI model inference on RISC-V hardware. Leveraging Rustβs performance and safety guarantees, this engine is optimized for low-power, resource-constrained devices, making it ideal for edge computing and IoT applications.
β¨ Features
- Efficient AI inference using Rust and
nalgebra
. - Optimized for RISC-V architecture.
- Caching system to reduce redundant API calls.
- Terminal-based user interface for quick insights.
- Flexible configuration options.
βοΈ Installation
First, ensure you have the Rust toolchain installed. Then, clone this repository and build the project:
git clone https://github.com/bensatlantik/riscv_ai_infer.git
cd riscv_ai_infer
cargo build --release
Usage
You can run the program directly using:
cargo run
Command-Line Arguments (Optional)
To specify a crate for which you want to fetch statistics:
cargo run -- <crate_name>
Configuration
The tool supports a configuration file named config.toml. This is optional and will be skipped if not present.
# Example config.toml
api_key = ""
Example Output
No configuration file found. Skipping...
Cache data: Object {"example_key": String("example_value")}
Crate: serde
Total Downloads: 0
No recent downloads data available.
License
This project is licensed under the MIT License
Author
bensatlantik
Dependencies
~7β18MB
~273K SLoC