137 releases

0.21.9 Jan 8, 2025
0.21.8 Dec 5, 2024
0.21.7 Sep 23, 2024
0.21.6 Jul 24, 2024
0.1.1 Nov 2, 2018

#1256 in Machine learning

Download history 5185/week @ 2024-10-29 3597/week @ 2024-11-05 2782/week @ 2024-11-12 4895/week @ 2024-11-19 3653/week @ 2024-11-26 3858/week @ 2024-12-03 4368/week @ 2024-12-10 2982/week @ 2024-12-17 1536/week @ 2024-12-24 2344/week @ 2024-12-31 5023/week @ 2025-01-07 3695/week @ 2025-01-14 5048/week @ 2025-01-21 6212/week @ 2025-01-28 4932/week @ 2025-02-04 4680/week @ 2025-02-11

21,765 downloads per month
Used in 37 crates (9 directly)

MIT/Apache

2MB
51K SLoC

Rust 42K SLoC // 0.0% comments Templ 9K SLoC // 0.1% comments GNU Style Assembly 12 SLoC // 0.2% comments

Tract

Tiny, no-nonsense, self contained, portable TensorFlow and ONNX inference.

Example

use tract_core::internal::*;

// build a simple model that just add 3 to each input component
let mut model = TypedModel::default();

let input_fact = f32::fact(&[3]);
let input = model.add_source("input", input_fact).unwrap();
let three = model.add_const("three".to_string(), tensor1(&[3f32])).unwrap();
let add = model.wire_node("add".to_string(),
    tract_core::ops::math::add(),
    [input, three].as_ref()
    ).unwrap();

model.auto_outputs().unwrap();

// We build an execution plan. Default inputs and outputs are inferred from
// the model graph.
let plan = SimplePlan::new(&model).unwrap();

// run the computation.
let input = tensor1(&[1.0f32, 2.5, 5.0]);
let mut outputs = plan.run(tvec![input.into()]).unwrap();

// take the first and only output tensor
let mut tensor = outputs.pop().unwrap();

assert_eq!(tensor, tensor1(&[4.0f32, 5.5, 8.0]).into());

While creating a model from Rust code is useful for testing the library, real-life use-cases will usually load a TensorFlow or ONNX model using tract-tensorflow or tract-onnx crates.

Dependencies

~13–25MB
~395K SLoC