22 releases (7 breaking)
0.8.0 | Nov 13, 2022 |
---|---|
0.7.0 | Sep 29, 2022 |
0.4.3 | Jul 31, 2022 |
#392 in Machine learning
60 downloads per month
84KB
2K
SLoC
EntityGym for Rust
EntityGym is a Python library that defines a novel entity-based abstraction for reinforcement learning environments which enables highly ergonomic and efficient training of deep reinforcement learning agents. This crate provides bindings that allows Rust programs to be used as EntityGym training environments, and to load and run neural networks agents trained with Entity Neural Network Trainer natively in pure Rust applications.
Overview
The core abstraction in entity-gym-rs is the Agent
trait.
It defines a high-level API for neural network agents which allows them to directly interact with Rust data structures.
To use any of the Agent
implementations provided by entity-gym-rs, you just need to derive the Action
and Featurizable
traits, which define what information the agent can observe and what actions it can take:
- The
Action
trait allows a Rust type to be returned as an action by anAgent
. This trait can be derived automatically for enums with only unit variants. - The
Featurizable
trait converts objects into a format that can be processed by neural networks. It can be derived for most fixed-sizestruct
s, and forenum
s with unit variants.Agent
s can observe collections containing any number ofFeaturizable
objects.
Example
Basic example that demonstrates how to construct an observation and sample a random action from an Agent
:
use entity_gym_rs::agent::{Agent, AgentOps, Obs, Action, Featurizable};
#[derive(Action, Debug)]
enum Move { Up, Down, Left, Right }
#[derive(Featurizable)]
struct Player { x: i32, y: i32 }
#[derive(Featurizable)]
struct Cake {
x: i32,
y: i32,
size: u32,
}
fn main() {
// Creates an agent that acts completely randomly.
let mut agent = Agent::random();
// Alternatively, load a trained neural network agent from a checkpoint.
// let mut agent = Agent::load("agent");
// Construct an observation with one `Player` entity and two `Cake entities.
let obs = Obs::new(0.0)
.entities([Player { x: 0, y: 0 }])
.entities([
Cake { x: 4, y: 0, size: 4 },
Cake { x: 10, y: 42, size: 12 },
]);
// To obtain an action from an agent, we simple call the `act` method
// with the observation we constructed.
let action = agent.act::<Move>(obs);
println!("{:?}", action);
}
For a more complete example that includes training a neural network to play Snake, see examples/bevy_snake.
Docs
- bevy_snake: Example of how to use entity-gym-rs in a Bevy game.
- bevy-snake-ai: More complex Bevy application with adversarial training of multiple agents to create AI opponents.
- EntityGym Rust API Docs: Rust API reference.
- If you have any questions, you can also get help on our discord server
Dependencies
~14–34MB
~564K SLoC