5 releases

0.1.4 Oct 14, 2022
0.1.3 Oct 14, 2022
0.1.2 Oct 14, 2022
0.1.1 Oct 14, 2022
0.1.0 Oct 14, 2022

#483 in Machine learning

MIT/Apache

2MB
709 lines

Neural-Network

A simple neural network written in rust.

About

This implementation of a neural network using gradient-descent is completely written from ground up using rust. It is possible to specify the shape of the network, as well as the learning-rate of the network. Additionally, you can choose from one of many predefined datasets, for example the XOR- and CIRCLE Datasets, which represent the relative functions inside the union-square. As well as more complicated datasets like the RGB_DONUT, which represents a donut-like shape with a rainbow like color transition.

Below, you can see a training process, where the network is trying to learn the color-values of the RGB_DONUT dataset.

Features

The following features are currently implemented:

  • Optimizers
    1. Adam
    2. RMSProp
    3. SGD
  • Loss Functions
    1. Quadratic
  • Activation Functions
    1. Sigmoid
    2. ReLU
  • Layers
    1. Dense
  • Plotting
    1. Plotting the cost-history during training
    2. Plotting the final predictions inside, either in grayscale or RGB

Usage

The process of creating and training the neural network is pretty straightforwards:

carbon

Example Training Process

Below, you can see how the network learns:

Learning Animation

https://user-images.githubusercontent.com/54124311/195410077-7a02b075-0269-4ff2-965f-97f224ab2cf1.mp4

Final Result

RGB_DONUT_SGD_ 2,64,64,64,64,64,3

Cool training results

RGB_DONUT

Big Network

RGB_DONUT_RMS_PROP_ 2,128,128,128,3 RGB_DONUT_RMS_PROP_ 2,128,128,128,3 _history

Small Network

RGB_DONUT_SGD_ 2,8,8,8,3 RGB_DONUT_SGD_ 2,8,8,8,3 _history

XOR_PROBLEM

XOR_SGD_ 2,8,8,8,1 XOR_SGD_ 2,8,8,8,1 _history

Dependencies

~9MB
~158K SLoC