#ndarray #entropy #statistics #information-theory

information

Information theory functions to use with ndarray

2 releases

0.1.1 Oct 10, 2022
0.1.0 Sep 2, 2022

#1966 in Math

MIT license

33KB
622 lines

information

MIT licensed actions status codecov

ndarray-based information theory utilities.

About

This is a crate to calculate information theory metrics with ndarray.

Check out the docs for usage and examples.

Functions

This calculates entropy, conditional_entropy, joint_entropy, mutual_information, and conditional_mutual_information.

Utilities

All of the above functions expect probability matrices - but this crate exposes some utility functions to build individual and joint probability densities for multiple variables using the prob* and hist* functions.


lib.rs:

Information

This is a crate to perform information theory calculations using ndarray arrays.

Entropy Functions

  • [entropy()]
  • [joint_entropy!()]
  • [conditional_entropy()]

Information Functions

  • [mutual_information()]
  • [conditional_mutual_information()]

Utility

N-d Histogram

N-d Probability

Dependencies

~2.5MB
~47K SLoC