Skip to content

Latest commit

 

History

History
15 lines (12 loc) · 825 Bytes

README.md

File metadata and controls

15 lines (12 loc) · 825 Bytes

Mutual Information Neural Estimation (MINE) (paper)

Implement MINE with TensorFlow 2

Toy Example

Calculate mutual information between X and Y, denoted by I(X ; Y).
Let X be sampled from Normal distribution, and Y be random variable denoting the result of the rolling a dice.
In theory, they are independent so I(X ; Y) = 0. The experimental result is shown below.

Results

  • Runs in 10000 iterations in our experiment.
  • Loss value is decreasing, the updating process is reasonable.
  • The lower bound gradually flattens around 0, meeting the theoretical result.