Mutual Information Neural Estimation (MINE) (paper)
Implement MINE with TensorFlow 2
Calculate mutual information between X and Y, denoted by I(X ; Y).
Let X be sampled from Normal distribution, and Y be random variable denoting the result of the rolling a dice.
In theory, they are independent so I(X ; Y) = 0. The experimental result is shown below.
- Runs in 10000 iterations in our experiment.
- Loss value is decreasing, the updating process is reasonable.
- The lower bound gradually flattens around 0, meeting the theoretical result.