Skip to content
/ MINE Public

Implement Mutual Information Neural Estimator with TensorFlow 2

Notifications You must be signed in to change notification settings

joe-ip2/MINE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

46 Commits
 
 
 
 
 
 

Repository files navigation

Mutual Information Neural Estimation (MINE) (paper)

Implement MINE with TensorFlow 2

Toy Example

Calculate mutual information between X and Y, denoted by I(X ; Y).
Let X be sampled from Normal distribution, and Y be random variable denoting the result of the rolling a dice.
In theory, they are independent so I(X ; Y) = 0. The experimental result is shown below.

Results

  • Runs in 10000 iterations in our experiment.
  • Loss value is decreasing, the updating process is reasonable.
  • The lower bound gradually flattens around 0, meeting the theoretical result.

About

Implement Mutual Information Neural Estimator with TensorFlow 2

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages