Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[WIP] Gaussian initialization for sinkhorn #555

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

Conversation

rflamary
Copy link
Collaborator

@rflamary rflamary commented Nov 2, 2023

Types of changes

This is a first shot for intializing empirical sinkhorn but the computational gain is not very clear on tests I did locally.

The following code

n = 2000
rng = np.random.RandomState(0)

x = rng.randn(n, 2)
x2 = 10*rng.randn(n//2, 2)
x2[:,0]+=2


ot.tic()
G, log = ot.empirical_sinkhorn(x,x2, 1, method='sinkhorn_log', warmstart=None, verbose=False, isLazy=False, stopThr=1e-5, log = True)
ot.toc()
print("Err=",log['err'][-1], "niter=", log['niter'])

ot.tic()
G2, log2 = ot.empirical_sinkhorn(x,x2, 1, method='sinkhorn_log', warmstart='gaussian', verbose=False, isLazy=False, stopThr=1e-5, log = True)
ot.toc()
print("Err=",log2['err'][-1], "niter=", log2['niter'])

give sthe following output

Elapsed time : 3.0527355670928955 s
Err= 9.441113655553818e-06 niter= 140
Elapsed time : 2.391462564468384 s
Err= 9.89690596643644e-06 niter= 110`

Quite far from the computational gains in the paper. Will investigate it more.

Motivation and context / Related issue

How has this been tested (if it applies)

PR checklist

  • I have read the CONTRIBUTING document.
  • The documentation is up-to-date with the changes I made (check build artifacts).
  • All tests passed, and additional code has been covered with new tests.
  • I have added the PR and Issue fix to the RELEASES.md file.

Copy link

codecov bot commented Nov 2, 2023

Codecov Report

Merging #555 (b3be5a6) into master (53dde7a) will decrease coverage by 12.79%.
The diff coverage is 91.83%.

Additional details and impacted files
@@             Coverage Diff             @@
##           master     #555       +/-   ##
===========================================
- Coverage   96.49%   83.70%   -12.79%     
===========================================
  Files          67       67               
  Lines       14663    14708       +45     
===========================================
- Hits        14149    12312     -1837     
- Misses        514     2396     +1882     

@cedricvincentcuaz
Copy link
Collaborator

I believe that your implementation is correct, i am not sure how authors handle the bias in empirical_bures_wasserstein_mapping set by default to True in POT.
From the experiments in the paper I would say that gains seem specific to low regimes for the entropic regularization, did you check that ?

@rflamary
Copy link
Collaborator Author

will do. iI have students looking into this. Will come ack to the PR after.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants