Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Test and fix all conditionals #1386

Merged
merged 13 commits into from
Jan 15, 2023
Merged

Test and fix all conditionals #1386

merged 13 commits into from
Jan 15, 2023

Conversation

dellaert
Copy link
Member

Turns out GaussianMixture error() was wrong! I added very thorough checking of all conditionals with all supported arguments, in both C++ and python, and made sure all compiled and worked. In the process I found out that GaussianMixture did not satisfy its invariants. The resolution is documented in GaussianMixture.h:

   * @brief Compute the error of this Gaussian Mixture.
   *
   * This requires some care, as different mixture components may have
   * different normalization constants. Let's consider p(x|y,m), where m is
   * discrete. We need the error to satisfy the invariant:
   *
   *    error(x;y,m) = K - log(probability(x;y,m))
   *
   * For all x,y,m. But note that K, for the GaussianMixture, cannot depend on
   * any arguments. Hence, we delegate to the underlying Gaussian
   * conditionals, indexed by m, which do satisfy:
   * 
   *    log(probability_m(x;y)) = K_m - error_m(x;y)
   * 
   * We resolve by having K == 0.0 and
   * 
   *    error(x;y,m) = error_m(x;y) - K_m

This was a lot of work but I think this level of care is needed to ensure correctness in hybrid inference.

@dellaert
Copy link
Member Author

For reviewing, look at Conditional-inst.h, the new file testHybridConditional.cpp, test_HybridBayesNet.py and GaussianMixture.*. Almost everything else is just small additions to make things compile.

Copy link
Collaborator

@varunagrawal varunagrawal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving, but I would really like to understand why K cannot be dependent on any arguments. Or should it be "K should not depend on any arguments"?

@dellaert
Copy link
Member Author

Approving, but I would really like to understand why K cannot be dependent on any arguments. Or should it be "K should not depend on any arguments"?

See my comment above. It is predicated on invariants set forth in Conditional.h. I will do some more commits in this PR as I think GaussianMixtureFactor is still wrong as well.

Copy link
Collaborator

@ProfFan ProfFan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this looks great.

To explain the previous behavior: I put the normalization constants as a DiscreteFactor, which should be somewhat equivalent, but of course a lot more hacky.

@ProfFan
Copy link
Collaborator

ProfFan commented Jan 14, 2023

Yeah, GMF will need the same modifications

@dellaert
Copy link
Member Author

I started addressing issues with GMF in a different PR, and I also made the requested changes there…

@dellaert dellaert merged commit 618ac28 into develop Jan 15, 2023
@dellaert dellaert deleted the hybrid/more_tests branch January 15, 2023 16:15
@dellaert dellaert added this to the Hybrid Inference milestone Feb 7, 2023
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants