-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Probabilities containing NaNs when choosing n_samples too small #4
Comments
One workaround for bad weight values is to implement the OptBayesExpt.enforce_parameter_constraints() method in a child class of OptBayesExpt() to ensure that your likelihood function can't generate NaNs. There are examples in demos/sweeper/obe_sweeper.py, and in demos/lockin/lockin_of_coil.py. Here's how the problem NaNs can happen. The particles that represent parameter values are given random displacements in the ParticlePDF.resample() method. With these random steps, parameters that are close enough to zero may become very small, or may change sign, leading to large or nonsense model_function outputs and subsequent NaN likelihood values. Note that while reducing n_samples may make optbayesexpt run faster, it also increases the chances of incorrect results. "Sample impoverishment," may prevent distribution mean values from converging to known parameter values, and may produce falsely narrow distributions. |
Thanks for your answer! I'm working on it and hope to get back to you soon. EDIT: Implementing the OptBayesExpt.enforce_parameter_constraints() method seems to work as suggested, thanks! |
Excuse me, actually, I thought it seemed to work, but I still keep encountering this problem even after enforcing parameter constraints. My model function has 3 parameters, and the prior of two of them is nowhere close to zero (one is Unif(5, 15) and the other Unif(2720, 3030)). The prior of the third parameter (C0) is Unif(0, 0.4), but after ensuring that particles with C0 < 0 would be assigned a weight of 0 the behaviour didn't change. Even after ensuring that particles' weight for which the third parameter would have a value below 0.05 would be set to 0 the errors still persisted. Now I seem to get a mix of the error above and an error saying that the "SVD did not converge" when I'm trying to decrease n_samples from 50000 to 10000 for example. For n_samples=50000 everything still works fine. Any thoughts? |
Hi Stennebroek, Thanks for helping out in getting to the bottom of this. I'm wondering if parameter values might be the wrong target. Maybe we should examine the weight values too. I'd like to pinpoint the routines that are raising these errors. Could you post the error messages? How do you determine the uncertainty values? Could they ever be negative or NaN? It might be helpful to see your model function as well. |
Sure, this is the error saying that the SVD did not converge:
The other error is the same as the one in my first message. I'm fitting many Lorentzian dips to noisy data (assuming shot noise, so the standard deviation is known). These dips all have a random location, random FWHM and random depth, though they are guaranteed to be confined by an interval that I have defined. Sometimes the SVD error pops up when fitting many of these spectra and sometimes the NaN error pops up. My model function is the following:
If you mean the standard deviations that are given as input to pdf_update(), then I estimate them according to the measured value assuming Poissonian noise (simply sqrt(M) if the measurement value is M, because M is large (100000-200000) for all measurements). I have looked at my script again and I cannot identify anything that would make the the uncertainties negative or NaN. |
I'm experiencing an error which I have trouble finding the root cause of. Apparently some probabilities contain NaN values when choosing the number of samples for the prior(s) (n_samples) too small. I found that the chance of getting this error increases as the n_sample decreases. Please find the error below.
Any help is much appreciated. Thank you in advance.
The text was updated successfully, but these errors were encountered: