Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Expose parameter uncertainties #1597

Merged
merged 8 commits into from
Aug 26, 2022

Conversation

rosteen
Copy link
Collaborator

@rosteen rosteen commented Aug 25, 2022

Implements retrieving and displaying uncertainties from the fit models in the Model Fitting plugin. Opening as draft since I need to mess around with the plugin layout to accommodate the new information in a way that's visually appealing.

@codecov
Copy link

codecov bot commented Aug 25, 2022

Codecov Report

Merging #1597 (99834a0) into main (d70425c) will decrease coverage by 0.02%.
The diff coverage is 50.00%.

❗ Current head 99834a0 differs from pull request most recent head 0e6af97. Consider uploading reports for the commit 0e6af97 to get more accurate results

@@            Coverage Diff             @@
##             main    #1597      +/-   ##
==========================================
- Coverage   86.12%   86.09%   -0.03%     
==========================================
  Files          94       94              
  Lines        9321     9330       +9     
==========================================
+ Hits         8028     8033       +5     
- Misses       1293     1297       +4     
Impacted Files Coverage Δ
...igs/default/plugins/model_fitting/model_fitting.py 77.50% <50.00%> (-0.64%) ⬇️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@rosteen
Copy link
Collaborator Author

rosteen commented Aug 25, 2022

The returned uncertainties from model fitting are now exposed, need to mess around with the styling a little more to improve the layout. I moved the parameter name/"fixed" checkbox to their own row since everything was getting way too squished horizontally, mainly I need to tighten things up vertically now I think.

Screen Shot 2022-08-25 at 5 46 14 PM

@rosteen rosteen added this to the 2.10 milestone Aug 26, 2022
@rosteen rosteen added the plugin Label for plugins common to multiple configurations label Aug 26, 2022
@rosteen rosteen marked this pull request as ready for review August 26, 2022 13:45
@rosteen
Copy link
Collaborator Author

rosteen commented Aug 26, 2022

I improved the UI slightly from the last screenshot, this is ready for review.

Screen Shot 2022-08-26 at 9 48 44 AM

Copy link
Member

@kecnry kecnry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this will be great to have exposed! Just a few comments/suggestions, but otherwise looks good to me.

Comment on lines +140 to +145
# The submodels don't have uncertainties attached, only the compound model
if self._fitted_model.stds is not None:
std_name = temp_param[0]["name"]
if submodel_index is not None:
std_name = f"{std_name}_{submodel_index}"
temp_param[0]["std"] = self._fitted_model.stds[std_name]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this block confused me... what situation triggers this logic and what does it do?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, this is because in specutils the stds attribute is only populated on the top-level compound model, not in each of the component models, and they're labeled things like amplitude_0, amplitude_1. So we have to see which model component is 0, 1, etc, from the order of the submodel_names attribute in order to retrieve the correct uncertainties. It's a bit of a pain but fixing it would require some upstream refactoring.

Comment on lines +216 to +217
roundUncertainty(uncertainty) {
return uncertainty.toPrecision(2)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is probably fine for now, but we might need to make it more advanced for some edge cases with really large/small uncertainties (or if we ever display values in scientific notation).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oooo I wish I knew this trick for the Angle PR. 😆

@kecnry
Copy link
Member

kecnry commented Aug 26, 2022

Also tested with specutils 1.7 (our min pinned version which does not have support for exposing uncertainties yet), and this just skips displaying them but does not raise any errors (which is what I'd expect).

Copy link
Member

@kecnry kecnry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Copy link
Contributor

@pllim pllim left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't run this but code looks like code and it is Friday, so :shipit: !

@pllim pllim merged commit 0460d72 into spacetelescope:main Aug 26, 2022
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
plugin Label for plugins common to multiple configurations Ready for final review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants