-
Notifications
You must be signed in to change notification settings - Fork 63
v2.0.2 #447
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
v2.0.2 #447
Conversation
…ent code Torch is very slow, so I had to increase the timeout accordingly.
* summary networks: add tests for using functional API * fix build functions for use with functional API
* fix docs of coupling flow * add additional tests
In addition, this PR limits the slow test to Windows and Python 3.10. The choices are somewhat arbitrary, my thought was to test the setup not covered as much through use by the devs.
…move update-workflows branch from workflow style tests, add __init__ and conftest to test_point_approximators (#443)
* implement compile_from_config and get_compile_config * add optimizer build to compile_from_config
* remove the is_symbolic_tensor check because this would otherwise skip the whole function for compiled contexts * skip pyabc test * fix sinkhorn and log_sinkhorn message formatting for jax by making the warning message worse
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR releases version v2.0.2 with several improvements and fixes across the BayesFlow codebase including new tests, better handling of input shapes via decorators, refined network layers, improved error handling in distributions, updated workflow configurations, and new issue templates.
- Added sanitize_input_shape decorators to several network functions
- Refactored distributions to use trainable_parameters instead of use_learnable_parameters and adjusted related operations
- Enhanced configuration methods for approximators and updated docs/workflows
Reviewed Changes
Copilot reviewed 51 out of 51 changed files in this pull request and generated no comments.
Show a summary per file
File | Description |
---|---|
bayesflow/networks/transformers/*.py | Added sanitize_input_shape decorators and adjusted imports |
bayesflow/metrics/*.py | Added serializable decorator and minor import adjustments |
bayesflow/distributions/*.py | Refactored parameter naming and tensor operations |
bayesflow/approximators/*.py | Added compile_from_config and get_compile_config methods |
README.md, .github/* | Updated docs, affiliation badge, workflows, and issue templates |
Comments suppressed due to low confidence (3)
bayesflow/distributions/mixture.py:56
- The identifier 'ops' is used here without an import. Please add an import statement (e.g., 'from keras import ops') at the top of the file.
self.mixture_logits = ops.ones(shape=len(distributions))
bayesflow/approximators/model_comparison_approximator.py:121
- The methods compile_from_config and get_compile_config use 'deserialize' and 'serialize' but there is no visible import for these functions. Ensure they are imported (e.g., from bayesflow.utils.serialization) to avoid runtime errors.
def compile_from_config(self, config):
bayesflow/approximators/continuous_approximator.py:107
- Similar to model_comparison_approximator.py, this file uses 'deserialize' and 'serialize' without an explicit import. Please verify that these functions are imported properly.
def compile_from_config(self, config):
- use torch as default backend - reduce range of N so users of jax won't be stuck with a slow notebook - use BayesFlow built-in MLP instead of keras.Sequential solution - general code cleanup
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🚀
compile_from_config
when loading approximators