Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Neural Adaptor tests set as @test_broken #513

Closed
ChrisRackauckas opened this issue May 9, 2022 · 1 comment
Closed

Neural Adaptor tests set as @test_broken #513

ChrisRackauckas opened this issue May 9, 2022 · 1 comment
Labels

Comments

@ChrisRackauckas
Copy link
Member

I didn't want that to continue blocking Flux v0.13 (and thus GalacticOptim.jl v3 as well), so I set them as @test_broken. I believe it might be just a tolerance thing, though it would be good to have someone double check that nothing broke in the differentiation setup, particularly with the restructure/destructure. I think this might be a case where just switching to Lux could make it more robust.

@sathvikbhagavan
Copy link
Member

@ChrisRackauckas, can this be closed? The tests are back and fixed.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants