-
Notifications
You must be signed in to change notification settings - Fork 65
Fix custom loss for MXNet backend. Fix bug in Concat layer #110
Fix custom loss for MXNet backend. Fix bug in Concat layer #110
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, minor comment, thanks for the contribution!
keras/engine/training.py
Outdated
# If sample_weights shape is like (100, ), we convert it to (100, 1). | ||
# Because, MXNet treats the shape (100, ) as (100) leading to broadcast operator | ||
# failures in below operations. | ||
if K.backend() == 'mxnet' and K.ndim(weights) == 1: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can use weight_ndim
already calculated:
if K.backend() == 'mxnet' and weight_ndim== 1:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done. Thanks
@@ -82,7 +82,7 @@ def test_batchnorm_correctness_2d(): | |||
|
|||
|
|||
@pytest.mark.skipif((K.backend() == 'mxnet'), | |||
reason='MXNet backend does not allow predict() before compile()') | |||
reason='MXNet backend uses native BatchNorm operator. Donot do updates in the model.') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do not?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
@@ -160,7 +160,7 @@ def test_batchnorm_convnet_no_center_no_scale(): | |||
|
|||
|
|||
@pytest.mark.skipif((K.backend() == 'mxnet'), | |||
reason='MXNet backend uses native BatchNorm operator. Do do updates in the model.') | |||
reason='MXNet backend uses native BatchNorm operator. Donot do updates in the model.') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same - Do not?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
@roywei - Fixed your comments |
* Fix custom loss usage in MXNet backend. Issue - 25 * Fix CR comments * Fix CR comments
@roywei @kalyanee