Skip to content
This repository has been archived by the owner on Jul 1, 2024. It is now read-only.

Fix custom loss for MXNet backend. Fix bug in Concat layer #110

Merged

Conversation

sandeep-krishnamurthy
Copy link

  • Fix bug in Concat operator leading to failure in Layers.Concatenate.
  • Fix shape mis-match errors when using Custom Loss.
  • Enable 6 more unit tests.

@roywei @kalyanee

Copy link

@roywei roywei left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, minor comment, thanks for the contribution!

# If sample_weights shape is like (100, ), we convert it to (100, 1).
# Because, MXNet treats the shape (100, ) as (100) leading to broadcast operator
# failures in below operations.
if K.backend() == 'mxnet' and K.ndim(weights) == 1:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can use weight_ndim already calculated:

if K.backend() == 'mxnet' and weight_ndim== 1:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done. Thanks

@@ -82,7 +82,7 @@ def test_batchnorm_correctness_2d():


@pytest.mark.skipif((K.backend() == 'mxnet'),
reason='MXNet backend does not allow predict() before compile()')
reason='MXNet backend uses native BatchNorm operator. Donot do updates in the model.')
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do not?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@@ -160,7 +160,7 @@ def test_batchnorm_convnet_no_center_no_scale():


@pytest.mark.skipif((K.backend() == 'mxnet'),
reason='MXNet backend uses native BatchNorm operator. Do do updates in the model.')
reason='MXNet backend uses native BatchNorm operator. Donot do updates in the model.')
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same - Do not?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@sandeep-krishnamurthy
Copy link
Author

@roywei - Fixed your comments

@sandeep-krishnamurthy sandeep-krishnamurthy merged commit a9a6b9e into awslabs:dev May 31, 2018
sandeep-krishnamurthy added a commit that referenced this pull request Jun 15, 2018
* Fix custom loss usage in MXNet backend. Issue - 25

* Fix CR comments

* Fix CR comments
# for free to subscribe to this conversation on GitHub. Already have an account? #.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants