-
Notifications
You must be signed in to change notification settings - Fork 6.8k
[MXNET-817] Fixes to speech recognition example #12291
Conversation
50b01af
to
77f22a4
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
has the whole example been tested after switching from tensorboard
to mxboard
?
@@ -32,7 +32,7 @@ def getInstance(self): | |||
|
|||
def __new__(class_, *args, **kwargs): | |||
print("__new__") | |||
class_.instances[class_] = super(Singleton, class_).__new__(class_, *args, **kwargs) | |||
class_.instances[class_] = super().__new__(class_, *args, **kwargs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why remove passing Singleton
and class_
while calling super
?
Typo in line 42, should be SingletonInstance
.
line 52 in redundant.
The whole SingletonInstance
class defined in line 42 is redundant. It is not being used anywhere. I think it can be removed.
In fact in this whole speech recognition example, the code around the use of singletons can be refactored. The Singleton
class defined in line 21 is used only once as a decorator in label_util.py
module.
The log_util.py
module again defines another class called SingletonType
which is used to make LogUtil
class a singleton. There is duplication of code and logic here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Python3 complained about this usage of super(Singleton, class_).
LogUtil's singleton looks much cleaner. Will refactor the code around this. Wanted to make minimal changes to get the example working/not break it - but I think this is essential. Will make the changes and resubmit. Thanks @anirudhacharya
@@ -86,7 +86,10 @@ def update(self, labels, preds): | |||
def get_batch_loss(self): | |||
return self.batch_loss | |||
def get_name_value(self): | |||
total_cer = float(self.total_l_dist) / float(self.total_n_label) | |||
try: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: newlines between methods is inconsistent in this file.
Testing has been done for a small dataset. Training for the entire LibriSpeech dataset has been going on for 2 days now. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Thanks for your contributions @vandanavk
Is this PR good to go? |
LGTM for most parts. Is the singleton refactor being tracked anywhere? |
@anirudhacharya Opened an issue #12384 for tracking the refactor |
Description
Changing the example to use mxboard instead of tensorboard (tensorboard SummaryWriter is deprecated - as identified by #12024 (comment)). Fixing a couple of other bugs in the example.
Checklist
Essentials
Please feel free to remove inapplicable items for your PR.
Changes
Comments