Skip to content

Residual Net / ResNet in TL #196

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Closed
geometrikal opened this issue Aug 21, 2017 · 3 comments
Closed

Residual Net / ResNet in TL #196

geometrikal opened this issue Aug 21, 2017 · 3 comments

Comments

@geometrikal
Copy link

I have been successfully using the resnet example code here. It would be nice to have a more concise form.

What do you think about adding a ResNet layer type? Interface would be similar to Conv2D, with addition of some flags such as type - what topology (BN-C-R / C-BN-R etc), and subsample - do we subsample by 2, etc.

@zsdonghao
Copy link
Member

zsdonghao commented Aug 21, 2017

I am thinking about it as well~
But I found there are many kinds of resnet, for me, I found it is easier to build residual blocks by using a for loop. This is a open topic, please let me know, if anyone have suggestion.

BTW, I think this resnet example is clear than the one you read.

    n = Conv2d(256, (1,1), (1,1), act=None, W_init=w_init, b_init=None)(n)
    n = BatchNorm2d(decay=0.9, act=tf.nn.relu, gamma_init=g_init)(n)

    # res blocks
    nn = Conv2d(256, (3,3), (1,1), act=None, W_init=w_init, b_init=None)(n)
    nn = BatchNorm2d(decay=0.9, act=tf.nn.relu, gamma_init=g_init)(nn)
    nn = Conv2d(256, (3,3), (1,1), act=None, W_init=w_init, b_init=None)(nn)
    nn = BatchNorm2d(decay=0.9, act=None, gamma_init=g_init)(nn)
    n = Elementwise(tf.add)([n, nn])

    nn = Conv2d(256, (3,3), (1,1), act=None, W_init=w_init, b_init=None)(n)
    nn = BatchNorm2d(decay=0.9, act=tf.nn.relu, gamma_init=g_init)(nn)
    nn = Conv2d(256, (3,3), (1,1), act=None, W_init=w_init, b_init=None)(nn)
    nn = BatchNorm2d(decay=0.9, act=None, gamma_init=g_init)(nn)
    n = Elementwise(tf.add)([n, nn])

    nn = Conv2d(256, (3,3), (1,1), act=None, W_init=w_init, b_init=None)(n)
    nn = BatchNorm2d(decay=0.9, act=tf.nn.relu, gamma_init=g_init)(nn)
    nn = Conv2d(256, (3,3), (1,1), act=None, W_init=w_init, b_init=None)(nn)
    nn = BatchNorm2d(decay=0.9, act=None, gamma_init=g_init)(nn)
    n = Elementwise(tf.add)([n, nn])

    nn = Conv2d(256, (3,3), (1,1), act=None, W_init=w_init, b_init=None)(n)
    nn = BatchNorm2d(decay=0.9, act=tf.nn.relu, gamma_init=g_init)(nn)
    nn = Conv2d(256, (3,3), (1,1), act=None, W_init=w_init, b_init=None)(nn)
    nn = BatchNorm2d(decay=0.9, act=None, gamma_init=g_init)(nn)
    n = Elementwise(tf.add)([n, nn])

More info :

@zsdonghao zsdonghao changed the title Adding a ResNet layer type to Tensorflow Residual Net / ResNet in TL Aug 21, 2017
@geometrikal
Copy link
Author

geometrikal commented Aug 21, 2017

Yes, that is nicely compact and clear, thanks! Maybe it is simpler to just add ResNet example (one that includes the downsampling and padding)? Off topic, but have you found that different ResNet block topologies are more suited to different types of data?

@zsdonghao
Copy link
Member

Yes, I will summarise the examples.
I usually use the traditional one, I didn't find any big performance differences on different topologies on image applications.

@luomai luomai closed this as completed Feb 14, 2018
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants