Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

any wise way to implement a residual network with 1000 layers ? #85

Closed
wagamamaz opened this issue Feb 12, 2017 · 2 comments
Closed

any wise way to implement a residual network with 1000 layers ? #85

wagamamaz opened this issue Feb 12, 2017 · 2 comments

Comments

@wagamamaz
Copy link
Collaborator

No description provided.

@wagamamaz wagamamaz changed the title how to implement a residual network with 1000 layers ? any smart way to implement a residual network with 1000 layers ? Feb 12, 2017
@wagamamaz wagamamaz changed the title any smart way to implement a residual network with 1000 layers ? any wise way to implement a residual network with 1000 layers ? Feb 12, 2017
@zsdonghao
Copy link
Member

zsdonghao commented Feb 12, 2017

@wagamamaz you can use for loop to define it.

lrelu = lambda x: tl.act.lrelu(x, 0.2)
...
net_h1 = Conv2d(net_h0, df_dim*2, (4, 4), (2, 2), act=None,             
        padding='SAME', W_init=w_init, b_init=b_init, name='e_h1/conv2d')
net_h1 = BatchNormLayer(net_h1, #act=lrelu, 
       is_train=is_train, gamma_init=gamma_init, name='e_h1/batchnorm')

for i in range(1000):
      net = Conv2d(net_h1, df_dim*1, (1, 1), (1, 1),
          padding='SAME', W_init=w_init, b_init=b_init, name='e{}/c1'.format(i))
      net = BatchNormLayer(net, act=lrelu,
          is_train=is_train, gamma_init=gamma_init, name='e{}/b1'.format(i))
      net = Conv2d(net, df_dim*1, (3, 3), (1, 1),
          padding='SAME', W_init=w_init, b_init=b_init, name='e{}/c2'.format(i))
      net = BatchNormLayer(net, act=lrelu,
          is_train=is_train, gamma_init=gamma_init, name='e{}/b2'.format(i))
      net = Conv2d(net, df_dim*2, (3, 3), (1, 1),
          padding='SAME', W_init=w_init, b_init=b_init, name='e{}/c3'.format(i))
      net = BatchNormLayer(net, #act=tf.nn.relu,
          is_train=is_train, gamma_init=gamma_init, name='e{}/b3'.format(i))
      net_h1 = ElementwiseLayer(layer=[net_h1, net], combine_fn=tf.add, name='e{}/add'.format(i))
      net_h1.outputs = lrelu(net_h1.outputs)

net_h2 = Conv2d(net_h1, df_dim*4, (4, 4), (2, 2), act=None,               
      padding='SAME', W_init=w_init, b_init=b_init, name='e_h2/conv2d')
net_h2 = BatchNormLayer(net_h2,                           
      is_train=is_train, gamma_init=gamma_init, name='e_h2/batchnorm')
....

@zsdonghao
Copy link
Member

i think the problem have been solved?

# for free to join this conversation on GitHub. Already have an account? # to comment
Projects
None yet
Development

No branches or pull requests

2 participants