Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

The following Variables were used a Lambda layer's call,BatchNormalization #522

Open
chenroundsquare opened this issue May 4, 2023 · 0 comments

Comments

@chenroundsquare
Copy link

Describe the bug(问题描述)
The following Variables were used a Lambda layer's call (tf.compat.v1.nn.fused_batch_norm), but
are not present in its tracked objects:
<tf.Variable 'batch_normalization/gamma:0' shape=(32,) dtype=float32>
<tf.Variable 'batch_normalization/beta:0' shape=(32,) dtype=float32>
It is possible that this is intended behavior, but it is more likely
an omission. This is a strong indication that this layer should be
formulated as a subclassed Layer rather than a Lambda layer.

To Reproduce(复现步骤)
I think there are some ploblem when import BatchNormalization,
I tried from tensorflow.python.keras.layers import BatchNormalization, but failed.
and i found the issue below in tensorflow.python.keras.layers.init.py

class VersionAwareLayers(object):
  """Utility to be used internally to access layers in a V1/V2-aware fashion.

  When using layers within the Keras codebase, under the constraint that
  e.g. `layers.BatchNormalization` should be the `BatchNormalization` version
  corresponding to the current runtime (TF1 or TF2), do not simply access
  `layers.BatchNormalization` since it would ignore e.g. an early
  `compat.v2.disable_v2_behavior()` call. Instead, use an instance
  of `VersionAwareLayers` (which you can use just like the `layers` module).
  """

  def __getattr__(self, name):
    serialization.populate_deserializable_objects()
    if name in serialization.LOCAL.ALL_OBJECTS:
      return serialization.LOCAL.ALL_OBJECTS[name]
    return super(VersionAwareLayers, self).__getattr__(name)

but i don't know how to use it.
i tried use tf.keras.layers.BatchNormalization instead of import the BatchNormalization
and i also tried use from keras.layers import BatchNormalization
In both of the above methods, i encountered the initial problem of Lambda reporting error

Operating environment(运行环境):

  • python version [3.9]
  • tensorflow version [ 2.8.0]
  • deepctr version [e.g. 0.9.2,]

Additional context
Add any other context about the problem here.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant