Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Week 3 - Task 2 issue #34

Open
nietoo opened this issue Feb 8, 2021 · 1 comment
Open

Week 3 - Task 2 issue #34

nietoo opened this issue Feb 8, 2021 · 1 comment

Comments

@nietoo
Copy link

nietoo commented Feb 8, 2021

In one of the last cells,

model.compile(
    loss='categorical_crossentropy',  # we train 102-way classification
    optimizer=keras.optimizers.adamax(lr=1e-2),  # we can take big lr here because we fixed first layers
    metrics=['accuracy']  # report accuracy during training
)

AttributeError: module 'keras.optimizers' has no attribute 'adamax'

This can be fixed by changing "adamax" to "Adamax". However, after that the second next cell:

# fine tune for 2 epochs (full passes through all training data)
# we make 2*8 epochs, where epoch is 1/8 of our training data to see progress more often
model.fit_generator(
    train_generator(tr_files, tr_labels), 
    steps_per_epoch=len(tr_files) // BATCH_SIZE // 8,
    epochs=2 * 8,
    validation_data=train_generator(te_files, te_labels), 
    validation_steps=len(te_files) // BATCH_SIZE // 4,
    callbacks=[keras_utils.TqdmProgressCallback(), 
               keras_utils.ModelSaveCallback(model_filename)],
    verbose=0,
    initial_epoch=last_finished_epoch or 0
)

throws the following error:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-183-faf1b24645ff> in <module>()
     10                keras_utils.ModelSaveCallback(model_filename)],
     11     verbose=0,
---> 12     initial_epoch=last_finished_epoch or 0
     13 )

2 frames
/usr/local/lib/python3.6/dist-packages/keras/legacy/interfaces.py in wrapper(*args, **kwargs)
     85                 warnings.warn('Update your `' + object_name +
     86                               '` call to the Keras 2 API: ' + signature, stacklevel=2)
---> 87             return func(*args, **kwargs)
     88         wrapper._original_function = func
     89         return wrapper

/usr/local/lib/python3.6/dist-packages/keras/engine/training.py in fit_generator(self, generator, steps_per_epoch, epochs, verbose, callbacks, validation_data, validation_steps, class_weight, max_queue_size, workers, use_multiprocessing, initial_epoch)
   1723 
   1724         do_validation = bool(validation_data)
-> 1725         self._make_train_function()
   1726         if do_validation:
   1727             self._make_test_function()

/usr/local/lib/python3.6/dist-packages/keras/engine/training.py in _make_train_function(self)
    935                 self._collected_trainable_weights,
    936                 self.constraints,
--> 937                 self.total_loss)
    938             updates = self.updates + training_updates
    939             # Gets loss and metrics. Updates weights at each call.

TypeError: get_updates() takes 3 positional arguments but 4 were given

keras.optimizers.Adamax() inherits the get_updates() method from keras.optimizers.Optimizer(), and that method takes only three arguments (self, loss, params), but _make_train_function is trying to pass four arguments to it.

As I understand it, the issue here is compatibility between tf 1.x and tf 2. I'm using colab and running the %tensorflow_version 1.x line, as well as the setup cell with week 3 setup uncommented at the start of the notebook.

All checkpoints up to this point have been passed succesfully.

@nietoo
Copy link
Author

nietoo commented Feb 8, 2021

Running !pip install q keras==2.0.6 in the colab init cell as per #29 fixes it.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant