Skip to content

Jupyter notebook test #540

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Closed
wants to merge 11 commits into from
Closed

Conversation

PhilipMay
Copy link
Contributor

First version of jupyter notebook test as discussed in #485

@PhilipMay
Copy link
Contributor Author

There are still the following issues:

  • the path (NBDIR) is hard wired and maybe wrong
  • bazel integration
  • is this the right place for this file?

@Smokrow
Copy link
Contributor

Smokrow commented Sep 23, 2019

I think the hardwired NBDIR should not be a problem since the folder should not move around after they included it into the docs generator. (see this comment )

I am not quite sure if we want to put this into the bazel testing pipeline since it is kind of self contained. We could add it to the travis pipeline though. @seanpmorgan Maybe we can add another test after the bazel test pipeline and before the build.

@PhilipMay do you know if this testing environment runs !pip install lalala arguments?

Copy link
Member

@seanpmorgan seanpmorgan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @PhilipMay! Could you please add this to our cmake commands:
https://github.com/tensorflow/addons/blob/master/makefile

We can then add this to the dockerized tests in .travis.yml

# ==============================================================================
"""Test for example jupyter notebooks."""

import testipynb
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll need a script to install this dependency during test time. Could you add this to:
https://github.com/tensorflow/addons/blob/master/tools/ci_build/install/ci_requirements.txt

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done - but without specifying a specific version number.
Is that ok?

@PhilipMay
Copy link
Contributor Author

PhilipMay commented Sep 24, 2019

Thanks @PhilipMay! Could you please add this to our cmake commands:
https://github.com/tensorflow/addons/blob/master/makefile

We can then add this to the dockerized tests in .travis.yml

I am done with that.

@PhilipMay do you know if this testing environment runs !pip install lalala arguments?

Yes it runs pip this way. I uninstalled tensorflow with pip, started make tutorial_test and then checked if tensorflow got installed. It was...

@PhilipMay
Copy link
Contributor Author

From my point of view this is done. These steps still miss:

  • integrate to travis - do this in other PR?
  • the test now shows errors so fix the notebooks - also separate PR?

Copy link
Member

@seanpmorgan seanpmorgan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @PhilipMay! I'm seeing several notebooks fail for DeadKernelError: Kernel died possibly from timeout. Could you increase the timeout for the tests and see if it fixes it.

@PhilipMay
Copy link
Contributor Author

Thanks @PhilipMay! I'm seeing several notebooks fail for DeadKernelError: Kernel died possibly from timeout. Could you increase the timeout for the tests and see if it fixes it.

@seanpmorgan I can see no DeadKernelError: Kernel died messages in my log (see below). The timeout is already increased from 600 (default) to 2100. Can you maybe post your messages? Where do these differences come from?

FAIL: test_optimizers_lazyadam (testipynb.testipynb.NbTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/local/miniconda3/envs/tf2/lib/python3.6/site-packages/testipynb/testipynb.py", line 96, in test_func
    assert passing, msg
AssertionError: 

 ... optimizers_lazyadam FAILED 

SyntaxError in cell [2] 
-----------
!pip install tensorflow-gpu==2.0.0rc0
!pip install tensorflow-addons~=0.5
from __future__ import absolute_import, division, print_function, unicode_literals

import tensorflow as tf
import tensorflow_addons as tfa
import tensorflow_datasets as tfds
import numpy as np
from matplotlib import pyplot as plt
-----------

                            

======================================================================
FAIL: test_template (testipynb.testipynb.NbTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/local/miniconda3/envs/tf2/lib/python3.6/site-packages/testipynb/testipynb.py", line 96, in test_func
    assert passing, msg
AssertionError: 

 ... template FAILED 

NameError in cell [5] 
-----------
result = model(tf.constant(np.random.randn(10,5), dtype = tf.float32)).numpy()

print("min:", result.min())
print("max:", result.max())
print("mean:", result.mean())
print("shape:", result.shape)
-----------

                            

----------------------------------------------------------------------
Ran 6 tests in 94.164s

FAILED (failures=6)
make: *** [tutorials-test] Error 1
(tf2) mikes-MacBook:addons mike$ clear

(tf2) mikes-MacBook:addons mike$ make tutorials-test
python tools/ci_testing/tutorials_test.py

--------------------- Testing image_ops.ipynb ---------------------
/usr/local/miniconda3/envs/tf2/lib/python3.6/site-packages/jupyter_client/session.py:371: DeprecationWarning: Session._key_changed is deprecated in traitlets 4.1: use @observe and @unobserve instead.
  def _key_changed(self):


 ... image_ops FAILED 

SyntaxError in cell [2] 
-----------
!pip install -q tensorflow-gpu==2.0.0rc0
!pip install -q tensorflow-addons~=0.5

from __future__ import absolute_import, division, print_function, unicode_literals

import numpy as np
import tensorflow as tf
import tensorflow_addons as tfa

import matplotlib.pyplot as plt
-----------

                            
----------------- >> begin Traceback << ----------------- 

  File "<ipython-input-2-b46ec5485964>", line 7
    import tensorflow as tf
           ^
SyntaxError: from __future__ imports must occur at the beginning of the file



----------------- >> end Traceback << -----------------

                            
F
--------------------- Testing layers_normalizations.ipynb ---------------------


 ... layers_normalizations FAILED 

SyntaxError in cell [2] 
-----------
!pip install -q tensorflow==2.0.0rc0 
!pip install -q tensorflow-addons~=0.5
from __future__ import absolute_import, division, print_function
import tensorflow as tf
import tensorflow_addons as tfa
-----------

                            
----------------- >> begin Traceback << ----------------- 

  File "<ipython-input-2-ea1d9ed991dc>", line 6
SyntaxError: from __future__ imports must occur at the beginning of the file



----------------- >> end Traceback << -----------------

                            
F
--------------------- Testing layers_weightnormalization.ipynb ---------------------


 ... layers_weightnormalization FAILED 

SyntaxError in cell [2] 
-----------
!pip install tensorflow-gpu==2.0.0rc0
!pip install tensorflow-addons~=0.5
from __future__ import absolute_import, division, print_function, unicode_literals

import tensorflow as tf
import tensorflow_addons as tfa
import numpy as np
from matplotlib import pyplot as plt
-----------

                            
----------------- >> begin Traceback << ----------------- 

  File "<ipython-input-2-6458efa20296>", line 6
    import tensorflow_addons as tfa
           ^
SyntaxError: from __future__ imports must occur at the beginning of the file



----------------- >> end Traceback << -----------------

                            
F
--------------------- Testing losses_triplet.ipynb ---------------------


 ... losses_triplet FAILED 

SyntaxError in cell [2] 
-----------
!pip install -q tensorflow-gpu==2.0.0rc0
!pip install -q tensorflow-addons~=0.5
from __future__ import absolute_import, division, print_function, unicode_literals

import io
import numpy as np
import tensorflow as tf
import tensorflow_addons as tfa
import tensorflow_datasets as tfds
from matplotlib import pyplot as plt
from google.colab import files
-----------

                            
----------------- >> begin Traceback << ----------------- 

  File "<ipython-input-2-aeb553bdcf4f>", line 6
    import numpy as np
           ^
SyntaxError: from __future__ imports must occur at the beginning of the file



----------------- >> end Traceback << -----------------

                            
F
--------------------- Testing optimizers_lazyadam.ipynb ---------------------


 ... optimizers_lazyadam FAILED 

SyntaxError in cell [2] 
-----------
!pip install tensorflow-gpu==2.0.0rc0
!pip install tensorflow-addons~=0.5
from __future__ import absolute_import, division, print_function, unicode_literals

import tensorflow as tf
import tensorflow_addons as tfa
import tensorflow_datasets as tfds
import numpy as np
from matplotlib import pyplot as plt
-----------

                            
----------------- >> begin Traceback << ----------------- 

  File "<ipython-input-2-444c83055f39>", line 6
    import tensorflow_addons as tfa
           ^
SyntaxError: from __future__ imports must occur at the beginning of the file



----------------- >> end Traceback << -----------------

                            
F
--------------------- Testing template.ipynb ---------------------
2019-09-25 16:20:27.875261: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2019-09-25 16:20:27.899003: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7fd57d5aa130 executing computations on platform Host. Devices:
2019-09-25 16:20:27.899078: I tensorflow/compiler/xla/service/service.cc:175]   StreamExecutor device (0): Host, Default Version
Error in atexit._run_exitfuncs:
Traceback (most recent call last):
  File "/usr/local/miniconda3/envs/tf2/lib/python3.6/logging/__init__.py", line 1945, in shutdown
    h.flush()
  File "/usr/local/miniconda3/envs/tf2/lib/python3.6/site-packages/absl/logging/__init__.py", line 891, in flush
    self._current_handler.flush()
  File "/usr/local/miniconda3/envs/tf2/lib/python3.6/site-packages/absl/logging/__init__.py", line 785, in flush
    self.stream.flush()
  File "/usr/local/miniconda3/envs/tf2/lib/python3.6/site-packages/ipykernel/iostream.py", line 341, in flush
    if self.pub_thread.thread.is_alive():
AttributeError: 'NoneType' object has no attribute 'thread'


 ... template FAILED 

NameError in cell [5] 
-----------
result = model(tf.constant(np.random.randn(10,5), dtype = tf.float32)).numpy()

print("min:", result.min())
print("max:", result.max())
print("mean:", result.mean())
print("shape:", result.shape)
-----------

                            
----------------- >> begin Traceback << ----------------- 

---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-5-3ed927615e9f> in <module>
----> 1 result = model(tf.constant(np.random.randn(10,5), dtype = tf.float32)).numpy()
      2 
      3 print("min:", result.min())
      4 print("max:", result.max())
      5 print("mean:", result.mean())

NameError: name 'np' is not defined


----------------- >> end Traceback << -----------------

                            
F
======================================================================
FAIL: test_image_ops (testipynb.testipynb.NbTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/local/miniconda3/envs/tf2/lib/python3.6/site-packages/testipynb/testipynb.py", line 96, in test_func
    assert passing, msg
AssertionError: 

 ... image_ops FAILED 

SyntaxError in cell [2] 
-----------
!pip install -q tensorflow-gpu==2.0.0rc0
!pip install -q tensorflow-addons~=0.5

from __future__ import absolute_import, division, print_function, unicode_literals

import numpy as np
import tensorflow as tf
import tensorflow_addons as tfa

import matplotlib.pyplot as plt
-----------

                            

======================================================================
FAIL: test_layers_normalizations (testipynb.testipynb.NbTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/local/miniconda3/envs/tf2/lib/python3.6/site-packages/testipynb/testipynb.py", line 96, in test_func
    assert passing, msg
AssertionError: 

 ... layers_normalizations FAILED 

SyntaxError in cell [2] 
-----------
!pip install -q tensorflow==2.0.0rc0 
!pip install -q tensorflow-addons~=0.5
from __future__ import absolute_import, division, print_function
import tensorflow as tf
import tensorflow_addons as tfa
-----------

                            

======================================================================
FAIL: test_layers_weightnormalization (testipynb.testipynb.NbTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/local/miniconda3/envs/tf2/lib/python3.6/site-packages/testipynb/testipynb.py", line 96, in test_func
    assert passing, msg
AssertionError: 

 ... layers_weightnormalization FAILED 

SyntaxError in cell [2] 
-----------
!pip install tensorflow-gpu==2.0.0rc0
!pip install tensorflow-addons~=0.5
from __future__ import absolute_import, division, print_function, unicode_literals

import tensorflow as tf
import tensorflow_addons as tfa
import numpy as np
from matplotlib import pyplot as plt
-----------

                            

======================================================================
FAIL: test_losses_triplet (testipynb.testipynb.NbTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/local/miniconda3/envs/tf2/lib/python3.6/site-packages/testipynb/testipynb.py", line 96, in test_func
    assert passing, msg
AssertionError: 

 ... losses_triplet FAILED 

SyntaxError in cell [2] 
-----------
!pip install -q tensorflow-gpu==2.0.0rc0
!pip install -q tensorflow-addons~=0.5
from __future__ import absolute_import, division, print_function, unicode_literals

import io
import numpy as np
import tensorflow as tf
import tensorflow_addons as tfa
import tensorflow_datasets as tfds
from matplotlib import pyplot as plt
from google.colab import files
-----------

                            

======================================================================
FAIL: test_optimizers_lazyadam (testipynb.testipynb.NbTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/local/miniconda3/envs/tf2/lib/python3.6/site-packages/testipynb/testipynb.py", line 96, in test_func
    assert passing, msg
AssertionError: 

 ... optimizers_lazyadam FAILED 

SyntaxError in cell [2] 
-----------
!pip install tensorflow-gpu==2.0.0rc0
!pip install tensorflow-addons~=0.5
from __future__ import absolute_import, division, print_function, unicode_literals

import tensorflow as tf
import tensorflow_addons as tfa
import tensorflow_datasets as tfds
import numpy as np
from matplotlib import pyplot as plt
-----------

                            

======================================================================
FAIL: test_template (testipynb.testipynb.NbTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/local/miniconda3/envs/tf2/lib/python3.6/site-packages/testipynb/testipynb.py", line 96, in test_func
    assert passing, msg
AssertionError: 

 ... template FAILED 

NameError in cell [5] 
-----------
result = model(tf.constant(np.random.randn(10,5), dtype = tf.float32)).numpy()

print("min:", result.min())
print("max:", result.max())
print("mean:", result.mean())
print("shape:", result.shape)
-----------

                            

----------------------------------------------------------------------
Ran 6 tests in 96.071s

FAILED (failures=6)
make: *** [tutorials-test] Error 1

@PhilipMay
Copy link
Contributor Author

@seanpmorgan what do you think are the next steps? Fix the DeadKernelError: Kernel died issue you have and I can't reproduce - right?

@seanpmorgan
Copy link
Member

Hi Philip could you please verify how you're testing the code? With the current commit I'm getting:

root@a4cbd4d4bf35:/addons# make tutorials-test
makefile:35: *** missing separator.  Stop

The indentation must be consistent with the other commands.

Also, could you please run in the docker container so we have consistent environments and that is how this will be ran during automated testing.

For a non-gpu environment I'm getting a mix of issues related to #532 (which is unrelated to this) and DeadKernel errors:
tutorialtest_output.txt

When running in gpu environment just to mitigate the #532 issues I get
tutorialtest_gpu_output.txt

As you can see a number of imports fail (matplotlib, tensorflow_datasets, numpy). So what we'll need to do is create a separate install_nbtest_dependecy.sh script which runs for this make command.

We can't merge this until we know that the setup is capable of running these tests

@PhilipMay
Copy link
Contributor Author

Hi Philip could you please verify how you're testing the code? With the current commit I'm getting:

root@a4cbd4d4bf35:/addons# make tutorials-test
makefile:35: *** missing separator.  Stop

This is fixed now.

@Smokrow
Copy link
Contributor

Smokrow commented Sep 26, 2019

As you can see a number of imports fail (matplotlib, tensorflow_datasets, numpy). So what we'll need to do is create a separate install_nbtest_dependecy.sh script which runs for this make command.

@seanpmorgan Do you think we could use the Colab Notebook container as an environment since this is the environment people will actually see this in? I am not quite sure if they are public though. Maybe a googler can help out? @karmel

@karmel
Copy link
Contributor

karmel commented Sep 26, 2019

I don't quite follow-- what colab nb container are you referring to, exactly?

@Smokrow
Copy link
Contributor

Smokrow commented Sep 26, 2019

I always thought that the "standard" Google Colab Notebook is actually a container running on Googles Servers. In case it is, do you know if the container is public?

@karmel
Copy link
Contributor

karmel commented Sep 27, 2019

@MarkDaoust @yashk2810 @tomerk , who know more about how the nb tests are run and what the options here might be.

@yashk2810
Copy link
Member

One option is, since we are creating a subsite for addons, their notebooks will automatically get tested by the infrastructure we have in place for all the other subsites and TensorFlow. If the notebooks fail, an email will be sent to the addons team to look into what went wrong.

For CI testing, I can look into opensourcing the stuff that we use to test the notebooks.

@seanpmorgan @PhilipMay One thing to note for enabling the tests with our infra is to not save the outputs in the notebook and install whichever version of addons you want to test your notebook with. You can follow the template notebook (https://github.com/tensorflow/docs/blob/master/tools/templates/notebook.ipynb).

@facaiy
Copy link
Member

facaiy commented Sep 28, 2019

For CI testing, I can look into opensourcing the stuff that we use to test the notebooks.

Good news! Thank Yash. It sound like we can share the same notebook testing library (and even workflow) if opensourcing. Would you mind telling us the opensouring schedule if possible?

@yashk2810
Copy link
Member

Would you mind telling us the opensouring schedule if possible?

Probably within a few weeks :)

@PhilipMay
Copy link
Contributor Author

Hi @yashk2810 - what is the status here?
Thanks
Philip

@PhilipMay
Copy link
Contributor Author

Bumping it again... @yashk2810 @karmel

@yashk2810
Copy link
Member

Hi @PhilipMay,

Opensourcing what we use will take some time since it needs to be decoupled from a lot of things that we use and will require some effort. I will triage and set a priority on it. But you are free to explore any other alternative that you have in mind :)

On a positive note, since addons is now a subsite tensorflow.org/addons, every notebook is already getting tested. If something fails, we will send an email to the appropriate handle.

@lc0
Copy link
Contributor

lc0 commented Dec 9, 2019

@seanpmorgan does it mean, that the issue is redundant and could be closed?

@seanpmorgan
Copy link
Member

IMO yes, provided that the above system works. @yashk2810 What email are failures sent to?
@PhilipMay would this be an acceptable solution?

@PhilipMay
Copy link
Contributor Author

@PhilipMay would this be an acceptable solution?

For me this is 100% ok. Why not. No need to implement something that is already implemented.

On a positive note, since addons is now a subsite tensorflow.org/addons, every notebook is already getting tested. If something fails, we will send an email to the appropriate handle.

Are we sure that "send an email to the appropriate handle" does work and that this will always be converted to a ticket here in GitHub?

@yashk2810
Copy link
Member

yashk2810 commented Dec 10, 2019

always be converted to a ticket here in GitHub?

I don;t think it will be converted to a ticket on Github.

But, is there an email with all the maintainers of addons that we can forward it to? Then you can monitor that email for failure emails from our side.

@PhilipMay
Copy link
Contributor Author

But, is there an email with all the maintainers of addons that we can forward it to? Then you can monitor that email for failure emails from our side.

I dont know... @seanpmorgan do you?
Before we close this PR we should test the mail functionality.

@lc0
Copy link
Contributor

lc0 commented Dec 10, 2019 via email

@karmel
Copy link
Contributor

karmel commented Dec 10, 2019

(@yashk2810 -- not sure how these particular tests are set up, but when I previously tried in several ways to get internal testing emails forwarded to external aliases, I failed. We could possibly set up an internal alias with external members or something like that, but note that there may be some hoops to jump through.)

@yashk2810
Copy link
Member

We could possibly set up an internal alias

Yup, that's how it works. The email is sent to our internal alias and then we forward it to the appropriate external handle after scrubbing all the internal email id's.

We have been doing this for translations already.

@seanpmorgan
Copy link
Member

@yashk2810 The email for failing tests is: addons-testing@tensorflow.org. It's open to all as a google group.

Do you have any more information as to when the notebooks are tested? Is this only done once the API docs are re-generated after every release? We'll be releasing 0.7 in the next few days. I can put a broken tutorial as a test on the branch.

@yashk2810
Copy link
Member

The email for failing tests is: addons-testing@tensorflow.org

Thanks

done once the API docs are re-generated after every release

Yes, currently. If the changes made to the notebooks are very often, then I can move it to weekly and eventually nightly.

I can put a broken tutorial as a test on the branch

If you want to test it out, sure. But we pull docs from the master branch. So you won;t have to put it on the 0.7 branch :)

@seanpmorgan
Copy link
Member

The email for failing tests is: addons-testing@tensorflow.org

Thanks

done once the API docs are re-generated after every release

Yes, currently. If the changes made to the notebooks are very often, then I can move it to weekly and eventually nightly.

I can put a broken tutorial as a test on the branch

If you want to test it out, sure. But we pull docs from the master branch. So you won;t have to put it on the 0.7 branch :)

Great thanks for your help @yashk2810! We'll put up a test prior to requesting the next update of API docs.

I did want to confirm that addons docs are pulled from master though. We only generate API docs on our release branches so can you confirm that the addons site generates them from master branch separately from what we do?

@yashk2810
Copy link
Member

addons site generates them from master branch

So, for the narrative docs (guides, tutorials, etc) we pull from master.

For API docs we use your pip package to generate the api docs.

Does that make sense?

@seanpmorgan
Copy link
Member

Thanks all for the support on this! Closing this PR and created a subsequent issue to track this testing:
#768

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants