A Dockerfile defining a Docker image containing CUDA, Tensorflow 2 (GPU), Miniconda, MLflow and Jupyter Note.
This Dockerfile borrows from various Dockerfiles (including some of the official Tensorflow Dockerfiles). Recommended for local development but not production.
- Install Docker.
- Make sure you have an NVIDIA GPU and install nvidia-docker.
docker build -t <image_tag> .
Where <image_tag> is the tag you choose for this image.
docker run -it -p 8888:8888 -p 5000:5000 --runtime=nvidia --name <container_name> <image_tag>
Where <container_name> is the name for your container and <image_tag> is the tag you choose for your image.
You can also map volumes to /mlflow/mlruns, /mlflow/projects and /mlflow/notebooks, e.g.
-v /your/home/mlruns:/mlflow/mlruns
When you run the container you'll see a web address in the logs similar to this:
http://127.0.0.1:8888/?token=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Click it and the Jupyter Notebook interface will open in your browser.
You can access the MLflow though http://localhost:5000.
In your notebooks, make sure you include this code snippet so you'll see your experiments in the MLflow UI:
from mlflow.tracking import set_tracking_uri as uri
uri('http://localhost:5000')