This is a container that will simply run the DEEP as a Service API component, with tf_cnn_benchmarks (src: tf_cnn_benchmarks).
To run the Docker container directly from Docker Hub and start using the API simply run the following command:
$ docker run -ti -p 5000:5000 -p 6006:6006 agrupp/deep-oc-tf_cnn_benchmarks
This command will pull the Docker container from the Docker Hub agrupp repository and start the default command (deepaas-run --listen-ip=0.0.0.0).
docker-compose.yml allows you to run the application with various configurations via docker-compose.
N.B! docker-compose.yml is of version '2.3', one needs docker 17.06.0+ and docker-compose ver.1.16.0+, see https://docs.docker.com/compose/install/
If you want to use Nvidia GPU, you need nvidia-docker and docker-compose ver1.19.0+ , see nvidia/FAQ
If you want to build the container directly in your machine (because you want
to modify the Dockerfile
for instance) follow the following instructions:
Building the container:
-
Get the
DEEP-OC-tf_cnn_benchmarks
repository (this repo):$ git clone https://github.com/adriangrupp/DEEP-OC-tf_cnn_benchmarks
-
Build the container:
$ cd DEEP-OC-tf_cnn_benchmarks $ docker build -t agrupp/deep-oc-tf_cnn_benchmarks .
-
Run the container:
$ docker run -ti -p 5000:5000 -p 6006:6006 agrupp/deep-oc-tf_cnn_benchmarks
These three steps will download the repository from GitHub and will build the
Docker container locally on your machine. You can inspect and modify the
Dockerfile
in order to check what is going on. For instance, you can pass the
--debug=True
flag to the deepaas-run
command, in order to enable the debug
mode.
Once the container is up and running, browse to http://localhost:5000
to get
the OpenAPI (Swagger) documentation page.