Dockerfiles for Reference
Publish and share dockerfiles for reference.
CUDA Dockerfiles
-
Container to install CUDA SDK samples and run the
p2pBandwidthLatencyTest
.
Tensorflow Dockerfiles
-
Dockerfile.tf18.03_tf1.4.0_ssh
Tensorflow container with ssh client and server installed. This enables the container to be run in multinode setups with MPI (Horovod installed) via docker.
-
Dockerfile.tf1.7.0py2_cuda9.0_cudnn7_nccl2.1.15_hvd_ompi3_ibverbs
Tensorflow v1.7.0 (with CUDA 9.0 for slightly older drivers) with ssh client and server installed. This enables the container to be run in multinode setups with MPI (Horovod installed) via docker.
-
Dockerfile.tf1.7.0py2_cuda9.1_cudnn7_nccl2.1.15_hvd_ompi3_ibverbs
Tensorflow v1.7.0 (with CUDA 9.1) with ssh client and server installed. This enables the container to be run in multinode setups with MPI (Horovod installed) via docker.
-
Dockerfile.tf1.12.0py3_cuda10.0_cudnn7_ubuntu16_nccl2.3.7_hvd_ompi3_ibverbs
Tensorflow v1.12.0 (with CUDA 10.0) with ssh client and server installed. This enables the container to be run in multinode setups with MPI (Horovod installed) via docker.
BVLC Caffe Dockerfile
-
BVLC Caffe container built against CUDA 9 and NCCL2. Added Volta architecture support via CUDA 9 and cmake option 70
set(Caffe_known_gpu_archs "30 35 50 60 61 70")
.
PyTorch Dockerfiles
-
PyTorch 0.4.0 container (Ubuntu 16 CUDA 9) with Horovod and Apex (https://github.com/NVIDIA/apex) installed. Originally based on Horovod dockerfile (https://github.com/uber/horovod/blob/master/Dockerfile).
Singularity
-
Use this container to run singularity to build singularity recipes and sregistry-cli for pulling containers.
Once the container is launched use the
sregistry
client.export SREGISTRY_NVIDIA_TOKEN=<YOUAPIKEY> # method 1 SREGISTRY_CLIENT=nvidia sregistry pull tensorflow:17.12 # method 2 sregistry pull nvidia://tensorflow:17.12
More references can be found here:
https://singularityhub.github.io/sregistry-cli/client-nvidiaPulling NGC (nvidia GPU cloud) containers can also be done via singularity version 2.5.x and above. Example:
export SINGULARITY_DOCKER_USERNAME='$oauthtoken' export SINGULARITY_DOCKER_PASSWORD='NVCR-TOKEN-HERE' singularity run docker://nvcr.io/nvidia/pytorch:17.12