Run tf_mnist.py in local conda environment.
$ pip install tensorflow
$ az ml experiment submit -c local tf_mnist.py
Run tf_mnist.py in a local Docker container.
$ az ml experiment submit -c docker tf_mnist.py
Run tf_mnist.py in a Docker container in a remote machine. Note you need to create/configure myvm.compute.
$ az ml experiment submit -c myvm tf_mnist.py
Run tf_mnist.py in a Docker container in a remote machine with GPU.
- Create a new compute context, nam it gpu (or any arbitary name)
- Use az ml computetarget attach to target the GPU equipped VM.
- In conda_dependencies.yml file, use tensorflow-gpu instead of tensorflow.
- In gpu.compute file, use microsoft/mmlspark:gpu as the base Docker image.
- In gpu.compute file, add a line nvidiaDocker: true
- In gpu.runconfig file, set Framework to Python
- Now run the script.
$ az ml experiment submit -c gpu tf_mnist.py
For more information on using GPU in Vienna execution, read this article.