A stable diffusion webui configuration for AMD ROCm. This only developed to run on Linux because ROCm is only officially supported on Linux.
This docker container deploys an AMD ROCm 5.4.2 container based on ubuntu 22.04 with pyTorch 2.0. I has the custom version of AUTOMATIC1111 deployed to it so it is optimized for AMD GPUs.
- Working version of docker on 64-bit Linux. Need at least kernel 5.10 for AMD ROCm support.
- AMD rocm modules drivers loaded. Recommended 5.4.2 to match current release.
- Permissions to create and deploy docker containers
- 12+ GB of system RAM recommended. Swap can be used but is slower
- 8GB of VRAM to produce 512x512 images. Command line options may help optimization.
There is currently no dockerhub entry for this yet, so you have to build the docker image for yourself.
- Change directory to the git clone of stable-diffusion-webui-rocm
- Run on the command
docker build . -t 'stable-diffusion-webui-rocm'
- Wait for build to complete.
- Verify docker-compose.yml meets your environment need
- Run command
docker-compose up
- Wait for deploy to complete
- Browse to http://localhost:7860/
ENV Parameter | Default Value | Possible Values | Notes |
---|---|---|---|
COMMANDLINE_ARGS | (empty) | (string) | |
PORT | 7860 | VALID TCP PORT NUMBER |
Default Volume Name | Container Path | Notes |
---|---|---|
configs | /sd/configs | This does not include web-user.sh or web-ui.json. See Issue #3. |
models | /sd/models | This directory can use lots of space. Plan accordingly |
outputs | /sd/outputs | Location of saved generations goes |
extensions | /sd/extensions | Installed custom extensions |
plugins | /sd/plugins | Installed plugins |