This is retype post from Andrew "How to run Stable Diffusion on AWS".
Running Stable Diffusion in the cloud (AWS) has many advantages. You rent the hardware on-demand and only pay for the time you use. You don’t need to worry about maintaining the hardware.
Author set up a personal cloud server to run AUTOMATIC1111, ComfyUI, and SD Forge for saving storage space, the three Stable Diffusion software share models.
When do you want to use the Cloud?
The benefits of using a personal cloud server to run Stable Diffusion are:
- You don’t need to buy and maintain hardware. The cloud provider is responsible for the capital cost and maintenance.
- You can easily rent a more powerful GPU if you need it.
- You can access the machine anywhere, even when you are traveling.
Setting up a cloud server requires some technical expertise and can be time-consuming. You can use Google Colab notebooks as alternative method.
Cloud server setup
This article will guide you in setting up a Stable Diffusion cloud server for personal use. You will use Amazon Web Service (AWS) to set up the cloud system.
AWS is Amazon’s cloud computing service. You can rent computer resources such as CPU, GPU, RAM, storage, and public IP addresses on demand. You only pay for the hours you use.
We will use:
- EC2: Compute instance to host the Stable Diffusion server. You can select the CPU, GPU, and RAM specs. The instance will have options to run A1111, ComfyUI, or SD Forge.
- Elastic IP: Fix the IP address of the EC2 instance. Without it, the IP address will change everytime you stop and start the instance.
- S3 bucket (optionally): For storing the AI models more economically.
Notes:
You should stop the instance after each session. Otherwise, it will keep charging at the rate of a running instance. The storage is persistent, meaning that all the files and settings stay the same between sessions. It is no different from your local PC. See the summary of commands at the end.
Personal prerequisite - to follow this tutorial you should have a basic knowledge of using Linux with a terminal.
PS: of course, you can choose a Amazon Machine Image as Windows11 and simply upload your portable builds of A1111, ComfyUI and SD Forge to running EC2 instance.)
Create a new EC2 instance
Log in to AWS. In the top search bar, type “EC2”. Click the EC2 service.
Click launch instance.
Now, you should be on the Launch Instance page. Use the following settings.
EC2 instance settings
Enter the following settings for the EC2 instance.
Name: stable diffusion
Amazon Machine Image: Ubuntu Server 24.04 LTS
Instance type: g4dn.xlarge
The g4dn.xlarge instance has 4 vCPU, 16 GB RAM, and one T4 GPU with 16GB of VRAM.
If you want more RAM, the next level up is g4dn.2xlarge with 8 vCPU and 32 GB RAM.
G4dn is not the only option. You can pick a different instance with a different price and speed tradeoff. You can see the pricing in the page below:
- G4dn – T4 GPU
- G6 – L4 GPU (Up to 2x faster)
- G5 – A10 GPU (Up to 3x faster)
Security key
Next, you will need to create a security key pair. In Key pair (login), click Create new key pair. Give it a name (e.g. aws_sd) and click Create pair. The pem key file should be downloaded automatically.
Configure Storage
In Configure Storage, change the storage to 100 GiB.
Click Launch instance.
Set up EC2 instance
Open the Amazon EC2 console.
Select Instances in the sidebar. You should see your instance initializing. The machine is ready to use when the Status check changes to checks passed.
Select the EC2 instance and click Connect.
In the EC2 Instance Connect, ensure the Username is ubuntu and click Connect.
You should now have access to the machine’s terminal. Alternatively, you can ssh the machine from your local PC using the information in the SSH client tab.
Update software
The machine image is outdated. Update the machine by running the following commands.
sudo apt update
sudo apt upgrade
Install NVidia driver
You need to install the NVidia driver before using the GPU. Run the following command.
sudo apt install nvidia-driver-535
You can use any newer version (the last 3 digits) available.
Install Python 3.10
You will need Python 3.10 to run the Stable Diffusion software.
First, add the deadsnakes repository,
sudo add-apt-repository ppa:deadsnakes/ppa
Install Python,
sudo apt install python3.10
Set Python 3.10 as the default when you type python3.
sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.10 2
Verify that python3 is calling python 3.10.
python3 --version
Install Python 3.10 venv. You will need it later when setting up AUTOMATIC1111.
sudo apt install python3.10-venv
Restart the instance
You need to restart the EC2 instance to initialize the GPU driver.
Restart the instance by using the AWS interface. Instance state > Reboot instance.
When the reboot is complete, and the instance is ready, start a new terminal.
Confirm that the NVidia driver is working.
nvidia-smi
You should see something like this.
Now, the GPU is ready to be used.
Elastic IP
The public IP address of the EC2 instance will change whenever the instance is restarted.
An elastic IP is a fixed public IP address you rent from AWS. It is not straightly necessary but, for a small fee, you get a fixed IP address to your machine.
Open the Amazon EC2 console.
Select Elastic IP under Network Security.
Click Allocate Elastic IP > Allocate.
Select the Elastic IP > Action > Associate Elastic IP.
Choose the EC2 instance you just created under Instance.
Click Associate.
Now, your EC2 instance has a persistent IP address.
You can confirm the IP address by clicking EC2 > Instances. Select your EC2.
The public IP is listed in the Public IPv4 address.
AUTOMATIC1111
Clone repository
Go to the home directory.
cd ~
Run the following command to clone AUTOMATIC1111 to your EC2.
git clone https://github.com/AUTOMATIC1111/stable-diffusion-webui
Set command line arguments
In the stable-diffusion-webui directory, edit the file webui-user.sh. The following example uses nano to edit the file. Feel free to use other editors.
cd stable-diffusion-webui
nano webui-user.sh
Uncomment the line with COMMANDLINE_ARGS and change to the following.
export COMMANDLINE_ARGS="--listen --opt-sdp-attention --enable-insecure-extension-access"
This enables:
Connecting to A1111 from another computer such as yours.
Using the faster flash attention when using models.
Allowing you to install extensions when connecting remotely.
Install and run webui
Start webui:
./webui.sh
It will take some time to install. When it is done, you should see a local URL address http://0.0.0.0:7860.
You will need to open port 7860 on the EC2 instance to connect to it.
PS: you also open ports 8188 (ComfyUI) and 7862 (Fooocus) to avoid coming here twice.
Open port
Go to the instance > security. Click the link on the security group.
Inbound rule > Edit inbound rules.
Click Add rule. Add a Custom TCP port 7860. Allow connection source from My IP.
This restrict access of your Stable Diffusion port 7860 only from your IP address.
PS: you also set rules for ports 8188 (ComfyUI) and 7862 (Fooocus) to avoid coming here twice.
Test connection
Now, you should be able to access your AUTOMATIC1111 webui by the elastic IP address and the port 7860.
Go to EC2> Instance. Click on your instance and note the Public IPv4 address. For example, if the public IPv4 address is http://12.3.456.789, you can add “:7860” to the address to access AUTOTMATIC1111 from your browser.
http://12.3.456.789:7860
Test using it to make sure it works.
ComfyUI
Install ComfyUI
To install ComfyUI, first go to the home folder.
cd ~
Clone ComfyUI.
git clone https://github.com/comfyanonymous/ComfyUI
Share models with AUTOMATIC1111
Use AUTOMATIC1111’s models.
cd ComfyUI
cp extra_model_paths.yaml.example extra_model_paths.yaml
Edit extra_model_paths.yaml with nano.
nano extra_model_paths.yaml
Change the line.
base_path: path/to/stable-diffusion-webui/
To:
base_path: /home/ubuntu/stable-diffusion-webui
Run ComfyUI
You can use the python of AUTOMATIC1111 so that you don’t need to install all the libraries again. This saves some space.
~/stable-diffusion-webui/venv/bin/python main.py --listen
You should see it uses the port 8188.
Open port
Open the Amazon EC2 console > Instances. Select your Stable Diffusion instance > Security > Click the link of the security group.
Edit inbound Rule. Add custom TCP port 8188 and make it only accessible from your IP address.
See the instructions above.
SD Forge
Install Forge
First, go to the home folder.
cd ~
Clone the repository.
git clone https://github.com/lllyasviel/stable-diffusion-webui-forge
Edit webui-user.sh. Uncomment the following line and add the path pointing to A1111. This let you use A1111’s model files.
export COMMANDLINE_ARGS="--listen --forge-ref-a1111-home /home/ubuntu/stable-diffusion-webui"
Complete the installation of and start SD Forge.
./webui.sh
Now you will find the Insight Face library is missing.
Quit Forge and install Insight Face.
./venv/bin/python -m pip install insightface
Start Forge
./webui.sh
PS: remember that the Forge uses the same 7860 port as the Automatiс1111.
Storing SD models in S3 bucket you can make optionally by original post from Andrew (link above).
Summary of usage commands:
You would normally STOP (not TERMINATE) the instance after each usage session! Save your money!
Start the instance
Open the Amazon EC2 console.
Select your Stable Diffusion instance > Instance state > Start instance.
Then connect to the terminal: Actions > Connect.
Mount S3
If you use a S3 bucket to store models, run the following command (with modification) to mount the S3 bucket:
sudo s3fs BUCKET_NAME /s3 -o iam_role=s3fullaccess -o use_cache=/tmp -o allow_other -o uid=1001 -o mp_umask=002 -o multireq_max=5 -o use_path_request_style -o url=https://s3.us-east-1.amazonaws.com -o endpoint=us-east-1
Start AUTOMATIC1111
cd ~/stable-diffusion-webui; ./webui.sh
Start ComfyUI
~/stable-diffusion-webui/venv/bin/python main.py --listen
Start SD Forge
cd ~/stable-diffusion-webui-forge; ./webui.sh
Stop the instance
When you are done, stop the instance to avoid extra charges.
Open the Amazon EC2 console. Select your Stable Diffusion instance > Instance state > Stop instance.
Good luck.
No comments:
Post a Comment
А что вы думаете по этому поводу?