Runpod Documentation home page
Search...
⌘K
Console
Blog
Support
Introduction
Overview
Containers
Serverless
Run your first serverless endpoint with Stable Diffusion
Generate images with SDXL Turbo
Run Google'S Gemma Model
Run an Ollama Server on a Runpod CPU
Pods
Generate images with ComfyUI
NEW
Run your first Fast Stable Diffusion with Jupyter Notebook
Run Fooocus in Jupyter Notebook
Set up Ollama on your GPU Pod
Build Docker Images on Runpod with Bazel
Fine tune an LLM with Axolotl on Runpod
SDKs
Python
Migrations
Banana
Cog
OpenAI
Runpod Documentation home page
Search...
⌘K
Sign up
runpod/docs
runpod/docs
Search...
Navigation
Pods
Run your first Fast Stable Diffusion with Jupyter Notebook
Documentation
Tutorials
SDKs
CLI
API
Documentation
Tutorials
SDKs
CLI
API
Sign up
runpod/docs
On this page
Overview
Prerequisites
Runpod infrastructure
Run the notebook
Launch Automatic1111 on your pod
Explore Stable-Diffusion
Pods
Run your first Fast Stable Diffusion with Jupyter Notebook
Overview
By the end of this tutorial, you’ll have deployed a Jupyter Notebook to Runpod, deployed an instance of Stable Diffusion, and generated your first image.
Time to complete: ~20 minutes
Prerequisites
Hugging Face user access token
Runpod infrastructure
Select
Runpod Fast Stable Diffusion
Choose 1x RTX A5000 or 1x RTX 3090
Select
Start Jupyter Notebook
Deploy.
Run the notebook
Select
RNPD-A1111.ipynb
Enter Hugging Face user access token
Select the model you want to run:
v.1.5
|
v2-512
|
v2-768
Launch Automatic1111 on your pod
The cell labeled
Start Stable-Diffusion
will launch your pod.
(optional) Provide login credentials for this instance.
Select the blue link ending in
.proxy.runpod.net
Explore Stable-Diffusion
Now that your pod is up and running Stable-Diffusion.
Explore and run the model.
Was this page helpful?
Yes
No
Suggest edits
Generate images with ComfyUI
Run Fooocus in Jupyter Notebook
Assistant
Responses are generated using AI and may contain mistakes.