Skip to content

Instantly share code, notes, and snippets.

@AdamOswald
Last active February 2, 2023 16:13
Show Gist options
  • Select an option

  • Save AdamOswald/6030606e4eb31a8aab1ab299bceeb3de to your computer and use it in GitHub Desktop.

Select an option

Save AdamOswald/6030606e4eb31a8aab1ab299bceeb3de to your computer and use it in GitHub Desktop.
finetuned-diffusion-gradio.ipynb
Display the source blob
Display the rendered blob
Raw
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Display the source blob
Display the rendered blob
Raw
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"provenance": [],
"name": "finetuned-diffusion-gradio.ipynb",
"include_colab_link": true
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"name": "python"
},
"gpuClass": "standard",
"accelerator": "GPU"
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/gist/AdamOswald/6030606e4eb31a8aab1ab299bceeb3de/copy-of-fine-tuned-diffusion-gradio.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"source": [
"**Finetuned Diffusion demo 🪄🖼️** \n",
"\n",
"With this app you can run multiple fine-tuned Stable Diffusion models, trained on different styles: [Arcane](https://huggingface.co/nitrosocke/Arcane-Diffusion), [Archer](https://huggingface.co/nitrosocke/archer-diffusion), [Elden Ring](https://huggingface.co/nitrosocke/elden-ring-diffusion), [Spider-Verse](https://huggingface.co/nitrosocke/spider-verse-diffusion), [Modern Disney](https://huggingface.co/nitrosocke/modern-disney-diffusion), [Classic Disney](https://huggingface.co/nitrosocke/classic-anim-diffusion), [Waifu](https://huggingface.co/hakurei/waifu-diffusion), [Pokémon](https://huggingface.co/lambdalabs/sd-pokemon-diffusers), [Pony Diffusion](https://huggingface.co/AstraliteHeart/pony-diffusion), [Robo Diffusion](https://huggingface.co/nousr/robo-diffusion), [Cyberpunk Anime](https://huggingface.co/DGSpitzer/Cyberpunk-Anime-Diffusion), [Tron Legacy](https://huggingface.co/dallinmackay/Tron-Legacy-diffusion) + any other custom Diffusers 🧨 SD model hosted on HuggingFace 🤗."
],
"metadata": {
"id": "oBB6SkL2REFn"
}
},
{
"cell_type": "markdown",
"source": [
"### 1. Install dependencies"
],
"metadata": {
"id": "2o-2jPqITvtD"
}
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "8WYSV-73rslm",
"colab": {
"base_uri": "https://localhost:8080/"
},
"outputId": "88a495c0-9f11-4324-ad77-504c1d067ad3"
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Cloning into 'finetuned_diffusion'...\n",
"remote: Enumerating objects: 180, done.\u001b[K\n",
"remote: Counting objects: 100% (172/172), done.\u001b[K\n",
"remote: Compressing objects: 100% (168/168), done.\u001b[K\n",
"remote: Total 180 (delta 97), reused 0 (delta 0), pack-reused 8\u001b[K\n",
"Receiving objects: 100% (180/180), 116.82 KiB | 2.12 MiB/s, done.\n",
"Resolving deltas: 100% (97/97), done.\n",
"/content/finetuned_diffusion\n",
"Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/, https://download.pytorch.org/whl/cu113\n",
"Collecting git+https://github.com/huggingface/diffusers.git (from -r requirements.txt (line 5))\n",
" Cloning https://github.com/huggingface/diffusers.git to /tmp/pip-req-build-ajy918qn\n",
" Running command git clone --filter=blob:none --quiet https://github.com/huggingface/diffusers.git /tmp/pip-req-build-ajy918qn\n",
" Resolved https://github.com/huggingface/diffusers.git to commit 68ef0666e22a66a5ae6cb57104a8213e85b4be38\n",
" Installing build dependencies ... \u001b[?25l\u001b[?25hdone\n",
" Getting requirements to build wheel ... \u001b[?25l\u001b[?25hdone\n",
" Preparing metadata (pyproject.toml) ... \u001b[?25l\u001b[?25hdone\n",
"Collecting git+https://github.com/huggingface/transformers (from -r requirements.txt (line 7))\n",
" Cloning https://github.com/huggingface/transformers to /tmp/pip-req-build-u8pb7a7s\n",
" Running command git clone --filter=blob:none --quiet https://github.com/huggingface/transformers /tmp/pip-req-build-u8pb7a7s\n",
" Resolved https://github.com/huggingface/transformers to commit e006ab51acdccab2476fdf80ab9afda66a0f510f\n",
" Installing build dependencies ... \u001b[?25l\u001b[?25hdone\n",
" Getting requirements to build wheel ... \u001b[?25l\u001b[?25hdone\n",
" Preparing metadata (pyproject.toml) ... \u001b[?25l\u001b[?25hdone\n",
"Collecting xformers==0.0.15.dev0+4c06c79.d20221205\n",
" Downloading https://github.com/camenduru/stable-diffusion-webui-colab/releases/download/0.0.15/xformers-0.0.15.dev0+4c06c79.d20221205-cp38-cp38-linux_x86_64.whl (110.0 MB)\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m110.0/110.0 MB\u001b[0m \u001b[31m7.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[?25hRequirement already satisfied: torch in /usr/local/lib/python3.8/dist-packages (from -r requirements.txt (line 2)) (1.13.1+cu116)\n",
"Collecting torchvision==0.13.1+cu113\n",
" Downloading https://download.pytorch.org/whl/cu113/torchvision-0.13.1%2Bcu113-cp38-cp38-linux_x86_64.whl (23.4 MB)\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m23.4/23.4 MB\u001b[0m \u001b[31m56.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[?25hRequirement already satisfied: scipy in /usr/local/lib/python3.8/dist-packages (from -r requirements.txt (line 8)) (1.7.3)\n",
"Collecting ftfy\n",
" Downloading ftfy-6.1.1-py3-none-any.whl (53 kB)\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m53.1/53.1 KB\u001b[0m \u001b[31m3.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[?25hRequirement already satisfied: psutil in /usr/local/lib/python3.8/dist-packages (from -r requirements.txt (line 10)) (5.4.8)\n",
"Collecting accelerate==0.12.0\n",
" Downloading accelerate-0.12.0-py3-none-any.whl (143 kB)\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m144.0/144.0 KB\u001b[0m \u001b[31m10.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[?25hCollecting triton==2.0.0.dev20220701\n",
" Downloading triton-2.0.0.dev20220701-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.3 MB)\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m18.3/18.3 MB\u001b[0m \u001b[31m92.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[?25hRequirement already satisfied: requests in /usr/local/lib/python3.8/dist-packages (from torchvision==0.13.1+cu113->-r requirements.txt (line 3)) (2.25.1)\n",
"Requirement already satisfied: pillow!=8.3.*,>=5.3.0 in /usr/local/lib/python3.8/dist-packages (from torchvision==0.13.1+cu113->-r requirements.txt (line 3)) (7.1.2)\n",
"Requirement already satisfied: numpy in /usr/local/lib/python3.8/dist-packages (from torchvision==0.13.1+cu113->-r requirements.txt (line 3)) (1.21.6)\n",
"Collecting torch\n",
" Downloading https://download.pytorch.org/whl/cu113/torch-1.12.1%2Bcu113-cp38-cp38-linux_x86_64.whl (1837.7 MB)\n",
"\u001b[2K \u001b[91m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[91m╸\u001b[0m \u001b[32m1.8/1.8 GB\u001b[0m \u001b[31m93.1 MB/s\u001b[0m eta \u001b[36m0:00:01\u001b[0mtcmalloc: large alloc 1837744128 bytes == 0x12116000 @ 0x7f2d2dc05680 0x7f2d2dc26824 0x5b3128 0x5bbc90 0x5f714c 0x64d800 0x527022 0x504866 0x56bbe1 0x569d8a 0x5f60c3 0x56bbe1 0x569d8a 0x5f60c3 0x56bbe1 0x569d8a 0x5f60c3 0x56bbe1 0x569d8a 0x5f60c3 0x56bbe1 0x569d8a 0x5f60c3 0x56bbe1 0x5f5ee6 0x56bbe1 0x569d8a 0x5f60c3 0x56cc92 0x569d8a 0x5f60c3\n",
"\u001b[2K \u001b[91m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[91m╸\u001b[0m \u001b[32m1.8/1.8 GB\u001b[0m \u001b[31m99.0 MB/s\u001b[0m eta \u001b[36m0:00:01\u001b[0mtcmalloc: large alloc 2297184256 bytes == 0x7f9b2000 @ 0x7f2d2dc05680 0x7f2d2dc25da2 0x5f714c 0x64d800 0x527022 0x504866 0x56bbe1 0x569d8a 0x5f60c3 0x56bbe1 0x569d8a 0x5f60c3 0x56bbe1 0x569d8a 0x5f60c3 0x56bbe1 0x569d8a 0x5f60c3 0x56bbe1 0x569d8a 0x5f60c3 0x56bbe1 0x5f5ee6 0x56bbe1 0x569d8a 0x5f60c3 0x56cc92 0x569d8a 0x5f60c3 0x56bbe1 0x569d8a\n",
"tcmalloc: large alloc 1837744128 bytes == 0x12116000 @ 0x7f2d2dc05680 0x7f2d2dc26824 0x5f97c1 0x5f8ecc 0x504866 0x56bbe1 0x569d8a 0x5f60c3 0x56bbe1 0x569d8a 0x5f60c3 0x50b32c 0x5f6b7b 0x66731d 0x5f6706 0x571143 0x50b22e 0x570b82 0x569d8a 0x50b3a0 0x570b82 0x569d8a 0x50b3a0 0x56cc92 0x501044 0x56be83 0x501044 0x56be83 0x501044 0x56be83 0x5f5ee6\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m1.8/1.8 GB\u001b[0m \u001b[31m952.7 kB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[?25hRequirement already satisfied: typing-extensions in /usr/local/lib/python3.8/dist-packages (from torchvision==0.13.1+cu113->-r requirements.txt (line 3)) (4.4.0)\n",
"Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.8/dist-packages (from accelerate==0.12.0->-r requirements.txt (line 11)) (21.3)\n",
"Requirement already satisfied: pyyaml in /usr/local/lib/python3.8/dist-packages (from accelerate==0.12.0->-r requirements.txt (line 11)) (6.0)\n",
"Requirement already satisfied: filelock in /usr/local/lib/python3.8/dist-packages (from triton==2.0.0.dev20220701->-r requirements.txt (line 14)) (3.9.0)\n",
"Requirement already satisfied: cmake in /usr/local/lib/python3.8/dist-packages (from triton==2.0.0.dev20220701->-r requirements.txt (line 14)) (3.22.6)\n",
"Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.8/dist-packages (from diffusers==0.13.0.dev0->-r requirements.txt (line 5)) (2022.6.2)\n",
"Requirement already satisfied: importlib-metadata in /usr/local/lib/python3.8/dist-packages (from diffusers==0.13.0.dev0->-r requirements.txt (line 5)) (6.0.0)\n",
"Collecting huggingface-hub>=0.10.0\n",
" Downloading huggingface_hub-0.12.0-py3-none-any.whl (190 kB)\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m190.3/190.3 KB\u001b[0m \u001b[31m23.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[?25hRequirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.8/dist-packages (from transformers==4.27.0.dev0->-r requirements.txt (line 7)) (4.64.1)\n",
"Collecting tokenizers!=0.11.3,<0.14,>=0.11.1\n",
" Downloading tokenizers-0.13.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (7.6 MB)\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m7.6/7.6 MB\u001b[0m \u001b[31m104.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[?25hRequirement already satisfied: wcwidth>=0.2.5 in /usr/local/lib/python3.8/dist-packages (from ftfy->-r requirements.txt (line 9)) (0.2.5)\n",
"Collecting pyre-extensions==0.0.23\n",
" Downloading pyre_extensions-0.0.23-py3-none-any.whl (11 kB)\n",
"Collecting typing-inspect\n",
" Downloading typing_inspect-0.8.0-py3-none-any.whl (8.7 kB)\n",
"Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging>=20.0->accelerate==0.12.0->-r requirements.txt (line 11)) (3.0.9)\n",
"Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.8/dist-packages (from importlib-metadata->diffusers==0.13.0.dev0->-r requirements.txt (line 5)) (3.11.0)\n",
"Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.8/dist-packages (from requests->torchvision==0.13.1+cu113->-r requirements.txt (line 3)) (2022.12.7)\n",
"Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.8/dist-packages (from requests->torchvision==0.13.1+cu113->-r requirements.txt (line 3)) (2.10)\n",
"Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.8/dist-packages (from requests->torchvision==0.13.1+cu113->-r requirements.txt (line 3)) (1.24.3)\n",
"Requirement already satisfied: chardet<5,>=3.0.2 in /usr/local/lib/python3.8/dist-packages (from requests->torchvision==0.13.1+cu113->-r requirements.txt (line 3)) (4.0.0)\n",
"Collecting mypy-extensions>=0.3.0\n",
" Downloading mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)\n",
"Building wheels for collected packages: diffusers, transformers\n",
" Building wheel for diffusers (pyproject.toml) ... \u001b[?25l\u001b[?25hdone\n",
" Created wheel for diffusers: filename=diffusers-0.13.0.dev0-py3-none-any.whl size=614289 sha256=0178635fa370390cc252c817de8c65efa59ca981d1163d1c3ce1d2e5efa0ba76\n",
" Stored in directory: /tmp/pip-ephem-wheel-cache-jjd1bx2_/wheels/28/16/cf/d8d37579fd1e7edb978252d850ec9328b055a7582ddfae3b87\n",
" Building wheel for transformers (pyproject.toml) ... \u001b[?25l\u001b[?25hdone\n",
" Created wheel for transformers: filename=transformers-4.27.0.dev0-py3-none-any.whl size=6387991 sha256=2fac3ecc2298af3b12fe450f41051ac83efbaf5f3ee57c413a349ed2f05cb327\n",
" Stored in directory: /tmp/pip-ephem-wheel-cache-jjd1bx2_/wheels/42/68/45/c63edff61c292f2dfd4df4ef6522dcbecc603e7af82813c1d7\n",
"Successfully built diffusers transformers\n"
]
}
],
"source": [
"!git clone https://huggingface.co/spaces/AdamOswald1/finetuned_diffusion\n",
"%cd finetuned_diffusion/\n",
"!pip install -r requirements.txt\n",
"!pip install gradio==3.17.1\n",
"\n",
"!pip uninstall -y xformers\n",
"!pip install -q https://github.com/camenduru/stable-diffusion-webui-colab/releases/download/0.0.15/xformers-0.0.15.dev0+1515f77.d20221130-cp38-cp38-linux_x86_64.whl"
]
},
{
"cell_type": "markdown",
"source": [
"### 2. Run\n",
"\n",
"Expand this cell and run it. After it finishes loading, **open the generated gradio public link** (e.g. https://xxxx.gradio.app) in the output.\n",
"\n",
"When running in a colab, models need to be downloaded the first time you use them, so give it some time."
],
"metadata": {
"id": "g51-cBLlTsO4"
}
},
{
"cell_type": "code",
"source": [
"!python app.py"
],
"metadata": {
"id": "dMhWUqKdNzn3"
},
"execution_count": null,
"outputs": []
}
]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment