Skip to content

Commit

Permalink
February commit : Gradio version bump
Browse files Browse the repository at this point in the history
readme edit. run colab exclusion. httpx
  • Loading branch information
etherealxx committed Feb 9, 2024
1 parent c53252a commit 9aeaac0
Show file tree
Hide file tree
Showing 4 changed files with 90 additions and 80 deletions.
26 changes: 17 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,13 @@
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/etherealxx/volatile-concentration-localux-colab/blob/main/volatile_concentration_localux_colab.ipynb) <- Click here to access the colab
# Project VCL-Colab
Another camenduru colab ~~clone~~ alternative.😋
May only works for Colab Pro user.

Features:
- All camenduru colab flavor in one single colab
- Bypass the damned google colab warning (when it detects `stable-diffusion-webui` and `sd-webui` string)
- Option to choose model from a Gradio UI directly on Colab cell output
- Automatic update, synced in real time with Camenduru's repo
- ~~Bypass the damned google colab warning (when it detects `stable-diffusion-webui` and `sd-webui` string)~~ MAY NOT WORK ANYMORE.

The automatic update works by basically scraping from camenduru's repo, so it will automatically update the model list everytime camenduru's repo got new models.<br/>
As long as he doesn't change much on the repo code, this colab will always works without the need to maintain it.
Expand All @@ -22,12 +23,19 @@ Huge thanks to [camenduru](https://github.com/camenduru), without him this colab
Read [here](https://github.com/etherealxx/volatile-concentration-localux-colab/blob/main/error403guide.md) for guide to fix it.

### 🆙 Latest Update:
- 10/01/2024 (February): Gradio version bump to v3.41.2. Updated `choosemodel4.py` to exclude camenduru's 'run' colab. Added `httpx` pip install. Merging the September branch (lol i forgot)
- 10/09/2023 (September): Added `sd-webui-reactor` (roop alternative) as optional choosable extension. `additionalextensions.txt` now support running bash code if an extension is selected (mostly for dependencies).
- 12/08/2023 (August): Gradio version bump to v3.37.0 (fixing the bug where extension selection doesn't appear and when orange button is pressed, error JSON input will shows up). ~~gradio_client version bump to v0.2.10 to matches the Gradio version.~~
- 27/07/2023 (July): Memory fix. The sed lines are now synced with camenduru's repo.
- 22/07/2023 (July): Added a little bit of documentation on the colab notebook. Removed unused old scripts. Fixed bug where unticking `choose_model` while at the same time ticking `controlnet_models` on the notebook makes SD fails to launch. Now changing branch after running the main cell atleast once will preserve the previously downloaded models and generated outputs.
- 20/07/2023 (July): Added functionality for the extension installer where extension's branch and commits are choosable in `additionalextensions.txt`. Removed the whole `libtcmalloc` lines. Adjusted the way this colab gather the code with the recent changes.
- 10/07/2023 (July): Added `sd-webui-cutoff`, `sd-webui-infinite-image-browsing`, `ultimate-upscale-for-automatic1111`, and `adetailer` as optional choosable extension. Now optional extensions are stored on `additionalextensions.txt`. Now optional extensions are listed at the bottom of the extension checkboxes on the gradio UI.
- 07/07/2023 (July): Fixed some typo in the repo extract code (fixed lite branch). Added `torchmetrics==0.11.4` as an additional dependency for lite branch.
- 02/07/2023 (July): Bypass the new colab warning that detects `sd-webui` string.
- 16/06/2023 (June): Added `a1111-sd-webui-tagcomplete` and `composable-lora extension` as optional choosable extension. Fixed 'all extension is missing' bug.

- <details>
<summary>Older Updates</summary>

- 12/08/2023 (August): Gradio version bump to v3.37.0 (fixing the bug where extension selection doesn't appear and when orange button is pressed, error JSON input will shows up). ~~gradio_client version bump to v0.2.10 to matches the Gradio version.~~
- 27/07/2023 (July): Memory fix. The sed lines are now synced with camenduru's repo.
- 22/07/2023 (July): Added a little bit of documentation on the colab notebook. Removed unused old scripts. Fixed bug where unticking `choose_model` while at the same time ticking `controlnet_models` on the notebook makes SD fails to launch. Now changing branch after running the main cell atleast once will preserve the previously downloaded models and generated outputs.
- 20/07/2023 (July): Added functionality for the extension installer where extension's branch and commits are choosable in `additionalextensions.txt`. Removed the whole `libtcmalloc` lines. Adjusted the way this colab gather the code with the recent changes.
- 10/07/2023 (July): Added `sd-webui-cutoff`, `sd-webui-infinite-image-browsing`, `ultimate-upscale-for-automatic1111`, and `adetailer` as optional choosable extension. Now optional extensions are stored on `additionalextensions.txt`. Now optional extensions are listed at the bottom of the extension checkboxes on the gradio UI.
- 07/07/2023 (July): Fixed some typo in the repo extract code (fixed lite branch). Added `torchmetrics==0.11.4` as an additional dependency for lite branch.
- 02/07/2023 (July): Bypass the new colab warning that detects `sd-webui` string.
- 16/06/2023 (June): Added `a1111-sd-webui-tagcomplete` and `composable-lora extension` as optional choosable extension. Fixed 'all extension is missing' bug.
</details>

2 changes: 1 addition & 1 deletion additionalextensions.txt
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,4 @@ https://github.com/Bing-su/adetailer

#@otorre1's request, roop alternative
https://github.com/Gourieff/sd-webui-reactor
#run sd-webui-reactor pip install -q insightface==0.7.3 onnx==1.14.0 onnxruntime==1.15.0 opencv-python==4.7.0.72 tqdm
#run sd-webui-reactor pip install -q insightface==0.7.3 onnx "onnxruntime-gpu>=1.16.1" opencv-python tqdm
137 changes: 69 additions & 68 deletions choosemodel4.py
Original file line number Diff line number Diff line change
@@ -1,69 +1,70 @@
import os, math, subprocess, pickle, sys

branchtype = 'lite'

if len(sys.argv) == 2:
branchargs = sys.argv[1]
branchtype = branchargs

import gradio as gr

# subprocess.run("apt -y install -qq aria2", shell=True, check=True)

everycolab = f'/content/camendurus/{branchtype}'
everycolabname = []
colabnamepair = []
for colabname in os.listdir(everycolab):
colabnamepruned = colabname.partition('_webui_colab.ipynb')[0]
everycolabname.append(colabnamepruned)

sortedcolabname = sorted(everycolabname)

vclvarpath = '/content/vclvariables'
def pickledump(vartodump, outputfile):
outputpath = os.path.join(vclvarpath, outputfile + '.pkl')
with open(outputpath, 'wb') as f:
pickle.dump(vartodump, f)

# 'sortedcolabname' will be accessed by the main colab notebook
pickledump(sortedcolabname, 'sortedcolabname')


# totalcolabcount = len(everycolabname)
# for i, colabname in enumerate(sortedcolabname):
# halfall = math.ceil(totalcolabcount / 2)
# numberedname = "{} | {}".format(i, colabname.ljust(30))
# if i <= halfall:
# colabnamepair.append(numberedname)
# else:
# rev_index = (i - halfall) - 1
# colabnamepair[rev_index] += "\t" + numberedname

# for colabpair in colabnamepair:
# print(colabpair)

# chosencolabname = ''

# while True:
# choosenumber = input('Choose the number of the model you want: ')
# if choosenumber.isdigit() and int(choosenumber) < totalcolabcount:
# chosencolabname = sortedcolabname[int(choosenumber)] + '_webui_colab.ipynb'
# print("Model from " + chosencolabname + " will be downloaded immediately after all the dependencies is installed. Please wait")
# break
# elif choosenumber == '':
# print("No model will be pre-downloaded. Dependencies installation will continue.")
# break

# aria2c_lines = []

# if chosencolabname:
# if os.path.exists(os.path.join(everycolab, chosencolabname)):
# with open(os.path.join(everycolab, chosencolabname), 'r', encoding='utf-8') as f:
# for line in f:
# stripped_line = line.strip()
# if stripped_line.startswith('"!aria2c'):
# aria2c_lines.append(stripped_line)

# if aria2c_lines:
# with open('/content/arialist.pkl', 'wb') as f:
import os, math, subprocess, pickle, sys

branchtype = 'lite'

if len(sys.argv) == 2:
branchargs = sys.argv[1]
branchtype = branchargs

import gradio as gr

# subprocess.run("apt -y install -qq aria2", shell=True, check=True)

everycolab = f'/content/camendurus/{branchtype}'
everycolabname = []
colabnamepair = []
for colabname in os.listdir(everycolab):
if not colabname.endswith('_run.ipynb'):
colabnamepruned = colabname.partition('_webui_colab.ipynb')[0]
everycolabname.append(colabnamepruned)

sortedcolabname = sorted(everycolabname)

vclvarpath = '/content/vclvariables'
def pickledump(vartodump, outputfile):
outputpath = os.path.join(vclvarpath, outputfile + '.pkl')
with open(outputpath, 'wb') as f:
pickle.dump(vartodump, f)

# 'sortedcolabname' will be accessed by the main colab notebook
pickledump(sortedcolabname, 'sortedcolabname')


# totalcolabcount = len(everycolabname)
# for i, colabname in enumerate(sortedcolabname):
# halfall = math.ceil(totalcolabcount / 2)
# numberedname = "{} | {}".format(i, colabname.ljust(30))
# if i <= halfall:
# colabnamepair.append(numberedname)
# else:
# rev_index = (i - halfall) - 1
# colabnamepair[rev_index] += "\t" + numberedname

# for colabpair in colabnamepair:
# print(colabpair)

# chosencolabname = ''

# while True:
# choosenumber = input('Choose the number of the model you want: ')
# if choosenumber.isdigit() and int(choosenumber) < totalcolabcount:
# chosencolabname = sortedcolabname[int(choosenumber)] + '_webui_colab.ipynb'
# print("Model from " + chosencolabname + " will be downloaded immediately after all the dependencies is installed. Please wait")
# break
# elif choosenumber == '':
# print("No model will be pre-downloaded. Dependencies installation will continue.")
# break

# aria2c_lines = []

# if chosencolabname:
# if os.path.exists(os.path.join(everycolab, chosencolabname)):
# with open(os.path.join(everycolab, chosencolabname), 'r', encoding='utf-8') as f:
# for line in f:
# stripped_line = line.strip()
# if stripped_line.startswith('"!aria2c'):
# aria2c_lines.append(stripped_line)

# if aria2c_lines:
# with open('/content/arialist.pkl', 'wb') as f:
# pickle.dump(aria2c_lines, f)
5 changes: 3 additions & 2 deletions volatile_concentration_localux_colab.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
"source": [
"##***Project `VCL-colab`***\n",
"### All camenduru colab in one spot, synced in realtime\n",
"###### Last time updated: Sep 10, 23\n",
"###### Last time updated: Feb 10, 24\n",
"###### (something doesn't work properly? Make sure you use the [latest version](https://colab.research.google.com/github/etherealxx/volatile-concentration-localux-colab/blob/main/volatile_concentration_localux_colab.ipynb), or [report a bug](https://github.com/etherealxx/volatile-concentration-localux-colab/issues).)"
]
},
Expand Down Expand Up @@ -82,7 +82,8 @@
"\n",
"%env TF_CPP_MIN_LOG_LEVEL=1\n",
"!git clone https://github.com/etherealxx/volatile-concentration-localux-colab /content/vcltools\n",
"!pip install -q gradio==3.37.0\n",
"!pip install -q gradio==3.41.2\n",
"!pip install -q pastebin-replace httpx==0.24.1\n",
"\n",
"emptymodel = False\n",
"\n",
Expand Down

0 comments on commit 9aeaac0

Please sign in to comment.