How to merge your Tensor Art LoRas using Single Value Decomposition (SVD) [Kaggle Notebook + code]


Updated:

What is this about? :

I coded a notebook in Kaggle which allows you to change your Tensor Art FLUX/SD LoRa into new rank to reduce filesize

This notebook will also filter out noise, change scale , and merge 3 LoRas into 1 using TIES merge and Single Value Decomposition (SVD).

Link: https://www.kaggle.com/code/nekos4lyfe/ties-svd-merge-rank-scale-lora-rearrangement

Feel free to copy , and adapt the code in this Notebook for your purposes as you see fit.

Also; Kaggle offers a very generous quota of storage on their servers (220 GB!). Storing backup of your LoRa models on Kaggle can be a good solution if you wish to tinker with them.

For further info on LoRa/neural net merge methods I can recommend this article: https://huggingface.co/blog/peft_merging

//--------//

What can I use this notebook for? :

Example 1) a 600MB LoRa at rank 64 can be cast to a 300MB lora at rank 32 with the same type of output.

Example 2) Three LoRas of any kind of rank can be merged into a single LoRa using the TIES algorithm - which is an loose abbreviation for 'Trim Elect Sign & Merge'

The TIES method is explained in this paper: https://arxiv.org/pdf/2306.01708

This Notebook is only compatible with LoRa trained using the AdamWBit method on Tensor Art or Civitai or similiar places that host online training.

Ideally , you can train LoRa in 64 rank with alpha set to 32 , then cast them to 32 rank using this notebook. This notebook uses a GPU , so make sure to connect to a GPU instance like 'GPU P100' prior to running this notebook.

This is a very short article. I plan on writing a more extensive article in the future to showcase this + a proper guide the FLUX LoRa training methods using composite images , but the tool is here for those who need it.

If you encounter problems in the notebook , write them here and I will fix it.

Cheers,

Adcom

0