【AIGC】深入理解 LORA模型-程序员宅基地

技术标签: 机器学习  计算机视觉  人工智能  AIGC  

深入理解 LORA模型

LORA模型是一种神经网络模型,它通过学习可以自动调整神经网络中各层之间的权重,以提高模型的性能。本文将深入探讨LORA模型的原理、应用场景、优缺点等方面。

1. LORA模型的原理

LORA模型的全称为Learnable Re-Weighting,即可学习的重加权模型。它主要是通过对神经网络中各层之间的权重进行学习,来提高模型的性能。具体来说,LORA模型通过学习到前一层和后一层之间的相关性,来自动调整当前层的权重,从而提高模型的性能。

LORA模型的基本思想是,将神经网络中的每一层看做是一个可加权的特征提取器,每一层的权重决定了它对模型输出的影响。LORA模型通过对权重的调整,可以让不同的层在不同的任务上发挥更好的作用。

在LORA模型中,每一层的权重由两个部分组成:上一层和下一层的权重。具体来说,假设当前层为第 i i i层,上一层为第 i − 1 i-1 i1层,下一层为第 i + 1 i+1 i+1层,则当前层的权重可以表示为:

w i = α i ⋅ W i − 1 ⋅ W i + 1 w_i = \alpha_i \cdot W_{i-1} \cdot W_{i+1} wi=αiWi1Wi+1

其中, α i \alpha_i αi为学习到的可学习参数, W i − 1 W_{i-1} Wi1 W i + 1 W_{i+1} Wi+1分别为上一层和下一层的权重。通过对 α i \alpha_i αi的学习,LORA模型可以自动调整当前层的权重,从而提高模型的性能。

2. LORA模型的应用场景

LORA模型的应用场景比较广泛,主要应用在需要处理复杂数据的场景中,例如自然语言处理、计算机视觉等。在自然语言处理领域,LORA模型可以通过学习上下文语境来提高文本分类、情感分析等任务的性能。在计算机视觉领域,LORA模型可以通过学习不同层之间的相关性来提高图像分类、物体检测等任务的性能。

3. LORA模型的优缺点

LORA模型的主要优点是可以自动学习各层之间的相关性,从而提高模型的性能。与传统的手动调整权重不同,LORA模型可以通过学习数据来自动调整权重,避免了人为调整权重带来的局限性。

然而,LORA模型也存在一些缺点。首先,LORA模型的假设每一层只受到前一层和后一层的影响,这在某些情况下可能会导致一些问题,但是在某些应用中,这种假设可以简化模型的设计和实现。

在LORA模型中,每一层都有一个与之对应的上层权重和下层权重,这些权重可以通过学习来得到。在训练过程中,LORA模型会自动地调整这些权重,从而使得模型更加准确地学习到数据中的特征。

LORA模型的实现过程相对简单,只需要对模型中的每一层进行重加权操作,即对上层权重和下层权重进行加权相乘,得到新的权重,然后用这些新的权重来更新模型。这种重加权操作可以用PyTorch框架中的torch.mm()函数来实现。

总的来说,LORA模型是一种简单而有效的可学习重加权模型,能够在某些应用中显著提高模型的表现。但是,由于其假设的局限性,LORA模型可能不适用于某些数据集和应用场景。

4. LORA 模型组成

一个 LORA 模型,它由三部分组成:lora_down.weightlora_up.weightalpha
其中 lora_down.weightlora_up.weight 是 LORA 模型中的上下层权重,而 alpha 是权重更新时的缩放系数。

5. 命名方式

这些是PyTorch模型中各个层的权重和偏置的命名。在PyTorch中,每个层的权重和偏置都存储在一个名为state_dict的字典中。这些命名规则通常是由层的类型和层在模型中的位置决定的。例如,lora_te_text_model_encoder_layers_9_self_attn_k_proj.lora_up.weight表示LORA模型中第9个自注意力层的K投影层的上行权重。

6. LORA模型key实例

这些key用于在PyTorch中加载和保存LORA模型的权重参数。每个key都与LORA模型中的一个权重张量相关联。

# LORA模型中的权重参数的key

- lora_te_text_model_encoder_layers_0_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_0_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_0_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_0_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_0_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_0_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_0_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_0_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_0_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_0_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_0_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_0_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_0_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_0_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_0_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_0_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_0_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_0_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_10_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_10_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_10_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_10_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_10_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_10_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_10_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_10_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_10_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_10_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_10_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_10_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_10_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_10_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_10_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_10_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_10_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_10_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_11_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_11_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_11_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_11_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_11_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_11_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_11_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_11_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_11_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_11_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_11_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_11_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_11_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_11_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_11_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_11_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_11_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_11_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_1_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_1_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_1_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_1_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_1_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_1_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_1_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_1_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_1_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_1_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_1_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_1_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_1_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_1_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_1_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_1_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_1_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_1_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_2_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_2_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_2_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_2_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_2_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_2_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_2_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_2_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_2_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_2_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_2_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_2_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_2_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_2_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_2_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_2_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_2_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_2_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_3_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_3_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_3_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_3_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_3_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_3_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_3_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_3_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_3_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_3_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_3_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_3_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_3_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_3_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_3_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_3_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_3_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_3_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_4_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_4_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_4_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_4_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_4_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_4_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_4_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_4_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_4_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_4_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_4_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_4_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_4_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_4_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_4_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_4_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_4_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_4_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_5_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_5_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_5_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_5_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_5_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_5_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_5_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_5_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_5_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_5_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_5_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_5_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_5_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_5_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_5_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_5_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_5_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_5_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_6_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_6_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_6_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_6_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_6_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_6_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_6_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_6_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_6_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_6_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_6_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_6_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_6_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_6_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_6_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_6_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_6_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_6_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_7_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_7_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_7_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_7_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_7_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_7_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_7_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_7_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_7_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_7_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_7_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_7_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_7_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_7_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_7_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_7_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_7_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_7_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_8_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_8_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_8_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_8_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_8_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_8_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_8_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_8_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_8_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_8_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_8_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_8_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_8_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_8_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_8_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_8_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_8_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_8_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_9_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_9_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_9_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_9_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_9_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_9_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_9_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_9_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_9_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_9_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_9_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_9_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_9_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_9_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_9_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_9_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_9_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_9_self_attn_v_proj.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_proj_in.alpha.
- lora_unet_down_blocks_0_attentions_0_proj_in.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_proj_in.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_proj_out.alpha.
- lora_unet_down_blocks_0_attentions_0_proj_out.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_proj_out.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_proj_in.alpha.
- lora_unet_down_blocks_0_attentions_1_proj_in.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_proj_in.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_proj_out.alpha.
- lora_unet_down_blocks_0_attentions_1_proj_out.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_proj_out.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_proj_in.alpha.
- lora_unet_down_blocks_1_attentions_0_proj_in.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_proj_in.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_proj_out.alpha.
- lora_unet_down_blocks_1_attentions_0_proj_out.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_proj_out.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_proj_in.alpha.
- lora_unet_down_blocks_1_attentions_1_proj_in.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_proj_in.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_proj_out.alpha.
- lora_unet_down_blocks_1_attentions_1_proj_out.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_proj_out.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_proj_in.alpha.
- lora_unet_down_blocks_2_attentions_0_proj_in.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_proj_in.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_proj_out.alpha.
- lora_unet_down_blocks_2_attentions_0_proj_out.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_proj_out.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_proj_in.alpha.
- lora_unet_down_blocks_2_attentions_1_proj_in.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_proj_in.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_proj_out.alpha.
- lora_unet_down_blocks_2_attentions_1_proj_out.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_proj_out.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_mid_block_attentions_0_proj_in.alpha.
- lora_unet_mid_block_attentions_0_proj_in.lora_down.weight.
- lora_unet_mid_block_attentions_0_proj_in.lora_up.weight.
- lora_unet_mid_block_attentions_0_proj_out.alpha.
- lora_unet_mid_block_attentions_0_proj_out.lora_down.weight.
- lora_unet_mid_block_attentions_0_proj_out.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_proj_in.alpha.
- lora_unet_up_blocks_1_attentions_0_proj_in.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_proj_in.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_proj_out.alpha.
- lora_unet_up_blocks_1_attentions_0_proj_out.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_proj_out.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_proj_in.alpha.
- lora_unet_up_blocks_1_attentions_1_proj_in.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_proj_in.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_proj_out.alpha.
- lora_unet_up_blocks_1_attentions_1_proj_out.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_proj_out.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_proj_in.alpha.
- lora_unet_up_blocks_1_attentions_2_proj_in.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_proj_in.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_proj_out.alpha.
- lora_unet_up_blocks_1_attentions_2_proj_out.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_proj_out.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_proj_in.alpha.
- lora_unet_up_blocks_2_attentions_0_proj_in.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_proj_in.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_proj_out.alpha.
- lora_unet_up_blocks_2_attentions_0_proj_out.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_proj_out.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_proj_in.alpha.
- lora_unet_up_blocks_2_attentions_1_proj_in.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_proj_in.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_proj_out.alpha.
- lora_unet_up_blocks_2_attentions_1_proj_out.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_proj_out.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_proj_in.alpha.
- lora_unet_up_blocks_2_attentions_2_proj_in.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_proj_in.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_proj_out.alpha.
- lora_unet_up_blocks_2_attentions_2_proj_out.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_proj_out.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_proj_in.alpha.
- lora_unet_up_blocks_3_attentions_0_proj_in.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_proj_in.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_proj_out.alpha.
- lora_unet_up_blocks_3_attentions_0_proj_out.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_proj_out.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_proj_in.alpha.
- lora_unet_up_blocks_3_attentions_1_proj_in.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_proj_in.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_proj_out.alpha.
- lora_unet_up_blocks_3_attentions_1_proj_out.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_proj_out.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_proj_in.alpha.
- lora_unet_up_blocks_3_attentions_2_proj_in.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_proj_in.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_proj_out.alpha.
- lora_unet_up_blocks_3_attentions_2_proj_out.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_proj_out.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.lora_up.weight.

版权声明:本文为博主原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。
本文链接:https://blog.csdn.net/qq_44824148/article/details/130522248

智能推荐

Oracle完全卸载_oracle完整卸载-程序员宅基地

文章浏览阅读2k次。Oracle完全卸载 系统环境: 1、操作系统:Windows 2000 Server,机器内存128M 2、数据库: Oracle 8i R2 (8.1.6) for NT 企业版 3、安装路径:D:/ORACLE 卸载步骤: 1、开始->设置->控制面板->管理工具->服务->停止所有Oracle服务。 2、开始->程序->Oracle - OraHome81->Oracle Instal_oracle完整卸载

linux 一些命令 ftp 配置等_级别3,4,5httpd ftpd-程序员宅基地

文章浏览阅读1.4w次。文章出自http://www.cnblogs.com/xia520pi/archive/2012/05/16/2503864.htmlHadoop集群(第3期)_VSFTP安装配置  1、VSFTP简介  VSFTP是一个基于GPL发布的类Unix系统上使用的FTP服务器软件,它的全称是Very Secure FTP 从此名称可以看出来,编制者的初衷是代码的安全。_级别3,4,5httpd ftpd

php安装redis扩展,以及遇到的坑_php8 ext/standard/php_smart_str.h-程序员宅基地

文章浏览阅读1k次。先说问题吧php 安装redis扩展ext/standard/php_smart_str.h: No such file or directory按照网上的教程安装了redis的扩展,执行到 编译安装 make && make install 时,报出了这样的错误,是因为扩展不支持php7,需要我们在 github 上拉 php7 的 redis扩展git clone htt..._php8 ext/standard/php_smart_str.h

达梦数据库卸载_达梦卸载报错exception in thread "main" org.eclipse.swt.s-程序员宅基地

文章浏览阅读2.1k次。./uninstall.sh -i_达梦卸载报错exception in thread "main" org.eclipse.swt.swterror: no more han

电子电路基础——知识点(上篇)-程序员宅基地

文章浏览阅读4.8w次,点赞253次,收藏1.5k次。第一章 电路的基本概念与基本定律一、基本概念电路: 是由“用电设备或元器件(负载)”与“供电设备(电源)”,通过导线连接而构成的提供给“电荷流动”的通路。电路的组成电源:为电路工作提供能量;用电设备/元器件:在电能作用下完成电路功能;导线:连接电源和用电设备;开关:控制电源的接入等;电路的功能能量传输: 将电源的电能传输给用电设备(负载);能量转换: 将传输到负载的电能根据需要转换成其它形式的能量,如:光、声、热、机械能等;信息传输: 信息——(载体)——信号——电路_电子电路基础

webstorm开发微信小程序-程序员宅基地

文章浏览阅读6k次,点赞2次,收藏3次。寻寻觅觅,冷冷清清。找了那么多方式和插件,唯独这个感觉最好github地址:https://github.com/zxj5470/wxapp-intellij_webstorm开发微信小程序

随便推点

基于STM32单片机智能家居无线315M门铃叮咚门铃系统毕业设计DIY89_stm32 315m 编码-程序员宅基地

文章浏览阅读350次,点赞8次,收藏8次。基于STM32单片机无线超再生模块315M门铃叮咚门铃系统设计315M超再生无线收发/DIY开发板套件89。_stm32 315m 编码

Ajax异步加载技术-程序员宅基地

文章浏览阅读469次。一、Ajax异步加载技术。_ajax异步加载

PyTorch 使用Visdom训练数据可视化_pytorch 结果可视化-程序员宅基地

文章浏览阅读1k次。使用Visdom,配合浏览器在线更新各种数据,如loss以及准确度等等,甚至图像.这里提供一个简明实用的教程.安装与启动安装: pip install visdom终端启动:python -m visdom.server打开浏览器,地址栏输入http://localhost:8097/可以看到蓝色界面.如果报错需要下载我上传的一个资源进行解决:static.zip 解决visdom 浏览器蓝屏上面链接也给出了具体的操作方法编写一个最简单的绘制正弦曲线程序import ._pytorch 结果可视化

abp学习日志五(领域服务)_volo.abp.application.services updateasync 怎么没有进入重写-程序员宅基地

文章浏览阅读497次。文章目录应用 ApplicationProductServiceICrudAppService 接口遇到了麻烦应用 Application这一层更多的是逻辑运算,把Dto转化为实体,聚合根等。Dto是一个非常不错的分层,关于Dto,Vo,Do,Po的详解在第一篇已经介绍abp学习日记 初记ProductServiceusing LY.Shop.Models;using System;..._volo.abp.application.services updateasync 怎么没有进入重写的savechange

Linux查看mac地址_凝思如何查看ipmac地址-程序员宅基地

文章浏览阅读1.9k次。方式一通过ifconfig命令查看对于网口的ether字段方式二这里我要查看eth0的网口mac地址cat /sys/class/net/eth0/address[Ryan@Ryan ~]$ cat /sys/class/net/eth0/address 00:16:3e:0c:44:de/sys/class/net/目录下记录了网口的信息如果有多个网口的话会有ech开头不同数字的目录,或者支持wifi的机器会有wlan开头的目录。要查看的话就直接进去看address文件就是mac地址_凝思如何查看ipmac地址

C++入门教程:C++基础教程,含进阶-程序员宅基地

文章浏览阅读4.7k次,点赞7次,收藏67次。C++入门教程:C++基础教程,含进阶http://c.biancheng.net/cpp/biancheng/cpp/rumen/

推荐文章

热门文章

相关标签