Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lora key not loaded #6381

Open
llm8047 opened this issue Jan 7, 2025 · 6 comments
Open

lora key not loaded #6381

llm8047 opened this issue Jan 7, 2025 · 6 comments
Labels
Potential Bug User is reporting a bug. This should be tested.

Comments

@llm8047
Copy link

llm8047 commented Jan 7, 2025

Expected Behavior

Requested to load SD1ClipModel
loaded completely 9.5367431640625e+25 235.84423828125 True
lora key not loaded: lora_te_text_model_encoder_layers_0_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_0_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_0_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_0_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_0_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_0_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_10_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_10_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_10_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_10_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_10_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_10_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_11_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_11_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_11_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_11_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_11_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_11_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_1_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_1_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_1_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_1_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_1_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_1_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_2_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_2_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_2_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_2_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_2_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_2_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_3_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_3_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_3_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_3_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_3_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_3_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_4_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_4_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_4_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_4_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_4_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_4_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_5_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_5_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_5_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_5_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_5_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_5_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_6_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_6_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_6_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_6_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_6_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_6_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_7_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_7_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_7_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_7_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_7_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_7_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_8_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_8_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_8_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_8_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_8_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_8_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_9_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_9_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_9_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_9_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_9_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_9_self_attn_v_proj.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.weight
Requested to load SD1ClipModel

Actual Behavior

no

Steps to Reproduce

no

Debug Logs

no

Other

No response

@llm8047 llm8047 added the Potential Bug User is reporting a bug. This should be tested. label Jan 7, 2025
@LukeG89
Copy link

LukeG89 commented Jan 7, 2025

Not a bug, you just used a LoRA that is incompatible with the model

@llm8047
Copy link
Author

llm8047 commented Jan 7, 2025

My Lora is trained using Kohya_Ss and can run on web_ui

@llm8047
Copy link
Author

llm8047 commented Jan 7, 2025

Not a bug, you just used a LoRA that is incompatible with the model

My Lora is trained using Kohya_Ss and can run on web_ui

@LukeG89
Copy link

LukeG89 commented Jan 7, 2025

My Lora is trained using Kohya_Ss and can run on web_ui

And you are using the LoRA with a compatible model? For example, the LoRA is for SD1.5 and you are using a SD1.5 checkpoint?

@llm8047
Copy link
Author

llm8047 commented Jan 8, 2025 via email

@llm8047
Copy link
Author

llm8047 commented Jan 8, 2025

My Lora is trained using Kohya_Ss and can run on web_ui

And you are using the LoRA with a compatible model? For example, the LoRA is for SD1.5 and you are using a SD1.5 checkpoint?

yes,the problem still exist.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Potential Bug User is reporting a bug. This should be tested.
Projects
None yet
Development

No branches or pull requests

2 participants