Advertisement
Guest User

Untitled

a guest
May 22nd, 2023
94
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 2.76 KB | Source Code | 0 0
  1. Traceback (most recent call last):
  2. File "/home/pixel/oobabooga_linux/text-generation-webui/server.py", line 70, in load_model_wrapper
  3. shared.model, shared.tokenizer = load_model(shared.model_name)
  4. File "/home/pixel/oobabooga_linux/text-generation-webui/modules/models.py", line 95, in load_model
  5. output = load_func(model_name)
  6. File "/home/pixel/oobabooga_linux/text-generation-webui/modules/models.py", line 275, in GPTQ_loader
  7. model = modules.GPTQ_loader.load_quantized(model_name)
  8. File "/home/pixel/oobabooga_linux/text-generation-webui/modules/GPTQ_loader.py", line 177, in load_quantized
  9. model = load_quant(str(path_to_model), str(pt_path), shared.args.wbits, shared.args.groupsize, kernel_switch_threshold=threshold)
  10. File "/home/pixel/oobabooga_linux/text-generation-webui/modules/GPTQ_loader.py", line 84, in _load_quant
  11. model.load_state_dict(safe_load(checkpoint), strict=False)
  12. File "/home/pixel/oobabooga_linux/installer_files/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 2041, in load_state_dict
  13. raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
  14. RuntimeError: Error(s) in loading state_dict for LlamaForCausalLM:
  15. size mismatch for model.layers.0.self_attn.k_proj.qzeros: copying a param with shape torch.Size([1, 832]) from checkpoint, the shape in current model is torch.Size([52, 832]).
  16. size mismatch for model.layers.0.self_attn.k_proj.scales: copying a param with shape torch.Size([1, 6656]) from checkpoint, the shape in current model is torch.Size([52, 6656]).
  17. size mismatch for model.layers.0.self_attn.o_proj.qzeros: copying a param with shape torch.Size([1, 832]) from checkpoint, the shape in current model is torch.Size([52, 832]).
  18. size mismatch for model.layers.0.self_attn.o_proj.scales: copying a param with shape torch.Size([1, 6656]) from checkpoint, the shape in current model is torch.Size([52, 6656]).
  19. size mismatch for model.layers.0.self_attn.q_proj.qzeros: copying a param with shape torch.Size([1, 832]) from checkpoint, the shape in current model is torch.Size([52, 832]).
  20. size mismatch for model.layers.0.self_attn.q_proj.scales: copying a param with shape torch.Size([1, 6656]) from checkpoint, the shape in current model is torch.Size([52, 6656]).
  21. size mismatch for model.layers.0.self_attn.v_proj.qzeros: copying a param with shape torch.Size([1, 832]) from checkpoint, the shape in current model is torch.Size([52, 832]).
  22. size mismatch for model.layers.0.self_attn.v_proj.scales: copying a param with shape torch.Size([1, 6656]) from checkpoint, the shape in current model is torch.Size([52, 6656]).
  23. size mismatch for model.layers.0.mlp.down_proj.qzeros: copying a param with shape torch.Size([1, 832]) from checkpoint, the shape in current model is torch.Size([140, 832]).
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement