You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The model is deployed and scripts/webui.sh is running on my laptop with WSL2 Ubuntu successfully. However, an error raised in the terminal once I tried to submit a prompt, which reads:
Traceback (most recent call last):
File ".../LaWGPT/utils/callbacks.py", line 46, in gentask
ret = self.mfunc(callback=_callback, **self.kwargs)
File ".../LaWGPT/webui.py", line 140, in generate_with_callback
model.generate(**kwargs)
File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/peft/peft_model.py", line 627, in generate
outputs = self.base_model.generate(**kwargs)
File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/transformers/generation/utils.py", line 1596, in generate
return self.greedy_search(
File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/transformers/generation/utils.py", line 2444, in greedy_search
outputs = self(
File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
output = old_forward(*args, **kwargs)
File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 809, in forward
outputs = self.model(
File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 651, in forward
inputs_embeds = self.embed_tokens(input_ids)
File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/torch/nn/modules/sparse.py", line 162, in forward
return F.embedding(
File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/torch/nn/functional.py", line 2210, in embedding
return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! (when checking argument for argument index in method wrapper_CUDA__index_select)
...
The word Error is displayed in the output textbox.
The text was updated successfully, but these errors were encountered:
The model is deployed and
scripts/webui.sh
is running on my laptop with WSL2 Ubuntu successfully. However, an error raised in the terminal once I tried to submit a prompt, which reads:The word
Error
is displayed in the output textbox.The text was updated successfully, but these errors were encountered: