Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Successfully Deployed but Error Raised during Runtime #135

Open
SwordJack opened this issue Aug 22, 2023 · 2 comments
Open

Successfully Deployed but Error Raised during Runtime #135

SwordJack opened this issue Aug 22, 2023 · 2 comments

Comments

@SwordJack
Copy link

The model is deployed and scripts/webui.sh is running on my laptop with WSL2 Ubuntu successfully. However, an error raised in the terminal once I tried to submit a prompt, which reads:

Traceback (most recent call last):
  File ".../LaWGPT/utils/callbacks.py", line 46, in gentask
    ret = self.mfunc(callback=_callback, **self.kwargs)
  File ".../LaWGPT/webui.py", line 140, in generate_with_callback
    model.generate(**kwargs)
  File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/peft/peft_model.py", line 627, in generate
    outputs = self.base_model.generate(**kwargs)
  File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/transformers/generation/utils.py", line 1596, in generate
    return self.greedy_search(
  File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/transformers/generation/utils.py", line 2444, in greedy_search
    outputs = self(
  File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
    output = old_forward(*args, **kwargs)
  File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 809, in forward
    outputs = self.model(
  File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 651, in forward
    inputs_embeds = self.embed_tokens(input_ids)
  File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/torch/nn/modules/sparse.py", line 162, in forward
    return F.embedding(
  File ".../LaWGPT/lawgpt/lib/python3.10/site-packages/torch/nn/functional.py", line 2210, in embedding
    return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! (when checking argument for argument index in method wrapper_CUDA__index_select)
...

The word Error is displayed in the output textbox.

@kaclea
Copy link

kaclea commented Aug 29, 2023

我也是这个问题

@yaosh000
Copy link

解决了吗@SwordJack @kaclea

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants