Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ModuleNotFoundError: No module named 'mm_builder' #1

Open
bozagina opened this issue Dec 25, 2024 · 5 comments
Open

ModuleNotFoundError: No module named 'mm_builder' #1

bozagina opened this issue Dec 25, 2024 · 5 comments

Comments

@bozagina
Copy link

image

@bingwork
Copy link
Collaborator

hi, @bozagina
I noticed that in the code you referenced from the image, starting from line 3 in inference.py, I saw the line from mm_builder import load_pretrained_model.
However, in the version of the code in the official repository, this line is not included.
Could you please clarify if there has been an update or if there might be a discrepancy between the versions?

企业微信截图_20241225184333

@bozagina
Copy link
Author

hi, @bozagina I noticed that in the code you referenced from the image, starting from line 3 in inference.py, I saw the line from mm_builder import load_pretrained_model. However, in the version of the code in the official repository, this line is not included. Could you please clarify if there has been an update or if there might be a discrepancy between the versions?

企业微信截图_20241225184333

hi, i was using the inference.py script as shown in the attached image:
image
Additionally, when I tried to use the model weights that have already been downloaded, I encountered the following errors:
image
image

@bingwork
Copy link
Collaborator

bingwork commented Dec 27, 2024

Ah, I got it. Please update to use the inference.py from MMAlaya GitHub repository, as I checked and it still works now. I'm sorry, but the inference.py on HuggingFace is incorrect. Also, please make sure to use transformers==4.33.0.

@bozagina
Copy link
Author

Ah, I got it. Please update to use the inference.py from MMAlaya GitHub repository, as I checked and it still works now. I'm sorry, but the inference.py on HuggingFace is incorrect. Also, please make sure to use transformers==4.33.0.
"When I use the command CUDA_VISIBLE_DEVICES=0 python infer_mmalaya.py, it works fine.
image
when I use the command "CUDA_VISIBLE_DEVICES=0 python infer_mmalaya.py --model_path "/mnt/znzz/william/inference/model_zoo/DataCanvas/MMAlaya""
image
with the pre-downloaded model weights, as shown below:"
image

@bingwork
Copy link
Collaborator

It seems like the attention.py file is missing in your /root/.cache/huggingface/modules/transformers_modules/MMAlaya directory. You could try running the following copy command to fix the issue:

cp $(ls | grep -vE 'pytorch_model-00001-of-00002.bin|pytorch_model-00002-of-00002.bin') /root/.cache/huggingface/modules/transformers_modules/MMAlaya

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants