You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Great suggestion! We plan to open-source the 3DGS optimization code in the future. In my tests on an A100 GPU, the VRAM usage is approximately 26.38GB, and the inference time for 50 denoising steps using DDIM is about 3 minutes and 30 seconds.
Is it possible to generate 3D models in GLB format, approximately how much VRAM and inference time are required
The text was updated successfully, but these errors were encountered: