any way to use with wan2gp?

#10
by balukumar - opened

So wan2gp rn has muchhhh better vram management for ltx-2 compared to comfyui and enables my 3060 12gb vram + 32 gb ram to run ltx-2 720p videos.

I can't do the same on comfyui. Any way to use this with Wan2GP?

Not OP but i keep failing getting this to work. I get:
Cannot copy out of meta tensor; no data! Please use torch.nn.Module.to_empty() instead of torch.nn.Module.to() when moving module from meta to a different device.

Traceback (most recent call last): File "/home/USER/Wan2GP/wgp.py", line 6562, in generate_video_error_handler generate_video(task, send_cmd, plugin_data=plugin_data, **params) File "/home/USER/Wan2GP/wgp.py", line 5288, in generate_video wan_model, offloadobj = load_models(model_type, override_profile, **model_kwargs) File "/home/USER/Wan2GP/wgp.py", line 3325, in load_models wan_model, pipe = model_type_handler.load_model( File "/home/USER/Wan2GP/models/ltx2/ltx2_handler.py", line 186, in load_model ltx2_model = LTX2( File "/home/USER/Wan2GP/models/ltx2/ltx2.py", line 349, in init self._cache_two_stage_models() File "/home/USER/Wan2GP/models/ltx2/ltx2.py", line 385, in _cache_two_stage_models self.text_embedding_projection = ledger_1.text_embedding_projection() File "/home/USER/Wan2GP/models/ltx2/ltx_pipelines/utils/model_ledger.py", line 383, in text_embedding_projection .to(self.device) File "/home/USER/miniconda3/envs/wan2gp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1355, in to return self._apply(convert) File "/home/USER/miniconda3/envs/wan2gp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 915, in _apply module._apply(fn) File "/home/USER/miniconda3/envs/wan2gp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 942, in _apply param_applied = fn(param) File "/home/USER/miniconda3/envs/wan2gp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1348, in convert raise NotImplementedError( NotImplementedError: Cannot copy out of meta tensor; no data! Please use torch.nn.Module.to_empty() instead of torch.nn.Module.to() when moving module from meta to a different device.

Sign up or log in to comment