Skip to content

fix: correct device and dtype mixup#1433

Open
xiuyuan18 wants to merge 2 commits into
modelscope:mainfrom
xiuyuan18:fix/device-dtype-mixup
Open

fix: correct device and dtype mixup#1433
xiuyuan18 wants to merge 2 commits into
modelscope:mainfrom
xiuyuan18:fix/device-dtype-mixup

Conversation

@xiuyuan18
Copy link
Copy Markdown

fix the vram config fall back logic in class AutoTorchModule, method set_dtype_and_device. The previous method would assign computation_dtype to onload_device, offload_device and preparing_device if they are None, which mixup dtype and device.
modified: diffsynth/core/vram/layers.py

fix the assignment logic in AutoTorchModule
modified:   diffsynth/core/vram/layers.py
Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request corrects a bug in the set_dtype_and_device method where the offload_device, onload_device, and preparing_device were incorrectly defaulting to computation_dtype instead of computation_device. The review feedback suggests further refining this logic to ensure that if any of the corresponding data types are set to "disk", the device should also default to "disk" to maintain compatibility with the disk offloading mechanisms.

Comment thread diffsynth/core/vram/layers.py Outdated
Comment thread diffsynth/core/vram/layers.py Outdated
Comment thread diffsynth/core/vram/layers.py Outdated
When offload_dtype is set to "disk", the offload_device should also default to "disk" to ensure that the disk offloading logic in AutoWrappedModule (which checks self.offload_device == "disk") works correctly. The current fallback to computation_device would cause the onload and preparing methods to skip loading from disk, as they rely on this sentinel value to trigger load_from_disk.

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant