Skip to content

Conversation

@mengniwang95
Copy link
Contributor

Bug fix

Copy link
Contributor

@wenhuach21 wenhuach21 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice catch, could you help add an UT?

@wenhuach21 wenhuach21 added this to the 0.8.0 milestone Oct 10, 2025
@wenhuach21
Copy link
Contributor

if re.search(re.compile(name), layer_name) is not None:

it should be matched and converted to full name

"self_attn": {"bits": 4, "data_type": "mx_fp", "act_bits": 4, "act_data_type": "mx_fp", "group_size": 32}
}
autoround = AutoRound(
self.model,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please just pass the self.model_name in the future, better not use the same model

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated

layer_config=layer_config,
amp=False,
)
autoround.quantize_and_save(self.save_folder, inplace=False)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if call save, then delete the folder after the ut

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

@mengniwang95
Copy link
Contributor Author

@wenhuach21 CI has passed, can we merge this PR?

@wenhuach21 wenhuach21 merged commit b693a2a into main Oct 11, 2025
14 checks passed
@wenhuach21 wenhuach21 deleted the mengniwang95-patch-2 branch October 11, 2025 13:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants