-
Notifications
You must be signed in to change notification settings - Fork 26
Open
Description
In your paper, you report that CBraMod has 4M parameters. However, when the model is loaded and finetuned, the number of trainable parameters is much higher, depending on the dataset. For example,
- Shu: 25.5M
- Physio: 46M
- BCICIV2a: 19.1M
- Stress: 25.1M
- SEEDV: 15M
For reference, I use total_params = sum(p.numel() for p in model.parameters() if p.requires_grad) to get the number of trainable parameters. Could you explain the discrepancy here? Thank you!!
Metadata
Metadata
Assignees
Labels
No labels