-
Notifications
You must be signed in to change notification settings - Fork 84
Open
Description
https://github.com/cavalleria/cavaface.pytorch/blob/e9b9bd8ee06de51649ee202712d673f8e64415e9/backbone/resattnet.py#L79
Is it intentional that the stage1 attention block uses addition for the out_trunk whereas the rest use multiplication?
Other repositories that implement this method appear to use multiply here, which makes me believe it is a mistake. However, as this get such good accuracy I'm tempted to ask if there was logic behind it or perhaps the others should be plus as well.
Metadata
Metadata
Assignees
Labels
No labels