You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The bias of a Linear layer is not taken into account for the FLOPs computation, as shown below:
Example:
import torch
import torch.nn as nn
from thop import profile
class Simple_FC_Net(nn.Module):
def __init__(self, bias=True):
super(Simple_FC_Net, self).__init__()
self.fc = nn.Linear(in_features=17, out_features=49, bias=bias)
def forward(self, x):
x = self.fc(x)
return x
input_simple_fc = torch.randn(1, 17)
# Model without bias
print("Model without bias:")
model = Simple_FC_Net(bias=False)
flops, params = profile(model, inputs=(input_simple_fc,))
print("Flops: {}, Params: {}".format(flops, params))
# Model with bias
print("Model with bias:")
model = Simple_FC_Net(bias=True)
flops, params = profile(model, inputs=(input_simple_fc,))
print("Flops: {}, Params: {}".format(flops, params))
Output:
Model without bias:
Register FLOP counter for module Linear(in_features=17, out_features=49, bias=False)
Flops: 1617.0, Params: 833.0
Model with bias:
Register FLOP counter for module Linear(in_features=17, out_features=49, bias=True)
Flops: 1617.0, Params: 882.0
The text was updated successfully, but these errors were encountered:
The bias of a Linear layer is not taken into account for the FLOPs computation, as shown below:
Example:
Output:
The text was updated successfully, but these errors were encountered: