Skip to content

Commit

Permalink
[Bug]Fix fpn teacher distill (#388)
Browse files Browse the repository at this point in the history
fix fpn distill
  • Loading branch information
yivona08 committed Dec 25, 2022
1 parent ae1af1d commit bcd6878
Showing 1 changed file with 4 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -30,15 +30,16 @@ def loss(
# If the `override_data` of a delivery is False, the delivery will
# record the origin data.
self.distiller.set_deliveries_override(False)

# Unlike ``SingleTeacherDistill``, teacher will only execute
# back + neck, not head, so there will be no loss.
if self.teacher_trainable:
# Unlike ``SingleTeacherDistill``, teacher will only execute
# back + neck, not head, so there will be no loss.
with self.distiller.teacher_recorders, self.distiller.deliveries:
_ = self.teacher.extract_feat(batch_inputs)
else:
with self.distiller.teacher_recorders, self.distiller.deliveries:
with torch.no_grad():
_ = self.teacher(batch_inputs, data_samples, mode='loss')
_ = self.teacher.extract_feat(batch_inputs)

# If the `override_data` of a delivery is True, the delivery will
# override the origin data with the recorded data.
Expand Down

0 comments on commit bcd6878

Please sign in to comment.