Skip to content

Commit ada3bb9

Browse files
fix: remove duplicated code in TemporalBasicTransformerBlock. (huggingface#7212)
fix: remove duplicate code in TemporalBasicTransformerBlock. Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
1 parent b5814c5 commit ada3bb9

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

src/diffusers/models/attention.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -440,7 +440,6 @@ def __init__(
440440

441441
# Define 3 blocks. Each block has its own normalization layer.
442442
# 1. Self-Attn
443-
self.norm_in = nn.LayerNorm(dim)
444443
self.ff_in = FeedForward(
445444
dim,
446445
dim_out=time_mix_inner_dim,

0 commit comments

Comments
 (0)