Skip to content

[Bug] Type inference of nn.softmax does not reject invalid axis #11684

@wzh99

Description

@wzh99

Expected behavior

The following Relay program should NOT pass type inference:

#[version = "0.0.5"]
def @main(%x: Tensor[(4), float32]) {
  nn.softmax(%x, axis=1)
}

The input tensor %x of nn.softmax has only one dimension. The valid range of axis is [-1, 1). axis=1 is obviously invalid in this case.

Actual behavior

This program passes type inference of Relay.

Environment

macOS 12.4. Compiled using Clang 13.1.6 with LLVM support. TVM commit 0df6961.

Steps to reproduce

from tvm import relay, IRModule

x = relay.var('x', shape=(4,))
y = relay.nn.softmax(x, axis=1)
mod = IRModule.from_expr(y)
mod = relay.transform.InferType()(mod)

Possible fix

tvm/src/relay/op/nn/nn.cc

Lines 409 to 423 in 0df6961

RELAY_REGISTER_OP("nn.softmax")
.describe(R"code(Softmax layer.
.. math:: \text{softmax}(x)_i = \frac{exp(x_i)}{\sum_j exp(x_j)}
.. note::
This operator can be optimized away for inference.
- **data**: The input data
)code" TVM_ADD_FILELINE)
.set_attrs_type<SoftmaxAttrs>()
.set_num_inputs(1)
.add_argument("data", "Tensor", "The input tensor.")
.set_support_level(1)
.add_type_rel("Identity", IdentityRel);

In operator registration of nn.softmax, its type relation is set to be IdentityRel. However, nn.softmax has an attribute axis that is not checked by IdentityRel.

A possible fix is to implement a new type relation function that checks axis attribute in SoftmaxAttrs. This type relation also applies to nn.fast_softmax and nn.log_softmax.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions