Skip to content

[BUG] bf16bf16bf16 Matmul failed #263

@WangJialei-A

Description

@WangJialei-A

https://github.com/intel/graph-compiler/actions/runs/10463928001/job/28976702848?pr=161

Here is a reproducible module

module {
  func.func @entry(%arg0: tensor<16x4096xbf16>, %arg1: tensor<4096x4xbf16>) -> tensor<16x4xbf16> attributes {llvm.emit_c_interface} {
    %cst = arith.constant 0.000000e+00 : bf16
    %0 = tensor.empty() : tensor<16x4xbf16>
    %1 = linalg.fill ins(%cst : bf16) outs(%0 : tensor<16x4xbf16>) -> tensor<16x4xbf16>
    %2 = linalg.matmul {cast = #linalg.type_fn<cast_signed>} ins(%arg0, %arg1 : tensor<16x4096xbf16>, tensor<4096x4xbf16>) outs(%1 : tensor<16x4xbf16>) -> tensor<16x4xbf16>
    return %2 : tensor<16x4xbf16>
  }
}


Metadata

Metadata

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions