Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About PyTorch execute failure: forward() is missing value for argument 'input'. error #3633

Closed
lincong8722 opened this issue Nov 30, 2021 · 4 comments

Comments

@lincong8722
Copy link

lincong8722 commented Nov 30, 2021

When I start the model successfully, pytorch execute failure: forward() is missing value for argument 'input' appears in the test phase of the client. The following is my esemble model. How can I solve it?

Traceback (most recent call last):
File "test.py", line 151, in
responses.append(async_request.get_result())
File "/usr/local/lib/python3.8/dist-packages/tritonclient/http/init.py", line 1474, in get_result
_raise_if_error(response)
File "/usr/local/lib/python3.8/dist-packages/tritonclient/http/init.py", line 64, in _raise_if_error
raise error
tritonclient.utils.InferenceServerException: in ensemble 'pl_video_tag', PyTorch execute failure: forward() is missing value for argument 'input'. Declaration: forward(torch.multimodal.model.multimodal_transformer.___torch_mangle_9591.Multimodal self, Tensor image, Tensor input) -> ((Tensor, Tensor))
Exception raised from checkAndNormalizeInputs at /opt/pytorch/pytorch/aten/src/ATen/core/function_schema_inl.h:234 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits, std::allocator >) + 0x6c (0x7fdd8470c63c in /opt/tritonserver/backends/pytorch/libc10.so)
frame #1: c10::detail::torchCheckFail(char const*, char const*, unsigned int, std::__cxx11::basic_string<char, std::char_traits, std::allocator > const&) + 0xfa (0x7fdd846d7a28 in /opt/tritonserver/backends/pytorch/libc10.so)
frame #2: + 0x12401ed (0x7fdd474a01ed in /opt/tritonserver/backends/pytorch/libtorch_cpu.so)
frame #3: torch::jit::GraphFunction::operator()(std::vector<c10::IValue, std::allocatorc10::IValue >, std::unordered_map<std::__cxx11::basic_string<char, std::char_traits, std::allocator >, c10::IValue, std::hash<std::__cxx11::basic_string<char, std::char_traits, std::allocator > >, std::equal_to<std::__cxx11::basic_string<char, std::char_traits, std::allocator > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits, std::allocator > const, c10::IValue> > > const&) + 0x31 (0x7fdd4a04d791 in /opt/tritonserver/backends/pytorch/libtorch_cpu.so)
frame #4: torch::jit::Method::operator()(std::vector<c10::IValue, std::allocatorc10::IValue >, std::unordered_map<std::__cxx11::basic_string<char, std::char_traits, std::allocator >, c10::IValue, std::hash<std::__cxx11::basic_string<char, std::char_traits, std::allocator > >, std::equal_to<std::__cxx11::basic_string<char, std::char_traits, std::allocator > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits, std::allocator > const, c10::IValue> > > const&) + 0x168 (0x7fdd4a060538 in /opt/tritonserver/backends/pytorch/libtorch_cpu.so)
frame #5: + 0x18ad8 (0x7fdd84ffcad8 in /opt/tritonserver/backends/pytorch/libtriton_pytorch.so)
frame #6: + 0x1efd2 (0x7fdd85002fd2 in /opt/tritonserver/backends/pytorch/libtriton_pytorch.so)
frame #7: TRITONBACKEND_ModelInstanceExecute + 0x38a (0x7fdd8500466a in /opt/tritonserver/backends/pytorch/libtriton_pytorch.so)
frame #8: + 0x30c859 (0x7fddc48c7859 in /opt/tritonserver/bin/../lib/libtritonserver.so)
frame #9: + 0x109ec0 (0x7fddc46c4ec0 in /opt/tritonserver/bin/../lib/libtritonserver.so)
frame #10: + 0xd6de4 (0x7fddc4110de4 in /lib/x86_64-linux-gnu/libstdc++.so.6)
frame #11: + 0x9609 (0x7fddc458e609 in /lib/x86_64-linux-gnu/libpthread.so.0)
frame #12: clone + 0x43 (0x7fddc3dfe293 in /lib/x86_64-linux-gnu/libc.so.6)

——————————————————————————————————————————————————

name: "pl_video_tag"
platform: "ensemble"
max_batch_size: 128

input [
    {
        name: "IMAGE"
        data_type: TYPE_UINT8
        dims: [-1]
    }
]

output [
  {
    name: "postp_output0"
    data_type: TYPE_STRING
    dims: [-1]
  }
]

ensemble_scheduling {
    step [
        {
            model_name: "pl_video_tag_prep"
            model_version: -1
            input_map {
                key: "prep_input0"
                value: "IMAGE"
            }
            output_map {
                key: "prep_output0"
                value: "prep_output0"
            }
        },
        {
            model_name: "pl_video_tag_ml"
            model_version: -1
            input_map {
                key: "ml_input__0"
                value: "prep_output0"
            }
            output_map {
                key: "ml_output__0"
                value: "ml_output__0"
            }
        },
        {
            model_name: "pl_video_tag_postp"
            model_version: -1
            input_map {
                key: "postp_input0"
                value: "ml_output__0"
            }
            output_map {
                key: "postp_output0"
                value: "postp_output0"
            }
        }
    ]
}
@lincong8722 lincong8722 changed the title About Unable to get device UUID error About PyTorch execute failure: forward() is missing value for argument 'input'. error Nov 30, 2021
@CoderHam
Copy link
Contributor

@lincong8722 it looks like the pytorch model expects 2 inputs but only 1 is given. Can you share the model config for the pytorch torchscript model?

@lincong8722
Copy link
Author

@CoderHam Hello,My .pt model prints like this:

RecursiveScriptModule(
  original_name=Multimodal
  (visual): RecursiveScriptModule(
    original_name=VisualTransformer
    (conv1): RecursiveScriptModule(original_name=Conv2d)
    (ln_pre): RecursiveScriptModule(original_name=LayerNorm)
    (transformer): RecursiveScriptModule(
      original_name=Transformer
      (resblocks): RecursiveScriptModule(
        original_name=Sequential
        (0): RecursiveScriptModule(
          original_name=ResidualAttentionBlock
          (attn): RecursiveScriptModule(
            original_name=MultiheadAttention
            (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
          )
          (ln_1): RecursiveScriptModule(original_name=LayerNorm)
          (mlp): RecursiveScriptModule(
            original_name=Sequential
            (c_fc): RecursiveScriptModule(original_name=Linear)
            (gelu): RecursiveScriptModule(original_name=QuickGELU)
            (c_proj): RecursiveScriptModule(original_name=Linear)
          )
          (ln_2): RecursiveScriptModule(original_name=LayerNorm)
        )
        (1): RecursiveScriptModule(
          original_name=ResidualAttentionBlock
          (attn): RecursiveScriptModule(
            original_name=MultiheadAttention
            (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
          )
          (ln_1): RecursiveScriptModule(original_name=LayerNorm)
          (mlp): RecursiveScriptModule(
            original_name=Sequential
            (c_fc): RecursiveScriptModule(original_name=Linear)
            (gelu): RecursiveScriptModule(original_name=QuickGELU)
            (c_proj): RecursiveScriptModule(original_name=Linear)
          )
          (ln_2): RecursiveScriptModule(original_name=LayerNorm)
        )
        (2): RecursiveScriptModule(
          original_name=ResidualAttentionBlock
          (attn): RecursiveScriptModule(
            original_name=MultiheadAttention
            (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
          )
          (ln_1): RecursiveScriptModule(original_name=LayerNorm)
          (mlp): RecursiveScriptModule(
            original_name=Sequential
            (c_fc): RecursiveScriptModule(original_name=Linear)
            (gelu): RecursiveScriptModule(original_name=QuickGELU)
            (c_proj): RecursiveScriptModule(original_name=Linear)
          )
          (ln_2): RecursiveScriptModule(original_name=LayerNorm)
        )
        (3): RecursiveScriptModule(
          original_name=ResidualAttentionBlock
          (attn): RecursiveScriptModule(
            original_name=MultiheadAttention
            (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
          )
          (ln_1): RecursiveScriptModule(original_name=LayerNorm)
          (mlp): RecursiveScriptModule(
            original_name=Sequential
            (c_fc): RecursiveScriptModule(original_name=Linear)
            (gelu): RecursiveScriptModule(original_name=QuickGELU)
            (c_proj): RecursiveScriptModule(original_name=Linear)
          )
          (ln_2): RecursiveScriptModule(original_name=LayerNorm)
        )
        (4): RecursiveScriptModule(
          original_name=ResidualAttentionBlock
          (attn): RecursiveScriptModule(
            original_name=MultiheadAttention
            (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
          )
          (ln_1): RecursiveScriptModule(original_name=LayerNorm)
          (mlp): RecursiveScriptModule(
            original_name=Sequential
            (c_fc): RecursiveScriptModule(original_name=Linear)
            (gelu): RecursiveScriptModule(original_name=QuickGELU)
            (c_proj): RecursiveScriptModule(original_name=Linear)
          )
          (ln_2): RecursiveScriptModule(original_name=LayerNorm)
        )
        (5): RecursiveScriptModule(
          original_name=ResidualAttentionBlock
          (attn): RecursiveScriptModule(
            original_name=MultiheadAttention
            (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
          )
          (ln_1): RecursiveScriptModule(original_name=LayerNorm)
          (mlp): RecursiveScriptModule(
            original_name=Sequential
            (c_fc): RecursiveScriptModule(original_name=Linear)
            (gelu): RecursiveScriptModule(original_name=QuickGELU)
            (c_proj): RecursiveScriptModule(original_name=Linear)
          )
          (ln_2): RecursiveScriptModule(original_name=LayerNorm)
        )
        (6): RecursiveScriptModule(
          original_name=ResidualAttentionBlock
          (attn): RecursiveScriptModule(
            original_name=MultiheadAttention
            (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
          )
          (ln_1): RecursiveScriptModule(original_name=LayerNorm)
          (mlp): RecursiveScriptModule(
            original_name=Sequential
            (c_fc): RecursiveScriptModule(original_name=Linear)
            (gelu): RecursiveScriptModule(original_name=QuickGELU)
            (c_proj): RecursiveScriptModule(original_name=Linear)
          )
          (ln_2): RecursiveScriptModule(original_name=LayerNorm)
        )
        (7): RecursiveScriptModule(
          original_name=ResidualAttentionBlock
          (attn): RecursiveScriptModule(
            original_name=MultiheadAttention
            (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
          )
          (ln_1): RecursiveScriptModule(original_name=LayerNorm)
          (mlp): RecursiveScriptModule(
            original_name=Sequential
            (c_fc): RecursiveScriptModule(original_name=Linear)
            (gelu): RecursiveScriptModule(original_name=QuickGELU)
            (c_proj): RecursiveScriptModule(original_name=Linear)
          )
          (ln_2): RecursiveScriptModule(original_name=LayerNorm)
        )
        (8): RecursiveScriptModule(
          original_name=ResidualAttentionBlock
          (attn): RecursiveScriptModule(
            original_name=MultiheadAttention
            (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
          )
          (ln_1): RecursiveScriptModule(original_name=LayerNorm)
          (mlp): RecursiveScriptModule(
            original_name=Sequential
            (c_fc): RecursiveScriptModule(original_name=Linear)
            (gelu): RecursiveScriptModule(original_name=QuickGELU)
            (c_proj): RecursiveScriptModule(original_name=Linear)
          )
          (ln_2): RecursiveScriptModule(original_name=LayerNorm)
        )
        (9): RecursiveScriptModule(
          original_name=ResidualAttentionBlock
          (attn): RecursiveScriptModule(
            original_name=MultiheadAttention
            (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
          )
          (ln_1): RecursiveScriptModule(original_name=LayerNorm)
          (mlp): RecursiveScriptModule(
            original_name=Sequential
            (c_fc): RecursiveScriptModule(original_name=Linear)
            (gelu): RecursiveScriptModule(original_name=QuickGELU)
            (c_proj): RecursiveScriptModule(original_name=Linear)
          )
          (ln_2): RecursiveScriptModule(original_name=LayerNorm)
        )
        (10): RecursiveScriptModule(
          original_name=ResidualAttentionBlock
          (attn): RecursiveScriptModule(
            original_name=MultiheadAttention
            (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
          )
          (ln_1): RecursiveScriptModule(original_name=LayerNorm)
          (mlp): RecursiveScriptModule(
            original_name=Sequential
            (c_fc): RecursiveScriptModule(original_name=Linear)
            (gelu): RecursiveScriptModule(original_name=QuickGELU)
            (c_proj): RecursiveScriptModule(original_name=Linear)
          )
          (ln_2): RecursiveScriptModule(original_name=LayerNorm)
        )
        (11): RecursiveScriptModule(
          original_name=ResidualAttentionBlock
          (attn): RecursiveScriptModule(
            original_name=MultiheadAttention
            (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
          )
          (ln_1): RecursiveScriptModule(original_name=LayerNorm)
          (mlp): RecursiveScriptModule(
            original_name=Sequential
            (c_fc): RecursiveScriptModule(original_name=Linear)
            (gelu): RecursiveScriptModule(original_name=QuickGELU)
            (c_proj): RecursiveScriptModule(original_name=Linear)
          )
          (ln_2): RecursiveScriptModule(original_name=LayerNorm)
        )
      )
    )
    (ln_post): RecursiveScriptModule(original_name=LayerNorm)
  )
  (transformer): RecursiveScriptModule(
    original_name=Transformer
    (resblocks): RecursiveScriptModule(
      original_name=Sequential
      (0): RecursiveScriptModule(
        original_name=ResidualAttentionBlock
        (attn): RecursiveScriptModule(
          original_name=MultiheadAttention
          (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
        )
        (ln_1): RecursiveScriptModule(original_name=LayerNorm)
        (mlp): RecursiveScriptModule(
          original_name=Sequential
          (c_fc): RecursiveScriptModule(original_name=Linear)
          (gelu): RecursiveScriptModule(original_name=QuickGELU)
          (c_proj): RecursiveScriptModule(original_name=Linear)
        )
        (ln_2): RecursiveScriptModule(original_name=LayerNorm)
      )
      (1): RecursiveScriptModule(
        original_name=ResidualAttentionBlock
        (attn): RecursiveScriptModule(
          original_name=MultiheadAttention
          (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
        )
        (ln_1): RecursiveScriptModule(original_name=LayerNorm)
        (mlp): RecursiveScriptModule(
          original_name=Sequential
          (c_fc): RecursiveScriptModule(original_name=Linear)
          (gelu): RecursiveScriptModule(original_name=QuickGELU)
          (c_proj): RecursiveScriptModule(original_name=Linear)
        )
        (ln_2): RecursiveScriptModule(original_name=LayerNorm)
      )
      (2): RecursiveScriptModule(
        original_name=ResidualAttentionBlock
        (attn): RecursiveScriptModule(
          original_name=MultiheadAttention
          (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
        )
        (ln_1): RecursiveScriptModule(original_name=LayerNorm)
        (mlp): RecursiveScriptModule(
          original_name=Sequential
          (c_fc): RecursiveScriptModule(original_name=Linear)
          (gelu): RecursiveScriptModule(original_name=QuickGELU)
          (c_proj): RecursiveScriptModule(original_name=Linear)
        )
        (ln_2): RecursiveScriptModule(original_name=LayerNorm)
      )
      (3): RecursiveScriptModule(
        original_name=ResidualAttentionBlock
        (attn): RecursiveScriptModule(
          original_name=MultiheadAttention
          (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
        )
        (ln_1): RecursiveScriptModule(original_name=LayerNorm)
        (mlp): RecursiveScriptModule(
          original_name=Sequential
          (c_fc): RecursiveScriptModule(original_name=Linear)
          (gelu): RecursiveScriptModule(original_name=QuickGELU)
          (c_proj): RecursiveScriptModule(original_name=Linear)
        )
        (ln_2): RecursiveScriptModule(original_name=LayerNorm)
      )
      (4): RecursiveScriptModule(
        original_name=ResidualAttentionBlock
        (attn): RecursiveScriptModule(
          original_name=MultiheadAttention
          (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
        )
        (ln_1): RecursiveScriptModule(original_name=LayerNorm)
        (mlp): RecursiveScriptModule(
          original_name=Sequential
          (c_fc): RecursiveScriptModule(original_name=Linear)
          (gelu): RecursiveScriptModule(original_name=QuickGELU)
          (c_proj): RecursiveScriptModule(original_name=Linear)
        )
        (ln_2): RecursiveScriptModule(original_name=LayerNorm)
      )
      (5): RecursiveScriptModule(
        original_name=ResidualAttentionBlock
        (attn): RecursiveScriptModule(
          original_name=MultiheadAttention
          (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
        )
        (ln_1): RecursiveScriptModule(original_name=LayerNorm)
        (mlp): RecursiveScriptModule(
          original_name=Sequential
          (c_fc): RecursiveScriptModule(original_name=Linear)
          (gelu): RecursiveScriptModule(original_name=QuickGELU)
          (c_proj): RecursiveScriptModule(original_name=Linear)
        )
        (ln_2): RecursiveScriptModule(original_name=LayerNorm)
      )
      (6): RecursiveScriptModule(
        original_name=ResidualAttentionBlock
        (attn): RecursiveScriptModule(
          original_name=MultiheadAttention
          (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
        )
        (ln_1): RecursiveScriptModule(original_name=LayerNorm)
        (mlp): RecursiveScriptModule(
          original_name=Sequential
          (c_fc): RecursiveScriptModule(original_name=Linear)
          (gelu): RecursiveScriptModule(original_name=QuickGELU)
          (c_proj): RecursiveScriptModule(original_name=Linear)
        )
        (ln_2): RecursiveScriptModule(original_name=LayerNorm)
      )
      (7): RecursiveScriptModule(
        original_name=ResidualAttentionBlock
        (attn): RecursiveScriptModule(
          original_name=MultiheadAttention
          (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
        )
        (ln_1): RecursiveScriptModule(original_name=LayerNorm)
        (mlp): RecursiveScriptModule(
          original_name=Sequential
          (c_fc): RecursiveScriptModule(original_name=Linear)
          (gelu): RecursiveScriptModule(original_name=QuickGELU)
          (c_proj): RecursiveScriptModule(original_name=Linear)
        )
        (ln_2): RecursiveScriptModule(original_name=LayerNorm)
      )
      (8): RecursiveScriptModule(
        original_name=ResidualAttentionBlock
        (attn): RecursiveScriptModule(
          original_name=MultiheadAttention
          (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
        )
        (ln_1): RecursiveScriptModule(original_name=LayerNorm)
        (mlp): RecursiveScriptModule(
          original_name=Sequential
          (c_fc): RecursiveScriptModule(original_name=Linear)
          (gelu): RecursiveScriptModule(original_name=QuickGELU)
          (c_proj): RecursiveScriptModule(original_name=Linear)
        )
        (ln_2): RecursiveScriptModule(original_name=LayerNorm)
      )
      (9): RecursiveScriptModule(
        original_name=ResidualAttentionBlock
        (attn): RecursiveScriptModule(
          original_name=MultiheadAttention
          (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
        )
        (ln_1): RecursiveScriptModule(original_name=LayerNorm)
        (mlp): RecursiveScriptModule(
          original_name=Sequential
          (c_fc): RecursiveScriptModule(original_name=Linear)
          (gelu): RecursiveScriptModule(original_name=QuickGELU)
          (c_proj): RecursiveScriptModule(original_name=Linear)
        )
        (ln_2): RecursiveScriptModule(original_name=LayerNorm)
      )
      (10): RecursiveScriptModule(
        original_name=ResidualAttentionBlock
        (attn): RecursiveScriptModule(
          original_name=MultiheadAttention
          (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
        )
        (ln_1): RecursiveScriptModule(original_name=LayerNorm)
        (mlp): RecursiveScriptModule(
          original_name=Sequential
          (c_fc): RecursiveScriptModule(original_name=Linear)
          (gelu): RecursiveScriptModule(original_name=QuickGELU)
          (c_proj): RecursiveScriptModule(original_name=Linear)
        )
        (ln_2): RecursiveScriptModule(original_name=LayerNorm)
      )
      (11): RecursiveScriptModule(
        original_name=ResidualAttentionBlock
        (attn): RecursiveScriptModule(
          original_name=MultiheadAttention
          (out_proj): RecursiveScriptModule(original_name=_LinearWithBias)
        )
        (ln_1): RecursiveScriptModule(original_name=LayerNorm)
        (mlp): RecursiveScriptModule(
          original_name=Sequential
          (c_fc): RecursiveScriptModule(original_name=Linear)
          (gelu): RecursiveScriptModule(original_name=QuickGELU)
          (c_proj): RecursiveScriptModule(original_name=Linear)
        )
        (ln_2): RecursiveScriptModule(original_name=LayerNorm)
      )
    )
  )
  (token_embedding): RecursiveScriptModule(original_name=Embedding)
  (ln_final): RecursiveScriptModule(original_name=LayerNorm)
)

@lincong8722
Copy link
Author

lincong8722 commented Dec 1, 2021

@CoderHam I print model.code

model: def forward(self,
    image: Tensor,
    input: Tensor) -> Tuple[Tensor, Tensor]:
  _0 = self.logit_scale
  _1 = self.text_projection
  _2 = self.ln_final
  _3 = self.transformer
  _4 = self.positional_embedding
  _5 = self.token_embedding
  _6 = self.visual
  input0 = torch.to(image, torch.device("cuda:0"), 5, False, False, None)
  _7 = (_6).forward1(input0, )
  x = torch.to((_5).forward1(input, ), torch.device("cuda:0"), 5, False, False, None)
  _8 = torch.to(_4, torch.device("cuda:0"), 5, False, False, None)
  x4 = torch.add(x, _8, alpha=1)
  x5 = torch.permute(x4, [1, 0, 2])
  x6 = torch.permute((_3).forward1(x5, ), [1, 0, 2])
  x7 = torch.to((_2).forward1(x6, ), torch.device("cuda:0"), 5, False, False, None)
  _9 = ops.prim.NumToTensor(torch.size(x7, 0))
  _10 = torch.arange(annotate(number, _9), dtype=None, layout=0, device=torch.device("cpu"), pin_memory=False)
  _11 = torch.argmax(input, -1, False)
  _12 = torch.to(_10, dtype=4, layout=0, device=torch.device("cuda:0"), pin_memory=None, non_blocking=False, copy=False, memory_format=None)
  _13 = torch.to(_11, dtype=4, layout=0, device=torch.device("cuda:0"), pin_memory=None, non_blocking=False, copy=False, memory_format=None)
  _14 = annotate(List[Optional[Tensor]], [_12, _13])
  input1 = torch.matmul(torch.index(x7, _14), _1)
  image_features = torch.div(_7, torch.frobenius_norm(_7, [-1], True))
  _15 = torch.frobenius_norm(input1, [-1], True)
  text_features = torch.div(input1, _15)
  logit_scale = torch.exp(_0)
  _16 = torch.mul(logit_scale, image_features)
  _17 = torch.matmul(_16, torch.t(text_features))
  _18 = torch.matmul(torch.mul(logit_scale, text_features), torch.t(image_features))
  return (_17, _18)

There are two inputs when the model is deployed

    def forward(self, image, text):
        image_features = self.encode_image(image)
        text_features = self.encode_text(text)

This model does have two inputs,But I only have one parameter for the input model on the model inference,not use text_features,like this

      with torch.no_grad():
        image_features = self.model.encode_image(image)

What should I do?

@CoderHam
Copy link
Contributor

CoderHam commented Dec 1, 2021

@lincong8722 you should reach out to the people who created the model as they will be able to best explain the same. This is not a Triton issue but rather a pytorch framework issue. Closing this ticket as such.

@CoderHam CoderHam closed this as completed Dec 1, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

2 participants