Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gluon.SymbolBlock cannot imports resnet trained with dtype="float16" #11849

wgchang opened this issue Jul 21, 2018 · 5 comments


None yet
5 participants
Copy link

commented Jul 21, 2018


Cannot load fine-tuned resnet101 (incubator-mxnet/example/image-classification/symbols/ with dtype="float16" with "gluon.SymbolBlock.imports" method.

Error Message:

AssertionError: Failed loading Parameter 'stage3_unit2_conv2_weight' from saved params: dtype incompatible expected <type 'numpy.float32'> vs saved <type 'numpy.float16'>

Minimum reproducible example

net = gluon.SymbolBlock.imports('resnet-101-symbol.json',['data','softmax_label'],'resnet-101-0007.params')  # This line gives the error message.
net = gluon.SymbolBlock.imports('resnet-101-symbol.json',['data','softmax_label']) 

My Questions

In incubator-mxnet/example/image-classification/symbols/,
there is mx.sym.Cast for type conversion.
I fine-tuned resnet101 with dtype="float16", and I need to load this model as HybridBlock, However, the method gluon.SymbolBlock.imports makes every params' type in the network as float32. Therefore, the trained model cannot be updated.

Here, resnet-101-0007.params are trained with argument dtype='float16'
In resnet-101-symbol.json file, there is the Cast op.
"op": "Cast",
"name": "cast0",
"attrs": {"dtype": "float16"},
"inputs": [[7, 0, 0]]
It seems that gluon.SymbolBlock.imports does not consider the type conversion operator.

For now, I think I need to load all parameter manualy, and change types then save.
Is there any other solution to solve this problem?


This comment has been minimized.

Copy link

commented Jul 23, 2018

@sandeep-krishnamurthy Please help to label this issue Gluon


This comment has been minimized.

Copy link

commented Aug 14, 2018

Got the same problem here. Any updates here?


This comment has been minimized.

Copy link

commented Aug 22, 2018

@rahul003 same problem here, can't load back saved float16 models


This comment has been minimized.

Copy link

commented Aug 29, 2018

Using below snippet:

import mxnet as mx

ctx = mx.cpu(0)
data = mx.nd.zeros((1,3,224,224), ctx=ctx, dtype='float64')
net_fp32 =, ctx=ctx)
pred = net_fp32.forward(data)
net_fp32.export('resnet34_fp16', 0)
print('export fp16 model')

sm = mx.sym.load('resnet34_fp16-symbol.json')
inputs = mx.sym.var('data', dtype='float64')
net_fp16 = mx.gluon.SymbolBlock(sm, inputs)
net_fp16.collect_params().load('resnet34_fp16-0000.params', ctx)
pred = net_fp16.forward(data)

Below are my findings:

  1. Casting worked fine.
  2. Saved parameters are in the correct format (fp64 in my sample code)
  3. sym.load worked fine. If I infer symbol's type (sym.infer_type(data='float64') I get correct inferred type (float64) for all params.

Below is the issue:

  1. When you create mx.gluon.SymbolBlock(sm, input). It creates the parameters in the Block. Type is not passed for creating the parameter.
    See here - and the behavior is "If there is not parameter to get, it creates one and uses default type (fp32)

I am working on the fix.

@apeforest @ThomasDelteil - FYI


This comment has been minimized.

Copy link

commented Sep 18, 2018

Resolving as changes are merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.