You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
A clear and concise description of what the bug is.
neither show error nor warning, it core dump, when transfer keras model to trt. This keras model is DenseNet121 from keras.
code like that:
import numpy as np
import os
def save_keras_model(model_file):
from keras.models import Model
from keras.layers import Dense, Dropout
import keras.backend as K
from keras.applications.densenet import DenseNet121
import keras.layers as layers
import keras.models as models
import keras.utils as utils
class densenet121(object):
def __init__(self, image_size):
self.base_model = DenseNet121(input_shape=(image_size, image_size, 3),
include_top=False, pooling='avg',
backend=K,
layers=layers,
models=models,
utils=utils,
weights=None)
x = Dropout(0.75)(self.base_model.output)
x = Dense(3, activation='softmax', name='top_layer')(x)
self.model = Model(self.base_model.input, x)
print("Densenet121")
model = densenet121(512).model
model.save(model_file)
def forward_transfer(model_file):
import forward
# 1. 构建 Engine
builder = forward.KerasBuilder()
infer_mode = 'float32' # Infer Mode: float32 / float16 / int8_calib / int8
batch_size = 1
max_workspace_size = 1<<32
builder.set_mode(infer_mode)
engine = builder.build(model_file, batch_size)
engine_path = os.path.splitext(model_file)[0]+'.engine'
engine.save(engine_path)
def test_forward(model_file,inputs):
import forward
engine_path = os.path.splitext(model_file)[0]+'engine'
engine = forward.KerasEngine()
engine.load(engine_path)
# inputs = np.ones(1, 24, 24, 3)
outputs = engine.forward([inputs]) # list_type output
print(outputs)
model_path = 'densenet121.h5'
save_keras_model(model_path)
x = np.ones((1,512,512,3))
forward_transfer(model_path)
test_forward(model_path,x)
Environment
TensorRT Version: 7.1.3.4 NVIDIA GPU: T4 NVIDIA Driver Version: 410.104 CUDA Version: 10.2 CUDNN Version: 8.0 Operating System: ubuntu 18.04 Python Version (if applicable): 3.6.9 Tensorflow Version (if applicable): 1.15.0 PyTorch Version (if applicable): 1.7.0
@ys0232 Sorry for late reply. We have fixed the problem you mentioned above in the latest commit(1a9516e). You could pull the lastest master branch to try your codes again.
@yuanzexi Thanks for this efficient work. And then i have another question, can we map one tensorflow operation to servel tensorrt operations in this design framework? Like convert "pack" in tensorflow to "shuffle"+"concatenation" in tensorrt.
Describe the bug
A clear and concise description of what the bug is.
neither show error nor warning, it core dump, when transfer keras model to trt. This keras model is DenseNet121 from keras.
code like that:
Environment
TensorRT Version: 7.1.3.4
NVIDIA GPU: T4
NVIDIA Driver Version: 410.104
CUDA Version: 10.2
CUDNN Version: 8.0
Operating System: ubuntu 18.04
Python Version (if applicable): 3.6.9
Tensorflow Version (if applicable): 1.15.0
PyTorch Version (if applicable): 1.7.0
print info:
The text was updated successfully, but these errors were encountered: