Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ioHelper.cpp:66:5: error: ‘onnx’ has not been declared #38

Open
Tramac opened this issue Jun 16, 2021 · 7 comments
Open

ioHelper.cpp:66:5: error: ‘onnx’ has not been declared #38

Tramac opened this issue Jun 16, 2021 · 7 comments

Comments

@Tramac
Copy link

Tramac commented Jun 16, 2021

When I complied the example from the posts/TensorRT-introduction, I got the following error:

ioHelper.cpp: In function ‘std::ostream& nvinfer1::operator<<(std::ostream&, nvinfer1::ILogger::Severity)’:
ioHelper.cpp:52:12: warning: enumeration value ‘kVERBOSE’ not handled in switch [-Wswitch]
     switch (severity)
            ^
ioHelper.cpp: In function ‘size_t nvinfer1::readTensorProto(const string&, float*)’:
ioHelper.cpp:66:5: error: ‘onnx’ has not been declared
     onnx::TensorProto tensorProto;
     ^~~~
ioHelper.cpp:67:10: error: ‘tensorProto’ was not declared in this scope
     if (!tensorProto.ParseFromString(data))
          ^~~~~~~~~~~
ioHelper.cpp:67:10: note: suggested alternative: ‘readTensorProto’
     if (!tensorProto.ParseFromString(data))
          ^~~~~~~~~~~
          readTensorProto
In file included from /home/opt/compiler/gcc-8.2/gcc-8.2/include/c++/8.2.0/cassert:44,
                 from /home/work/protobuf/src/google/protobuf/extension_set.h:42,
                 from /home/work/onnx-tensorrt/build/third_party/onnx/onnx/onnx_onnx2trt_onnx-ml.pb.h:33,
                 from /home/work/onnx-tensorrt/build/third_party/onnx/onnx/onnx-ml.pb.h:2,
                 from /home/work/onnx-tensorrt/third_party/onnx/onnx/onnx_pb.h:50,
                 from ioHelper.cpp:32:
ioHelper.cpp:70:12: error: ‘tensorProto’ was not declared in this scope
     assert(tensorProto.has_raw_data());
            ^~~~~~~~~~~
ioHelper.cpp:70:12: note: suggested alternative: ‘readTensorProto’
make: *** [<builtin>: ioHelper.o] Error 1

I have installed onnx-tensorrt and TensorRT successfully, why can't find onnx?

The ioHelper.cpp is same as it.

Many appreciate if got any reply!

@harrism @angererc @nsakharnykh

@mk-nvidia
Copy link

@Tramac you're missing the onnx library. pip install onnx should install it for you. For more details see https://github.com/onnx/onnx#installation

@Tramac
Copy link
Author

Tramac commented Jun 23, 2021

@mk-nvidia Thanks for your reply!
pip list is as follows:

Package           Version
----------------- --------
appdirs           1.4.4
dataclasses       0.8
decorator         5.0.9
graphsurgeon      0.4.1
Mako              1.1.4
MarkupSafe        2.0.1
numpy             1.19.5
onnx              1.9.0
onnx-tensorrt     0.1.0
pip               21.1.2
protobuf          3.17.3
pycuda            2020.1
pytools           2021.2.7
setuptools        57.0.0
six               1.16.0
tensorrt          7.0.0.11
torch             1.4.0
typing-extensions 3.10.0.0
uff               0.6.5
wheel             0.36.2

where the onnx, onnx-tensorrt and tensorrt have been installed successfully.

@harrism
Copy link
Member

harrism commented Jun 23, 2021

@mk-nvidia yeah this definitely isn't just a library installation problem. The errors are in one of the files in the present repo, saying onnx is not declared, so I suspect a header has changed between versions.

@Tramac
Copy link
Author

Tramac commented Jun 23, 2021

@harrism how can I fix this problem, I have no idea about it.
Looking forward to reply!

@Tramac
Copy link
Author

Tramac commented Jun 23, 2021

@harrism The new error:

ioHelper.o: In function `nvinfer1::readTensorProto(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, float*)':
ioHelper.cpp:(.text+0x16b): undefined reference to `onnx::TensorProto::TensorProto()'
ioHelper.cpp:(.text+0x184): undefined reference to `google::protobuf::MessageLite::ParseFromString(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)'
ioHelper.cpp:(.text+0x27b): undefined reference to `onnx::TensorProto::~TensorProto()'
ioHelper.cpp:(.text+0x2b0): undefined reference to `onnx::TensorProto::~TensorProto()'
collect2: error: ld returned 1 exit status
make: *** [<builtin>: simpleOnnx_1] Error 1

where Makefile as follows:

CXXFLAGS=-std=c++11 -DONNX_ML=1 -Wall -I$(CUDA_INSTALL_DIR)/include -I$(TENSORRT_LIBRARY_DIR)/include
LDFLAGS=-L$(CUDA_INSTALL_DIR)/lib64 -L$(CUDA_INSTALL_DIR)/lib64/stubs -L/usr/local/lib
LDLIBS=-Wl,--start-group -L$(ONNX_TENSORRT_DIR)/build/third_party/onnx -L$(PROTOBUF_DIR) -lnvonnxparser -lnvinfer -lcudart_static -lonnx -lonnx_proto -lprotobuf -lstdc++ -lm -lrt -ldl -lpthread -Wl,--end-group

HEADERS=${wildcard *.h}
TARGET_SRCS=$(wildcard simpleOnnx*.cpp)
TARGET_OBJS=${TARGET_SRCS:.cpp=.o}
TARGETS=${TARGET_OBJS:.o=}

This error is same as #34 , but it hasn't been resolved yet.

@harrism
Copy link
Member

harrism commented Jun 23, 2021

@harrism

I don't know, or I would have already told you. :) This is why I asked @mk-nvidia for help.

@Tramac
Copy link
Author

Tramac commented Jun 23, 2021

@harrism Thanks for your reply, I will try to change a version. Update again if it is resolved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants