Skip to content
This repository has been archived by the owner on Jan 24, 2024. It is now read-only.

can't convert from caffe model. #521

Closed
szad670401 opened this issue May 26, 2019 · 1 comment
Closed

can't convert from caffe model. #521

szad670401 opened this issue May 26, 2019 · 1 comment

Comments

@szad670401
Copy link

szad670401 commented May 26, 2019

my config:

> OPTIONS:
>     Framework: CAFFE
>     SavePath: ./output
>     ResultName: face_r100
>     Config:
>         LaunchBoard: ON
>         Server:
>             ip: 0.0.0.0
>             port: 8888
>         OptimizedGraph:
>             enable: OFF
>             path: ./googlenet.paddle_inference_model.bin.saved
>     LOGGER:
>         LogToPath: ./log/
>         WithColor: ON
> 
> TARGET:
>     CAFFE:
>         # path of fluid inference model
>         Debug: NULL                            # Generally no need to modify.
>         PrototxtPath: ./model/model.prototxt        # The upper path of a fluid inference model.
>         ModelPath: ./model/model.caffmodel        # The upper path of a fluid inference model.
>         NetType:        

Traceback (most recent call last):
File "converter.py", line 79, in
graph = Graph(config)
File "/root/Anakin/tools/external_converter_v2/parser/graph.py", line 26, in init
raise NameError('ERROR: GrapProtoIO not support %s model.' % (config.framework))
NameError: ERROR: GrapProtoIO not support CAFFE model.

@Jayoprell
Copy link
Collaborator

Did you set the path of caffe proto? You must set caffe proto before you convert.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants