Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added output_dir argument in export.py (ssd, lpr, vehicle_attribute) #91

Merged
merged 3 commits into from Jun 24, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
3 changes: 2 additions & 1 deletion tensorflow_toolkit/lpr/README.md
Expand Up @@ -129,9 +129,10 @@ To run the model via OpenVINO one has to freeze TensorFlow graph and
then convert it to OpenVINO Internal Representation (IR) using Model Optimizer:

```Bash
python3 tools/export.py --data_type FP32 chinese_lp/config.py
python3 tools/export.py --data_type FP32 --output_dir <export_path> chinese_lp/config.py
```

**default export path**:
`lpr/model/export_<step>/frozen_graph` - path to frozen graph
`lpr/model/export_<step>/IR/<data_type>` - path to converted model in IR format

Expand Down
3 changes: 2 additions & 1 deletion tensorflow_toolkit/lpr/tools/export.py
Expand Up @@ -29,6 +29,7 @@
def parse_args():
parser = argparse.ArgumentParser(description='Export model in IE format')
parser.add_argument('--data_type', default='FP32', choices=['FP32', 'FP16'], help='Data type of IR')
parser.add_argument('--output_dir', default=None, help='Output Directory')
parser.add_argument('--checkpoint', default=None, help='Default: latest')
parser.add_argument('path_to_config', help='Path to a config.py')
return parser.parse_args()
Expand Down Expand Up @@ -71,7 +72,7 @@ def main(_):
raise FileNotFoundError(str(checkpoint))

step = checkpoint.split('.')[-2].split('-')[-1]
output_dir = os.path.join(config.model_dir, 'export_{}'.format(step))
output_dir = args.output_dir if args.output_dir else os.path.join(config.model_dir, 'export_{}'.format(step))

# Freezing graph
frozen_dir = os.path.join(output_dir, 'frozen_graph')
Expand Down
5 changes: 3 additions & 2 deletions tensorflow_toolkit/ssd_detector/README.md
Expand Up @@ -148,10 +148,11 @@ To run the model via OpenVINO one has to freeze TensorFlow graph and
then convert it to OpenVINO Internal Representation (IR) using Model Optimizer:

```
python3 tools/export.py --data_type FP32 vlp/config.py
python3 tools/export.py --data_type FP32 --output_dir <export_path> vlp/config.py
```

As a result, you'll find three new artifacts:
As a result, you'll find three new artifacts:
**default export path**
- `vlp/model/export_<step>/frozen_graph/` - path to frozen graph
- `vlp/model/export_<step>/IR/<data_type>/` - path to converted model in IR format

Expand Down
3 changes: 2 additions & 1 deletion tensorflow_toolkit/ssd_detector/tools/export.py
Expand Up @@ -28,6 +28,7 @@ def parse_args():
parser = argparse.ArgumentParser(description='Export model in IE format')
parser.add_argument('--model_name', default='vlp')
parser.add_argument('--data_type', default='FP32', choices=['FP32', 'FP16'], help='Data type of IR')
parser.add_argument('--output_dir', default=None, help='Output Directory')
parser.add_argument('--checkpoint', default=None, help='Default: latest')
parser.add_argument('path_to_config', help='Path to a config.py')
return parser.parse_args()
Expand Down Expand Up @@ -88,7 +89,7 @@ def main(_):
raise FileNotFoundError(str(checkpoint))

step = checkpoint.split('-')[-1]
output_dir = os.path.join(config.MODEL_DIR, 'export_{}'.format(step))
output_dir = args.output_dir if args.output_dir else os.path.join(config.MODEL_DIR, 'export_{}'.format(step))

# Freezing graph
frozen_dir = os.path.join(output_dir, 'frozen_graph')
Expand Down
4 changes: 2 additions & 2 deletions tensorflow_toolkit/vehicle_attributes/README.md
Expand Up @@ -122,10 +122,10 @@ To run the model via OpenVINO one has to freeze TensorFlow graph and
then convert it to OpenVINO Internal Representation (IR) using Model Optimizer:

```Bash
python3 tools/export.py --data_type FP32 cars_100/config.py
python3 tools/export.py --data_type FP32 --output_dir <export_path> cars_100/config.py
```

As a result, you'll find three new artifacts:
**default export path**
- `lpr/model/export_<step>/frozen_graph/` - path to frozen graph
- `lpr/model/export_<step>/IR/<data_type>/` - path to converted model in IR format

Expand Down
3 changes: 2 additions & 1 deletion tensorflow_toolkit/vehicle_attributes/tools/export.py
Expand Up @@ -28,6 +28,7 @@ def parse_args():
parser.add_argument('--mo', default='mo.py', help="Path to model optimizer 'mo.py' script")
parser.add_argument('--mo_config', default='cars_100/mo.yaml', help="Path config for model optimizer")
parser.add_argument('--data_type', default='FP32', choices=['FP32', 'FP16'], help='Data type of IR')
parser.add_argument('--output_dir', default=None, help='Output Directory')
parser.add_argument('--checkpoint', default=None, help='Default: latest')
parser.add_argument('path_to_config', help='Path to a config.py')
return parser.parse_args()
Expand Down Expand Up @@ -59,7 +60,7 @@ def main(_):
raise FileNotFoundError(str(checkpoint))

step = checkpoint.split('.')[-1].split('-')[-1]
output_dir = os.path.join(config.model_dir, 'export_{}'.format(step))
output_dir = args.output_dir if args.output_dir else os.path.join(config.model_dir, 'export_{}'.format(step))

# Freezing graph
frozen_dir = os.path.join(output_dir, 'frozen_graph')
Expand Down