Skip to content
This repository has been archived by the owner on Aug 30, 2018. It is now read-only.

Create library for converting pytorch to caffe2 and examples #69

Merged
merged 6 commits into from Nov 27, 2017

Conversation

houseroad
Copy link
Member

Base on this example, we can create notebooks which explain how to compare the results and performance.

log = logging.getLogger(__name__)


def run_caffe2_model(init_net, predict_net, inputs):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.



def run_caffe2_benchmark(init_net, predict_net, warmup_iters, main_iters, layer_details):
workspace.ResetWorkspace()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't directly use caffe2.python.workspace, use onnx_caffe2.workspace.Workspace instead.

return init_net, predict_net


def load_caffe2_net(file):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe move this into helper.py

return net


def save_caffe2_net(net, file, output_txt=False):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

helper.py

pytorch_out, expected_decimal)
log.info("The converted Caffe2 model achieves {}-decimal precision."
.format(expected_decimal))
if not compare_performance:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Split this into a separate benchmark/profile function

return init_net, predict_net

log.info("Starting benchmarking PyTorch.")
for _i in range(warmup_iters):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May be put these into a run_pytorch_benchmark function?

@houseroad houseroad changed the title Create library for converting pytorch to caffe2 and examples [WIP] Create library for converting pytorch to caffe2 and examples Nov 17, 2017
@houseroad
Copy link
Member Author

Refactoring the code, will move most of the functions to onnx_caffe2/helper.py. And move the example to the tutorial repo.

@houseroad houseroad changed the title [WIP] Create library for converting pytorch to caffe2 and examples Create library for converting pytorch to caffe2 and examples Nov 17, 2017

ws.RunNetOnce(predict_net)

output_names = predict_net.external_output
output_values = [ws.FetchBlob(name) for name in output_names]
return ws, namedtupledict('Outputs', output_names)(*output_values)


def name_inputs(onnx_model, inputs):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you should have the names of blobs in predict_net.external_input. So maybe just pass the list directly to c2_native_run_net?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

init_net contains the fake input. So the assertion len(uninitialized) == len(inputs) fails. So it's better to pass a dict.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What I meant is to accept the list directly in c2_native_run_net() and put this code there

def benchmark_pytorch_model(model, inputs, training=False, warmup_iters=3,
main_iters=10, verbose=False):
for _i in range(warmup_iters):
ts = time.time()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need for time in warmup

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice catch.

ts = time.time()
model(*inputs)
te = time.time()
total_pytorch_time = te - ts + total_pytorch_time
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+= :)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure. :-)

model(*inputs)
te = time.time()
total_pytorch_time = te - ts + total_pytorch_time
log.info("The PyTorch model execution time per iter is {} milliseconds, "
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also return time?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point!

init_net = load_caffe2_net(init_file)
predict_net = load_caffe2_net(predict_file)

# Prepare the inputs for Caffe2.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should be handled by passing option to Caffe2Backend

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep.

Copy link
Member

@dzhulgakov dzhulgakov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, ship it!


ws.RunNetOnce(predict_net)

output_names = predict_net.external_output
output_values = [ws.FetchBlob(name) for name in output_names]
return ws, namedtupledict('Outputs', output_names)(*output_values)


def name_inputs(onnx_model, inputs):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What I meant is to accept the list directly in c2_native_run_net() and put this code there

ws.RunNetOnce(init_net)
ws.CreateNet(predict_net)
results = ws.BenchmarkNet(predict_net.name, warmup_iters, main_iters, layer_details)
return results[0]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add del ws to free up memory explicitly

@houseroad houseroad merged commit db94c12 into onnx:master Nov 27, 2017
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants