Skip to content

[deeplab] How to output AP (Average precision) value during eval ? #8570

@cclo-astri

Description

@cclo-astri

Hi all,

I am now studying some benchmark between some latest deep learning network models. But I found some model only output AP (Average precision) / AR (Average Recall) with various IoU range.

I wonder DeepLabV3 also support to output such figures during evaluation ?

Or I mis-understood the relationship between mIoU and AP ?

Thanks.

Metadata

Metadata

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions