Skip to content
IceVision scoring software
Branch: master
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
examples updae scoring and example Jul 13, 2019
src add swapping of bbox coordinates Jul 15, 2019
.gitignore first commit May 7, 2019
.travis.yml update readme, fix CI May 18, 2019
Cargo.lock normalize commas to dots, update readme Jul 15, 2019
Cargo.toml normalize commas to dots, update readme Jul 15, 2019
LICENSE-APACHE first commit May 7, 2019
LICENSE-MIT first commit May 7, 2019
README.md normalize commas to dots, update readme Jul 15, 2019

README.md

icevision-score Build Status dependency status

Scoring software used in the IceVision competition.

During online stage version v0.1.5 was used.

Input file formats

For ground truth we use exactly the same format as in the annotations repository.

For solutions we use a slightly different TSV (tab-separated values) file format. It contains all detected traffic signs of given classes on provided frame sequences. A solution file must contain header and the following fields:

  • frame: sequence + frame number, e.g. 2018-02-13_1418_left/000032.
  • xtl, ytl, xbr, ybr: bounding box coordinates, integer or float. Note: xtl must be bigger than xbr and ytl must be bigger than ybr
  • class: traffic sign code. Valid values are: 2.1, 2.4, 3.1, 3.24, 3.27, 4.1, 4.2, 5.19, 5.20, 8.22.
  • temporary: is sign temporary (has a yellow background)? Valid values: true and false. Can be omitted.
  • data: associated sign data. UTF-8 encoded string. Can be omitted.

Some examples can be found in examples/ folder.

Scoring methodology

During offline stage participants can detect all traffic sign classes defined by Russian traffic code.

Bounding boxes with an area smaller than 100 pixels are ignored during evaluation. Detection is considered successful if IoU is bigger or equal to 0.3 and bounding box has a correct class or superclass code. If sign is detected twice, then detection with a smallest IoU will be counted as a false positive. Each false positive or incorrect detection results in a penalty equal to 2 points.

Score for true positives is calculated as (1 + k1 + k2 + k3)*s, where: s -- "base" score, k1 -- coefficient for detecting sign code, k2 -- coefficient for detecting associated data, k3 -- coefficient for detecting temporary sign. If (1 + k1 + k2 + k3) < 0, detection score is set to 0.

If IoU > 0.85, s = 1. Otherwise it is calculated using the following equation: ((IoU – 0.3)/0.55)^0.25.

k1 is calculated as follows. For two-digit signs (e.g. “1.25”):

  • If only one digit is detected (e.g. “1”), k1=-0.7
  • If two digits are detected (e.g. “1.25”), k1=0

For three-digit signs (e.g. “5.19.1”):

  • If only one digit is detected (e.g. “5”), k1=-0.7
  • If two digits are detected (e.g. “5.19”), k1=-0.2
  • If three digits are detected (e.g. “5.19.1”), k1=0

If manual annotation have used “8” for sign code, then for all traffic sign detections beginning with “8”, K1=0

If detection provides sign associated data and it is equal to annotation associated data, k2=2. If data is different, k2=-0.5. If annotation has used “NA” for associated data, k2=0 for any data provided in detection.

If detection does not provide any information about sign being temporary (empty string in the "temporary" field), k3=0. If sign is annotated as temporary and detection is correct ("true" in the "temporary" field), k3=1. If sign is not annotated as temporary and detection is correct ("false" in the "temporary" field), k3=0. If detection is incorrect about sign being temporary, k3=-0.5.

The final score is computed as sum of all true positive points minus all penalties.

For exact algorithm, please, refer to the code.

Building

Scoring software is written in Rust, so you'll need to grab a Rust installation in order to compile it. In general, we use the latest stable release of the Rust compiler, but older versions may work as well.

$ git clone https://github.com/icevision/score
$ cd score/
$ cargo build --release
$ ./target/release/icevision-score --help
icevision-score 0.2.2
Artyom Pavlov <newpavlov@gmail.com>
IceVision competition scoring software

USAGE:
    icevision-score [FLAGS] <ground_truth> <solution>

FLAGS:
    -h, --help       Prints help information
    -V, --version    Prints version information
    -v, --verbose    Enable verbose report

ARGS:
    <ground_truth>    Path to a directory with ground truth TSV files.
    <solution>        Path to a solution TSV file.

Usage example

For ground truth we use some files from annotations repository.

$ ./target/release/icevision-score examples/ground_truth/ examples/good.tsv
Total score:    1.249
Total penalty:  0.000
Per class results:
Class   Score   Penalty
4.1     0.791   0.000
2.4     0.458   0.000

To enable verbose report use --verbose flag:

$ ./target/release/icevision-score --verbose examples/ground_truth/ examples/good.tsv

frame: 2018-02-13_1418_left/000033
score   xtl     ytl     xbr     ybr     class   s       k1  k2  k3
0.791   1774    896     1847    979     4.1     0.989   -20 0   0
0.458   1643    895     1771    973     2.4     0.916   0   0   -50

===========================

Total score:    1.249
Total penalty:  0.000
Per class results:
Class   Score   Penalty
4.1     0.791   0.000
2.4     0.458   0.000

License

Licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

You can’t perform that action at this time.