Skip to content

peteryuX/tensorflow-GHM-loss

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Open In Colab Star Fork License

This is a simple tensorflow implementation of the loss weights in Gradient Harmonized Single-stage Detector published on AAAI 2019 Oral.

Original Paper (Arxiv): Link


Proposed GHM Function

You can get the GHM weights by the get_ghm_weight() function in tf_ghm_loss.py. And use these weights to modify your loss term like theory in paper. The brief information of this function as below:

def get_ghm_weight(predict, target, valid_mask, bins=10, alpha=0.75,
               dtype=tf.float32, name='GHM_weight'):
    """ Get gradient Harmonized Weights.
    This is an implementation of the GHM ghm_weights described
    in https://arxiv.org/abs/1811.05181.
    Args:
        predict:
            The prediction of categories branch, [0, 1].
            -shape [batch_num, category_num].
        target:
            The target of categories branch, {0, 1}.
            -shape [batch_num, category_num].
        valid_mask:
            The valid mask, is 0 when the sample is ignored, {0, 1}.
            -shape [batch_num, category_num].
        bins:
            The number of bins for region approximation.
        alpha:
            The moving average parameter.
        dtype:
            The dtype for all operations.
    
    Returns:
        weights:
            The beta value of each sample described in paper.
    """

Toy Demo ☕

The demo state like bellow:

  • prediction: [1., 0., 0.5, 0.]
  • target: [1., 0., 0., 1.]

You can find more details in tf_ghm_loss.py.

Run

python tf_ghm_loss.py

Output

update 1 times:  [[0.5        0.5        0.72727275 0.72727275]]
update 100 times:  [array([[0.20000002, 0.20000002, 0.40000004, 0.40000004]], dtype=float32)]

Relevant materials 🍺

About

The unofficial tensorflow implementation of loss weights of in "Gradient Harmonized Single-stage Detector" published on AAAI 2019 (Oral). With Colab.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages