Skip to content

Distinguish actual loss and estimated loss #139

@jbweston

Description

@jbweston

(original issue on GitLab)

opened by Joseph Weston (@jbweston) at 2017-07-25T15:30:25.483Z

Currently we calculate the loss as the max of the loss over all the "intervals". Intervals are determined by the points (x, y) in data and values are interpolated for points with None for a y value.

This means that the loss as currently calculated is based on the expected value of the loss once all the points have been evaluated.

The per-interval expected loss is the good measure to compare when deciding where to put subsequent points, but is the incorrect measure to use when deciding whether to terminate a calculation or not. If one starts with 2 points, then adds a billion points in between with None for they y values, then the value reported by loss() is tiny, even though there is no actual data.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions