Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(Learner1D) add possibility to use the direct neighbors in the loss #20

Closed
basnijholt opened this issue Dec 19, 2018 · 2 comments
Closed

Comments

@basnijholt
Copy link
Member

(original issue on GitLab)

opened by Jorn Hoofwijk (@Jorn) at 2018-10-25T18:39:41.849Z

This allows cool things like taking the second derivative into account.

I have created a demo notebook and some plots as to what it may look like:

image

The method is described in these pages: second_derivative.pdf

And you can find the notebook to play with here: adaptive_vs_homogenous.ipynb

@basnijholt
Copy link
Member Author

originally posted by Anton Akhmerov (@anton-akhmerov) at 2018-10-25T19:14:37.496Z on GitLab

@jorn impressive writeup and a really cool proposal!

@basnijholt
Copy link
Member Author

originally posted by Bas Nijholt (@basnijholt) at 2018-10-25T19:54:53.532Z on GitLab

Like we discussed already this week, this is super awesome!

I took the relevant code (notebook has a lot of NameErrors) and sped it up a bit:

from bisect import bisect_left

import adaptive
adaptive.notebook_extension()
import numpy as np

from adaptive.learner.learner1D import default_loss
from adaptive.learner.learnerND import volume


def simple_runner(learner, goal):
    while not goal(learner):
        x = learner.ask(1)[0][0]
        y = learner.function(x)
        learner.tell(x, y)

        # Hack to also update the losses of neigbouring intervals
        sorted_data = sorted(learner.data)
        index = bisect_left(sorted_data, x)
        xs = [sorted_data[i] for i in range(index-1, index+3)
              if 0 <= i < len(sorted_data)]
        if len(xs) > 2:
            for i in range(len(xs)-1):
                ival = xs[i], xs[i+1]
                learner._update_interpolated_loss_in_interval(*ival)

def loss_of_multi_interval(xs, function_values):
    pts = [(x, function_values[x]) for x in xs]
    N = len(pts) - 2
    return sum(volume(pts[i:i+3]) for i in range(N)) / N

def triangle_loss(interval, scale, function_values):
    _default_loss = default_loss(interval, scale, function_values)
    x_left, x_right = interval
    data = sorted(function_values)
    index = bisect_left(data, x_left)
    xs = [data[i] for i in range(index-1, index+3) if 0 <= i < len(data)]
    dx = x_right - x_left
    if len(xs) <= 2:
        return dx
    else:
        triangle_loss = loss_of_multi_interval(xs, function_values)
        return triangle_loss**0.5 + 0.02 * _default_loss + 0.02 * dx

def f5(x):
    return np.exp(-(x-0.3)**2/0.1**3)

learner = adaptive.Learner1D(f5, (-1, 1), loss_per_interval=triangle_loss)
simple_runner(learner, goal=lambda l: l.npoints > 1000)

learner.plot()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant