Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update backend: x^y instead of abs(x)^y, and high-precision constants #191

Merged
merged 9 commits into from
Sep 11, 2022

Conversation

MilesCranmer
Copy link
Owner

@MilesCranmer MilesCranmer commented Sep 10, 2022

This updates the backend to v0.12, which includes a couple of changes:

  1. Use functions returning NaN on branch cuts instead of abs (issue #109) SymbolicRegression.jl#123 by @johanbluecreek, which changes operators from, e.g., abs(x)^y to x^y. Expressions using these operators will now simply avoid cases where their output is NaN. This avoids letting the search exploit weird functional forms which one might wish to not use.
  2. Generalize expressions to have arbitrary constant types SymbolicRegression.jl#119, which uses arbitrary-precision for constants. Now, when you use PySRRegressor(precision=64), the constants will also be 64-bit precision. As a side effect, this also increases the speed of evaluation, since there is type stability.
  3. Fix for when the dataset has zero variance. This allows you to fit datasets with a single row: useful if you want to find a symbolic approximation to a number (Ramanujan-style), like described on express a number as a function of given constants SymbolicRegression.jl#127.
  4. This also sets the default parsimony to 0.0. I felt like it makes a lot of assumptions about the dataset scale to set any value for the default parsimony, so it's better to just leave the default turned off. The search will still find simple expressions; but it will not be as biased towards them, especially when the baseline loss is large.

@MilesCranmer
Copy link
Owner Author

For example, here is how you can find the result of pi/sqrt(2) with PySR:

import math
from pysr import PySRRegressor

model = PySRRegressor(
    binary_operators="+ * / -".split(" "),
    unary_operators=["sqrt"],
    precision=64,
    complexity_of_constants=100,
    niterations=1000,
    weight_mutate_constant=0.0,
    should_optimize_constants=False,
    parsimony=0.001,
)

model.fit(
    [[2, math.pi, math.e]], [2.22144146908], variable_names=["_2", "_pi", "_e"]
)

At the end, you should be able to run:

model.sympy()
# _pi/sqrt(_2)

@MilesCranmer
Copy link
Owner Author

Finding approximations to pi :)

import math
from pysr import PySRRegressor

model = PySRRegressor(
    binary_operators="+ * / -".split(" "),
    unary_operators=["sqrt", "square"],
    niterations=1000,
    # High precision:
    precision=64,
    # Favor simpler expressions:
    parsimony=0.003,
    # Disable constants:
    complexity_of_constants=100,
    # Prevent redundant computation:
    weight_mutate_constant=0.0,
    should_optimize_constants=False,
)

model.fit(
    [[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]],
    [math.pi],
    variable_names=["_1", "_2", "_3", "_4", "_5", "_6", "_7", "_8", "_9", "_10"],
)

@MilesCranmer MilesCranmer merged commit d843a3c into master Sep 11, 2022
@MilesCranmer MilesCranmer deleted the backend-update branch November 4, 2022 18:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant