Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Objective cell memory leak? #154

Open
pusolito opened this issue Aug 19, 2022 · 26 comments
Open

Objective cell memory leak? #154

pusolito opened this issue Aug 19, 2022 · 26 comments

Comments

@pusolito
Copy link

pusolito commented Aug 19, 2022

It seems as though the cells for the objective will grow, but not shrink when edit variables are added/suggested/removed. More specifically, adding edit variables for equality will lead to the objective picking up 2 cells. These cells are then reduced down to 1. But that cell is not removed from the objective when the edit variable is removed. Instead, the added cell's value is decremented. This means the objective cells size will grow with repeated add/remove calls. Is there something I'm missing?

@sccolbert
Copy link
Member

sccolbert commented Aug 19, 2022 via email

@pusolito
Copy link
Author

Thanks for the quick reply! 🙏

In my case, I need to update the edit variables whenever an object "tries" to adjust. I could discard the solver each time, but I'm hoping to gain a bit of speed by keeping it around and not rebuilding the constraints each time something adjusts. I also see the same growth for any equals constraint actually. So this is independent of edit variables. I just happened to notice it there b/c mine contained equality constraints.

Do you have recommendations for how to do the cleanup correctly? I am actually working with a port of this lib that I have full code control over.

@pusolito
Copy link
Author

The case for a single equals constraint is that 2 Error entries are added to the objective, but only 1 is removed when that constraint is remove.

@sccolbert
Copy link
Member

sccolbert commented Aug 19, 2022 via email

@sccolbert
Copy link
Member

sccolbert commented Aug 19, 2022 via email

@pusolito
Copy link
Author

Interesting. Thanks for clarifying. As you suggested, I'll follow up with more context on my use case to get more guidance.

@pusolito
Copy link
Author

So the leak was actually a bug in the port. But I'll provide more details of my use case still since I'm trying to improve performance while doing more dynamic updates to the constrains/edits.

@pusolito
Copy link
Author

pusolito commented Oct 20, 2022

Ok, I know it's been a while. But I made a lot of good progress on optimizing my solution in the general cases. Now I'm thinking about a slightly different problem. The implementation is very efficient when updating variables, which makes sense. But I have cases where it would be great to express the idea of a "read-only" variable. This would work great in UI layout cases, where you'd like to have a relationship with a parent element that is dynamic, as in it can change over time, yet the solver never tries to update it unless you explicitly suggest a value for it. So essentially, "read-only" edit variables.

And example of this would be: child.left + child.width == parent.width / 2 where you want to avoid the solver ever changing parent.width, but you want the ability to set/suggest it over time.

Right now I solve this by removing constraints and re-adding them as follows, but this is inefficient.

+ `child.left + child.width == 100
- `child.left + child.width == 100
+ `child.left + child.width == 200

Any suggestions for how you might address this or evolve the solver to handle it?

@MatthieuDartiailh
Copy link
Member

As suggested @sccolbert in one of his earlier comment you could use an edit variable with a required strength. As a consequence the solver won't change it but you will be able to.

@pusolito
Copy link
Author

Edit variables can't actually have a strength of required:

throw BadRequiredStrength();

@sccolbert
Copy link
Member

sccolbert commented Oct 25, 2022 via email

@pusolito
Copy link
Author

I wondered about that, and would be interested in using that ability if possible. Separately, I am exploring a method like suggestValue that updates the constant for a constraint. So far this seems to work.

@Qix-
Copy link
Contributor

Qix- commented Oct 26, 2022

Maybe we can experiment with removing that restriction.

That would be incredible.

@MatthieuDartiailh
Copy link
Member

I think this was forbidden because if abused it can easily lead to poor performance. @sccolbert any memory of why you forbid it.

Lifting the restriction in itself would be easy but would call for some extra documentation on when it should be used. Would anybody be interested in making a PR ?

@sccolbert
Copy link
Member

sccolbert commented Oct 28, 2022 via email

@pusolito
Copy link
Author

As mentioned, I created a new method for updating a constraint. It looks basically like this (though I'm working in Kotlin, so apologize if the port isn't quite right):

void updateConstant( const Constraint& old, const Constraint& new )
{
    auto cn_it = m_cns.find( old );
    if( cn_it == m_cns.end() )
        throw UnknownConstraint( old );

    DualOptimizeGuard guard( *this );

    Tag tag( cn_it->second );
    m_cns.erase( cn_it );

    m_cns[ new ] = tag;

    double delta = new.expression.constant - old.expression.constant

    // Check first if the positive error variable is basic.
    auto row_it = m_rows.find( tag.marker );
    if( row_it != m_rows.end() )
    {
        if( row_it->second->add( -delta ) < 0.0 )
            m_infeasible_rows.push_back( row_it->first );
        return;
    }

    // Check next if the negative error variable is basic.
    row_it = m_rows.find( tag.other );
    if( row_it != m_rows.end() )
    {
        if( row_it->second->add( delta ) < 0.0 )
            m_infeasible_rows.push_back( row_it->first );
        return;
    }

    // Otherwise update each row where the error variables exist.
    for (const auto & rowPair : m_rows)
    {
        double coeff = rowPair.second->coefficientFor( tag.marker );
        if( coeff != 0.0 &&
            rowPair.second->add( delta * coeff ) < 0.0 &&
            rowPair.first.type() != Symbol::External )
            m_infeasible_rows.push_back( rowPair.first );
    }
}

This seems to work fine for constraints with required strength, but if doesn't have the desired result for those with lower strength. Any suggestions? My assumption was that this would work generally given that suggestValue does a similar thing.

@pusolito
Copy link
Author

pusolito commented Jan 8, 2023

Any suggestion for what could be wrong with the approach above?

@MatthieuDartiailh
Copy link
Member

Sorry @pusolito I had no time to look into it. I will try to have a look by the end of the month (do not hesitate to ping me).

@MatthieuDartiailh
Copy link
Member

Looking at what you are doing I am confused by the fact that you pass a new constraint instead of simply a new constant value. In particular you fully remove the old constraints and add the new which seems at odd with your intent. Did you try simply updating the constant without removing/adding constraints ?

@pusolito
Copy link
Author

The reason for this is that I'm tracking the list of constraints outside of the Solver for a layout manager. Those constraints are checked for changes whenever the UI layouts need to be recalculated. I then do a diff and add/remove/update any constraints that have changed. This means I need to Solver's internal map of constraints to reflect what is tracked outside. Updating the row directly w/o removing the old constraint and adding the new one would mean the Solver throws unknown constraint errors when subsequent calls come in related to the new constraint.

@MatthieuDartiailh
Copy link
Member

I am confused because this deviates from your original use case which was about modifying the constant of a constraint. Here you are attempting to fully remove a constraint which, as @sccolbert has described earlier in this thread, is expensive and I don't think your are fully cleaning it up in your current code.

Why is updating a constraint constant not sufficient anymore ?

@pusolito
Copy link
Author

This is not really different than my original use case. I still essentially want to modify the constant of a constraint. But it is more complex than that. I'm using the Solver as a part of a larger constraint layout manager. You can see more in this project (repo).

In my case, the layout manager recomputes constraints (actually regenerates them) whenever part of the UI needs recomposition (i.e. something moves or resizes). This means I have a list of brand new constraint instances and need to diff them against the previous tracked list. Upon finding a difference, I update the Solver. This sometimes means removing or adding new constraints.

My use case is trying to speed up the instances when the difference in a constraint is simply a constant value. I'd like to just update the constraint in this case. But, I still need to have the Solver reflect the new constraint after the update, so that subsequent operations on the Solver don't fail with constraint unknown.

For example, let's say you setup the following constraints in my framework. The parent.width and parent.right resolve to constant values since parent properties in constraints are non-modifiable by default.

layout = constrain(view1, view2) { v1, v2 -> 
    v1.width eq parent.width
    v2.right  eq parent.right - 10
}

However, this block is re-evaluated every time the parent width ore height changes. So the underlying manger will see its constraint list change each time. It then does a diff to see if anything has been added/deleted/updated (these blocks can have conditional logic since they are just lambdas, etc.). In this case, it will see that 2 constraints have been deleted and 2 added. But, the changes can also be seen to differ only by constants.

This is the case where I'd like to modify the internal Solver constrain constants instead of doing an expensive remove then add. However, I still currently need the Solver to be left with constraints that reflect the latest state. That is, the new constraints (which are not equivalent to the former ones) are assumed in the Solver once the manager is done with a pass.

@MatthieuDartiailh
Copy link
Member

Thanks for the detailed explanation.

Looking at it this way, to me at least, it looks like convincing your manager that the latest built constraint is not always the relevant one may be easier. Since you diff anyway, couldn't you determine the relevant constraints from the diff result.

One thing that bugs me though is that an edit variable is basically a constraint with an == and a strength. Out of curiosity when you say your function does not work for strength that are not required, is it for == constraint or <= and >= ? If the later case we could try to dig deeper since this is a case that does not exist for edit variables.

BTW I am still open to allowing required variable edits if that helps (but in this case I doubt it)

@Qix-
Copy link
Contributor

Qix- commented Jan 19, 2023

BTW I am still open to allowing required variable edits if that helps (but in this case I doubt it)

Not to derail the conversation too much, but isn't it the case that some_var == 10 is also discouraged? Wouldn't required variable edits speed things up?

@MatthieuDartiailh
Copy link
Member

I honestly do not remember of the top of my head. I think the concern here is that this kind of indirection increase the system to solve size and could lead to bad performance if abused, which is also a valid concern for required edit variables. If we allow it we have to be clear about the trade-off in the documentation.

@pusolito
Copy link
Author

@MatthieuDartiailh, I think the issue shows up whenever I have a constraint with a strength less than required. So I don't think it matters if the constraint is ==, <=, or >=. But I'll confirm.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants