Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gi rewrite #121

Closed
ljwolf opened this issue Jun 26, 2020 · 0 comments · Fixed by #133
Closed

Gi rewrite #121

ljwolf opened this issue Jun 26, 2020 · 0 comments · Fixed by #133

Comments

@ljwolf
Copy link
Member

ljwolf commented Jun 26, 2020

The G_Local statistic currently assumes that all weights are equal in its __crand() method. This could be in the case where input weights are binary, or in the case where input weights are row standardized, but that row standardization has to be of equally-weighted input. This isn't valid when input weights are row-standardized, but not constant (like, in perimeter-weighted contiguity or any kernel weighting). While the vast majority of users use binary G_Local, this would be good to fix.

So, to simplify the calc method and correct this in the __crand implementation, @sjsrey and I have drawn up a roadmap of the desired new behavior:

  • change the star=False arg to star=None
  • use the w however it is provided if star=None. This means that, by default, the statistic will:
    • detect if there is a diagonal value in w.sparse
    • if so, treat it as Gi* and otherwise compute Gi.
  • if star=False, the input w will have its diagonal zeroed. If transform='R', then the weights will be re-transformed after over-writing the diagonal.
  • if star=True, then we need some more information:
    • ifw has a diagonal, use it.
    • if w has no diagonal, then examine the inputted weight's transform:
      • if the weights are binary, add one to diagonal
      • otherwise, warn that the statistic is going to be the gi, not gi-star and give directions on how to update the w with a self-weight for gi-star.
  • if star is a float, use that as self weight if the w doesn't have it
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant