New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for changesets #111
Conversation
…times, made tests pass. Still WIP for getting discussions working.
…nds, plus first simple test.
…n of PostgreSQL that Travis is using.
…branches for this anyway, so there's little need for the flag.
ostr << "}"; | ||
return ostr.str(); | ||
} | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We now have basically the same code repeated for osm_nwr_id_t, tile_id_t, and osm_changeset_id_t. Is there any reasonable way to deduplicate?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not as reasonable as I'd have liked (i.e: hiding it away in a .cpp
file), but I've reduced it to a macro in fb6ad22 which at least means there's only one copy of the code in the human-readable sense.
Right now MAX_NODES is defined in only With openstreetmap/openstreetmap-website#1259 we have a PR to change the max changeset upload size in the API, and it's been different in the past with an off-by-one error. Will either of these cause problems? |
Do we have a plan for how to test this, ideally with a canary deployment? Is there a better place to discuss that? |
Well spotted! It seems they're completely unused and I've removed them in 0697e91.
Yes. The only place in the code that the max changeset size is used is when figuring out if a changeset is closed, and it really ought to be configurable. I'll think about how to add that with minimal pain. As for the off-by-one errors, I think the best way to test that is with a set of tests for all the different corner cases. Help with adding more tests would be greatly appreciated.
My plan was to have as comprehensive a suite of tests as possible. We can certainly manually do a canary deployment, but I'd prefer to be confident ahead of time that cgimap is doing the right thing because it's passing the tests. |
I think we have the HTTP status, headers, and other similar things in agreement between the rails port and cgimap, so it's just changeset logic we should need to worry about, but comprehensive tests aren't enough. Last time with the relation full call we had a comprehensive set of tests and I had tested it for performance but when it was pointed against ramoth the combination of the database statistics, postgresql version and statistics tuning lead to it doing sequential scans. |
Agreed, and I think we can try to do a canary deployment. Comprehensive tests are necessary, but not sufficient. I don't think the unit tests are anything like comprehensive at the moment. Given you're asking about |
Well, that was more about the question of how we're handling changesets with > 50k changes. |
This adds support for changesets and changeset discussions to cgimap.
@tomhughes, @pnorman your comments would be most appreciated. Especially if you think there's any test cases which need to be added to cover corner cases or ensure compatibility in behaviour with the rails code?