GitHub is home to over 20 million developers working together to host and review code, manage projects, and build software together.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
Already on GitHub? Sign in to your account
Consider the following case:
d3.scale.linear().domain([0, 10]).range([0, 1, 2, 3, 4]);
The cardinality of the domain and range arrays don't match, so D3 will choose a bilinear interpolation, with the domain end points being (0, 10) and the range being (0, 1). You can see this in d3 / src / scale / bilinear.js (snippet shown):
var u = uninterpolate(domain, domain),
i = interpolate(range, range);
I wonder if a more intuitive fallback strategy would be to instead have the end points for the range be the first and last elements of the array. I understand that this is still a somewhat arbitrary solution to an improper use of the API (e.g., what if the array contains random values?), but I suspect that it will lead to a more "expected" result in a large portion of the cases.
In general, I prefer to document behavior with invalid inputs (even if that behavior is undefined) rather than coding defensively against it, since the latter tends to be burdensome and is never enough to defend against all possible invalid inputs. In this case there is no check that the cardinality of the domain and range are mismatched, but there is a check that the smaller of the two is greater than 2; since your domain only has two elements, you’re getting bilinear interpolation.
Fair enough. Then the document is the appropriate target for this. Forgive me as I'm new to this process; do I fork the docs, update, and submit a pull request? Or is this something you take care of as the repo master?
You can just click the Edit button the wiki. Thanks for the help!