Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Reduce AutoDateFormatter precision when possible
Previously, the AutoDateFormatter would choose a format with second or microsecond precision, even when the ticks were significantly coarser than that. The resulting extra precision looks weird and can clutter the display (especially with the long microsecond display). This commit changes the default scale to format dictionary, which now works as follows: - Use microsecond precision when the ticks are less than a second apart - Use second precision when the ticks are seconds apart - Use minute precision when the ticks are minutes or hours apart - Use day-precision, month or year precision when the ticks are days or more apart (unchanged). Note that there is no point in displaying only the hour when the ticks are hours apart, since then it won't be immediately clear that a time is being displayed. Adding the (technically superfluous) :00 for the minutes should make it immediately obvious that a time is being displayed, which is why the minute precision should also be used when the ticks are hours apart. While updating the documentation for this change, it was also changed to use symbolic constants instead of hardcoded numbers. This should make it more clear what the intention is. Closes: matplotlib#4808
- Loading branch information