Skip to content

Set threshold to account for too many duplicate or nan values in DateTimeFormatDataCheck #3883

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Dec 13, 2022

Conversation

ParthivNaresh
Copy link
Contributor

@ParthivNaresh ParthivNaresh commented Dec 10, 2022

Currently if a time_index is chosen that has duplicate or nan values, these values are dropped prior to the creation of an alias_dict in Woodwork which iterates over all identified frequencies in the time series to determine the most likely one.

The side effect of this is that in a dataset with a length of 20,000, if the time_index is just 50 consecutive daily datetime values repeated 400 times, then we currently return DATETIME_HAS_UNEVEN_INTERVALS and an action code to fix this data.

This time_index should not be considered for regularization as regularizing this data would reduce this dataset from 20,000 observations to 50, since all duplicates and nans would be dropped and the remaining datetime values would have a frequency of 1 day. This is more of a multi-series concern which is outside the scope of this data check.

I've added a check where if 25% of the time_index consists of solely duplicate and nan values, then DATETIME_NO_FREQUENCY_INFERRED will be returned instead of DATETIME_HAS_UNEVEN_INTERVALS. I'm open to changing this threshold or making it a passable argument, it was more or less arbitrary.

@codecov
Copy link

codecov bot commented Dec 10, 2022

Codecov Report

Merging #3883 (5f0cd2d) into main (24a01ae) will increase coverage by 0.1%.
The diff coverage is 100.0%.

@@           Coverage Diff           @@
##            main   #3883     +/-   ##
=======================================
+ Coverage   99.7%   99.7%   +0.1%     
=======================================
  Files        346     346             
  Lines      36284   36304     +20     
=======================================
+ Hits       36147   36167     +20     
  Misses       137     137             
Impacted Files Coverage Δ
evalml/data_checks/datetime_format_data_check.py 100.0% <100.0%> (ø)
...ta_checks_tests/test_datetime_format_data_check.py 100.0% <100.0%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@ParthivNaresh ParthivNaresh marked this pull request as ready for review December 12, 2022 15:41

assert result[2]["code"] == "DATETIME_HAS_UNEVEN_INTERVALS"

X.iloc[0, -1] = None
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changing this value crosses the threshold into DATETIME_NO_FREQUENCY_INFERRED

Copy link
Contributor

@eccabay eccabay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the threshold was arbitrary, I think it certainly wouldn't hurt to make it a passable argument. Regardless, looks great!

Copy link
Contributor

@christopherbunn christopherbunn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense to me! +1 to @eccabay's comment on making it a passable argument or even just an internal global variable.

@chukarsten chukarsten merged commit 408eb9b into main Dec 13, 2022
@chukarsten chukarsten deleted the set_threshold_for_regularizing branch December 13, 2022 15:23
@christopherbunn christopherbunn mentioned this pull request Jan 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants