New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ML] Enabling mml estimation in data recognizer module setup #64900
[ML] Enabling mml estimation in data recognizer module setup #64900
Conversation
Pinging @elastic/ml-ui (:ml) |
x-pack/plugins/ml/server/models/data_recognizer/data_recognizer.ts
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM ⚡
Code LGTM, but I don't see unit or functional tests 👀 It'd nice to have a basic unit test for |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tested and LGTM
@darnautov functional tests added in de1f4a4 |
@pheyos added you as a reviewer of the tests. |
prefix: 'pf2_', | ||
indexPatternName: 'kibana_sample_data_logs', | ||
startDatafeed: false, | ||
estimateModelMemory: true, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
probably worth to omit this parameter to validate that estimateModelMemory
is enabled by default
}: { | ||
body: { | ||
jobs: Job[]; | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
would be better to provide a return type for getAnomalyDetectionJob
method
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree, but would touch other files which use it, so should be done in a refactor follow up.
@elasticmachine merge upstream |
…github.com:jgowdyelastic/kibana into enabling-mml-estimation-in-data-recognizer-modules
@elasticmachine merge upstream |
💚 Build SucceededHistory
To update your PR or re-run it, just comment with: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Uptime test change LGTM !!
…#64900) * [ML] Enabling mml estimation in data recognizer module setup * small refactor * adding functional tests * increasing uptime test timeout * tiny refactor * checking for default setting * testng flakey uptime test * catching erros in mml estimation * lowering timeout * ensuing data is present for ML tests * adding await Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
…#65141) * [ML] Enabling mml estimation in data recognizer module setup * small refactor * adding functional tests * increasing uptime test timeout * tiny refactor * checking for default setting * testng flakey uptime test * catching erros in mml estimation * lowering timeout * ensuing data is present for ML tests * adding await Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com> Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
Enables the model memory limit estimation by default for the
/api/ml/modules/setup
endpoint.Also fixes an issue where the module's
query
was being used for retrieving the index time range and for the estimation.The module's
query
is used for the "recognising" aspect of this feature and is not appropriate for mml estimation.Instead the queries from each datafeed are used instead. Unless there is a common time field where a general
match_all
is used.Part of #48510
Also fixes issue with uptime ML integration functional tests where the index data hadn't been loaded before the tests start.
Unit or functional tests were updated or added to match the most common scenarios
This was checked for breaking API changes and was labeled appropriately