Skip to content

Commit

Permalink
[DOCS] Updates terms in machine learning datafeed APIs (#44883)
Browse files Browse the repository at this point in the history
  • Loading branch information
lcawl committed Jul 26, 2019
1 parent 3305a2f commit c4f9ef9
Show file tree
Hide file tree
Showing 10 changed files with 115 additions and 108 deletions.
15 changes: 8 additions & 7 deletions docs/java-rest/high-level/ml/delete-datafeed.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -4,29 +4,30 @@
:response: AcknowledgedResponse
--
[id="{upid}-delete-datafeed"]
=== Delete Datafeed API
=== Delete datafeed API

Deletes an existing datafeed.

[id="{upid}-{api}-request"]
==== Delete Datafeed Request
==== Delete datafeed request

A +{request}+ object requires a non-null `datafeedId` and can optionally set `force`.

["source","java",subs="attributes,callouts,macros"]
---------------------------------------------------
include-tagged::{doc-tests-file}[{api}-request]
---------------------------------------------------
<1> Use to forcefully delete a started datafeed;
this method is quicker than stopping and deleting the datafeed.
Defaults to `false`.
<1> Use to forcefully delete a started datafeed. This method is quicker than
stopping and deleting the datafeed. Defaults to `false`.

include::../execution.asciidoc[]

[id="{upid}-{api}-response"]
==== Delete Datafeed Response
==== Delete datafeed response

The returned +{response}+ object indicates the acknowledgement of the request:
["source","java",subs="attributes,callouts,macros"]
---------------------------------------------------
include-tagged::{doc-tests-file}[{api}-response]
---------------------------------------------------
<1> `isAcknowledged` was the deletion request acknowledged or not
<1> `isAcknowledged` was the deletion request acknowledged or not.
27 changes: 14 additions & 13 deletions docs/java-rest/high-level/ml/put-datafeed.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -4,25 +4,24 @@
:response: PutDatafeedResponse
--
[id="{upid}-{api}"]
=== Put Datafeed API
=== Put datafeed API

The Put Datafeed API can be used to create a new {ml} datafeed
in the cluster. The API accepts a +{request}+ object
Creates a new {ml} datafeed in the cluster. The API accepts a +{request}+ object
as a request and returns a +{response}+.

[id="{upid}-{api}-request"]
==== Put Datafeed Request
==== Put datafeed request

A +{request}+ requires the following argument:

["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests-file}[{api}-request]
--------------------------------------------------
<1> The configuration of the {ml} datafeed to create
<1> The configuration of the {ml} datafeed to create.

[id="{upid}-{api}-config"]
==== Datafeed Configuration
==== Datafeed configuration

The `DatafeedConfig` object contains all the details about the {ml} datafeed
configuration.
Expand All @@ -33,10 +32,10 @@ A `DatafeedConfig` requires the following arguments:
--------------------------------------------------
include-tagged::{doc-tests-file}[{api}-config]
--------------------------------------------------
<1> The datafeed ID and the job ID
<2> The indices that contain the data to retrieve and feed into the job
<1> The datafeed ID and the {anomaly-job} ID.
<2> The indices that contain the data to retrieve and feed into the {anomaly-job}.

==== Optional Arguments
==== Optional arguments
The following arguments are optional:

["source","java",subs="attributes,callouts,macros"]
Expand All @@ -49,7 +48,8 @@ include-tagged::{doc-tests-file}[{api}-config-set-chunking-config]
--------------------------------------------------
include-tagged::{doc-tests-file}[{api}-config-set-frequency]
--------------------------------------------------
<1> The interval at which scheduled queries are made while the datafeed runs in real time.
<1> The interval at which scheduled queries are made while the datafeed runs in
real time.

["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
Expand All @@ -72,8 +72,9 @@ The window must be larger than the Job's bucket size, but smaller than 24 hours,
and span less than 10,000 buckets.
Defaults to `null`, which causes an appropriate window span to be calculated when
the datafeed runs.
The default `check_window` span calculation is the max between `2h` or `8 * bucket_span`.
To explicitly disable, pass `DelayedDataCheckConfig.disabledDelayedDataCheckConfig()`.
The default `check_window` span calculation is the max between `2h` or
`8 * bucket_span`. To explicitly disable, pass
`DelayedDataCheckConfig.disabledDelayedDataCheckConfig()`.

["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
Expand Down Expand Up @@ -101,4 +102,4 @@ default values:
--------------------------------------------------
include-tagged::{doc-tests-file}[{api}-response]
--------------------------------------------------
<1> The created datafeed
<1> The created datafeed.
13 changes: 6 additions & 7 deletions docs/java-rest/high-level/ml/start-datafeed.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,13 @@
:response: StartDatafeedResponse
--
[id="{upid}-{api}"]
=== Start Datafeed API
=== Start datafeed API

The Start Datafeed API provides the ability to start a {ml} datafeed in the cluster.
It accepts a +{request}+ object and responds
with a +{response}+ object.
Starts a {ml} datafeed in the cluster. It accepts a +{request}+ object and
responds with a +{response}+ object.

[id="{upid}-{api}-request"]
==== Start Datafeed Request
==== Start datafeed request

A +{request}+ object is created referencing a non-null `datafeedId`.
All other fields are optional for the request.
Expand All @@ -20,9 +19,9 @@ All other fields are optional for the request.
--------------------------------------------------
include-tagged::{doc-tests-file}[{api}-request]
--------------------------------------------------
<1> Constructing a new request referencing an existing `datafeedId`
<1> Constructing a new request referencing an existing `datafeedId`.

==== Optional Arguments
==== Optional arguments

The following arguments are optional.

Expand Down
24 changes: 13 additions & 11 deletions docs/java-rest/high-level/ml/update-datafeed.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,13 @@
:response: PutDatafeedResponse
--
[id="{upid}-{api}"]
=== Update Datafeed API
=== Update datafeed API

The Update Datafeed API can be used to update a {ml} datafeed
in the cluster. The API accepts a +{request}+ object
Updates a {ml} datafeed in the cluster. The API accepts a +{request}+ object
as a request and returns a +{response}+.

[id="{upid}-{api}-request"]
==== Update Datafeed Request
==== Update datafeed request

A +{request}+ requires the following argument:

Expand All @@ -22,7 +21,7 @@ include-tagged::{doc-tests-file}[{api}-request]
<1> The updated configuration of the {ml} datafeed

[id="{upid}-{api}-config"]
==== Updated Datafeed Arguments
==== Updated datafeed arguments

A `DatafeedUpdate` requires an existing non-null `datafeedId` and
allows updating various settings.
Expand All @@ -31,12 +30,15 @@ allows updating various settings.
--------------------------------------------------
include-tagged::{doc-tests-file}[{api}-config]
--------------------------------------------------
<1> Mandatory, non-null `datafeedId` referencing an existing {ml} datafeed
<2> Optional, set the datafeed Aggregations for data gathering
<3> Optional, the indices that contain the data to retrieve and feed into the job
<1> Mandatory, non-null `datafeedId` referencing an existing {ml} datafeed.
<2> Optional, set the datafeed aggregations for data gathering.
<3> Optional, the indices that contain the data to retrieve and feed into the
{anomaly-job}.
<4> Optional, specifies how data searches are split into time chunks.
<5> Optional, the interval at which scheduled queries are made while the datafeed runs in real time.
<6> Optional, a query to filter the search results by. Defaults to the `match_all` query.
<5> Optional, the interval at which scheduled queries are made while the
datafeed runs in real time.
<6> Optional, a query to filter the search results by. Defaults to the
`match_all` query.
<7> Optional, the time interval behind real time that data is queried.
<8> Optional, allows the use of script fields.
<9> Optional, the `size` parameter used in the searches.
Expand All @@ -55,4 +57,4 @@ the updated {ml} datafeed if it has been successfully updated.
--------------------------------------------------
include-tagged::{doc-tests-file}[{api}-response]
--------------------------------------------------
<1> The updated datafeed
<1> The updated datafeed.
24 changes: 13 additions & 11 deletions docs/reference/ml/anomaly-detection/apis/delete-datafeed.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,29 +15,31 @@ Deletes an existing {dfeed}.

`DELETE _ml/datafeeds/<feed_id>`

[[ml-delete-datafeed-prereqs]]
==== {api-prereq-title}

* If {es} {security-features} are enabled, you must have `manage_ml` or
`manage` cluster privileges to use this API. For more information, see
{stack-ov}/security-privileges.html[Security privileges].

[[ml-delete-datafeed-desc]]
==== {api-description-title}

NOTE: Unless the `force` parameter is used, the {dfeed} must be stopped before it can be deleted.
NOTE: Unless the `force` parameter is used, the {dfeed} must be stopped before
it can be deleted.

[[ml-delete-datafeed-path-parms]]
==== {api-path-parms-title}

`feed_id` (required)::
(string) Identifier for the {dfeed}
`feed_id`::
(Required, string) Identifier for the {dfeed}.

[[ml-delete-datafeed-query-parms]]
==== {api-query-parms-title}

`force`::
(boolean) Use to forcefully delete a started {dfeed}; this method is quicker than
stopping and deleting the {dfeed}.

[[ml-delete-datafeed-prereqs]]
==== {api-prereq-title}

You must have `manage_ml`, or `manage` cluster privileges to use this API.
For more information, see {stack-ov}/security-privileges.html[Security privileges].
(Optional, boolean) Use to forcefully delete a started {dfeed}; this method is
quicker than stopping and deleting the {dfeed}.

[[ml-delete-datafeed-example]]
==== {api-examples-title}
Expand Down
21 changes: 10 additions & 11 deletions docs/reference/ml/anomaly-detection/apis/preview-datafeed.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,13 @@ Previews a {dfeed}.

`GET _ml/datafeeds/<datafeed_id>/_preview`

[[ml-preview-datafeed-prereqs]]
==== {api-prereq-title}

* If {es} {security-features} are enabled, you must have `monitor_ml`, `monitor`,
`manage_ml`, or `manage` cluster privileges to use this API. For more
information, see {stack-ov}/security-privileges.html[Security privileges].

[[ml-preview-datafeed-desc]]
==== {api-description-title}

Expand All @@ -25,19 +32,11 @@ structure of the data that will be passed to the anomaly detection engine.
[[ml-preview-datafeed-path-parms]]
==== {api-path-parms-title}

`datafeed_id` (required)::
(string) Identifier for the {dfeed}

[[ml-preview-datafeed-prereqs]]
==== {api-prereq-title}

If {es} {security-features} are enabled, you must have `monitor_ml`, `monitor`,
`manage_ml`, or `manage` cluster privileges to use this API. For more
information, see
{stack-ov}/security-privileges.html[Security privileges].
`datafeed_id`::
(Required, string) Identifier for the {dfeed}.

[[ml-preview-datafeed-security]]
==== Security Integration
==== Security integration

When {es} {security-features} are enabled, the {dfeed} query is previewed using
the credentials of the user calling the preview {dfeed} API. When the {dfeed}
Expand Down
31 changes: 16 additions & 15 deletions docs/reference/ml/anomaly-detection/apis/put-datafeed.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,11 +15,18 @@ Instantiates a {dfeed}.

`PUT _ml/datafeeds/<feed_id>`

[[ml-put-datafeed-prereqs]]
==== {api-prereq-title}

* You must create an {anomaly-job} before you create a {dfeed}.
* If {es} {security-features} are enabled, you must have `manage_ml` or `manage`
cluster privileges to use this API. See
{stack-ov}/security-privileges.html[Security privileges].

[[ml-put-datafeed-desc]]
==== {api-description-title}

You must create a job before you create a {dfeed}. You can associate only one
{dfeed} to each job.
You can associate only one {dfeed} to each {anomaly-job}.

IMPORTANT: You must use {kib} or this API to create a {dfeed}. Do not put a {dfeed}
directly to the `.ml-config` index using the Elasticsearch index API.
Expand All @@ -29,8 +36,8 @@ IMPORTANT: You must use {kib} or this API to create a {dfeed}. Do not put a {df
[[ml-put-datafeed-path-parms]]
==== {api-path-parms-title}

`feed_id` (required)::
(string) A numerical character string that uniquely identifies the {dfeed}.
`feed_id`::
(Required, string) A numerical character string that uniquely identifies the {dfeed}.
This identifier can contain lowercase alphanumeric characters (a-z and 0-9),
hyphens, and underscores. It must start and end with alphanumeric characters.

Expand All @@ -56,12 +63,13 @@ IMPORTANT: You must use {kib} or this API to create a {dfeed}. Do not put a {df
bucket spans, or, for longer bucket spans, a sensible fraction of the bucket
span. For example: `150s`.

`indices` (required)::
(array) An array of index names. Wildcards are supported. For example:
`indices`::
(Required, array) An array of index names. Wildcards are supported. For example:
`["it_ops_metrics", "server*"]`.

`job_id` (required)::
(string) A numerical character string that uniquely identifies the job.
`job_id`::
(Required, string) A numerical character string that uniquely identifies the
{anomaly-job}.

`query`::
(object) The {es} query domain-specific language (DSL). This value
Expand Down Expand Up @@ -90,13 +98,6 @@ IMPORTANT: You must use {kib} or this API to create a {dfeed}. Do not put a {df
For more information about these properties,
see <<ml-datafeed-resource>>.

[[ml-put-datafeed-prereqs]]
==== {api-prereq-title}

If {es} {security-features} are enabled, you must have `manage_ml`, or `manage`
cluster privileges to use this API. For more information, see
{stack-ov}/security-privileges.html[Security privileges].

[[ml-put-datafeed-security]]
==== Security integration

Expand Down
Loading

0 comments on commit c4f9ef9

Please sign in to comment.