Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: fixing documentation #836

Merged
merged 5 commits into from
May 14, 2024
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Dockerfile.tests
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
#
FROM ubuntu:latest
FROM ubuntu:22.04
kdoroszko-splunk marked this conversation as resolved.
Show resolved Hide resolved

RUN mkdir -p /work/tests
RUN mkdir -p /work/test-results/functional
Expand Down
4 changes: 2 additions & 2 deletions docs/cim_compliance.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,15 +29,15 @@ There are two ways to generate the CIM Compliance report:
- Append the following to [any one of the commands](how_to_use.md#test-execution) used for executing the test cases:

```console
--cim-report <file_name.md
--cim-report <file_name.md>
```

**2. Generating the report using the test results stored in the junit-xml file**

- Execute the following command:

```console
cim-report <junit_report.xml<report.md
cim-report <junit_report.xml> <report.md>
```

## Report Generation Troubleshooting
Expand Down
12 changes: 6 additions & 6 deletions docs/cim_tests.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ To generate test cases only for CIM compatibility, append the following marker t
```


#### Testcase Assertions:
#### Testcase Assertions:

- There should be at least 1 event mapped with the dataset.
- Each required field should be extracted in all the events mapped with the datasets.
Expand Down Expand Up @@ -100,13 +100,13 @@ To generate test cases only for CIM compatibility, append the following marker t
- Plugin gets a list of fields whose extractions are defined in props using addon_parser.
- By comparing we obtain a list of fields whose extractions are not allowed but defined.

**5. Testcase to check that eventtype is not be mapped with multiple datamodels.**
**5. Testcase to check that eventtype is not mapped with multiple datamodels.**


**Workflow:**

- Parsing tags.conf it already has a list of eventtype mapped with the datasets.
- Using SPL we check that each eventtype is not be mapped with multiple datamodels.
- Using SPL we check that each eventtype is not mapped with multiple datamodels.

## Testcase Troubleshooting

Expand All @@ -122,14 +122,14 @@ If all the above conditions are satisfied, further analysis of the test is requi
For every CIM validation test case there is a defined structure for the stack trace.

```text
AssertionError: <<error_message
AssertionError: <<error_message>>
Source | Sourcetype | Field | Event Count | Field Count | Invalid Field Count | Invalid Values
-------- | --------------- | ------| ----------- | ----------- | ------------------- | --------------
str | str | str | int | int | int | str

Search = <Query
Search = <Query>

Properties for the field :: <field_name
Properties for the field :: <field_name>
type= Required/Conditional
condition= Condition for field
validity= EVAL conditions
Expand Down
26 changes: 13 additions & 13 deletions docs/field_tests.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ To generate test cases only for knowledge objects, append the following marker t
```

Testcase verifies that there are events mapped with source/sourcetype.
Here <stanza is the source/sourcetype that is defined in the stanza.
Here &lt;stanza&gt; is the source/sourcetype that is defined in the stanza.

**Workflow:**

Expand All @@ -43,12 +43,12 @@ To generate test cases only for knowledge objects, append the following marker t
**2. Fields mentioned under source/sourcetype should be extracted**

```python
test_props_fields[<stanza::field::<fieldname>]
test_props_fields[<stanza>::field::<fieldname>]
```

Testcase verifies that the field should be extracted in the source/sourcetype.
Here <stanza is the source/sourcetype that is defined in the stanza and
<fieldname is the name of a field which is extracted under source/sourcetype.
Here &lt;stanza&gt; is the source/sourcetype that is defined in the stanza and
&lt;fieldname&gt; is the name of a field which is extracted under source/sourcetype.

**Workflow:**

Expand All @@ -62,8 +62,8 @@ To generate test cases only for knowledge objects, append the following marker t
```

Testcase verifies that the field should not have "-" (dash) or "" (empty) as a value.
Here <stanza is the source/sourcetype that is defined in the stanza and
<fieldname is name of field which is extracted under source/sourcetype.
Here &lt;stanza&gt; is the source/sourcetype that is defined in the stanza and
&lt;fieldname&gt; is name of field which is extracted under source/sourcetype.

**Workflow:**

Expand All @@ -90,7 +90,7 @@ To generate test cases only for knowledge objects, append the following marker t

- While parsing the conf file when the plugin finds one of EXTRACT, REPORT, LOOKUP
the plugin gets the list of fields extracted and generates a test case.
- For all the fields in the test case it generates a single SPL search query including the stanza and asserts event_count 0.
- For all the fields in the test case it generates a single SPL search query including the stanza and asserts event_count > 0.
- This verifies that all the fields are extracted in the same event.

**5. Events should be present in each eventtype**
Expand All @@ -104,7 +104,7 @@ To generate test cases only for knowledge objects, append the following marker t

**Workflow:**

- For each eventtype mentioned in eventtypes.conf plugin generates an SPL search query and asserts event_count 0 for the eventtype.
- For each eventtype mentioned in eventtypes.conf plugin generates an SPL search query and asserts event_count > 0 for the eventtype.

**6. Tags defined in tags.conf should be applied to the events.**

Expand All @@ -113,13 +113,13 @@ To generate test cases only for knowledge objects, append the following marker t
```

Test case verifies that the there are events mapped with the tag.
Here <tag_stanza is a stanza mentioned in tags.conf and <tag is an individual tag
Here &lt;tag_stanza&gt; is a stanza mentioned in tags.conf and &lt;tag&gt; is an individual tag
applied to that stanza.

**Workflow:**

- In tags.conf for each tag defined in the stanza, the plugin generates a test case.
- For each tag, the plugin generates a search query including the stanza and the tag and asserts event_count 0.
- For each tag, the plugin generates a search query including the stanza and the tag and asserts event_count > 0.

**7. Search query should be present in each savedsearches.**

Expand All @@ -133,7 +133,7 @@ To generate test cases only for knowledge objects, append the following marker t
**Workflow:**

- In savedsearches.conf for each stanza, the plugin generates a test case.
- For each stanza mentioned in savedsearches.conf plugin generates an SPL search query and asserts event_count 0 for the savedsearch.
- For each stanza mentioned in savedsearches.conf plugin generates an SPL search query and asserts event_count > 0 for the savedsearch.

## Testcase Troubleshooting

Expand All @@ -150,8 +150,8 @@ If all the above conditions are satisfied, further analysis of the test is requi
For every test case failure, there is a defined structure for the stack trace.

```text
AssertionError: <<error_message
Search = <Query
AssertionError: <<error_message>>
Search = <Query>
```

Get the search query from the stack trace and execute it on the Splunk instance and verify which specific type of events are causing failure.
45 changes: 22 additions & 23 deletions docs/how_to_use.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ There are three ways to execute the tests:
Run pytest with the add-on, in an external Splunk deployment

```bash
pytest --splunk-type=external --splunk-app=<path-to-addon-package --splunk-data-generator=<path to pytest-splunk-addon-data.conf file --splunk-host=<hostname --splunk-port=<splunk-management-port --splunk-user=<username --splunk-password=<password --splunk-hec-token=<splunk_hec_token
pytest --splunk-type=external --splunk-app=<path-to-addon-package> --splunk-data-generator=<path to pytest-splunk-addon-data.conf file> --splunk-host=<hostname> --splunk-port=<splunk-management-port> --splunk-user=<username> --splunk-password=<password> --splunk-hec-token=<splunk_hec_token>
```

**2. Running tests with docker splunk**
Expand Down Expand Up @@ -101,6 +101,7 @@ services:
SPLUNK_APP_ID: ${SPLUNK_APP_ID}
SPLUNK_APP_PACKAGE: ${SPLUNK_APP_PACKAGE}
SPLUNK_VERSION: ${SPLUNK_VERSION}
platform: linux/amd64
ports:
- "8000"
- "8088"
Expand All @@ -120,6 +121,7 @@ services:
SPLUNK_APP_ID: ${SPLUNK_APP_ID}
SPLUNK_APP_PACKAGE: ${SPLUNK_APP_PACKAGE}
SPLUNK_VERSION: ${SPLUNK_VERSION}
platform: linux/amd64
hostname: uf
ports:
- "9997"
Expand All @@ -132,13 +134,10 @@ services:
volumes:
- ${CURRENT_DIR}/uf_files:${CURRENT_DIR}/uf_files

volumes:
splunk-sc4s-var:
external: false
```
</details>

<details>
<details id="conftest">
<summary>Create conftest.py file</summary>

```
Expand Down Expand Up @@ -184,7 +183,7 @@ def docker_services_project_name(pytestconfig):
Run pytest with the add-on, using the following command:

```bash
pytest --splunk-type=docker --splunk-data-generator=<path to pytest-splunk-addon-data.conf file
pytest --splunk-type=docker --splunk-data-generator=<path to pytest-splunk-addon-data.conf file>
```

The tool assumes the Splunk Add-on is located in a folder "package" in the project root.
Expand All @@ -209,15 +208,15 @@ The tool assumes the Splunk Add-on is located in a folder "package" in the proje

```bash
pytest --splunk-type=external # Whether you want to run the addon with docker or an external Splunk instance
--splunk-app=<path-to-addon-package # Path to Splunk app package. The package should have the configuration files in the default folder.
--splunk-host=<hostname # Receiver Splunk instance where events are searchable.
--splunk-port=<splunk_management_port # default 8089
--splunk-user=<username # default admin
--splunk-password=<password # default Chang3d!
--splunk-forwarder-host=<splunk_forwarder_host # Splunk instance where forwarding to receiver instance is configured.
--splunk-hec-port=<splunk_forwarder_hec_port # HEC port of the forwarder instance.
--splunk-hec-token=<splunk_forwarder_hec_token # HEC token configured in forwarder instance.
--splunk-data-generator=<pytest_splunk_addon_conf_path # Path to pytest-splunk-addon-data.conf
--splunk-app=<path-to-addon-package> # Path to Splunk app package. The package should have the configuration files in the default folder.
--splunk-host=<hostname> # Receiver Splunk instance where events are searchable.
--splunk-port=<splunk_management_port> # default 8089
--splunk-user=<username> # default admin
--splunk-password=<password> # default Chang3d!
--splunk-forwarder-host=<splunk_forwarder_host> # Splunk instance where forwarding to receiver instance is configured.
--splunk-hec-port=<splunk_forwarder_hec_port> # HEC port of the forwarder instance.
--splunk-hec-token=<splunk_forwarder_hec_token> # HEC token configured in forwarder instance.
--splunk-data-generator=<pytest_splunk_addon_conf_path> # Path to pytest-splunk-addon-data.conf
```

> **_NOTE:_**
Expand All @@ -243,10 +242,10 @@ There are 3 types of tests included in pytest-splunk-addon are:
3. To generate test cases only for index time properties, append the following marker to pytest command:

```console
-m splunk_indextime --splunk-data-generator=<Path to the conf file
-m splunk_indextime --splunk-data-generator=<Path to the conf file>
```

For detailed information on index time test execution, please refer {ref}`here <index_time_tests`.
For detailed information on index time test execution, please refer [here](./index_time_tests.md).

- To execute all the searchtime tests together, i.e both Knowledge objects and CIM compatibility tests,
append the following marker to the pytest command:
Expand All @@ -262,19 +261,19 @@ The following optional arguments are available to modify the default settings in
1. To search for events in a specific index, user can provide following additional arguments:

```console
--search-index=<index
--search-index=<index>

Splunk index of which the events will be searched while testing. Default value: "*, _internal".
```

2. To increase/decrease time interval and retries for flaky tests, user can provide following additional arguments:

```console
--search-retry=<retry
--search-retry=<retry>

Number of retries to make if there are no events found while searching in the Splunk instance. Default value: 0.

--search-interval=<interval
--search-interval=<interval>

Time interval to wait before retrying the search query.Default value: 0.
```
Expand All @@ -297,7 +296,7 @@ The following optional arguments are available to modify the default settings in
- **Addon related errors:** To suppress these user can create a file with the list of strings and provide the file in the **--ignore-addon-errors** param while test execution.

```console
--ignore-addon-errors=<path_to_file
--ignore-addon-errors=<path_to_file>
```

- Sample strings in the file.
Expand Down Expand Up @@ -328,7 +327,7 @@ The following optional arguments are available to modify the default settings in
- Default value for this parameter is *store_new*

```console
--event-file-path=<path_to_file
--event-file-path=<path_to_file>
```

- Path to tokenized events file
Expand Down Expand Up @@ -380,7 +379,7 @@ The following optional arguments are available to modify the default settings in

**3. Setup test environment before executing the test cases**

If any setup is required in the Splunk/test environment before executing the test cases, implement a fixture in {ref}`conftest.py <conftest_file`.
If any setup is required in the Splunk/test environment before executing the test cases, implement a fixture in [conftest.py](#conftest).

```python
@pytest.fixture(scope="session")
Expand Down
16 changes: 8 additions & 8 deletions docs/index_time_tests.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,15 +14,15 @@
### Prerequisites

- `pytest-splunk-addon-data.conf` file which contains all the required data
executing the tests. The conf file should follow the specifications as mentioned {ref}`here <conf_spec`.
executing the tests. The conf file should follow the specifications as mentioned [here](./sample_generator.md#pytest-splunk-addon-dataconfspec).

______________________________________________________________________


To generate test cases only for index time properties, append the following marker to pytest command:

```console
-m splunk_indextime --splunk-data-generator=<Path to the conf file
-m splunk_indextime --splunk-data-generator=<Path to the conf file>
```

> **_NOTE:_** --splunk-data-generator should contain the path to *pytest-splunk-addon-data.conf*,
Expand Down Expand Up @@ -55,7 +55,7 @@ To generate test cases only for index time properties, append the following mark
- This test case will not be generated if there are no key fields specified for the event.
- Key field can be assign to token using field property. `i.e token.n.field = <KEY_FIELD>`

Testcase assertions:
#### Testcase Assertions:

- There should be at least 1 event with the sourcetype and host.
- The values of the key fields obtained from the event
Expand All @@ -72,7 +72,7 @@ To generate test cases only for index time properties, append the following mark

- Execute the SPL query in a Splunk instance.

- Assert the test case results as mentioned in {ref}`testcase assertions<test_assertions_key_field`.
- Assert the test case results as mentioned in [testcase assertions](#testcase-assertions).

**2. Test case for _time property:**

Expand Down Expand Up @@ -141,8 +141,8 @@ If all the above conditions are satisfied, further analysis of the test is requi
For every test case failure, there is a defined structure for the stack trace.

```text
AssertionError: <<error_message
Search = <Query
AssertionError: <<error_message>>
Search = <Query>
```

Get the search query from the stack trace and execute it on the Splunk instance and verify which specific type of events are causing failure.
Expand Down Expand Up @@ -229,9 +229,9 @@ Get the search query from the stack trace and execute it on the Splunk instance

- No test would generate to test Key Fields for that particular stanza and thus won't be correctly tested.

8. When do I assign token.\<n.field = \<field_name to test the Key Fields for an event?
8. When do I assign token.&lt;n&gt;.field = &lt;field_name&gt; to test the Key Fields for an event?

- When there props configurations written in props to extract any of the field present in Key Fields list, you should add `token.<n.field = <field_name` to the token for that field value.
- When there props configurations written in props to extract any of the field present in Key Fields list, you should add `token.<n>.field = <field_name>` to the token for that field value.
- Example:
: For this sample, there is report written in props that extracts `127.0.0.1` as `src`,

Expand Down
8 changes: 4 additions & 4 deletions docs/sample_generator.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ host_prefix = {{host_prefix}}
- If the value is event, the host field should be provided for a token using "token.<n\>.field = host".

**input_type = modinput | scripted_input | syslog_tcp | file_monitor | windows_input | uf_file_monitor | default**
-

- The input_type used in addon to ingest data of a sourcetype used in stanza.
- The way with which the sample data is ingested in Splunk depends on Splunk. The most similar ingesting approach is used for each input_type to get accurate index-time testing.
- In input_type=uf_file_monitor, universal forwarder will use file monitor to read event and then it will send data to indexer.
Expand Down Expand Up @@ -143,7 +143,7 @@ The following replacementType -> replacement values are supported

- "n" is a number starting at 0, and increasing by 1.
- For static, the token will be replaced with the value specified in the replacement setting.
- For timestamp, the token will be replaced with the strptime specified in the replacement setting. Strptime directive: <https://docs.python.org/2/library/datetime.html#strftime-and-strptime-behavior\>
- For timestamp, the token will be replaced with the strptime specified in the replacement setting. Strptime directive: [https://docs.python.org/2/library/datetime.html#strftime-and-strptime-behavior](https://docs.python.org/2/library/datetime.html#strftime-and-strptime-behavior)
- For random, the token will be replaced with a randomly picked type-aware value
- For all, For each possible replacement value, a new event will be generated and the token will be replaced with it. The configuration can be used where a token replacement contains multiple templates/values and all of the values are important and should be ingested at least once. The number of events will be multiplied by the number of values in the replacement. For example, if sample contains 3 lines & a token replacement has list of 2 values, then 6 events will be generated. For a replacement if replacementType='all' is not supported, then be default plugin will consider replacementType="random".
- For file, the token will be replaced with a random value retrieved from a file specified in the replacement setting.
Expand Down Expand Up @@ -174,8 +174,8 @@ The following replacementType -> replacement values are supported

- For <replacement file name\>, the token will be replaced with a random line in the replacement file.

- Replacement file name should be a fully qualified path (i.e. \$SPLUNK_HOME/etc/apps/windows/samples/users.list).
- Windows separators should contain double forward slashes "\\" (i.e. \$SPLUNK_HOME\\etc\\apps\\windows\\samples\\users.list).
- Replacement file name should be a fully qualified path (i.e. $SPLUNK_HOME/etc/apps/windows/samples/users.list).
- Windows separators should contain double forward slashes "\\" (i.e. $SPLUNK_HOME\\etc\\apps\\windows\\samples\\users.list).
- Unix separators will work on Windows and vice-versa.
- Column numbers in mvfile references are indexed at 1, meaning the first column is column 1, not 0.

Expand Down
Loading