Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/cla.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ jobs:
path-to-signatures: ".github/signatures/version1/cla.json"
path-to-document: "https://github.com/splunk/addonfactory-test-releaseci/blob/main/CLA.md" # e.g. a CLA or a DCO document
# branch should not be protected
branch: "master"
branch: "main"
allowlist: dependabot
#below are the optional inputs - If the optional inputs are not given, then default values will be taken
#remote-organization-name: enter the remote organization name where the signatures should be stored (Default is storing the signatures in the same repository)
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/release-notes.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ jobs:
git fetch --prune --unshallow --tags
- uses: snyk/release-notes-preview@v1.6.1
with:
releaseBranch: master
releaseBranch: main
env:
GITHUB_PR_USERNAME: ${{ github.actor }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
46 changes: 0 additions & 46 deletions build.py

This file was deleted.

6 changes: 6 additions & 0 deletions docs/api_reference/addon_parser.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,3 +34,9 @@ TransformsParser
.. automodule:: standard_lib.addon_parser.transforms_parser
:members:
:show-inheritance:

SavedsearchesParser
~~~~~~~~~~~~~~~~~~~
.. automodule:: standard_lib.addon_parser.savedsearches_parser
:members:
:show-inheritance:
8 changes: 4 additions & 4 deletions docs/cim_tests.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Overview
The CIM tests are written with a purpose of testing the compatibility of the add-on with CIM Data Models (Based on Splunk_SA_CIM 4.15.0).
An add-on is said to be CIM compatible if it fulfils the following two criteria:

1. The add-on extracts all the fields with valid values, which are marked as required by the `Data Model Definitions <https://github.com/splunk/pytest-splunk-addon/tree/master/pytest_splunk_addon/standard_lib/data_models>`_.
1. The add-on extracts all the fields with valid values, which are marked as required by the `Data Model Definitions <https://github.com/splunk/pytest-splunk-addon/tree/main/pytest_splunk_addon/standard_lib/data_models>`_.
2. Any event for the add-on is not mapped with more than one data model.

---------------------
Expand All @@ -34,7 +34,7 @@ Test Scenarios
**Workflow:**

* Plugin parses tags.conf to get a list of tags for each eventtype.
* Plugin parses all the `supported datamodels <https://github.com/splunk/pytest-splunk-addon/tree/master/pytest_splunk_addon/standard_lib/data_models>`_.
* Plugin parses all the `supported datamodels <https://github.com/splunk/pytest-splunk-addon/tree/main/pytest_splunk_addon/standard_lib/data_models>`_.
* Then it gets a list of the datasets mapped with an eventtype.
* Generates test case for each eventtype.

Expand Down Expand Up @@ -80,11 +80,11 @@ Test Scenarios

**Workflow:**

* Plugin collects the list of not_allowed_in_search fields from mapped datasets and `CommonFields.json <https://github.com/splunk/pytest-splunk-addon/blob/master/pytest_splunk_addon/standard_lib/cim_tests/CommonFields.json>`_.
* Plugin collects the list of not_allowed_in_search fields from mapped datasets and `CommonFields.json <https://github.com/splunk/pytest-splunk-addon/blob/main/pytest_splunk_addon/standard_lib/cim_tests/CommonFields.json>`_.
* Using search query the test case verifies if not_allowed_in_search fields are populated in search or not.

.. note::
`CommonFields.json <https://github.com/splunk/pytest-splunk-addon/blob/master/pytest_splunk_addon/standard_lib/cim_tests/CommonFields.json>`_ contains fields which are automatically provided by asset and identity correlation features of applications like Splunk Enterprise Security.
`CommonFields.json <https://github.com/splunk/pytest-splunk-addon/blob/main/pytest_splunk_addon/standard_lib/cim_tests/CommonFields.json>`_ contains fields which are automatically provided by asset and identity correlation features of applications like Splunk Enterprise Security.

**4. Testcase for all not_allowed_in_props fields**

Expand Down
17 changes: 16 additions & 1 deletion docs/field_tests.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ Overview
5. Eval
6. Eventtypes
7. Tags
8. Savedsearches

--------------------------------

Expand Down Expand Up @@ -121,7 +122,21 @@ Test Scenarios
**Workflow:**

* In tags.conf for each tag defined in the stanza, the plugin generates a test case.
* For each tag, the plugin generates a search query including the stanza and the tag and asserts event_count > 0
* For each tag, the plugin generates a search query including the stanza and the tag and asserts event_count > 0.

**7. Search query should be present in each savedsearches.**

.. code-block:: python

test_savedsearches[<savedsearch_stanza>]

Test case verifies that the search mentioned in savedsearch.conf generates valid search results.
Here <savedsearch_stanza> is a stanza mentioned in savedsearches.conf file.

**Workflow:**

* In savedsearches.conf for each stanza, the plugin generates a test case.
* For each stanza mentioned in savedsearches.conf plugin generates an SPL search query and asserts event_count > 0 for the savedsearch.

Testcase Troubleshooting
------------------------
Expand Down
4 changes: 2 additions & 2 deletions docs/how_to_use.rst
Original file line number Diff line number Diff line change
Expand Up @@ -279,9 +279,9 @@ Extending pytest-splunk-addon

How can this be achieved :

- Make json representation of the data models, which satisfies this `DataModelSchema <https://github.com/splunk/pytest-splunk-addon/blob/master/pytest_splunk_addon/standard_lib/cim_tests/DatamodelSchema.json>`_.
- Make json representation of the data models, which satisfies this `DataModelSchema <https://github.com/splunk/pytest-splunk-addon/blob/main/pytest_splunk_addon/standard_lib/cim_tests/DatamodelSchema.json>`_.
- Provide the path to the directory having all the data models by adding ``--splunk_dm_path path_to_dir`` to the pytest command
- The test cases will now be generated for the data models provided to the plugin and not for the `default data models <https://github.com/splunk/pytest-splunk-addon/tree/master/pytest_splunk_addon/standard_lib/data_models>`_.
- The test cases will now be generated for the data models provided to the plugin and not for the `default data models <https://github.com/splunk/pytest-splunk-addon/tree/main/pytest_splunk_addon/standard_lib/data_models>`_.

.. raw:: html

Expand Down
14 changes: 13 additions & 1 deletion docs/release_history.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,19 @@ Release History

The best way to track the development of pytest-splunk-addon is through `the GitHub Repo <https://github.com/splunk/pytest-splunk-addon/>`_.

1.3.14
1.4.0
""""""""""""""""""""""""""
**Changes:**

* Plugin now generates and executes tests to validate savedsearches defined in savedsearches.conf.

**Known Issues:**

* Event ingestion through SC4S via UDP port
* Fields for modular regular expressions are not extracted in the plugin.


1.3.15
""""""""""""""""""""""""""
**Changes:**

Expand Down
3 changes: 3 additions & 0 deletions pytest_splunk_addon/plugin.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,9 @@ def pytest_configure(config):
"markers",
"splunk_searchtime_fields_eventtypes: Test search time eventtypes only",
)
config.addinivalue_line(
"markers", "splunk_searchtime_fields_savedsearches: Test search time savedsearches only"
)
config.addinivalue_line(
"markers", "splunk_searchtime_cim: Test CIM compatibility only"
)
Expand Down
17 changes: 17 additions & 0 deletions pytest_splunk_addon/standard_lib/addon_parser/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
from .props_parser import PropsParser
from .tags_parser import TagsParser
from .eventtype_parser import EventTypeParser
from .savedsearches_parser import SavedSearchParser

LOGGER = logging.getLogger("pytest-splunk-addon")

Expand All @@ -37,6 +38,7 @@ def __init__(self, splunk_app_path):
self._props_parser = None
self._tags_parser = None
self._eventtype_parser = None
self._savedsearch_parser = None

@property
def app(self):
Expand All @@ -62,6 +64,12 @@ def eventtype_parser(self):
self._eventtype_parser = EventTypeParser(self.splunk_app_path, self.app)
return self._eventtype_parser

@property
def savedsearch_parser(self):
if not self._savedsearch_parser:
self._savedsearch_parser = SavedSearchParser(self.splunk_app_path,self.app)
return self._savedsearch_parser

def get_props_fields(self):
"""
Parse the props.conf and yield all supported fields
Expand All @@ -88,3 +96,12 @@ def get_eventtypes(self):
generator of list of eventtypes
"""
return self.eventtype_parser.get_eventtypes()

def get_savedsearches(self):
"""
Parse the App configuration files & yield searchedservices

Yields:
generator of list of searchedservices
"""
return self.savedsearch_parser.get_savedsearches()
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
# -*- coding: utf-8 -*-
"""
Provides savedsearches.conf parsing mechanism
"""

class SavedSearchParser(object):
"""
Parses savedsearches.conf and extracts savedsearches

Args:
splunk_app_path (str): Path of the Splunk app
app (splunk_appinspect.App): Object of Splunk app
"""
def __init__(self, splunk_app_path, app):
self.app = app
self.splunk_app_path = splunk_app_path
self._savedsearches = None

@property
def savedsearches(self):
try:
if not self._savedsearches:
self._savedsearches = self.app.get_config("savedsearches.conf")
return self._savedsearches
except OSError:
return None

def get_savedsearches(self):
"""
Parse the App configuration files & yield savedsearches

Yields:
generator of list of savedsearches
"""
if not self.savedsearches:
return None
for stanza in self.savedsearches.sects:
savedsearch_sections = self.savedsearches.sects[stanza]
savedsearch_container = {
"stanza" : stanza,
"search" : "index = \"main\"",
"dispatch.earliest_time" : "0",
"dispatch.latest_time" : "now"}
for key in savedsearch_sections.options:
empty_value = ['None','',' ']
if key == "search" and savedsearch_sections.options[key].value not in empty_value:
savedsearch_container[key] = savedsearch_sections.options[key].value
elif key == "dispatch.earliest_time" and savedsearch_sections.options[key].value not in empty_value:
savedsearch_container[key] = savedsearch_sections.options[key].value
elif key == "dispatch.latest_time" and savedsearch_sections.options[key].value not in empty_value:
savedsearch_container[key] = savedsearch_sections.options[key].value
yield savedsearch_container
15 changes: 15 additions & 0 deletions pytest_splunk_addon/standard_lib/fields_tests/test_generator.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@ def generate_tests(self, fixture):
* splunk_app_searchtime_negative
* splunk_app_searchtime_eventtypes
* splunk_app_searchtime_tags
* splunk_app_searchtime_savedsearches

Args:
fixture(str): fixture name
Expand All @@ -54,6 +55,8 @@ def generate_tests(self, fixture):
yield from self.generate_tag_tests()
elif fixture.endswith("eventtypes") :
yield from self.generate_eventtype_tests()
elif fixture.endswith("savedsearches"):
yield from self.generate_savedsearches_tests()

def generate_field_tests(self, is_positive):
"""
Expand Down Expand Up @@ -143,6 +146,18 @@ def generate_eventtype_tests(self):
id="eventtype::{stanza}".format(**each_eventtype)
)

def generate_savedsearches_tests(self):
"""
Generate test case for savedsearches

Yields:
pytest.params for the test templates
"""
for each_savedsearch in self.addon_parser.get_savedsearches():
yield pytest.param(
each_savedsearch,
id="{stanza}".format(**each_savedsearch))

def _contains_classname(self, fields_group, criteria):
"""
Check if the field_group dictionary contains the classname
Expand Down
50 changes: 50 additions & 0 deletions pytest_splunk_addon/standard_lib/fields_tests/test_templates.py
Original file line number Diff line number Diff line change
Expand Up @@ -264,3 +264,53 @@ def test_eventtype(
f"No result found for the search.\nsearch={search}\n"
f"interval={splunk_search_util.search_interval}, retries={splunk_search_util.search_retry}"
)


@pytest.mark.splunk_searchtime_fields
@pytest.mark.splunk_searchtime_fields_savedsearches
def test_savedsearches(
self,
splunk_search_util,
splunk_ingest_data,
splunk_setup,
splunk_searchtime_fields_savedsearches,
record_property,
caplog,
):
"""
Tests if all savedsearches in savedsearches.conf are being executed properly to generate proper results.

Args:
splunk_search_util (fixture):
Fixture to create a simple connection to Splunk via SplunkSDK
splunk_searchtime_fields_savedsearches (fixture):
Fixture containing list of savedsearches
record_property (fixture):
Used to add user properties to test report
caplog (fixture):
Access and control log capturing

Returns:
Asserts whether test case passes or fails.
"""
search_query = splunk_searchtime_fields_savedsearches["search"]
earliest_time = splunk_searchtime_fields_savedsearches["dispatch.earliest_time"]
latest_time = splunk_searchtime_fields_savedsearches["dispatch.latest_time"]

temp_search_query = search_query.split('|')
temp_search_query[0] += " earliest_time = {0} latest_time = {1} ".format(earliest_time,latest_time)
search_query = "|".join(temp_search_query)

search = (f"search {search_query}")

self.logger.info(f"Search: {search}")

result = splunk_search_util.checkQueryCountIsGreaterThanZero(
search, interval=splunk_search_util.search_interval, retries=splunk_search_util.search_retry
)

record_property("search", search)
assert result, (
f"No result found for the search.\nsearch={search}\n"
f"interval={splunk_search_util.search_interval}, retries={splunk_search_util.search_retry}"
)
Loading