Skip to content
This repository has been archived by the owner on Feb 3, 2023. It is now read-only.

Commit

Permalink
Merge pull request #216 from guilhemmarchand/testing
Browse files Browse the repository at this point in the history
Version 1.2.30
  • Loading branch information
guilhemmarchand committed Jan 2, 2021
2 parents 210bbff + 7cdd1a7 commit 170704f
Show file tree
Hide file tree
Showing 45 changed files with 5,897 additions and 420 deletions.
5 changes: 3 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,16 @@
# local metadata
*/metadata/local.meta

# Custom lookup table
*/lookups/my_frameid_lookup.csv
# lookups dir
*/lookups

# Compiled Python files
*.pyo
*.pyc

# Backup files
*~
*/default.old*

# pycharm
.idea
Expand Down
2 changes: 1 addition & 1 deletion build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ rm -f *.tgz
find . -name "*.pyc" -type f | xargs rm -f
find . -name "*.py" -type f | xargs chmod go-x
find trackme/lib -name "*.py" -type f | xargs chmod a-x
tar -czf ${app}_${version}.tgz --exclude=trackme/local --exclude=trackme/metadata/local.meta --exclude=trackme/lookups/lookup_file_backups --exclude='./.*' --exclude='.[^/]*' --exclude="._*" trackme
tar -czf ${app}_${version}.tgz --exclude=trackme/local --exclude=trackme/metadata/local.meta --exclude=trackme/lookups/lookup_file_backups --exclude=trackme/default.old* --exclude='./.*' --exclude='.[^/]*' --exclude="._*" trackme
echo "Wrote: ${app}_${version}.tgz"

exit 0
Binary file modified docs/img/first_steps/img_data_sampling002.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/img/first_steps/img_data_sampling_create_custom1.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/img/first_steps/img_data_sampling_create_custom2.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/img/postman_screen.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/img/postman_screen2.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,8 @@ No matters the purpose of your Splunk deployment, trackMe will easily become an
- Monitoring and insight visibility about your indexes, sourcetypes, events and metrics
- General data activity monitoring and detection of Zombie data
- Continous and automated data quality assessment
- PII data detection with custom regular expression based rules and data sampling
- many more!

Overview:
=========
Expand Down
21 changes: 21 additions & 0 deletions docs/releasenotes.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,27 @@
Release notes
#############

Version 1.2.30
==============

**CAUTION:**

This is a new main release branch, TrackMe 1.2.x requires the deployment of the following dependencies:

- Semicircle Donut Chart Viz, Splunk Base: https://splunkbase.splunk.com/app/4378
- Splunk Machine Learning Toolkit, Splunk Base: https://splunkbase.splunk.com/app/2890
- Splunk Timeline - Custom Visualization, Splunk Base: https://splunkbase.splunk.com/app/3120

TrackMe requires a summary index (defaults to trackme_summary) and a metric index (defaults to trackme_metrics):
https://trackme.readthedocs.io/en/latest/configuration.html

- Feature - Issue #210 - new REST API endpoints for Elastic Sources / Logical Groups / Data Sampling / Tags Policies / Lagging Classes / Lagging Classes Metrics
- Feature - Issue #212 - Data sampling - Allows defining exclusive rules for data sampling custom models, this can be used when a regex must not be matched, such as detecting PII data automatically
- Feature - Issue #214 - Data sampling - Allows defining a custom number of records to be sampled on a per data source basis
- Feature - Issue #215 - Data Hosts - Support for priority based lagging classes
- Fix - Data sampling - Clear state and run sampling action would fail if actioned on a data source which data sampling has not run yet at least once, fixes and UI improvements for Data sampling
- Change - Issue #213 - knowledge objects default permissions - Review of the app related KVstores default permissions, fixing missing collections and transforms

Version 1.2.29
==============

Expand Down
1,556 changes: 1,342 additions & 214 deletions docs/rest_api_reference.rst

Large diffs are not rendered by default.

218 changes: 210 additions & 8 deletions docs/userguide.rst

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion trackme/app.manifest
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
"id": {
"group": null,
"name": "trackme",
"version": "1.2.29"
"version": "1.2.30"
},
"author": [
{
Expand Down
42 changes: 7 additions & 35 deletions trackme/bin/trackme_rest_handler_ack.py
Original file line number Diff line number Diff line change
Expand Up @@ -277,6 +277,9 @@ def post_ack_enable(self, request_info, **kwargs):

try:

# create a record
record = '{"object": "' + object_value + '", "object_category": "' + object_category_value + '", "ack_expiration": "' + str(ack_expiration) + '", "ack_state": "' + str(ack_state) + '", "ack_mtime": "' + str(ack_mtime) + '"}'

# Insert the record
collection_audit.data.insert(json.dumps({
"time": str(current_time),
Expand All @@ -285,7 +288,7 @@ def post_ack_enable(self, request_info, **kwargs):
"change_type": "enable ack",
"object": str(object_value),
"object_category": "data_source",
"object_attrs": json.dumps({"object": object_value, "object_category": object_category_value, "ack_expiration": str(ack_expiration), "ack_state": str(ack_state), "ack_mtime": str(ack_mtime)}, index=1),
"object_attrs": json.dumps(json.loads(record), indent=1),
"result": "N/A",
"comment": str(update_comment)
}))
Expand All @@ -296,7 +299,7 @@ def post_ack_enable(self, request_info, **kwargs):
}

return {
"payload": json.dumps(collection.data.query(query=str(query_string)), indent=1),
"payload": json.dumps(json.loads(record), indent=1),
'status': 200 # HTTP status code
}

Expand Down Expand Up @@ -415,40 +418,9 @@ def post_ack_disable(self, request_info, **kwargs):

else:

# Insert the record
collection.data.insert(json.dumps({"object": object_value,
"object_category": object_category_value,
"ack_expiration": str(ack_expiration),
"ack_state": str(ack_state),
"ack_mtime": str(ack_mtime)}))

# Record an audit change
import time
current_time = int(round(time.time() * 1000))
user = "nobody"

try:

# Insert the record
collection_audit.data.insert(json.dumps({
"time": str(current_time),
"user": str(user),
"action": "success",
"change_type": "disable ack",
"object": str(object_value),
"object_category": "data_source",
"object_attrs": json.dumps({"object": object_value, "object_category": object_category_value, "ack_expiration": str(ack_expiration), "ack_state": str(ack_state), "ack_mtime": str(ack_mtime)}, index=1),
"result": "N/A",
"comment": str(update_comment)
}))

except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}

# There no ack currently for this object, return http 200 with message
return {
"payload": json.dumps(collection.data.query(query=str(query_string)), indent=1),
"payload": "There are no active acknowledgment for the entity object: " + str(object_value) + ", object_category: " + str(object_category_value),
'status': 200 # HTTP status code
}

Expand Down
134 changes: 134 additions & 0 deletions trackme/bin/trackme_rest_handler_data_hosts.py
Original file line number Diff line number Diff line change
Expand Up @@ -428,6 +428,140 @@ def post_dh_enable_monitoring(self, request_info, **kwargs):
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}

# Reset by object name
def post_dh_reset(self, request_info, **kwargs):

# By data_host
data_host = None
query_string = None

# Retrieve from data
resp_dict = json.loads(str(request_info.raw_args['payload']))
data_host = resp_dict['data_host']

# Update comment is optional and used for audit changes
try:
update_comment = resp_dict['update_comment']
except Exception as e:
update_comment = "API update"

# Define the KV query
query_string = '{ "data_host": "' + data_host + '" }'

# Get splunkd port
entity = splunk.entity.getEntity('/server', 'settings',
namespace='trackme', sessionKey=request_info.session_key, owner='-')
splunkd_port = entity['mgmtHostPort']

try:

# Data collection
collection_name = "kv_trackme_host_monitoring"
service = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection = service.kvstore[collection_name]

# Audit collection
collection_name_audit = "kv_trackme_audit_changes"
service_audit = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection_audit = service_audit.kvstore[collection_name_audit]

# Get the current record
# Notes: the record is returned as an array, as we search for a specific record, we expect one record only

try:
record = collection.data.query(query=str(query_string))
key = record[0].get('_key')

except Exception as e:
key = None

# Render result
if key is not None and len(key)>2:

# Update the record
collection.data.update(str(key), json.dumps({
"object_category": record[0].get('object_category'),
"data_host": record[0].get('data_host'),
"data_last_lag_seen": record[0].get('data_last_lag_seen'),
"data_last_ingestion_lag_seen": record[0].get('data_last_ingestion_lag_seen'),
"data_eventcount": record[0].get('data_eventcount'),
"data_first_time_seen": record[0].get('data_first_time_seen'),
"data_last_time_seen": record[0].get('data_last_time_seen'),
"data_last_ingest": record[0].get('data_last_ingest'),
"data_max_lag_allowed": record[0].get('data_max_lag_allowed'),
"data_lag_alert_kpis": record[0].get('data_lag_alert_kpis'),
"data_monitored_state": record[0].get('data_monitored_state'),
"data_monitoring_wdays": record[0].get('data_monitoring_wdays'),
"data_override_lagging_class": record[0].get('data_override_lagging_class'),
"data_host_state": record[0].get('data_host_state'),
"data_tracker_runtime": record[0].get('data_tracker_runtime'),
"data_previous_host_state": record[0].get('data_previous_host_state'),
"data_previous_tracker_runtime": record[0].get('data_previous_tracker_runtime'),
"data_host_alerting_policy": record[0].get('data_host_alerting_policy'),
"OutlierMinEventCount": record[0].get('OutlierMinEventCount'),
"OutlierLowerThresholdMultiplier": record[0].get('OutlierLowerThresholdMultiplier'),
"OutlierUpperThresholdMultiplier": record[0].get('OutlierUpperThresholdMultiplier'),
"OutlierAlertOnUpper": record[0].get('OutlierAlertOnUpper'),
"OutlierTimePeriod": record[0].get('OutlierTimePeriod'),
"OutlierSpan": record[0].get('OutlierSpan'),
"isOutlier": record[0].get('isOutlier'),
"enable_behaviour_analytic": record[0].get('enable_behaviour_analytic'),
"latest_flip_state": record[0].get('latest_flip_state'),
"latest_flip_time": record[0].get('latest_flip_time'),
"priority": record[0].get('priority')
}))

# Record an audit change
import time
current_time = int(round(time.time() * 1000))
user = "nobody"

try:

# Insert the record
collection_audit.data.insert(json.dumps({
"time": str(current_time),
"user": str(user),
"action": "success",
"change_type": "reset data",
"object": str(data_host),
"object_category": "data_host",
"object_attrs": str(json.dumps(collection.data.query_by_id(key), indent=1)),
"result": "N/A",
"comment": str(update_comment)
}))

except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}

return {
"payload": json.dumps(collection.data.query_by_id(key), indent=1),
'status': 200 # HTTP status code
}

else:
return {
"payload": 'Warn: resource not found ' + str(query_string),
'status': 404 # HTTP status code
}

except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}

# Update priority by object name
def post_dh_update_priority(self, request_info, **kwargs):

Expand Down

0 comments on commit 170704f

Please sign in to comment.