Skip to content
This repository has been archived by the owner on Feb 3, 2023. It is now read-only.

Commit

Permalink
Merge pull request #125 from guilhemmarchand/testing
Browse files Browse the repository at this point in the history
Version 1.2.21
  • Loading branch information
guilhemmarchand committed Sep 1, 2020
2 parents 6733bec + c91c538 commit 5100e0f
Show file tree
Hide file tree
Showing 55 changed files with 4,686 additions and 1,082 deletions.
7 changes: 4 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,11 @@
- Provides a powerful user interface to manage activation states, configuration and quickly trouble availability failure detection
- Analyse and detect lack of data and performance lagging of data sources and hosts within your Splunk deployment
- Behaviour analytic with outlier detection based on machine learning outliers calculations
- Behaviour analytic with data sampling and event format recognition, monitor and detect anomalies in raw events to detect event format changes or misbehaviour
- Record and investigate historical changes of statuses, as well as administrators changes (audit flipping and changes)
- Easy administration via graphical human interface from A to Z
- No matters the purpose of your Splunk deployment, trackMe will easily become an essential and easy piece of your deployment, and even providing efficient answers to PCI and compliance requirements
- Never let again your team be the last to discover what empty and no results found mean!
- Keep things under your control and be the first to know when data is not available, get alerted before your users get back to you!

![screenshot1](./docs/img/screenshots_main/img001.png)

Expand All @@ -35,11 +36,11 @@ No administrator should be informed of an issue in the data flow by the customer

with the massive amount of data sources, this becomes easily a painful and time consuming activity, this application aims to drastically help you in these tasks.

This tiny application provides a handy user interface associated with a simple but efficient data discovery, state and alerting workflow.
TrackMe provides a handy user interface associated with an efficient data discovery, state and alerting workflow.

Made by Splunk admins for Splunk admins, the TrackMe application provides builtin powerful features to monitor and administer you data source monitoring the easy way!

## Use case for TrackMe?
## Use cases for TrackMe?

No matters the purpose of your Splunk deployment, trackMe will easily become an essential and positive piece of your Splunk journey:

Expand Down
45 changes: 38 additions & 7 deletions docs/FAQ.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ FAQ
What is the "data name" useful for?
-----------------------------------

See :ref:`Data Sources tracking concept and features`
See :ref:`data sources tracking and features`

In the context of data source, the field **"data_name"** represents the unique identifier of the data source.

Expand All @@ -17,7 +17,7 @@ The data_name unique identifier is used in different parts of the application, s

**What are the numbers in the "lag summary" column?**

See :ref:`Data Sources tracking concept and features`
See :ref:`data sources tracking and features`

The field **"lag summary (lag event / lag ingestion)"** is exposed within the UI to summarise the two key metrics handled by TrackMe to monitor the Splunk data.

Expand Down Expand Up @@ -143,8 +143,39 @@ How to deal with sourcetypes that are emitting data occasionally or sporadically

There are no easy answers to this question, however:

- From a data source perspective, what matters is monitoring the data from a pipeline point of view, which translated in TrackMe means making sure you have a data source that corresponds to this unique data flow
- From a data host perspective, there wouldn't be the value one could be expecting in having a strict monitoring of every single sourcetype linked to a given host, especially because many of them can be generating data in a sporadic fashion depending on the circumstances
- On the opposite, what matters and provides value is being able to detect global failures of hosts (endpoints, whatever you call these) in a way that is not generating noises and alert fatigue
- This is why the data host design takes in consideration the data globally sent on a per host basis, TrackMe provides many different features (allowlist / blocklist, etc) to manage use cases with the level of granularity required
- Finally, from the data host perspective, the outliers detection is a powerful feature that would provide the capability to detect a significant change in the data volume, for example when a major sourcetype has stopped to be emitted
- The default concept of data sources tracking relies on entities broken per index and sourcetype, this can be extended easily using the Elastic sources feature to fullfil any kind of requirements and make sure that a data source represents the data pipeline
- The data hosts tracking feature provides the vision broken on a per host basis (using the Splunk host Metadata)
- TrackMe does not replace the knowledge you have regarding the way you are ingesting data into Splunk, instead it provides various features and options you can use to configure what should raise an alert or not, and how
- The basic configuration for data tracking are related to the latency and the delta in seconds between the latest time data was indexed in Splunk and now
- In addition, the volume Outliers feature allows detecting automatically behaviour changes in the volume of data indexed in Splunk for a given sourcetype
- In most cases, you should focus on the most valuable and important sourcetypes, TrackMe provides different levels of features (allowlists / blocklists) to exclude automatically data of low interest, and the priority feature allows granular definition of the importance of an entity
- A sourcetype that comes very occasionally in Splunk might be something that you need to track carefully, however if it does you need to define the tresholds accordlingy and TrackMe provides different options to do so on a per data source basis for instance

What is the purpose of the enable / disable button?
---------------------------------------------------

The purpose of the enable / disable button is to provide a way to disable the monitoring of an entity, without removing it from the collections entirely.

There are different aspects to consider:

- Sometimes you have some sourcetypes you do not care about really, you can use allowlisting / blocklisting, or disable it
- When an entity is disabled, the value of the field "data_monitored_state" is set to false (default is true when it is discovered initially)
- The UI by default filters on entities which are being monitored effectively, you can show disabled entities by using the "Filter monitored_state:" filter form, or looking at the lookup content manually
- Out of the box alerts do not take in consideration disabled entities
- Various other parts of the application will as well stop considering these disabled entities, for instance there will not be metrics generated anymore, etc.
- When an entity is disabled, all information are preserved, if you re-enable a disabled entity, TrackMe will simply start to consider it again and refresh its state and other actions automatically
- You should consider disabling entities rather than deleting entities if these are actively generating data to Splunk and cannot be excluded easily by allow listing / block listing
- The reason is that if you delete an active entity, in temporary deletion mode it will be re-added very quickly (when the trackers will capture activity for it), and permanent mode it would re-added after a certain period of time

What's the difference between disabled and (permanently) deleted?
-----------------------------------------------------------------

The deletion of entities is explained in details in :ref:`Deletion of entities`.

In short, the purpose of the permanent deletion is to prevent an entity from being disovered again after it is deleted.

To achieve this, when an entity is permanently deleted the value of the field "change_type" is defined to "delete permanent", when the entity is temporarily deleted, the value is set to "delete tempoary".

Then, Trackers reports wich perform discovery of the data use a filter to exclude entities that have been permanently deleted, such that even if the entity is still actively sending data to Splunk, TrackMe will ignore it automatically as long as the audit record is available. (by default audit records are purged after 90 days)

The UI does not provide a function to undo a permanent deletion, however updating or purging the audit record manually would allow to re-create an entity after it was permanently deleted.
1 change: 1 addition & 0 deletions docs/draw.io/data sampling.drawio
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
<mxfile host="Electron" modified="2020-08-31T22:20:19.989Z" agent="5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/13.6.2 Chrome/83.0.4103.122 Electron/9.2.0 Safari/537.36" etag="MylreLz4yhoyBKOapxX1" version="13.6.2" type="device"><diagram id="C5RBs43oDa-KdzZeNtuy" name="Page-1">7VvbcqM4EP0aPyYFCDB+zG2yl8zU7mZ3M36UQWBtADlCju18/UggGWQYx46xwbWpqcqgu9SnT6u7lQzATbK8p3A2/UoCFA8sI1gOwO3AskwDePw/UbMqaoaOUVREFAeyU1nxiN+QGilr5zhAmdaRERIzPNMrfZKmyGdaHaSULPRuIYn1VWcwQrWKRx/G9donHLCprDXdUdnwC8LRVC7tWcOiIYGqszxJNoUBWVSqwN0A3FBCWPGVLG9QLISn5PL06+opfnh273/7M3uB/1z//ve3fy+Kyb7sM2R9BIpS9uGpycPXv9xnB0TmS/gwhk93gW1eWPJobKXkhQIuPlkklE1JRFIY35W115TM0wCJWQ1eKvs8EDLjlSav/A8xtpK6AOeM8KopS2LZipaYfa98j8VUl44s3S7lzHlhpQopo6vv1UJllCiWw/KSGrej6KSIMzKnPtrSD0gNhjRCbItcpcoIYVbUUAJzj0iC+CZ5B4piyPCrrqtQqny07lfCyj8ksnsokNz1K4zncqVHfgImWA0ZFOeGySzGaVTThhJrAdZiihl6nMFcRAtuL3RcQ5IyCbrJD3sdxTDLJA4Zo+R5zUDRe00n0RziOL4hMaH5siCAyAv99bBKi+t7aBLuDewrogwtt0KhWk2nGCLtneVJLBcV66FMwrRiOGzjcPQadcn55OjPOLqNe+9y1G6bo3LoHwSnrFQo4OkKBYwNRSk2Kkdt6Mp6Gx9Xn1GN/GN+Hzep1AOc8MtfUwMY4yjl3z6HBnEKXgsiYX67XsmGBAdBoXEow29wks8nQJ2J8+QndK4Hzm0jzNvUvcbYtYsgF9Fu4SYmXxiXhjfUZK906EBIlUHdmFVNQMIwQ0fB0uvSFJT0H1da3jMFJfvHGvl7ZQqGXV3XjbtRrnOFst9IjxnrtchYW2eW1QpfT8bPOnADy42Z9I00/NyXOVENF1nOvSvewTRmy7KRf0Ush4rNaZqpySZUNaBXlGP2RTXxbRdLFe11h25Kksk8e9+ZO5Z/Zbkb/pVb96+8BvfKO5Z7Zdqf/tVeRtXe0aiaoFdW1a5xM2OEIrFUGgj5UOIjHq/wDRekkmERqnPofxcUAc/VSdt5UGR2GhUZH2Kt7gqZp2XtsM7abbmCnpB2WCNtihZiIcZQMhM3Kk75jxQtxbe/8j/pKuhpb4Sc3dO108ilf5fssZMOrkJXGezR6NID+iyFyajlHepzue/PdeQchnluSQyztZjIuDR0MreUwtAvdPPSHOpTHC9IAqDTq/v8shjmrq8OVuvW5SCjD+oud6/zGEoxW0lkgMNoqhIgG3b2eKw0649E7eQuQkITmHtqWb4HjgEO+T9B5vNJWzj2xiXYddrCOptnobatYUMk0yyh1h94DnOB67GMTxIhllyLuBKJHAR6xWSefcYyP/c+O49llIZUgAxwll8NlhEiyOaVrJLKMWUMst7AOYG+F4AmOC1g205wTDu6EZh0H5pan6HpfgbY2jUBbPXrWc06tyDSavNhzRo6I415h76Fn/zxG3Sb8j2/uHFnooLOfl2teTtujaj9jhsP/5WVQ0nonYqEVj2mbydMvEpJAmMhiwAxLrLzCg89s2/hYT3aCClCbxVfVHqoFOIM5eG5gkCIqieeauj5yG8MPCaeYzv7G83dIR2BHQIPcEpPVS3W8IT9PqRGXyANHOQFdhOknjUBrntESIej0wUfvFj+aUBhfcs/sAB3PwA=</diagram></mxfile>
Binary file modified docs/img/first_steps/img002.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/img/first_steps/img008.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/img/first_steps/img020_data_sampling.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/img/first_steps/img_data_sampling001.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/img/first_steps/img_data_sampling002.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/img/first_steps/img_data_sampling_audit.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/img/identity_card1.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/img/identity_card2.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/img/identity_card3.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/img/identity_card4.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/img/identity_card5.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/img/identity_card6.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed docs/img/identity_card7.png
Binary file not shown.
Binary file added docs/img/img_data_sampling_main_red.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/img/logical_groups_example6.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/img/mindmaps/data_sampling_main.png
Binary file added docs/img/tags_filter.png
Binary file added docs/img/tags_img001.png
Binary file added docs/img/tags_img002.png
Binary file added docs/img/tags_img002bis.png
Binary file added docs/img/tags_img003.png
Binary file added docs/img/tags_img004.png
Binary file added docs/img/tags_img005.png
Binary file added docs/img/tags_img006.png
7 changes: 4 additions & 3 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,11 +12,12 @@ Welcome to the Splunk TrackMe application documentation
- Provides a powerful user interface to manage activation states, configuration and quickly trouble availability failure detection
- Analyse and detect lack of data and performance lagging of data sources and hosts within your Splunk deployment
- Behaviour analytic with outlier detection based on machine learning outliers calculations
- Behaviour analytic with data sampling and event format recognition, monitor and detect anomalies in raw events to detect event format changes or misbehaviour
- Create elastic sources for any kind of custom monitoring requirements based on tstats / raw / mstats / from searches to fullfill any requirements
- Record and investigate historical changes of statuses, as well as administrators changes (audit flipping and changes)
- Easy administration via graphical human interface from A to Z
- No matters the purpose of your Splunk deployment, trackMe will easily become an essential and easy piece of your deployment, and even providing efficient answers to PCI and compliance requirements
- Never let again your team be the last to discover what empty and no results found mean!
- Keep things under your control and be the first to know when data is not available, get alerted before your users get back to you!

.. image:: img/screenshots_main/img001.png
:alt: img001.png
Expand Down Expand Up @@ -48,11 +49,11 @@ No administrator should be informed of an issue in the data flow by the customer

with the massive amount of data sources, this becomes easily a painful and time consuming activity, this application aims to drastically help you in these tasks.

This tiny application provides a handy user interface associated with a simple but efficient data discovery, state and alerting workflow.
TrackMe provides a handy user interface associated with an efficient data discovery, state and alerting workflow.

Made by Splunk admins for Splunk admins, the TrackMe application provides builtin powerful features to monitor and administer you data source monitoring the easy way!

**Use case for TrackMe?**
**Use cases for TrackMe?**

No matters the purpose of your Splunk deployment, trackMe will easily become an essential and positive piece of your Splunk journey:

Expand Down
24 changes: 24 additions & 0 deletions docs/releasenotes.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,30 @@
Release notes
#############

Version 1.2.21
==============

**CAUTION:**

This is a new main release branch, TrackMe 1.2.x requires the deployment of the following dependencies:

- Semicircle Donut Chart Viz, Splunk Base: https://splunkbase.splunk.com/app/4378
- Splunk Machine Learning Toolkit, Splunk Base: https://splunkbase.splunk.com/app/2890

TrackMe requires a summary index (defaults to trackme_summary) and a metric index (defaults to trackme_metrics):
https://trackme.readthedocs.io/en/latest/configuration.html

- Feature: Introducing a new very hot feature! Data sampling and event format recognition is a new workflow that allows monitoring the event formats behaviour by processing automated sampling of the data sources and monitoring their behaviour over time, builtin rules are provided and can be extended with custom rules to handle any custom data format
- Feature: Introducing the new tags capability, you can now add tags to data sources, tags are keywords which can be set per data source to provide new filtering capabilities
- Fix: When using a custom Splunk URI path (root_endpoint in web.conf), internal calls to splunkd made the UI can fail if splunkd does not accept the root context and only accepts the custom root context
- Fix: When creating new dedicated elastic sources, if the search result name exceeds 100 characters, this results in a silent failure to create the new source
- Fix: Shorten default naming convention used for new Elastic Sources tracker names
- Fix: Limitation of the list function used in stats limits the number for Elastic shared data sources to 99 sources maximum, fixed by alternative improved syntax
- Fix: For Elastic shared sources, if the first source is a raw search, the addition of the "search" key word in the first pipeline fails under some conditions
- Change: Automatically join the acknowledgement comment in the acknowledgement screen
- Change: Time to live definition for scheduled reports (dispatch.ttl) to reduce overhead in the dispatch directory
- Change: Automatically affect a 1 minute time window when creating Elastic dedicated trackers

Version 1.2.20
==============

Expand Down

0 comments on commit 5100e0f

Please sign in to comment.