-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
select and install Apache log analyzer #90
Comments
Would it be far fetched to consider Splunk as an over-arching enterprise-wide solution. Harvesting logs is a primary function of Splunk. It has also other capabilities, e.g. Metrics Analytics, Platform Health Monitoring. It certainly boosts site maintenance capabilities and robustness? https://www.splunk.com/en_us/download/splunk-enterprise.html?utm_campaign=google_amer_en_search_brand&utm_source=google&utm_medium=cpc&utm_content=Splunk_Enterprise_Demo&utm_term=splunk&_bk=splunk&_bt=386199173970&_bm=e&_bn=g&_bg=43997962767&device=c&gclid=CjwKCAiAhc7yBRAdEiwAplGxX0GpyuYs6K-NuQLMV3w3LT6upTC48LKX4k8nXFlFy_kzb0RlURUaYxoCH48QAvD_BwE |
I've no particular inclinations in this regard. We do already have an instance of Jenkins running - primarily for container health monitoring. |
https://docs.splunk.com/Documentation/Splunk/7.2.4/Overview/AboutSplunkEnterprise The index: Transforms the machine generated data (e.g. log files) into a searchable Index of Events that end up getting analyzed for customizable fields. The Search: Powerful concise query language. Can use Reg Ex to query things like Error conditions, or normal operational events like users logging into the server |
Splunk for Analytics: "Splunk can read this unstructured, semi-structured or rarely structured data. After reading the data, it allows to search, tag, create reports and dashboards on these data. With the advent of big data, Splunk is now able to ingest big data from various sources, which may or may not be machine data and run analytics on big data. So, from a simple tool for log analysis, Splunk has come a long way to become a general analytical tool for unstructured machine data and various forms of big data." from: |
Splunk ### Distributed Deployment -- Topology: Therefore, if we elect to choose Splunk, we would need to install a Forwarder instance of it on the production web server (Guardian). This would be the instance that would forward the raw log data to the Search/Index Head/Manager instance of Splunk (singleton) "Indexers and search heads are built from Splunk Enterprise instances that you configure to perform the specialized function of indexing or search management, respectively. Each indexer and search head is a separate instance that usually resides on its own machine" (e.g. on my machine, and your machine... and on any NIH stake holder's machine (e.g. Julie who originally requested this capability))" from: |
So as a first iteration (simple local deployment), after installing Splunk 8, on Guardian, using the first link I commented above, go ahead and follow the below ten minute tutorial, to end up being able to (using the UI this Splunk Search instance installation exposes) Splunk the production log files; except instead of the tutorial datasource they say to download and point to it, instead point it to the real live datasource, i.e. the Tomcat logfile you are wanting to analyze: Version-wise, I don't think we would need the Splunk 8 Enterprise version for Guardian (good thing because it is not available for mac's)! The standard Free version 8 should give us all we need (GUI, Indexer, Network/System Monitor... Admin Console with analytics and visualization... Search window); plus it's FREE!!! Please try this @eichmann or @alexisgraves .... and let me know if SUCCESS (sorry I don't have login to Guardian)? After this is done, I am thinking the Reporting capabilities of this single instance of Splunk may suffice, for the requirement for the stake holders (Julie the original requester, and others) being able to request and receive reports (e.g. of hits, logins, alerts...). This will still provide a helpful, thorough, but effortless tool for analyzing the large volume of logfiles that have accumulated on Guardian, for error conditions, performance issues, metrics? Splunk dashboards also would provide an invaluable one stop checking of the health and performance of the website, and enabling diagnostics. |
Phase II - Configuring mutually remote forwarder/receiver https://docs.splunk.com/Documentation/Forwarder/8.0.2/Forwarder/HowtoforwarddatatoSplunkEnterprise Step One - Configure Splunk to receive: port 9997 available (% lsof -i -P -n | grep 9997) ? My laptop will receive data on its port 9997—————————————————————————————— Start Receiver on port 9997 — Install a forwarder on Guardian——————————————————————————————————————————————————————————————————————————— https://docs.splunk.com/Documentation/Forwarder/8.0.2/Forwarder/HowtoforwarddatatoSplunkEnterprise Step One - Configure Splunk to receive: port 9997 available (% lsof -i -P -n | grep 9997) ? My laptop will receive data on its port 9997The splunk enable listen command creates a [splunktcp] stanza in inputs.conf. For example, if you set the port to 9997, it creates the stanza [splunktcp://9997]. Alternatively: To enable receiving, add a [splunktcp] stanza that specifies the receiving port. In this example, the receiving port is 9997: The universal forwarder installation packages are available for download from splunk.com. Install the universal forwarder on Mac OS X Install the universal forwarder from the Finder Open a Terminal window. Start the universal forwarder https://docs.splunk.com/Documentation/Forwarder/8.0.2/Forwarder/Starttheuniversalforwarder Configure the universal forwarder to send data to the Splunk Enterprise indexer This procedure details a basic configuration. For additional configuration options, see Configure the universal forwarder. To monitor Apache log files: |
We have multiple shortlinks - Apache aliases that redirect to various servers for onboarding, etc. Julie's looking to be able to report hits on the shortlinks.
The text was updated successfully, but these errors were encountered: