-
Notifications
You must be signed in to change notification settings - Fork 39
Home
The following steps will get you up and running with a full Elastic Stack 6.1 instance (including syslog server), so you can visualise data from your Palo Alto firewall(s) using the files in this project.
For an explanation of the Elastic Stack, see the README.md on this project or www.elastic.co/elk-stack.
In this tutorial we'll be using a fresh Ubuntu Server 16.04 LTS, but the Elastic Stack can be installed on any flavour of Linux, or Windows. With minimal tweaking, the core files in this project can be adapted to work on an Elastic install on any OS, or be integrated with an existing Elastic installation.
I'll be feeding in syslog entries from a PA-220 firewall using PAN-OS 8.0.3. The syslog format hasn't varied hugely, so you should be able to achieve results from any PAN-OS 7.x device also. Panorama devices will also work for exporting syslogs, though you may want to tweak the visualisations slightly for multi-device support.
To kick off, create a fresh VM on your virtualisation platform of choice, and install Ubuntu. I've used the 16.04 LTS Server ISO from here
One important point when working with Linux behind a Palo Alto Networks firewall - ensure SSL Decryption is disabled for your new VM. Most of the install steps below will fail if SSL decryption is in place, unless you have explicitly added the PANW signing certificate to the local keystore. Since that's beyond the scope of this tutorial, I've just placed an override rule for my new VM above the primary SSL Decryption policy.
Once the install process is completed, configure your networking. You'll want a static IP address, and to open the following firewall ports;
- tcp/80 to access the web interface
- udp/5514 to ingest the syslogs
You can follow the instructions here to set a static IP and here to open some firewall ports
Once your Ubuntu install is completed and networking setup, we can get started
Java is required for the LogStash install to come later, so we'll get this out of the way first. Open a Terminal and follow along at home;
sudo add-apt-repository -y ppa:webupd8team/java
sudo apt-get update
sudo apt-get -y install oracle-java8-installer
At this point you'll be asked to agree to the Oracle Java license agreement. After agreeing, you'll see the installation continues...and then fails. As of this writing, the webupd8 team's repo for Java 8 151 is a little out of date You can patch that with the following;
cd /var/lib/dpkg/info
sudo sed -i 's|JAVA_VERSION=8u151|JAVA_VERSION=8u161|' oracle-java8-installer.*
sudo sed -i 's|PARTNER_URL=http://download.oracle.com/otn-pub/java/jdk/8u151-b12/e758a0de34e24606bca991d704f6dcbf/|PARTNER_URL=http://download.oracle.com/otn-pub/java/jdk/8u161-b12/2f38c3b165be4555a1fa6e98c45e0808/|' oracle-java8-installer.*
sudo sed -i 's|SHA256SUM_TGZ="c78200ce409367b296ec39be4427f020e2c585470c4eed01021feada576f027f"|SHA256SUM_TGZ="6dbc56a0e3310b69e91bb64db63a485bd7b6a8083f08e48047276380a0e2021e"|' oracle-java8-installer.*
sudo sed -i 's|J_DIR=jdk1.8.0_151|J_DIR=jdk1.8.0_161|' oracle-java8-installer.*
With that out of the way, let's resume the Java install
sudo apt-get install oracle-java8-installer
I've cribbed the short version of these instructions from the comprehensive documentation at the elastic.co site, including
- https://www.elastic.co/guide/en/elasticsearch/reference/6.1/deb.html
- https://www.elastic.co/guide/en/kibana/current/deb.html
- https://www.elastic.co/guide/en/logstash/current/installing-logstash.html
If you want to run this project on a different platform, or encounter issues with the instructions below, that documentation is the best place to start.
This step will be broken down into the following;
- Install common components
- Install Elastic Search
- Install Kibnana
- Install LogStash
Common Components
cd ~
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list
ElasticSearch
sudo apt-get update && sudo apt-get install elasticsearch
sudo /bin/systemctl daemon-reload
sudo /bin/systemctl enable elasticsearch.service
Kibana
sudo apt-get update && sudo apt-get install kibana
sudo /bin/systemctl daemon-reload
sudo /bin/systemctl enable kibana.service
sudo systemctl start kibana.service
LogStash
sudo apt-get update && sudo apt-get install logstash
sudo systemctl enable logstash.service
The Elastic Stack isn't particularly secure, out of the box. You can opt to purchase the X-Pack from elastic.co, which offers a whole suite of options. But for smaller deployments, we can provide some basic security quite easily
ElasticSearch
First step is to lock-down ElasticSearch to only accept requests from localhost
sudo nano /etc/elasticsearch/elasticsearch.yml
Locate the line "#network.host: 192.168.0.1", and change it to "network.host: localhost"
nginx
Next step is to install nginx and configure it to act as a reverse proxy to the Kibana server on port 5601, using htpasswd for simple authentication. Anything more advanced than basic user/pass will require additional work, or the X-Pack from elastic.co, but for basic network device monitoring this will suffice
sudo apt-get install nginx apache2-utils
Create a user for authentication. Replace 'admin' in the command below to use a custom username. You can also repeat this command to add additional users
sudo htpasswd -c /etc/nginx/htpasswd.users admin
And lastly, open up the nginx configuration file
sudo nano /etc/nginx/sites-available/default
Replace the contents of this file with the contents of the "nginx-default" file from this project.
NOTE: ensure you replace '<host-or-ip-address>
' with the hostname or IP address of your server
Now we get to the fun stuff.
Configure your PAN-OS firewall to send syslog messages to your new VM, on port udp/5514 The firewall should send at least the following logs;
- Traffic
- Threat (includes Wildfire & URL)
- System
- Config
Log format should be BSD, with facility set to LOCAL_USER.
There's detailed guides available on the Palo Alto Documentation site Configure Syslog Monitoring and Assign the Log Forwarding Profile
Step-by-Step
These instructions are for PAN-OS 8. The process is slightly different for PAN-OS 7, but fundamentally the same. Check the Palo Alto Networks website for instructions relevant to PAN-OS 7
The procedure for configuring syslog output on PAN-OS firewalls involves 4 steps;
- Create Syslog Server Profile
- Select Device > Server Profiles > Syslog
- Click Add and set the profile name (e.g. Syslog-Default)
- Then click 'Add' in the Servers tab and fill in the fields
- Name: Choose any name
- Server: IP address of FQDN of your Ubuntu VM
- Transport: UDP
- Port: 5514
- Format: BSD
- Facility: LOG_USER
- Click OK
- Add the Syslog Server to a Log Forwarding Profile
- Select Objects > Log Forwarding
- Create a new Log Forwarding Profile (or edit an existing one)
- For each of the following log types, click 'Add' enter a name and select your syslog profile;
- Threats
- Wildfire
- Traffic
- URL
- Click OK
- Apply that Log Shipping Object to Security Policies
- Select Policies > Security and edit the rule(s)
- Select Actions and select the Log Forwarding profile you created/edited
- Set the Profile Type to Profiles or Group, and then select the security profiles or Group Profile required to trigger log generation and forwarding for:
- Threat logs — Traffic must match any security profile assigned to the rule
- WildFire Submission logs — Traffic must match a WildFire Analysis profile assigned to the rule
- For Traffic logs, select Log At Session End
- Click OK to save the rule
- Repeat for each rule you want to send logs to the Elastic Stack
- Ensure that System & Config logs are sent
- Select Device > Log Settings
- For System & Configuration, click add, set a name, add the Syslog server profile, and click OK.
Note: if you're wondering why we've configured 6 log types, but there's only 4 log streams in this project, the Threat syslog contains both URL & Wildfire entries, in addition to Threats
Triggering Logs
Once completed, you'll need to trigger some traffic for all 4 log types. This is important, as the Index Pattern in Kibana cannot be created until syslog messages are received.
- Traffic logs should ship pretty quickly, provided you have correctly applied the Log Shipping Object to one or more active Security Policies
- Threat logs may have to be manually triggered (unless your environment is frequently-attacked). Palo Alto Networks have a Live article that is useful here
- To send a Config syslog message, you'll need to make another change to the firewall. This may seem counter-intuitive, but making the config change to send syslog messages doesn't send that change as a syslog message. So make another arbitrary change to the firewall, and commit that
- The second commit should also trigger a System log message. If not, try instructing the firewall to update it's antivirus, app or wildfire signatures
Everything we have done so far is vanilla Elastic Stack & Palo Alto Firewall syslog. If you've encountered difficulties, there's many resources on the web to help you out.
Now we get into the project-specific implementation.
Firstly edit the PAN-OS.conf file from this project, and set your timezone.
NOTE: It's critical the time zone is set (& spelt) correctly for your area. If using a distributed install with Panorama, set the timezone to the timezone of the Panorama device
PAN-OS.conf
Once the timezone is set, copy this file to /etc/logstash/conf.d/
cp PAN-OS.conf /etc/logstash/conf.d/PAN-OS.conf
Index Templates
Then install the two index templates from this project, which were created by shadow-box
curl -XPUT http://<your-server>:9200/_template/panos-traffic?pretty -H 'Content-Type: application/json' -d @traffic_template_mapping-v1.json
curl -XPUT http://<your-server>:9200/_template/panos-threat?pretty -H 'Content-Type: application/json' -d @threat_template_mapping-v1.json
LogStash
Start the logstash server
sudo service logstash start
It takes a minute or two for the LogStash server to start, then syslog messages should appearing. If not, check the status of the logstash service, and look for any errors
sudo systemctl status logstash
If errors are found, check the syntax of your /etc/logstash/conf.d/PAN-OS.conf
When LogStash starts receiving syslog messages from your firewall, it will ingest, transform and store them in ElasticSearch. They will then be exposed in the Kibana web inteface, for viewing, searching & visualising.
When Kibana starts seeing new ElasticSearch indexes, it needs to create an 'Index Pattern' so it can understand and visualise the data.
Login to Kibana
Open a browser to http:// to start the Kibana web interface so we can create these patterns. Note: you will need the username & password you configured in Step 3 to login.
Click 'Management > Index Patterns' and you should see a screen similar to the below;
Create the Index Patterns
Create each of the 4 required indexes, in the order presented below;
- panos-traffic
- panos-threat
- panos-system
- panos-config
For each index, set the Time Filter field name to '@timestamp'
Once the indexes are created, edit the 'panos-traffic' index. Locate the Bytes, BytesReceived & BytesSent fields. For each, click the 'edit' button on the right-hand side and set the format to 'Bytes'. This allows Kibana to automatically adjust the display format for any Bytes fields to B, KB, MB, GB etc. as appropriate
Confirm Log Flow
Click 'Discover' in the left-hand menu, and confirm logs are flowing into ElasticSearch, based on each of the 4 Index Patterns you have just created
As a last step, we will import the pre-packaged visualisations included in this project. These are by no means a complete demonstration of what can be accomplished with this setup, but they provide an effective and eye-catching way of looking at your PANW device health & status
I'd encourage everyone who has read this far, to go further and look at the Elastic documentation for visualisations, so they can create their own. But, to get you started;
Click 'Management > Saved Objects' and import Saved Object files (in this order);
- searches-base.json
- visualisations-base.json
- dashboards-base.json
This will import 66 visualisations, each of which creates a single chart/graph based on a single index pattern & filter. When viewing each visualisation, you can set a time frame and additional filters on top of the built-in one.
Since a single visualisation might be pretty, but isn't too useful, we've also imported 9 pre-built dashboards (collections of related visualisations). These dashboards also include some Saved Searches, for raw log retrieval.
You can click on a slice/bar/column in a dashboard to set a filter, which will apply to all objects on that dashboard. This helps you easily narrow down the culprit (user, IP, file, etc) for any logged event
That's it!
Click 'Dashboards' in the left-hand menu, and select one of the dashboards to view. I suggest starting with 'Overview', and then checking out the others based on what is flagged in the overview dashboard.
Remember you may need to let the firewall ingest data for 30-60 minutes before there's enough for the visualisations to be useful