Oct.08

Windows ELK Stack & Palo Alto Firewall

Windows ELK Stack & Palo Alto Firewall

I’ve seen a few tutorials around the web showing how to setup an ELK stack setup for Palo Alto Firewalls. I’m not a Linux guru, and our whole environment at work is primary Windows servers, so I wanted to stick with what I know – but haven’t found any tutorials on how to setup an ELK stack on a Windows box. I’ll post a brief setup of how I accomplished this below.

My final product – one side is PRTG, monitoring the bandwidth for each of our locations and both firewalls, as well as any down sensors. The right side is my Kibana setup. This is setup on a 55″ 4K TV:

IMG_1307

Some other sources that I used as references and combined info from to get this to work:
https://www.ulyaoth.net/resources/tutorial-install-logstash-and-kibana-on-a-windows-server.34/
http://operational.io/elk-stack-for-network-operations-reloaded/
https://anderikistan.com/2016/03/26/elk-palo-alto-networks/
https://exorcimist.wordpress.com/2015/05/07/kibana-logstash-elasticsearch-for-palalto/

An ELK Stack is short for Elasticsearch, Logstash, Kibana. The way I have my Palo Alto box setup to get my data to show up on my server is:

Firewall –> Logstash –> Elasticsearch –> Kibana

I export all my logs via Palo Alto’s syslog export to the IP address of my ELK server. Once everything is setup correctly, Logstash collects the syslog data from the firewall. Logstash has a config file that we setup, that knows how to read the syslog data. Kibana is the front end dashboard that views everything, and Elasticsearch is the engine that reads through the logs and pushes it to Kibana. The only thing we really need to setup are a couple files for Logstash, and setting an index in Kibana.

What you’ll need to download:

Now for installing everything:

  • Install Java JDK
  • You need to setup the system variable “JAVA_HOME”. To do this, go to system properties, advanced options, add a new environmental variable called “JAVA_HOME” and add the full Windows path to the JDK install:
    1
  • You now need to unzip all Elasticsearch, Logstash and Kibana. To keep things clean and organized, I unzipped them to have the following structure:
    • C:\elk\elasticsearch
    • C:\elk\kibana
    • C:\elk\logstash
  • Logstash requires a config file for it to run correctly and understand what to do with the data. It also requires a JSON file that serves as a template. Finally, for geographical data and getting to use the map feature in Kibana, you need the Geodata file, which you downloaded above.
  • Extract the GeoLiteCity.gz file, and you will get a “GeoLiteCity.dat” file. Place that in the root of the logstash folder, here: C:\elk\logstash\GeoLiteCity.dat
  • Create a file called “logstash.conf” in “C:\elk\logstash\bin”. Insert this code in the config file and save it: http://pastebin.com/EcnfxFg5
  • Create a file called “elasticsearch-template.json” in “C:\elk\logstash”. Insert this code into the json file you just created: http://pastebin.com/uwbv3yLd

At this point, if all the paths are correct (if you look at the logstash.conf file, it points to the JSON template at the end, so the paths must  be accurate), then you should be able to start it up.

  • You should be able to start up everything. Open Elasticsearch and Kibana by running their BAT files:
    • C:\elk\elasticsearch\bin\elasticsearch.bat
    • C:\elk\kibana\bin\kibana.bat
  • To launch Logstash correctly, open a command prompt diaglogue box in the directory “C:\elk\logstash\bin”, and type the command
    logstash -f logstash.conf

You should now have all 3 programs running without a problem. If you run into issues, it’s most likely going to be because you didn’t set JAVA_HOME correctly, or more likely, you messed up on logstash.conf or the json file somehow. Check the paths in the configuration files to make sure everything is right.

2

I’m sending my Palo Alto syslog over TCP port 5513. If you want to change this to UDP or different port number, you need to update the logstash.conf file at the beginning to reflect that.

elk

You can now go to http://localhost:5601 , and Kibana should load up. If your Palo Alto box is sending syslog data to the ELK stack server, and all 3 programs are currently running, then you just have to add the index:

palo-firewall-*

Once you add the index above, select ‘@Timestamp’ and you should see the following:

index

 

Advanced Configuration (Services & JAVA HEAP)

Once you’ve confirmed everything is setup and you’re receiving data, you may need to tweak it a bit to perform well. Our environment is pretty big – 16,000 users total (not all using the internet at the same time), but the default install will choke and crash before lunch time, so I’ve had to change some stuff. First off, let’s set Elasticsearch to run as a service and update the Java memory settings. Then we’ll install Logstash and Kibana as services.

  • Navigate to C:\elk\elasticsearch\bin
  • Run the command “service install”
  • Run the command “service manager”
  • Set the Java Virtual Machine path to your JRE install directory with the jvm.dll (mine was ‘C:\Program Files\Java\jre1.8.0_102\bin\server\jvm.dll’) – your version may differ, so look for the similar path.
  • Update the memory pool to 50% of the actual RAM in your system. My server has 24GB of RAM, so I’ve set the memory pool to 12GB.

elasticservice

 

Elasticsearch is now setup as a service with more memory for Java. This has stopped our server from crashing multiple times a day due to the load. Now let’s setup Logstash and Kibana as a service.

  • Download NSSM @ http://nssm.cc/release/nssm-2.24.zip
  • Extract and run command “nssm install X”, where X is the name of the service you want to install. We’ll set one up for logstash and kibana.
  • Run:
    nssm install Logstash
  • Setup the path and arguments that point to the config file.
  • Run:
    nssm install Kibana

    Setup the path and then dependencies of “Logstash” and “elasticsearch-service-x64”. This will make sure that Elasticsearch and Logstash services are both running before Kibana is allowed.

nssm1 nssm2

 

At this point, you should have all 3 running as services. and the memory allocated automatically for Java:

 

server

 

Bonus: Redirect port 5601 to 80 and enable basic authentication

If you want to make things simpler and access the server via port 80 and enable authentication so others can’t get in and snoop, follow these steps:

  • Download NGINX here: https://nginx.org/en/download.html
  • Extract to C:\elk\nginx
  • Edit the file “C:\elk\nginx\conf\nginx.conf” and replace with this code: http://pastebin.com/jzAhJdqe
  • In the same directory, make a file called “.htpasswd”
  • Generate a username and password via this site: http://www.htaccesstools.com/htpasswd-generator/
  • Insert the generated code into the .htpasswd file.
  • Start NGINX, and you should be able to access the site via http://localhost with authentication.
  • If you get Internal 500 errors, it could be security rights to the password file – check the rights of the user that nginx is running under.
  • There are tutorials online on how to setup NGINX as a service as well.

Bonus: Automatically Delete Old Indices

My server only has 150GB of free space on it, just because I don’t need to look at old data – we really only care about what’s going on that day, as we can always go into the firewall and run reports for older things. I wanted a way to delete older indices automatically, so here is what I’m doing:

curator --config C:\elk\curator\curator.yml C:\elk\curator\action.yml
  • The command specifies the config file loation, then the action to take. People normally have this in their user folder somewhere, but I like keeping all my files in one location.

delete

4

map

img_0123

Uncategorized
Share this Story:
  • facebook
  • twitter
  • gplus

About admin

Comments(2)

  1. Krishna
    122 days ago

    Hi
    ur article is nice and easy to digest.
    can u help me understand how manually clean indexes and old data from both logstach and kibana manually with out curator ? any command to do in windows ?

    • admin
      121 days ago

      I’m sure you could script a batch file that would delete the indices from the logstash folder – but curator was made for this purpose.

Leave a comment

Comment