Splunk Dev

Question regarding running Splunk in a PCI environment

mohitvohra109
Explorer

Hi all,

I'm looking to evaluate Splunk for log management w.r.t PCI DSS compliance and have a couple of questions regarding its running, hence sharing them here:

  1. How does Splunk capture the information from different sources? Is it using an agent that runs on each server / data source?
  2. Do we need to install Splunk on each in-scope application / server or can we simply install it on the client pc?

I'm basically looking for a solution that can be run from the client machine itself and I can then extract data out of the in-scope servers using custom queries / search pattern. Not sure if this can be achieved using Splunk. I thought since many users are already using it so you might have an answer to these two questions.

Many thanks,

Mohit Vohra.

Tags (2)
1 Solution

ftk
Motivator

Let me first answer your specific questions:

  1. Splunk can capture via a number of methods. The most common options include an agent on each source server, syslog, remote WMI queries, indexing log files on UNC shares, and custom scripted inputs. There is a lot of good information in the docs here.

  2. Depending on how you want to get your data into your Splunk indexer both methods will work, but in the end it depends on your QSA's interpretation of the DSS.

Now that we have that out of the way, let's talk a bit more about Splunk and PCI.

Yes, Splunk works well for several of PCI's requirements (they even have a whitepaper on this), specifically log management, log auditing, file integrity monitoring, daily review, and reporting. The way Splunk works is you aggregate all your logs to one or multiple indexers, and then search, alert, and report on the aggregated data. If for some reason you have PANs or CVVs in your logs, Splunk can mask them with very little configuration.

Splunk is not a small client application -- it is a server application that can scale to enterprise proportions. Running it on a client PC will not suffice, you will need a dedicated server (or at least VM). Remember the PCI requirement of one function per server.

I recommend you start your approach as follows: Install an evaluation copy of Splunk and index some data. Get familiar with the software and the interface. Get a feel for search, reports, and even alerts. Index a representative sample of your data and use that as a starting point for estimating the size of your production installation. Engage a PCI QSA (if your PSP level is high enough) and run your plans by them. Engaging Splunk sales can help as well. Design your splunk installation adhering to PCI's guidelines, get it up and running, locked down, and indexing. Get your QSA to do their audit, and hope for the best 😉

View solution in original post

ftk
Motivator

Let me first answer your specific questions:

  1. Splunk can capture via a number of methods. The most common options include an agent on each source server, syslog, remote WMI queries, indexing log files on UNC shares, and custom scripted inputs. There is a lot of good information in the docs here.

  2. Depending on how you want to get your data into your Splunk indexer both methods will work, but in the end it depends on your QSA's interpretation of the DSS.

Now that we have that out of the way, let's talk a bit more about Splunk and PCI.

Yes, Splunk works well for several of PCI's requirements (they even have a whitepaper on this), specifically log management, log auditing, file integrity monitoring, daily review, and reporting. The way Splunk works is you aggregate all your logs to one or multiple indexers, and then search, alert, and report on the aggregated data. If for some reason you have PANs or CVVs in your logs, Splunk can mask them with very little configuration.

Splunk is not a small client application -- it is a server application that can scale to enterprise proportions. Running it on a client PC will not suffice, you will need a dedicated server (or at least VM). Remember the PCI requirement of one function per server.

I recommend you start your approach as follows: Install an evaluation copy of Splunk and index some data. Get familiar with the software and the interface. Get a feel for search, reports, and even alerts. Index a representative sample of your data and use that as a starting point for estimating the size of your production installation. Engage a PCI QSA (if your PSP level is high enough) and run your plans by them. Engaging Splunk sales can help as well. Design your splunk installation adhering to PCI's guidelines, get it up and running, locked down, and indexing. Get your QSA to do their audit, and hope for the best 😉

mohitvohra109
Explorer

I tried playing around with outputcsv at the end; but what I was looking for was to set a schedule that emails results in csv format. I accomplished that using table command and then using sendemail command option 'inline=false'; then i saved the search and scheduled it so that worked for me.

0 Karma

ftk
Motivator

Hmm, do you mean that you want to export certain data from splunk into csv? You can do that by crafting your search and then appending | outputcsv at the end, no custom script needed.

mohitvohra109
Explorer

Just wanted to add:
The manual element was in copying and pasting the file from one server to the central syslog server. Basically there were 2 syslog servers (one primary, one DR) and i would copy and aggregate the resultant csv from DR server and merge it with the csv on primary server and do some filtering so as to reflect only latest day's data which i would copy into my spreadsheet directly.
Now with splunk, my concern is whether i'd have to re-build a custom script for it or not as that would mean that more time is required for this.

0 Karma

mohitvohra109
Explorer

I now have splunk installed on two servers; these servers will now act as central syslog source that will aggregate the data and then can be used to analyze the aggregated data.

Prior to this i had created shell script to parse through the log data and create a csv file on each server & then used that file in my spreadsheet. So there was still some manual element in my process which splunk can be used to overcome.

Area of concern:
I can see in splunk admin guide that we can use custom scripts to parse the data. Does that mean i'll have to modify my existing script to run it on splunk?

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...