Collecting Data from ServiceNow in Splunk


Let's discuss how to collect data from your ServiceNow instance in Splunk. First, what is ServiceNow? ServiceNow is a maker of service management software that can be on-prem or in the cloud. Organizational use of ServiceNow ranges from standard IT help desk ticketing systems to legal service management. These organizations may want to collect data from their ServiceNow instance for security auditing or operational awareness of their deployment. ServiceNow exposes a REST API that can be used to extract this data.


In our example scenario, an organization has replaced their legacy IT support ticketing system with ServiceNow. One of their requirements is to collect security relevant data in order to audit and monitor their ServiceNow installation which is hosted in the cloud. They currently have three environments: dev, test, and prod, and need to collect data from all three. Below we'll cover strategies on how to collect this data.

The Add-on

The add-on we'll use to collect the data from the ServiceNow instances is the "Splunk Add-on for ServiceNow." Note that this add-on is built and supported by Splunk and is compatible with Splunk v6.5 and CIM 4.4. The add-on can be used to both collect data and access ServiceNow via workflow actions. The capability we care about in this scenario is the data collection capability which the add-on does by querying the relevant ServiceNow REST API endpoints.


As we said above, there are three separate environments from which we need to collect data. To collect data from ServiceNow, you would install the add-on on a full instance of Splunk, such as a heavy forwarder or search head, that is configured to send it's data to the indexer(s). The challenge here lies in collecting data from the multiple environments. The add-on only allows for one ServiceNow instance to be queried. So in order to query more than one instance, you would need to install and configure multiple add-ons. Based on our experience, we were not able to get more than one instance of the ServiceNow add-on installed and running indepedently on the same server. Because of this, a copy of the add-on would have to be installed on separate Splunk instances, each of which is configured to pull from a different environment. Initially, this can be done locally on each instance before being deployed via the Splunk deployment server. This solves the main challenge we were facing.

Time Range

Note: The add-on, by default will collect data from 1 year ago. Depending on when the ServiceNow instance was deployed, that could be a lot of data that will be collected in Splunk. Note that, time range and data collection interval settings can be enabled per input or for all inputs. In order to enable the default set of inputs, modify the "[snow]" stanza in the "inputs.conf" file. If you wish to change the time range, modify the "since_when" attribute for the input as shown below for the "sysevent" table. 

    duration = 120
    since_when = 10/1/2016 00:00:00

To determine your calculations for how long it will take for data to catch up for a historical data pull, or to determine the appropriate collection interval, note that the add-on has a hard-coded limit of 1,000 records per data pull. This definitely should be take into consideration when pulling the data into Splunk, as you want to make sure that you keep up with the data volume for a particular input. Below is a calculation you can use to determine what the correct interval range will be your deployment:

Max # Records Retrieved per Collection = 1000
# Seconds per Day = 86400

Minimum # of Collections > (# Events per Day / Max # Records Retrieved per Collection )

Largest Interval Between Collections < ( # Seconds per Day / Minimum # of Collections )

For example, if you are collecting 200,000 events per day for a particular data input, then the minimum # of collections you'll need is 200 (200,000 / 1,000). The largest amount of time you can go between data collections is 432 seconds (86,400 / 200). As long as the interval for this input is less than 432, then you will be able to keep up with volume for this data source.

Hopefully, this post has helped you determine a strategy for collecting data from your ServiceNow instances. As always, Happy Splunking!

Subscribe to Our Newsletter

Stay In Touch