Integrating External Asset Databases with the Splunk App for Enterprise Security

image

Overview

In this post I'd like to cover an approach for integrating an external asset database with the Splunk App for Enterprise Security (ES).  This post is relevant for people just starting out with ES or who have used it for a while and want to improve the integration of their assets information with the application. 

For those wondering what an assets list is in the context of ES, it's a list containing information (such as ip, hostname, and owner) about the IT assets in an organization (such as servers, workstations, and network devices).  The information in the assets list serve two main purposes:

1.   Correlate asset information from different log sources.  For example, if an ip for an asset is seen in one log and the hostname for it another, ES is able to use the assets information to correlate the two separate events together, which can be extremely powerful.

2.  Provide additional information about the asset to the users of the app.  For example, if the location and owner of the asset is populated in the assets list then it can appear in the notable event detail on the "Incident Review" page.  This addtional information can save an analyst time from having to do a separate lookup to find the information related to an IP as it's immediately presented to them. 

As you can see, the assets list is an extremely important data input into ES.  Ideally, this information is as up to date as possible and is collected automatically.  There's a lot of flexibility in this process and room for customization.  Below is one approach to automate the integration of this list into ES. 

Automating Asset List Integration

At a high level, the approach for automating the population of the assets list is as follows:

  1. Extract the data from an assets database and index it in Splunk
  2. Run a scheduled search to read the indexed information and output it to a lookup file

The file is then available for ES to use in creating it's merged assets list and can be used in the app.  Note, this approach assumes that you have the DB Connect app set up on a Splunk instance in your environment that is configured to forward data to your indexers.  Now, Let's cover this approach in detail.

1.   Extract the Data from the Asset DB

a.  Create an index in on your Splunk indexers to store the assets information.  In this example, we'll call our index "function1_assets." 

Note that it's also possible to create a search that extracts the information from the assets db and then outputs to the lookup file directly.  In this solution, we want to persist the data that is retrieved from the asset database in a Splunk index and then query it later from within Splunk. 

b.  In the DB Connect app, create a database connection to query the asset db.

c. Create a database input that uses that connection. 

As you'll see it's important to use a good naming convention in order to organize the different components in the approach and make it easier to manage the configuration of multiple asset sources.  Note that we're saving this input in the "SA-IdentityManagement" app which is the app that hosts the asset lookup tables and other related configuration.  It's not necessary to store it in this app, but you can do so for organizational purposes.

Continue setting up your DB Connect input as you normally would. 

The following are the list of expected fields in the Assets table:

ip,mac,nt_host,dns,owner,priority,lat,long,city,country,bunit,category,pci_domain,is_expected,should_timesync,should_update,requires_av

In your query, it would be a good idea to rename the fields from the assets database to the names above so they are indexed that way.  This will make the populating search later, simpler.

In our example we set the search to run every hour. 

We used the following settings in the "Metadata" section:

1. index = "function1_assets"

2. sourcetype = "function1_server_assets"  We specify "server" in the sourcetype so we can distinguish these records from other asset integrations we'll be doing.  We'll be creating separate assets list in ES later.

3.  source = function1_asset_db

Every hour it should extract the server list from our organization's asset database and index that data in Splunk.  In this solution, the entire contents of the assets db are collected and ingested in Splunk.  

2. Create the Lookup Populating Search

Now that we have our asset information indexed in Splunk, we'll next want to create scheduled search that will periodically extract the information from the index and put it into an assets list that ES can use. 

a.  Create the Splunk search query.  Below is a sample search query for pulling information from the data that is indexed:

index=function1_assets sourcetype=function1_server_assets | rename asset_owner AS owner | eval is_expected = if(priority=="low",0,1) | table ip,mac,nt_host,dns,owner,priority,lat,long,city,country,bunit,category,pci_domain,is_expected,should_timesync,should_update,requires_av | outputlookup function1_server_assets.csv

A few things to note about this query:

  • As an example, a "rename" is done that translates a database field name, in this case "asset_owner" to the name of the field that is expected in the Assets table, in this case "owner."  This can be done for any fields that were indexed with a name other than the "expected" name.
  • The "eval is_expected" statement is an example of building logic into the query such that sets values for other fields in the lookup table based on other field values.  In this, we're setting the "is_expected" flag to true, for any assets that have a priority other than "low."
  • The search outputs the results into a file called "function1_server_assets.csv."  This filename must match the corresponding file name that appears in the assets list definition you configure in ES. 

b.  Once the query has been tested and the results look correct, go to "Settings" > "Searches, reports, and alerts" and create a new search.

Make sure to create the search in the "SA-IdentityManagement" app so that the .csv file is created in the "lookups" directory within that app.  Also when scheduling the search, make sure to have it execute around 10 min. after DB Connect input executes in order to give it enough time to complete and index the data.  So if the DB Connect input is set to execute at the top of every hour, then the populating search can be executed 10 min. after the hour. 

You have now automated the extraction of information from your asset database into the Splunk App for Enterprise Security.

Thanks for reading and happy Splunking!

 

 

 

 

 

 

Subscribe to Our Newsletter


Stay In Touch