Splunk Search

DBConnect Acceleration

MartinMcNutt
Communicator

I am currently going through an exercise where we are trying to leverage Splunk for Reporting against our Remedy (Helpdesk) ticketing system. People seem happy with the POC reports despite Splunk Poor PDF exporting.

The problem I am having is that the reporting SQL boxes are not very powerful and they have had performance issues with running searches. I am trying to reduce the cost of running the same query over and over for different users. Since this is ticket data, i am not ready to ingest them into indexes.

  • I have tried outputting all the results to CSV. InputCSV works most of the time.
  • DBConnect and Acceleration doesn't seem to be an option.
  • I have not tried ingestion due to changes in data while tickets are open

Are there any other suggestions? The goal is to do something like Microsoft SQL Reporting Services where the data is cache for X amount of time. The user will only used cache results.

Thanks

hogan24
Path Finder

Create a saved search and schedule it to run at approx every 30m (or whatever interval works for you).

This will cache the results of the query and make the results available for reference using the 'loadjob' command. If your saved search is called 'DBQuerySearch' you could do something like this:

| loadjob savedsearch="^searchOwner^:^appName^:DBQuerySearch"

Using this method, you can run the database query only once per your chosen interval, but serve up the results over and over to multiple users. Hope that helps. Thanks!

http://docs.splunk.com/Documentation/Splunk/6.2.4/SearchReference/Loadjob

0 Karma

hogan24
Path Finder

Ack, just realized the date as to when this question was posted. Hopefully someone will still find this helpful. Cheers!!

0 Karma

araitz
Splunk Employee
Splunk Employee

We don't have anything like that available in DB Connect today, but I'll try to capture that as an enhancement request for the future. Really though, I think that results caching and the like is something that should be handled at the database level, rather than the app level.

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...

Updated Data Management and AWS GDI Inventory in Splunk Observability

We’re making some changes to Data Management and Infrastructure Inventory for AWS. The Data Management page, ...