All Apps and Add-ons

Splunk App for Web Analytics 1.61: How do the kvstore lookups and data model work over time?

cgullach
New Member

Hello,

The issues that I am having are regarding being able to generate the WA_sessions lookup over a long period of time. In my environment, we currently have 29 sites that have been sending logs to Splunk for a couple years now. I want to be able to query session information from at least year-to-date, but when I try to generate the WA_sessions lookup for any time greater than a month I get the following errors in the search.log:

ERROR KVStorageProvider - An error occurred during the last operation ('saveBatchData', domain: '2', code: '4'): Failed to read 4 bytes from socket within 300000 milliseconds.
ERROR KVStoreLookup - KV Store output failed with code -1 and message '[ "{ \"ErrorMessage\" : \"Failed to read 4 bytes from socket within 300000 milliseconds.\" }" ]'

There are 3 500 000+ events when querying year-to-date. I have tried defragmenting the Splunk server drives and cleaning the kvstores, but keep getting the same errors.

What I would like to know what happens over time to the WA_pages and WA_sessions kvstores, as well as the Web data model? For example, what happens to the WA_sessions kvstore each time one manually generates the lookup, and do old sessions get removed as the automated job runs? If I could split up the generation of sessions into smaller spans and then combine them, this could help me work around the issues I'm having. Do you have any other suggestions for how I could resolve the issue?

Thank you in advance,
Chris

0 Karma

jbjerke_splunk
Splunk Employee
Splunk Employee

Hi Chris

Sorry for taking so long to reply.

There is an issue with incorporating big quantities of old data in the Web Analytics app. The app works by generating sessions from a scheduled search which are stored in the KVStore. After this search has finished, the session information will be added to a data model and the information in the KVStore will be purged to only contain the most recent data that has not been added to the data model. Moving forward after this initial load, the data contained in the KVStore will be very small.

The problem lies within the initial backfill load as it can lead to the timeout issues you have encountered. You might succeed with running the backfill in smaller timespans (one month at a time?) and then waiting for the data to be added to the data model before running the next month.

Let me know how you get along.

j

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...