Splunk Search

How to edit my search to calculate values inside JsonArray

georg_koch
Engager

Hi *,

I have some trouble with Splunk stats functions :).
I have a JSONArray event like this and I need to sum all counts grouped by status.
When I execute the search below, I get the following result.
What is wrong?

Result

sum(responces{}.count):
    2xx = 2160
    4xx = 405

Json Event

{ "responces": [
        { "count": 19, "status": "2xx" },
        { "count": 7,  "status": "2xx" },
        { "count": 1,  "status": "2xx" },
        { "count": 1,  "status": "2xx" },
        { "count": 4,  "status": "2xx" },
        { "count": 1,  "status": "2xx" },
        { "count": 2,  "status": "2xx" },
        { "count": 1,  "status": "2xx" },
        { "count": 1,  "status": "2xx" },
        { "count": 1,  "status": "2xx" },
        { "count": 40, "status": "2xx" },
        { "count": 19, "status": "2xx" },
        { "count": 1,  "status": "2xx" },
        { "count": 11, "status": "2xx" },
        { "count": 1,  "status": "2xx" },
        { "count": 10, "status": "2xx" },
        { "count": 1,  "status": "4xx" },
        { "count": 2,  "status": "4xx" },
        { "count": 12, "status": "4xx" }
] }

Search

search XYZ | spath input=json | stats sum(responces{}.count) by responces{}.status

I use Splunk enterprise
Splunk-Version 6.4.0
Splunk-Build f2c836328108

0 Karma
1 Solution

martin_mueller
SplunkTrust
SplunkTrust

I'm guessing you've got the count and status fields extracted as multi-value fields, ie one event has many count values and many status values.
By treating this event that way, stats loses the connection between a count and its status so it computes a kind of cross product yielding way higher values:

sum(count) = 135
number of 4xx status values: 3 --> 3*135 = 405
number of 2xx status values: 16 --> ...you get the picture

Essentially, you're telling splunk "this event has count values, sum those up - and please do that for each status value in the event". There's no way for splunk to guess that you actually have 19 events nested in there.
The key issue is the way your JSON is structured - ideally, your source would produce each aggregation individually, so one event with one set of counts for each status.

Barring that, you can split up the event in search like this:

search XYZ | spath input=json responces{} output=responses | mvexpand responses | spath input=responses | stats sum(count) by status

View solution in original post

martin_mueller
SplunkTrust
SplunkTrust

I'm guessing you've got the count and status fields extracted as multi-value fields, ie one event has many count values and many status values.
By treating this event that way, stats loses the connection between a count and its status so it computes a kind of cross product yielding way higher values:

sum(count) = 135
number of 4xx status values: 3 --> 3*135 = 405
number of 2xx status values: 16 --> ...you get the picture

Essentially, you're telling splunk "this event has count values, sum those up - and please do that for each status value in the event". There's no way for splunk to guess that you actually have 19 events nested in there.
The key issue is the way your JSON is structured - ideally, your source would produce each aggregation individually, so one event with one set of counts for each status.

Barring that, you can split up the event in search like this:

search XYZ | spath input=json responces{} output=responses | mvexpand responses | spath input=responses | stats sum(count) by status
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...