Splunk Search

how to extract two multi-valued field value in pairs and do operation on those pairs separately.

nisha12345
New Member

For ex: I want to plot a graph of mytime vs perc from below sample data. Hence I need to have mytime and perc in two separate fields and 30 rows.( I need to have them in different rows as, I need to trim last 3 zeroes in mytime and then convert it to readable time). Also, I need to do all this in a Splunk query(Dont want to touch any props or tranforms files)
Please help!!! Thank you!

"data":[{"mytime":1490572800000,"perc":0.9940119760479041},{"mytime":1490659200000,"perc":0.9965156794425087},{"mytime":1490745600000,"perc":0.6004728132387707},{"mytime":1490832000000,"perc":0.9732246798603027},{"mytime":1490918400000,"perc":0.8205128205128205},{"mytime":1491004800000,"perc":0.6938300349243306},{"mytime":1491091200000,"perc":0.7467760844079718},{"mytime":1491177600000,"perc":0.8126463700234192},{"mytime":1491264000000,"perc":0.9976470588235294},{"mytime":1491350400000,"perc":0.9988262910798122},{"mytime":1491436800000,"perc":0.9593175853018373},{"mytime":1491523200000,"perc":0.9434954007884363},{"mytime":1491609600000,"perc":0.6474442988204456},{"mytime":1491696000000,"perc":0.9529964747356052},{"mytime":1491782400000,"perc":0.9534883720930233},{"mytime":1491868800000,"perc":0.991869918699187},{"mytime":1491955200000,"perc":0.9953216374269006},{"mytime":1492041600000,"perc":0.9953488372093023},{"mytime":1492128000000,"perc":0.9988425925925926},{"mytime":1492214400000,"perc":0.6813953488372093},{"mytime":1492300800000,"perc":0.9929824561403509},{"mytime":1492387200000,"perc":0.9907407407407407},{"mytime":1492473600000,"perc":0.9311551925320887},{"mytime":1492560000000,"perc":0.9965034965034965},{"mytime":1492646400000,"perc":0.9883720930232558},{"mytime":1492732800000,"perc":0.9875156054931336},{"mytime":1492819200000,"perc":0.9906542056074766},{"mytime":1492905600000,"perc":0.9881093935790726},{"mytime":1492992000000,"perc":0.9964830011723329},{"mytime":1493078400000,"perc":0.9848308051341891}]}

0 Karma

niketn
Legend

Following is a run anywhere search based on the sample data provided first two lines mimic the data using makeresults and eval and can be ignored

| makeresults
| eval jsonData="{\"data\":[{\"mytime\":1490572800000,\"perc\":0.9940119760479041},{\"mytime\":1490659200000,\"perc\":0.9965156794425087},{\"mytime\":1490745600000,\"perc\":0.6004728132387707},{\"mytime\":1490832000000,\"perc\":0.9732246798603027},{\"mytime\":1490918400000,\"perc\":0.8205128205128205},{\"mytime\":1491004800000,\"perc\":0.6938300349243306},{\"mytime\":1491091200000,\"perc\":0.7467760844079718},{\"mytime\":1491177600000,\"perc\":0.8126463700234192},{\"mytime\":1491264000000,\"perc\":0.9976470588235294},{\"mytime\":1491350400000,\"perc\":0.9988262910798122},{\"mytime\":1491436800000,\"perc\":0.9593175853018373},{\"mytime\":1491523200000,\"perc\":0.9434954007884363},{\"mytime\":1491609600000,\"perc\":0.6474442988204456},{\"mytime\":1491696000000,\"perc\":0.9529964747356052},{\"mytime\":1491782400000,\"perc\":0.9534883720930233},{\"mytime\":1491868800000,\"perc\":0.991869918699187},{\"mytime\":1491955200000,\"perc\":0.9953216374269006},{\"mytime\":1492041600000,\"perc\":0.9953488372093023},{\"mytime\":1492128000000,\"perc\":0.9988425925925926},{\"mytime\":1492214400000,\"perc\":0.6813953488372093},{\"mytime\":1492300800000,\"perc\":0.9929824561403509},{\"mytime\":1492387200000,\"perc\":0.9907407407407407},{\"mytime\":1492473600000,\"perc\":0.9311551925320887},{\"mytime\":1492560000000,\"perc\":0.9965034965034965},{\"mytime\":1492646400000,\"perc\":0.9883720930232558},{\"mytime\":1492732800000,\"perc\":0.9875156054931336},{\"mytime\":1492819200000,\"perc\":0.9906542056074766},{\"mytime\":1492905600000,\"perc\":0.9881093935790726},{\"mytime\":1492992000000,\"perc\":0.9964830011723329},{\"mytime\":1493078400000,\"perc\":0.9848308051341891}]}"
| rex field=jsonData "\{\"mytime\":(?<Time>\d{10})000,\"perc\":(?<perc>[\d|\.]+)\}," max_match=0
| eval RawData=mvzip(Time,perc)
| table RawData
| mvexpand RawData
| eval arrRawData=split(RawData,",")
| eval Time=mvindex(arrRawData,0)
| eval perc=mvindex(arrRawData,1)
| fields - arrRawData
| fieldformat Time=strftime(Time,"%m/%d/%Y %H:%M:%S")

1) rex command is used to apply regular expression to fetch mytime and perc values. max_match=0 applies the regular expression multiple times on the same data. This should be set to some max number if you are aware like 10 etc.
\d{10} is used to extract only first 10 digits of epoch time.

2) mvzip is used to bring the two multivalue fields together match 1st mytime with 1st perc and so on.

3) mvexpand breaks into single value rows

4) split is used to create two fields mytime and perc based on comma (,) delimiter

5) fieldformat is used to display time in Human Readable format while retaining mytime as epoch time.

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"

nisha12345
New Member

Thank you for the response niketnilay.
This works perfectly, but I need to perform this operation on latest log( everyday there will we new log with different values of mytime and perc). Could you please suggest solution for the same.

0 Karma

adonio
Ultra Champion

hello there,
i see the [ ] around the data, is the sample shared in your question a single event or each line is an event?
should i consider the [ ] or they are there by accident, since if i simplify the data, it looks like that:
"data":[{....},{....},{....}, ..... ,{.....}]}
is that the right format?

0 Karma

nisha12345
New Member

Hello Adonio, yes the data is in right format. The source type contains each log in format --> "data":[{....},{....},{....}, ..... ,{.....}]}. However, I have to work on only the latest log.

0 Karma
Get Updates on the Splunk Community!

Detecting Remote Code Executions With the Splunk Threat Research Team

REGISTER NOWRemote code execution (RCE) vulnerabilities pose a significant risk to organizations. If ...

Observability | Use Synthetic Monitoring for Website Metadata Verification

If you are on Splunk Observability Cloud, you may already have Synthetic Monitoringin your observability ...

More Ways To Control Your Costs With Archived Metrics | Register for Tech Talk

Tuesday, May 14, 2024  |  11AM PT / 2PM ET Register to Attend Join us for this Tech Talk and learn how to ...