Splunk Search

Renaming identical field names for different events

himynamesdave
Contributor

Splunk newbie here...

In my indexed data I have two separate events for latitude and longitude, both defined as the field "value", ex:

{name:longitude timestamp:1362061286.802 value:-83.237328}
{name:latitude timestamp:1362061286.791 value:42.291935}

I'm trying to plot the lat / long locations as single event based on timestamp on a map.

The most efficient way I know to do this in Splunk is to rename the "value" fields in each lat / long event AS "x" (long) / "y" (lat) before plotting, e.g:

sourcetype="json-openxc" longitude | rename value AS x
sourcetype="json-openxc" latitude | rename value AS y

I have two questions:

1) how do I format a search string to run these commands in a single search to output x / y at the same time?

2) is this the most efficient way to solve this problem?

Tags (1)
0 Karma

lguinn2
Legend

This is a follow-on response to the comments on another answer: How to format the lat/long events into a single event... (It was too long for a comment!)

As @rsennett_splunk points out, you should probably be telling Splunk that your events span multiple lines, and how to determine the start/end of events. Splunk calls that "linebreaking" and it is done as your data is parsed and indexed.

Assume that your inputs.conf has something like this

[monitoryourfilenamehere]
sourcetype=json-openxc

In props.conf, you could put

[json-openxc]
KV_MODE=JSON
SHOULD_LINEMERGE=true
BREAK_ONLY_BEFORE=a_regular_expression_that_matches_something_on_the_first_line
MAX_EVENTS=1500
TRUNCATE=0

This might make analyzing and using your data much easier.

If you want more help with this, you should probably open another question. Also, you could take a look at

Configure event linebreaking

lguinn2
Legend

I would probably do it this way

sourcetype="json-openxc"  
| eval x = if (match(_raw,"longitude",value,null())
| eval y = if (match(_raw,"latitude",value,null())

This sets the x and y values using just one search, which will be much more efficient than using a subsearch.
However, you still have the issue of combining the two events to get a single event with both x and y. I don't know your data well enough to do that for certain. However, you will definitely need a timestamp, so here is how to get that if you haven't configured Splunk to do it for you. Also, I have bucketed the time to seconds, to eliminate millisecond differences between events

| eval _time = timestamp
| bucket _time span=1s

Here is one method for combining the events and displaying them

sourcetype="json-openxc"  
| eval x = if (match(_raw,"longitude",value,null())
| eval y = if (match(_raw,"latitude",value,null())
| eval _time = timestamp
| bucket _time span=1s
| stats list(x) as x list(y) as y by _time
0 Karma

rsennett_splunk
Splunk Employee
Splunk Employee

dave: see lisa's answer below. if you give us more data we can be more specific... but you pretty much want to look at the doc for props.conf tell Splunk Where to BREAK_BEFORE where to BREAK_AFTER and where to find the time stamp... meaning what comes before it... like the word longitude (use a regex where you specify the word so it picks only THAT line and ignores the next one)

With Splunk... the answer is always "YES!". It just might require more regex than you're prepared for!
0 Karma

himynamesdave
Contributor

This is interesting.

Just to clarify, what you are suggesting is to format the lat / long events as a single event based on identical timestamp before indexing?

Would I do this using custom fields?

http://docs.splunk.com/Documentation/Splunk/6.0/Data/Configureindex-timefieldextraction

Sorry if this is obvious, I'm just finding my feet with Splunk 🙂

0 Karma

rsennett_splunk
Splunk Employee
Splunk Employee

Since Dave is New to Splunk, I'm sort of wondering if this is really a multiline event to begin with... The two individual lines aren't properly formed JSON (missing commas and quotation marks) which is why the full relationship isn't aligned. If it was re-indexed using the first time stamp - start with longitude, end with latitude and just extract the fields - you'd be done with it.. and could then just search at will based on the time stamp.
Although we don't know if there is more to the data or if re-indexing is even an option... just something that occurred to me...

With Splunk... the answer is always "YES!". It just might require more regex than you're prepared for!

somesoni2
SplunkTrust
SplunkTrust

Try following
1. I see your timestamp for these two events differ in just ms part. So for each timestamp we get two such events, you can try below:

sourcetype="json-openxc" name=longitude | eval timestamp=strftime(timestamp,"%y-%m-%d %H:%M:%S")
|stats first(value) as x by timestamp | appendcols [search sourcetype="json-openxc" name=latitude| eval timestamp=strftime(timestamp,"%y-%m-%d %H:%M:%S")
|stats first(value) as y by timestamp]

You can replace appendcols with join

sourcetype="json-openxc" name=longitude | eval timestamp=strftime(timestamp,"%y-%m-%d %H:%M:%S")|rename value as x | join timestamp[search sourcetype="json-openxc" name=latitude | eval timestamp=strftime(timestamp,"%y-%m-%d %H:%M:%S") | rename value as y]

Or if you don't want to join/append cols, you do it using stats as well.

sourcetype="json-openxc" name=longitude OR name=latitude | eval timestamp=strftime(timestamp,"%y-%m-%d %H:%M:%S") | stats values(value) as value by timestamp | eval x=mvindex(value,0) | eval y=mvindex(value,1) | fields timestamp , x ,y
0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...