I have a search that works, but I've recently discovered that my events are recorded in two separate log files, sometimes as duplicates in each, sometimes as unique events in a single log.
The events have unique ids in them, and I'd like to use those to get a distinct count to fix things.
The original search was essentially this:
FieldChangedId | chart COUNT(eval(FieldName)) by Site, FieldName
going after an event that looks like this:
"Site":4303,
"DocumentId":99,
"FieldChangedId":161,
"FieldName":"LastLocation",
The search I have generates counts that include duplicate events because of the logging issue.
The FieldChangeId will be unique per unique event, so I'm thinking a dc of some kind on that field is how I would lose my duplicate log events. I need to express the data by Site and by FieldName, but I'm stuck on how to get the distinct in there AND also make the chart.
Have you tried dedup
?
FieldChangedId | dedup FieldChangedId | chart COUNT(eval(FieldName)) by Site, FieldName
How can I refine this search string to grab those for the whole year and add other Splunk commands to break them into common ‘buckets’ with counts for each type of error without duplicate error types?
sourcetype=was_prod source="/srs/*Automation" "error"
@belamg This question is more than 3 years old with an accepted answer so you're unlikely to get many responses. Please post a new question.
Have you tried dedup
?
FieldChangedId | dedup FieldChangedId | chart COUNT(eval(FieldName)) by Site, FieldName
That did it. When I tested this out, I also found that I'd typed the end of the FieldChangedId field as ID, so... derp.
Thanks much.