Getting Data In

Delete line with all hex 0 ie \x00

65pony
Explorer

We have a very strange file where the first line has hundreds of \x00 values.
ex. the following times 50....

\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00

I tried want to get rid of this line but do not seem to be having any luck in my transforms.
Here are the REGEX's that I have tried with no luck.... any suggestions?

REGEX = ^0x00+ , \x00 , ^\x00  
DEST_KEY = queue
FORMAT = nullQueue
Tags (2)
0 Karma

rturk
Builder

Hi 65pony,

There are quite a few posted questions on the site, and a common suggestion it to confirm the character set (CHARSET) of the file that is being indexed. I had a similar issue with some logs, where a single 27MB archive would chew through an entire 50GB license due to the \x00 values... not ideal.

I got around this by having the following props.conf:

[nice_sqlTrace]
CHARSET         = UTF-16LE
SHOULD_LINEMERGE    = false

So see if you can confirm the CHARSET of the file to be indexed and set that accordingly.

Let me know how you get on 🙂

0 Karma

lukejadamec
Super Champion

Log on to the system that generates this garbage, look at the 'file', and post the redacted contents of the problem event.
If the actual log contains this, then talk to the developers. Otherwise, it is like trying to hack logs.

0 Karma

jonuwz
Influencer

you tried \\x00 ?

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...