Getting Data In

crcSalt is not working

thisissplunk
Builder

I ingested SQL ERRORLOGs and SQLAGENT logs with my forwader but didn't have the props.conf setup correctly. They showed up as binary (hex).

I now have the correct props.conf setup and want to reingest. To do this I set a new crcSalt string in my inputs.conf. However, Splunk still is not reingesting the files.

What gives?

props:

[sql_error]
NO_BINARY_CHECK = true
CHARSET = UTF-16LE

inputs:

[batch:///data/sql/.../ERRORLOG*]
move_policy = sinkhole
index = sql
disabled = false
crcSalt = please_reingest
sourcetype = sql_error
0 Karma

neelshah
Path Finder
0 Karma

gjanders
SplunkTrust
SplunkTrust

Assuming you did restart the forwarder after tweaking the crcSalt I assume that would trigger re-ingestion, however it might be easier to just force the forwarder to re-index the one file.

The btprobe command can be used to remove an individual entry from the fishbucket, as per this answer , the below is a quote from the linked answer:

splunk cmd btprobe -d $SPLUNK_HOME/var/lib/splunk/fishbucket/splunk_private_db --file $FILE --reset

Also if you can put the file on an indexer or heavy forwarder you could oneshot the file

jkat54
SplunkTrust
SplunkTrust

Two questions:

  1. If your forwarder is "controlled" via a deployment server... does the serverclass require splunkd restart? If not, restart Splunk on the forwarder.

  2. Has move policy always been set to "sinkhole"? If so, then the old data has probably been erased at the source and there's nothing new to ingest.

jkat54
SplunkTrust
SplunkTrust

@thisissplunk any updates on this?

0 Karma

thisissplunk
Builder

No updates. Due to time restraints I just pointed the files to a new index. Next time this happens I'll try the suggestions that people have posted here after I did that.

0 Karma

jkat54
SplunkTrust
SplunkTrust

Ok thanks for the reply. Can you close this question or answer it with your response above and mark it as the answer?

Thanks in advance!

0 Karma

mwdbhyat
Builder

You are doing a batch upload - that will destructively upload the file once(delete it from disk after upload)..so that same file shouldnt be there in which case you wouldnt need crcSalt as there is nothing to reingest?

Are you using batch because the files are too big for a file monitor?

Another option to try would be to clean the fishbucket or you could just oneshot the data in again.. here is a link to a few other things you could try:

https://answers.splunk.com/answers/72562/how-to-reindex-data-from-a-forwarder.html

0 Karma

thisissplunk
Builder

I checked and the files are there, interestingly enough.

I'm using batch jobs because these are one-shot upload dumps. Nothing else will ever be placed into this index. It's specific data for a specific analysis task.

I can't delete the fishbucket because it would start reingesting everything else in this small splunk instance.

I was trying to figure out how to clean a specific "file" in a fish bucket but I couldn't find instructions.

0 Karma

mwdbhyat
Builder
0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...