I have a situation where I'd like to duplicate some or all events going to one index into another.
The only point at which I can touch the data is as it hits the indexers. I can't use another heavy forwarder to do the duplication in flight.
In reading the docs, I've come up with this, but I think I'm missing something fundamental.
At a basic level below is sort of what I want:
props.conf
[mydupesourcetype]
TRANSFORMS-duplicate = original_index, duplicate_index
transforms.conf
[original_index]
FORMAT = indexa
REGEX = (.)
DEST_KEY = _MetaData:Index
[duplicate_index]
REGEX = mydupesourcetype
FORMAT = indexb
SOURCE_KEY = MetaData:Sourcetype
DEST_KEY = _MetaData:Index
http://docs.splunk.com/Documentation/Splunk/6.4.0/Forwarding/Routeandfilterdatad
This would mean the props and transforms above would never work as it would just rename the index in the duplicate_index stanza.
You can use collect
to copy existing data to another index or you can use CLONE_SOURCETYPE
to clone new data as it comes in (but it has to be cloned to a different sourcetype
).
Clone_sourcetype. Hmm haven't seen that one before will have to investigate the one.
Hi Lucas K,
a quick thought here, how about using a collect
search to duplicate some events into the new index or if you want to duplicate all events into the new index copy the buckets?
cheers, MuS
Thanks! Unfortunately it doesn't get me the "like for like" proof I'm after it will also be a large resource hog to do so. Search time exceeds the time frame so it can't keep up with the incoming data.
It's a non trivial volume of data hence the entire requirement to do it on the way in.
I think I'm going to just pull the trigger on it and pray this fixes the issue.