Hi
I have indexed logs from more than nine months ago in the default directory: $SPLUNK_DB\dbcustom1\db
And I wish to run a task every month to send all logs older than 3 months to $SPLUNK_DB\dbcustom1\colddb
I would appreciate a clue please.
Regards.
Jorge
Have a look at this article which nicely explains Splunk data bucket life cycle and various factors involved in the bucket movement to next stage.
http://wiki.splunk.com/Deploy:BucketRotationAndRetention
My guess will be to adjust the maxDataSize/maxHotSpanSecs
to roll hot buckets to warm and then adjust maxWarmDBCount
and maxHotBuckets
values so that they contain only 90 day worth of data and then they roll-out to cold bucket.
e.g.
maxHotSpanSecs=86400 (daily roll-out of hot to warm)
maxHotBuckets=3
maxWarmDBCount = 87 (total hot+warm buckets are 90, so 90 days worth of data)
NOTE: Ensure that maxTotalDataSizeMB and frozenTimePeriodInSecs are sufficiently high so that cold buckets are not getting rolled-over to Frozen.
Hi somesoni2
Thanks for you answer.
I'm testing your example with following parameters
maxHotSpanSecs = 3600
maxHotBuckets=3
maxWarmDBCount = 21
In 15 hours i get results, and I commented
a query.
With this I mean that when you reach total buckets will be sent to colddb. What if between maxHotSpanSecs splunk restart the service, but I'm wrong a new bucket is created. So this would impact total buckets.
thanks
Regards