Refine your search:


I want to move some events from an indexer to another, for a particular period of time. I saw that there are some importtool and exportool commands in $SPLUNK_HOME/bin how to use them ?

asked 26 May '11, 18:39

mataharry's gravatar image

accept rate: 18%

edited 01 Dec '11, 11:08

Steve%20G.'s gravatar image

Steve G. ♦

2 Answers:

How to export/import selectively data from an indexer to another.

Here is the example for the defaultdb index (the main index) with $SPLUNK_HOME = /opt/splunk and a time period from April 10th 00:00 to April 11th 00:00 GMT (equivalent to 1302393600 to 1302480000 epoch time)

1 - roll the hot buckets to warm on the initial indexer

cd /opt/splunk/bin
./splunk _internal call /data/indexes/defaultdb/roll-hot-buckets -auth admin:changeme
specify the correct db name, and password

2- identify the buckets containing data for your time period.

The dates are in epoch time UTC in the filename, in the reverse order. the filename is db_recentevent_oldestevent_bucketuniquenumber. You can use to check

example :
contains data for the period of
to    1301920239 = GMT: Mon, 04 Apr 2011 12:30:39 GMT
from 1305913172 = GMT: Fri, 20 May 2011 17:39:32 GMT

3 - export the events for the index and the period you need

usage : exporttool db_directory exportfile [-et <earliest_time_utc>] [-lt <latest_time_utc>] [-csv] [export_search]
example :
cd /opt/splunk/bin
./splunk cmd exporttool /opt/splunk/var/lib/splunk/defaultdb/db/db_1305913172_1301920239_29/  /myexportpath/export1.csv  -et 1302393600 -lt 1302480000 -csv
If needed, you can also add a search as last parameter. Check that an export file was created. Repeat for each buckets containing data of the good period and change the export file. If you want to run the export over all the buckets, use a loop command.

4 - import each file into the new indexer, in the proper destination index

usage : importtool 
<database_path> <csv_file>
example :
cd /opt/splunk/bin
./splunk cmd importtool /opt/splunk/var/lib/splunk/defaultdb/db /myexportpath/export1.csv
"Successfully imported 71615 events into the bucket.
Please ensure this bucket resides in a valid index and restart Splunk to recognize the new events."
Restart to have splunk detecting the new data and recalculate the metadata.
example :
./splunk restart
Perform recovery now? [y/n] y
    Recovering (across all data)...
    bucket=opt/splunk/var/lib/splunk/defaultdb/db/db_1306285067_1305920377_54 count mismatch tsidx=2525 source-metadata=2524, repairing...


answered 26 May '11, 18:43

yannK's gravatar image

accept rate: 32%

edited 27 May '11, 11:42

Great Post!

A couple of corrections during import (at least with 4.2.5):

  • add the bucket dir in the import line, thus:


  • after restart, I didn't get prompted, perhaps there's a new fsck that happens automatically (you'll see in splunkd.log the recovery occur)


answered 13 Dec '11, 01:23

bchen's gravatar image

bchen ♦
accept rate: 20%

Post your answer
toggle preview

Follow this question

Log In to enable email subscriptions



Answers + Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text]( "Title")
  • image?![alt text](/path/img.jpg "Title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported



Asked: 26 May '11, 18:39

Seen: 4,812 times

Last updated: 13 Dec '11, 01:23

Copyright © 2005-2014 Splunk Inc. All rights reserved.