Hi,
I have been looking into how to export events from one index, modify the data(as the original event data contain wrong values) and then import those events into another index. By following the accepted answer on this link https://answers.splunk.com/answers/25174/how-to-export-import-events-from-indexes.html, I have managed to export bucket data and make modifications, but when I tried to import the data there was an error:
./splunk cmd importtool /opt/splunk/var/lib/splunk/test_index/db path/to/exported_events.csv
Output:
Using logging configuration at /opt/splunk/etc/log-cmdline.cfg.
ERROR IndexConfig - Asked to check if idx= is an index with a remote storage, but that index does not exist on the system or is disabled
Successfully imported 333 events into bucket.
Please ensure this bucket resides in a valid index and restart Splunk to recognize the new events.
Although it says that 333 events were imported, after checking the index, it is still empty. Restarting Splunk didn't help neither. I have tried a couple of more times the same thing with newly created indexes, but each time I get the same error:
ERROR IndexConfig - Asked to check if idx= is an index with a remote storage, but that index does not exist on the system or is disabled
and end up with an empty index.
Is there something that I have missed here?
Oh, fixed it.
In my particular case, it was just a permissions issue, as the importtool command runs as "splunk" user, and I had the .csv files as root with 600 permissions, so, make sure splunk user have read access to the .csv files.
So, a command like this should work:
/opt/splunk/bin/splunk cmd importtool /[splunk_hot_data_dir]/[indexname]/db/[new.bucket.name] exported.indexname.db.file.csv
remember also that you need to specify a new bucket directory inside "db" dir, (you can create it manually or splunk will do it for you)