I need to get a vague idea of disk space requirements before I start forwarding logs to a Splunk instance. Each indexed line will have on average 320 characters and I will be indexing around 500,000 lines a day.
My assumptions are 1 byte per character and I'm ignoring space taken by Splunk for indices, etc. That's 160MB per day.
Would you say that's semi-accurate or totally off the mark?
The general rule of thumb I've been taught is to take your raw data size and figure about 50% of that on disk including indexes. This is due to compression reducing the size significantly, and indexing adding to the size on disk.
Of course, this is a rule of thumb, YMMV. It is recommended that you simply test it by indexing some data (e.g. with a day's or week's worth of data) and see how large the files are on disk. The actual compression / index size can vary significantly.
The general rule of thumb I've been taught is to take your raw data size and figure about 50% of that on disk including indexes. This is due to compression reducing the size significantly, and indexing adding to the size on disk.
Of course, this is a rule of thumb, YMMV. It is recommended that you simply test it by indexing some data (e.g. with a day's or week's worth of data) and see how large the files are on disk. The actual compression / index size can vary significantly.
Thanks! I don't have ready access to a Splunk instance - but that ballpark estimate should do for now.