Getting Data In

Hardwware Requirements for Evaluation

carbin
New Member

We need to request a server from Network Operations in order perform an evaluation. We would need Hard Disk Requirements. The other Requirements are easily found, as far as hard disk space goes, I found no useful information. We have approximately 100 Windows Servers, in varying roles (App Servers, Web Servers, etc...), and a handful of firewalls.

I need to put my Hardware Request in today and find no way to easily determine this. I just need a starting number for HDD sizing, We understand that AFTER running for a period, we can determine sizing for a Production installation, but what Minimum HDD size for our trial?

Tags (1)
0 Karma

DMohn
Motivator

A good starting point for sizing the hardware is here: http://splunk-sizing.appspot.com/

A Windows Server can send something like 500MB of data a day, depending on that you want to collect (WinEvent Logs, Performance Data, Application Logs). Then you need to decide which retention time you need, if you want to use tsidx reduction, if you have mor than one data volume (eg. SSD for hot/warm storage, HDD for cold), etc.

A rough estimation for your case (100 Servers) and a minimal retention time of 30 days would result in a space requirement of 750 GB in total for the index partition. On top of that, you should calculate at least 150GB of space for the Splunk application itself, including possible data models, etc...

0 Karma

nickhills
Ultra Champion

It very much depends how much data you intend to index, and how long you want to store it for.

If you are evaluating, I would suggest you set your index to a relatively small size - perhaps 25GB.
This will allow you to add all of your servers, and the logs you want to keep, and evaluate what sort of retention this gives you.

You may find that 25GB provides you 6 months, or 6 days, or 6 hours - it all depends on what you choose to collect.

I would create an os index (separate ones for windows/linux if you have them), a web server index, and app server index.
Set them all to 25GB - Leave another 50GB spare on the volume for _internal data, and then add 50%
A dedicated 200GB - 300GB volume is probably therefore a good start point + your OS volume.

Assuming you're installing on linux, a good idea (to start with) is to have the data volume mounted as /opt/splunk, so when you perform the install it's on its own volume.

If my comment helps, please give it a thumbs up!
0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to May Tech Talks, Office Hours, and Webinars!

Take a look below to explore our upcoming Community Office Hours, Tech Talks, and Webinars this month. This ...

They're back! Join the SplunkTrust and MVP at .conf24

With our highly anticipated annual conference, .conf, comes the fez-wearers you can trust! The SplunkTrust, as ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had two releases of new security content via the Enterprise ...