Getting Data In

What will be the size of the indexed data if I send 50GB of raw data to Splunk?

splunker12er
Motivator

I send daily 50 GB raw data from my machines to Splunk for indexing
what will be the size of the data after it got indexed ?

Will this be the same 50 Gb or indexed data will have less size ?

MuS
Legend

Hi splunker12er,

it all depends on your raw data, but basically you can say compression between 30-50% are normal, you can check this with this search:

 | dbinspect index=YOURINDEX
 | fields state,id,rawSize,sizeOnDiskMB 
 | stats sum(rawSize) AS rawTotal, sum(sizeOnDiskMB) AS diskTotalinMB
 | eval rawTotalinMB=(rawTotal / 1024 / 1024) | fields - rawTotal
 | eval compression=tostring(round(diskTotalinMB / rawTotalinMB * 100, 2)) + "%"
 | table rawTotalinMB, diskTotalinMB, compression

cheers,

MuS

Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...