All Apps and Add-ons

Why is the Splunk for Palo Alto Networks app consuming all my disk space and how to prevent this?

bcdatacomm
Explorer

The Splunk for Palo Alto Networks app seems to be consuming all my disk space. I have the index set to a max size of 168gb, but there also seems to be a data model that also consumes about 171gb of space. Is there anyway to prevent this from using all our space?

bcdatacomm
Explorer

The problem actually seems to be that it creates another database called datamodel_summary. The datamodel_summary is larger than the actual database.

/pan_logs# du -h -d 1
49G ./colddb
4.0K ./thaweddb
171G ./datamodel_summary
116G ./db
335G .

Why is datamodel_summary so large and how can I make it smaller?

bcdatacomm
Explorer

I've changed it to only be 7 days based on this. http://answers.splunk.com/answers/136089/how-to-manage-datamodel-acceleration-storage-tstatshomepath...

But can I delete the current datamodel_summary directory and then have it rebuild it?

0 Karma

bkondakindi
Path Finder

If you set the index proprieties on indexs.conf file for those indexes it will clean every particular days based on ur settings.The data will be moved from warm/hot to cold buckets so you never get any disk warnings.

Time Cheatsheet

Seconds 7776000 = 90 days

Seconds 3888000 = 45 days

Seconds 86400 = 1 day

default]
frozenTimePeriodInSecs = 7776000
homePath.maxDataSizeMB = 25000 (25gb)

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...