Dashboards & Visualizations

Need to generate sparkline charts on daily index disk usages for each index in table.

sajeeshpn
New Member

Hi,

I would like to generate sparkline charts for each index in the table. The sparkline should show the daily index disk usages. So that it would help to know the disk usage for each index over a period of time.

Could someone please help me to figure out the search string for it ?

Thanks,
Sajeesh

Tags (1)
0 Karma
1 Solution

PPape
Contributor

UPDATED ANSWER:

Hi Sajeesh,
you could try this:

index=_introspection sourcetype=splunk_disk_objects component=Indexes data.name=*
| eval data_birth_date = if(isnotnull('data.bucket_dirs.cold.event_min_time'), 'data.bucket_dirs.cold.event_min_time', 'data.bucket_dirs.home.event_min_time')
| eval data_age_days = round((_time - data_birth_date) / 86400, 0)
| eval data.total_capacity = if(isnotnull('data.total_capacity'), 'data.total_capacity', 500000)
| eval disk_usage = round('data.total_size', 2)
| eval disk_capacity = round('data.total_capacity' / 1024, 2) 
| stats max(disk_usage) as disk_usage sparkline(max(disk_usage), 1d) as trend by data.name

Does this fit your needs?

ORIGINAL ANSWER:
Hi Sajeesh,
you could try this:

index=_internal  source=*license_usage.log type=Usage | eval MB = b/1024/1024 | chart sum(MB) as MB sparkline(sum(MB), 1d) as trend by idx

Does this fit your needs?

View solution in original post

0 Karma

sajeeshpn
New Member

Thank you, this answer is similar to what I expected.

0 Karma

PPape
Contributor

UPDATED ANSWER:

Hi Sajeesh,
you could try this:

index=_introspection sourcetype=splunk_disk_objects component=Indexes data.name=*
| eval data_birth_date = if(isnotnull('data.bucket_dirs.cold.event_min_time'), 'data.bucket_dirs.cold.event_min_time', 'data.bucket_dirs.home.event_min_time')
| eval data_age_days = round((_time - data_birth_date) / 86400, 0)
| eval data.total_capacity = if(isnotnull('data.total_capacity'), 'data.total_capacity', 500000)
| eval disk_usage = round('data.total_size', 2)
| eval disk_capacity = round('data.total_capacity' / 1024, 2) 
| stats max(disk_usage) as disk_usage sparkline(max(disk_usage), 1d) as trend by data.name

Does this fit your needs?

ORIGINAL ANSWER:
Hi Sajeesh,
you could try this:

index=_internal  source=*license_usage.log type=Usage | eval MB = b/1024/1024 | chart sum(MB) as MB sparkline(sum(MB), 1d) as trend by idx

Does this fit your needs?

0 Karma

sajeeshpn
New Member

Each index's disk usage will it be the same as the values ('b') derived from license_usage.log ?

How about using "| db_inspect" for each index and getting the 'sizeOnDiskMB' value for each day and plotting it?

0 Karma

PPape
Contributor

Hi Sajeesh,
you are right the license volume is not the volume on the disk.

this here should do it.

index=_introspection sourcetype=splunk_disk_objects component=Indexes data.name=*
| eval data_birth_date = if(isnotnull('data.bucket_dirs.cold.event_min_time'), 'data.bucket_dirs.cold.event_min_time', 'data.bucket_dirs.home.event_min_time')
| eval data_age_days = round((_time - data_birth_date) / 86400, 0)
| eval data.total_capacity = if(isnotnull('data.total_capacity'), 'data.total_capacity', 500000)
| eval disk_usage = round('data.total_size', 2)
| stats max(disk_usage) as disk_usage sparkline(max(disk_usage), 1d) as trend by data.name

the one above measures in MB if you want your data in GB you have to change this line: "| eval disk_usage = round('data.total_size', 2)" to "| eval disk_usage = round('data.total_size' / 1024 , 2)"

0 Karma
Get Updates on the Splunk Community!

More Ways To Control Your Costs With Archived Metrics | Register for Tech Talk

Tuesday, May 14, 2024  |  11AM PT / 2PM ET Register to Attend Join us for this Tech Talk and learn how to ...

.conf24 | Personalize your .conf experience with Learning Paths!

Personalize your .conf24 Experience Learning paths allow you to level up your skill sets and dive deeper ...

Threat Hunting Unlocked: How to Uplevel Your Threat Hunting With the PEAK Framework ...

WATCH NOWAs AI starts tackling low level alerts, it's more critical than ever to uplevel your threat hunting ...