Questions in topic: "calculations"
https://answers.splunk.com/answers/topics/single/161915.html
The latest questions for the topic "calculations"Dew Point Calculation
https://answers.splunk.com/answers/774479/dew-point-calculation.html
I am trying to produce or calculate the Dew Point in Celsius of data in two separate indexes.
I believe the offcial Dew point formula is Td = T - ((100 - RH)/5.)
I want to basically use this formula to produce Dew point using relative humidity on one index with temperature on another indexsplunk-enterprisecalculationsTue, 01 Oct 2019 12:22:13 GMTadrianrepublicDynamic Threshold Calculation in splunk alert
https://answers.splunk.com/answers/746170/dynamic-threshold-calculation-in-splunk-alert.html
I have market data feed indexing into splunk.
The logs look like following -
Security: "HDFC", FIELDS: {"PRICE", "ASK", "HIGH"}, receivedTime: <time-string>
Security "YESBANK", FIELDS= {"PRICE", "HIGH"}, receivedTime: <time-string>
Security: "HDFC", FIELDS: {"ASK", "HIGH"}, receivedTime: <time-string>
Security: "HDFC", FIELDS: {"PRICE"}, receivedTime: <time-string>
**Security:** a single value filed
**FIELDS:** a *multi value field*
**receivedTime:** sting, can be different from _time
- There are close to 5000 Securities log in daily.
- It's about 10 GB of license usage per day. So it's a large number of events.
**We want to calculate the SECUIRTY:FIELD pairs that are logging less frequency than their usual input frequency.**
so, for a SECURITY:FIELD pair -
diff_time = recievedTime (previous) - receivedTime (current)
this diff time varies from each SECURITY:FIELD pair from other. Some log in every second, others log in only once a day.
The challenge is to come up with an alert/alerts that dynamically calculates the optimum frequency (diff_time) for each SECURITY:FIELD pair and then compares it with it's current value.
Now let's say we assume that an optimum frequency will be the average of last 7 frequency of inputs of same SECURITY:FIELD pair.
In order calculate this value I will have to run the query for last 7 days (cause some log only once a day), and with large amount of data and use of mvexpand command, this is not viable.
**How do you suggest I achieve this goal? Please suggest an algorithm for it.**
- I can't use a lookup table cause of it's size issue. A large burst of input data will bring down whole splunk if lookup table grows wildly.alertssearch-helpoutliercalculationcalculationsThu, 16 May 2019 17:27:05 GMTiparitoshPerforming calculations on multi-valued fields
https://answers.splunk.com/answers/735262/performing-calculations-on-multi-valued-fields.html
Hello, I am trying to perform calculations on multiple fields.
I am working with data in the format of Key='value1,value2,value3,value4' which can contain anywhere from 1 to 4 values.
**Input:**
height='32,12,14,13' or width='32'
as well as variance='3.24e-2,4.23e+3,1.12e-4,1.01e-3'
**Query:**
`
index=myindex sourcetype=mysourcetype
| transaction source
| foreach *
[eval <>=if(match('<>', "[\d.,e+-]+"), '<>', '')
| eval total=0 | eval count=1
| eval <>=replace(<>,"\'","")
| makemv delim="," '<>'
| mvexpand '<>'
| foreach <>
[eval total_<<FIELD>>=total + <<FIELD>>
| eval count=count + 1]
| eval avg_<>=<>/count]
| table *, total, count
| transpose include_empty=false 10
`
**Expected Output:**
height=17.75
width=32
variance=1.0575e+3
**Actual Output:**
height=032,12,14,13, width=32
variance=03.24e-2,4.23e+3,1.12e-4,1.01e-3
Thank you in advance for any help provided!splunk-enterprisemulti-valuemultivaluedcalculationsmulti-valuedFri, 22 Mar 2019 13:44:48 GMTztayluhCalculate percentage b/n 2 counted numbers
https://answers.splunk.com/answers/725956/calculate-percentage-bn-2-counted-numbers.html
Hello everyone,
I'm trying to calculate the % of overdue items and print the result for every month. It looks like I'm completely stuck with the query so any help will be greatly appreciated.
Here's what I'm trying to achieve:
1. Select entries with every 4 and 5 for the last month
2. calculate the days difference between 2 days to get the overdue days
3. count only items that are 30+ days overdue
4. count the total amount of items
5. calculate the percentage
6. print it in a table and get the % for every month for comparison
7. possibly visualize the results in a Area Chart
...........
index=something (SEVERITY=4 OR SEVERITY=5) earliest=-4w@w latest=now
| eval start=FIRST_FOUND_DATETIME| eval end=LAST_FOUND_DATETIME| eval duration = round((end-start)/86400)
| stats count, values(round) AS Overdue
| where round>30
| stats count as Total
| eval percent_difference=((Overdue/Total)*100)
| table percent_differencepercentagecalculationscalculatingThu, 14 Feb 2019 12:18:22 GMTswimenaIs the following calculation possible ?
https://answers.splunk.com/answers/719914/is-the-following-calculation-possible.html
I'm currently generating an AvgTime of processing cycles in a thread within a 5 min duration and writing these out to a log similar to this
[PrepareEvents, DispatchAll]
PrepareEvents samples Avg: 2757ns; Median: 1411ns; Max: 1533433ns; Total Events: 277138; Total Items: 314155
I want to perform the following calculation so i find out how many average ns i've spent processing cycles in the 5min duration
avgTime (multiply i cant add star here) Total items * 100 / (5 min in nanos
Can i do this in splunk ?splunk-enterprisetabledurationmathcalculationsThu, 31 Jan 2019 10:38:15 GMTluckyman80how to combine/merge multiple generic fields/columns in one field/column with average calculation per generic field/column values in Splunk?
https://answers.splunk.com/answers/705941/how-to-combinemerge-multiple-generic-fieldscolumns.html
hi,
I have following situation in splunk (see picture below).
![Actual Situation][1]
I need following pattern in Splunk (see picture below).
![Target Solution][2]
I have different generic columns where the last part of the column-name (Suffix) is dynamic and unknown. I need to combine/merge this generic columns to one target-column. Within the target-column I want to calculate the average per generic field. I think the picture explains the situation very well.
[1]: /storage/temp/259621-ist.png
[2]: /storage/temp/259622-goal.pngsplunk-enterprisefieldscolumnscalculationsWed, 12 Dec 2018 08:51:52 GMTAlexHollerHow do you calculate time difference between multiple events that aren't in chronological order?
https://answers.splunk.com/answers/693820/how-do-you-calculate-time-difference-between-multi.html
I have 6 events. Each one has a timestamp, and I have extracted the time of each into a new field using eval. But now, I am not able to create timedifference between event6-event1 or event4-event3 as per my needs.
I do not want to use the transaction command as I need to write multiple searches, and I am trying to solve this in search.
I am at a point where my last seach line is
| table Fourm_step_1_Time Fourm_step_2_Time Fourm_step_3_Time Fourm_step_4_Time Fourm_step_5_Time Fourm_step_6_Time
results are
0 0
0 0
0 0
0 0
0 123435453
1234545433 0
so on
@somesoni2splunk-enterpriseevaltime-differencecalculationsmultiple-eventsThu, 18 Oct 2018 18:24:08 GMTpuneetkharbandaHow do I calculate median values for the column for 7 weeks?
https://answers.splunk.com/answers/683384/how-do-i-calculate-median-values-for-the-column-fo.html
Dear all,
There are two columns with data: `time` (time scale in steps of 10 minutes) and `val` (amount of transactions).
I need to calculate median values (med_val) for the `val` column for 7 weeks. The specific example for the point 12.04.2018 15:00:00 med_val = `median` ( `val` by 7 points. 5.04.2018 15:00:00, 29/03/2018 15:00:00, 22/03/2018 15:00:00, 15/03/2018 15:00:00, 8/8/2018 15:00:00, 03/03/2018 15:00:00, 22/02/2018 15:00:00), i.e. so median at 7 same time points on the same days of the week. If there are no data, then we consider that 0 transactions were performed.
The best that I could come up with is:
| timechart span=10m median(val) | timewrap 1w series=exact
Are there any good solutions?
Thanks in advance!splunk-enterprisecalculationsmedianMon, 27 Aug 2018 08:44:17 GMTbeltsHow to calculate percentage change?
https://answers.splunk.com/answers/678639/how-to-calculate-percentage-change.html
Hello,
I have 2 fields current_value and previous_value, how to calculate the increase or decrease percentage based on the 2 values?percentagecalculationsWed, 01 Aug 2018 17:58:49 GMTknallacalculate percentage from the results of two reports
https://answers.splunk.com/answers/673502/calculate-percentage-from-the-results-of-two-repor.html
I'm having a difficult time calculating a percentage based on two reports (searches).
Search 1
| inputlookup mydata.csv
| where category= "category" AND assignment_group="myteam" AND yyyy_mm="$Datedropdown$"
| stats count as myteamstotal
Search 2
| inputlookup mydata.csv
| where category= "category" AND yyyy_mm="$Datedropdown$"
| stats count as total
I'd like to calculate a percentage
| eval percent=(myteamstotal/total)*100
I tried this and got 0 as a result:
| inputlookup GCC-SNOW_01012018_12312018.csv
| where category= "category" AND assignment_group="myteam" AND yyyy_mm="$Datedropdown$"
| stats count as myteamstotal
| where category= "category" AND yyyy_mm="$Datedropdown$"
| stats count as total
| eval percent=(myteamstotal/total)*100
How can I calculate the percentage?splunk-enterprisepercentagecalculationsThu, 19 Jul 2018 10:21:45 GMTjdlocklin526Calculation in Dashboard
https://answers.splunk.com/answers/643668/calculation-in-dashboard.html
Hello everyone,
I am completely new to splunk,. I have a Dashboard requirement where I need to display the difference of the first column of numbers every 15 minutes. The log is updating every 10-15 minutes in real time. For ex: first record in the graph should be 99( 1100-1001)
2018-04-21 00:00:00 INFO Switching to next trail file due to EOF
1001 abc done on 2018-04-21 00:45:38
1100 agc done on 2018-04-21 00:55:30
1200 ybc done on 2018-04-21 01:05:39
...................................
.....................................
2400 bcd done on 2018-04-21 10:40:38
Please assist in how I can achieve this.
Thanks,
Naomisplunk-enterprisesplunkcalculationsTue, 24 Apr 2018 01:18:01 GMTnaomibnCalculation of percentage based on the results from 3 different searches with a common value
https://answers.splunk.com/answers/624900/calculation-of-percentage-based-on-the-results-fro.html
I have calculated % from 3 different searches and i am getting the result perfectly fine.
source="log-ura" "Flag Finalizacao" NOT Finalizacao=3 AND NOT Finalizacao=4 AND NOT Finalizacao=6 AND NOT Finalizacao=7 AND NOT Finalizacao=11 AND NOT Finalizacao=13 | stats count(Finalizacao) as Ignored |
join date [search source="log-ura" "Flag Finalizacao" Finalizacao=* | stats count(Finalizacao) as TotalFlagCount] | table Ignored TotalFlagCount |
join date [search source="log-ura" "Flag Finalizacao" Finalizacao=3 OR Finalizacao=4 | stats count(Finalizacao) as Together]
| eval ATHlĂquido=(Together/(TotalFlagCount-Ignored))*100 | table ATHlĂquido
Now I need to calculate the % based on the telephone dialing codes. I am trying to extract dialing code(DDD) and Counts from 3 different searches and need to apply a math formula for each DDD code.
source="log-ura" "Flag Finalizacao" NOT Finalizacao=3 AND NOT Finalizacao=4 AND NOT Finalizacao=6 AND NOT Finalizacao=7 AND NOT Finalizacao=11 AND NOT Finalizacao=13 | dedup _raw | eval Ignored.PhoneDDD = substr(PhNumber, 1, 2) | stats count(Finalizacao) as Ignored by Ignored.PhoneDDD |
join date [search source="log-ura" "Flag Finalizacao" Finalizacao=* | dedup _raw | eval TotalFlagCount.PhoneDDD = substr(PhNumber, 1, 2)| stats count(Finalizacao) as TotalFlagCount by TotalFlagCount.PhoneDDD ] |
join date [search source="log-ura" "Flag Finalizacao" Finalizacao=3 OR Finalizacao=4 | dedup _raw | eval Together.PhoneDDD = substr(PhNumber, 1, 2)| stats count(Finalizacao) as Together by Together.PhoneDDD ] | table Ignored.PhoneDDD Ignored TotalFlagCount.PhoneDDD TotalFlagCount Together.PhoneDDD Together
However its not working.
in other way,
search 1 --> DDD, countA
search 2 --> DDD, countB
search 3 --> DDD, countC
I need to do a formula = (countA-CountB)/CountC for each DDD. Can someone help me where i am doing wrong ?splunk-enterprisecalculationsFri, 09 Mar 2018 18:05:48 GMTklchandrakanthHow to Calculate Risks score by assigning a risk score that also uses the total amount of scanned hosts and how many are are vulnerable from Nessus results?
https://answers.splunk.com/answers/623603/how-to-calculate-risks-score-by-assigning-a-risk-s.html
Hello,
I'm quite new to splunk, and probably this can be done more efficiently. I have a search that uses Nessus reports to show a table with the vulnerabilities classified by severity, I need to apply a custom formula to assign a risk score that also uses the total amount of scanned hosts and how many are vulnerable. There are two separate searches that obtain the vulnerable and total hosts and I tried to use a Join command.
This is the formula I wrote for the risk score:
index=XXXX sourcetype="nessus:scan" name="XXXXXX" NOT severity=informational| dedup plugin_family plugin_name host-ip ports{}.port ports{}.protocol ports{}.transport | chart count over plugin_family by severity |rename critical as Critical, high as High, medium as Medium, low as Low, plugin_family as Name | stats sum(*) as * | join userhandle [search index=XXXX sourcetype="nessus:scan" name=XXXXX NOT severity=informational | dedup host-ip | chart count as "Vuln hosts"| stats list(Vuln hosts) as VH] | join userhandle [search index=XXXXXX sourcetype="nessus:scan" name=XXXXX | dedup host-ip | chart count as "Total hosts" | stats list(Total hosts) as TH]| fillnull Critical, High |eval Name="XXXXX" | eval RiskValue=/Risk value formula/ | fields Name, Critical, High, Medium, Low, RiskValue
It works, but its quite long and as a new assignment I need to be able to generate a histogram of Risk value for the past 6 months. Using `append` and specifying the dates ( `earliest -6mon` and so forth) returns incorrect data and the search itself becomes quite big and I feel this can bring performance issues in Splunk if we are to display the histogram on a dashboard. I need guidance on how to improve this search and display the histogram.
Thanks in advance.splunk-enterprisenessuscalculationsscoreWed, 28 Feb 2018 15:00:57 GMTivan128What is the best way to calculate date and time span?
https://answers.splunk.com/answers/620441/what-is-the-best-way-to-calculate-date-and-time-sp.html
Hello
What I am trying to do is calulate dates and span.
So I have a date called "Date Due" and a field "SLA". What I am trying to do is take "Date Due" and add "SLA" compare to the current date and give how many days past it currently is.
Currently Im using this but the results arent spot on
index=fp_dev_tsv "BO Type"="assessments" | rename "BO ID" as id| convert timeformat="%Y-%m-%d %H:%M:%S.%6N" mktime("Step Date Started") AS starttime mktime("Step Date Completed") AS endtime mktime("Step Due Date") AS cumulativeDueDate mktime("Step Actual Due Date") AS actualDueDate
|eval dueDateRange=mvrange(actualDueDate,now(),86400)
|convert ctime(dueDateRange) timeformat="%+"
| eval pastDueDays =mvcount(mvfilter(NOT match(dueDateRange,"(Sun|Sat).*")))
this doesnt take into account the SLA field just counts the days from the actualDueDate until now in days. Its correct BUT I need to add in SLA and then compare
Heres a sample table:
Name Past Due Step Name Past Due Step Due Date SLA for Past Due Step
General Name Info 1 2018-02-01 20:38:10.154000 3
Genberal Name Info 2 2018-02-10 20:38:10.154000 10
General Name Info 3 2018-03-08 20:38:10.154000 5
General Name Info 4 2018-03-15 20:38:10.154000 5
General Name Info 5 2018-03-22 20:38:10.154000 5
So what I need to do is check each steps due date, add SLA and then compare to todays date.
Any idea how I can achieve this?
Thanks for the help!splunk-enterprisesearch-languagedatetimecalculationsWed, 21 Feb 2018 17:14:13 GMTtkwaller_2Calculate and display jump in connections
https://answers.splunk.com/answers/557146/calculate-and-display-jump-in-connections.html
Hi all,
I have written a search that will list the "average daily connections" originating from a source ip address, as well as listing the "actual daily connections" from a source ip address.
I was wondering how I would write a statement at the end to only return results where the "actual daily connections" is 3 times (3*) the "average daily connections".
For example:
| where "Actual_Daily_Connections" >= 3*("Average_Daily_Connections")
Any assistance would be greatly appreciated :)splunk-cloudcalculationsMon, 24 Jul 2017 11:18:18 GMTMikeElliottCalculations on fields with multiplier abbriviations
https://answers.splunk.com/answers/542883/calculations-on-fields-with-multiplier-abbriviatio.html
Any ideas on how to handle this - I am imaging a horrible if/string statement, but any other ideas?
i have a field "bytes" and any of the following could be values:
bytes=0
bytes=345
bytes=456K
bytes=789M
bytes=20G
I would like to chart (or otherwise perform math functions) so I need a means to normalise the values into a common format either bytes or kb.fieldcalculationsFri, 26 May 2017 18:49:38 GMTnickhillscplHow to calculate changes over multiple column
https://answers.splunk.com/answers/541497/how-to-calculate-changes-over-multiple-column.html
I have the following output from my base search:
![alt text][1]
[1]: /storage/temp/204574-image.png
It shows accumulative value for each sampling time for each interface. Is there a good way to calculate the change for each sampling time for each interface and show the result in a similar table format?
ThankscalculationsincrementalTue, 23 May 2017 14:05:35 GMTjgcscoCalculate percentage of multiple values, different events
https://answers.splunk.com/answers/454877/calculate-percentage-of-multiple-values-different.html
I am trying to display the percentage of Total Modems against Total Modems on Card 0.
The XML I am given unfortunately breaks up data from essentially one event into three:
![alt text][1]
[1]: http://i.imgur.com/T84LY2R.png
source="C:\\splunk_files\\summary.xml" host="OSSTEST01" index="prtg_cmts" sourcetype="PRTG_API" | rex "(<sensor>)(?<sensor>.*)<" | rex "(<group>)(?<group>.*)<" | rex "(<lastvalue>)(?<value>\d+)\s" | search group="Bertha/Hewitt CMTS" | table _time, group, sensor, value
I have tried running a sub search to get *just* total modem count, and then compare that to the count of the two other rows, using **eventstats**, but that was not successful.percentagesumcalculationscalculatingMon, 26 Sep 2016 16:25:09 GMTevan_roggenkampCalculate time and doubling it from the user selection
https://answers.splunk.com/answers/446961/calculate-time-and-doubling-it-from-the-user-selec.html
Hi all.
I have a normal time selector in splunk that I think that everybody know.
![alt text][1]
I noticed that in my dashboard it gets used in the following way:
<search>
<query>MY QUERY</query>
<earliest>$field1.earliest$</earliest>
<latest>$field1.latest$</latest>
</search>
Now, what I want to do is to double the time range selected by the user.
For instance, if the user select 1 week, I want to pick 2 weeks.
Same thing for days, months, hours and any time range .
If the user pick some strange period (es: from 1st January to 21 February ) I want to maintain the closest selection (21 February ) and double the chosen time.
>1st January to 21 February = 51 days
>
>51 * 2 = 102
>
>21 February - 102 days = 11 November
How can I do this in my code.
I'm also willing to trasform my dashboard in HTML (I think I'll do it anyway later).
Thanks a lot!
-------------------------------------------------------------------------------------------------------------------------
Thank you a lot Sundareshr!
I tried to implement your solution but for some reason it does not work.
the code is the following:
BASE QUERY [
| makeresults
| eval earliest=if(isnum($field1.earliest$), $field1.earliest$, relative_time(now(), "$field1.earliest$")
| eval latest=if(isnum($field1.latest$), $field1.latest$, relative_time(now(), "$field1.latest$")
| eval span=latest-earliest
| eval mid=earliest
| eval earliest=earliest-span
| table earliest latest mid]
| eval when=if(_time>relative_time(now(), mid), "Current_Period", "Prev_Period")
| stats count as events by source when
| chart sum(events) by source, when
| eval perc = (Current_Period-Prev_Period)/Prev_Period
| eval trend = case(perc < -0.3, "low", (perc >= -0.3 and perc <= 0.3 ), "madium", perc > 0.3, "high")
| table source, Current_Period, Prev_Period, perc, trend
It shows me the following error
> Error in 'eval' command: The expression is malformed. Expected ).
in your part of the code I changed this
eval span=latest=earliest
to this:
eval span=latest-earliest
Because I thought it was a typo
Thanks a lot again
----------------------------------------------------------------------------------------------------------
There is something wrong with the formatting that te selector gives to the variable.
If i chose "from the beginning of the week" i get thios error:
> Error in 'eval' command: The expression is malformed. An unexpected character is reached at '@w1), @w1, relative_time(now(), "@w1"))'.
for the month is like this:
> Error in 'eval' command: The expression is malformed. An unexpected character is reached at '@mon), @mon, relative_time(now(), "@mon"))'.
if i choose always is like this:
> Error in 'eval' command: The expression is malformed. An unexpected character is reached at ', relative_time(now(), ""))'.
if i choose between 2 dates is like this:
> Error in 'search' command: Unable to parse the search: 'AND' operator is missing a clause on the left hand side.
if i choose last 7 days is like this
> Error in 'eval' command: The expression is malformed. Expected ).
Thank you
[1]: https://docs.splunk.com/images/3/3f/6.2_timerange_presets.pngsplunk-enterprisetimecalculationsselectorTue, 30 Aug 2016 07:13:48 GMTandreafebboCalculating Trends in tables
https://answers.splunk.com/answers/443854/calculating-trends-in-tables.html
Hi all.
I'd like to create a table like the following columns:
server - event count - event count last period
what do I mean by that?
I have to do a count by of the events, and this is trivial.
base query | stats count as events by source
now i have a selector that let me select the period (the classic splunk time selector).
what i need is the following:
if the selector is "last week" i need to count in the first column the events of the last week and in the second column the events of the week before that
if the selector is "last mount" i need to count in the first column the events of the last mount and in the second column the events of the mount before that
etc...
Id like to do that without messing with html, xml or any other language. I'd like, if possible, a plain splunk search.
Thanks a lot.
Andreasplunk-enterprisetimecalculationsWed, 24 Aug 2016 14:51:54 GMTandreafebbo