Example:
Return the count of users/ total users, encountering error by page, error
So the X axis would be the rows for each distinct error, and the Y axis would be the page, and the data in each intersecting cell would be unique users encountering error on page/unique total users hitting page
New answer!
OK. You need rows of error, columns of pages, and each cell to say "X/Y", for example "9/2131" where it says that 9 users experienced that error on that particular page, and 2131 total users visited that page.
Here you go:
| stats values(ipAddress) as addresses dc(ipAddress) as impact by error Page
| streamstats dc(addresses) as totalUsersImpacted by Page
| fields - addresses
| eval impact=impact + "/" + totalUsersImpacted
| xyseries error Page impact
You need to do dc once in the regular stats command and then a second time with streamstats ... by error. This streamstats clause will do the same dc math but only per Error. To give the streamstats the raw materials we need, we need to pass along the values(ipAddress) although after streamstats we can safely discard them.
If you have fields userId, page and error, it would look like this:
<your search terms> | chart dc(userId) over error by page limit=100
The limit=100 is because by default the chart command will display 9 split by values and then roll all the rest up into a big one called "OTHER". limit=100 raises that threshold from 10 to 100.
This only gives a unique count of the users getting the error per page, what I need is the "errorUsers - totalUsers" per page
In other words I have a query that currently looks like this and works
search here
| stats dc(ipAddress) AS impact by Page error
| xyseries error,Page ,impact
| fillnull value=0
The psuedo code of what I want is
search here
| stats dc(ipAddressErrors)."/".dc(totalIpAddresses) AS impact by Page error
| xyseries error,Page ,impact
| fillnull value=0
Understood. You need rows of error, columns of pages, and each cell to say "9/2131" meaning that 9 users experienced that error on that particular page, and 2131 total users experienced that error across all pages.
Then what you need will look like this.
| stats values(ipAddress) as addresses dc(ipAddress) as impact by error Page
| streamstats dc(addresses) as totalUsersImpacted by error
| fields - addresses
| eval impact=impact + "/" + totalUsersImpacted
| xyseries error Page impact
You need to do dc once in the regular stats command and then a second time with streamstats ... by error. This streamstats clause will do the same dc math but only per error. To give the streamstats the raw materials we need, we need to pass along the values(ipAddress) although after streamstats we can safely discard them.
close, but this part
"Understood. You need rows of error, columns of pages, and each cell to say "9/2131" meaning that 9 users experienced that error on that particular page, and 2131 total users experienced that error across all pages."
should read
"Understood. You need rows of error, columns of pages, and each cell to say "9/2131" meaning that 9 users experienced that error on that particular page, and 2131 total users visited that page."
Ah, easy enough.
| stats values(ipAddress) as addresses dc(ipAddress) as impact by error Page
| streamstats dc(addresses) as totalUsersImpacted by Page
| fields - addresses
| eval impact=impact + "/" + totalUsersImpacted
| xyseries error Page impact
I'll post it as a separate answer since this one began pretty far away from what you needed.
Hmm, this still is not doing what I would like, When searching back 60 minutes, it will report data like this
Logon Search
404 1/35 2/23
The problem is, the value for the 1 and 2 seems correct, but the second number is way off, if I do a
....page=logon
|top 0 ipAddress
I get say 157, so I would expect it to show 1/157 for that value.
Where there were 157 unique ipAddress values for logon that did or did not receive the error, and 1 that did during the search time frame.
Sounds like either Page or error is undefined for a lot of your requests. I assumed you didn't want to count such events.
My first stats command there will implicitly discard rows that don't have a value for either error or Page. If you want to include counts for things that don't have error or Page values, then add before the first stats, | fillnull error Page value="NULL"
I see what my issue here is based on your good explanation. My issue is, the field that I am regex the error out of the event doesn't exist at all in the non-error events, so filling null, effectively doesn't do anything. I need to figure out a way to include the total count. I have another field that is common between both events, error and non-error. LogLevel=INFO or LogLevel=ERROR. So I need to figure out a way to get the total unique non-error+error users now.