Splunk Search

Need to find query from mobile(Android, IOS) device

sinha58
Explorer

Hello,

I am new in Splunk, Looking for result which is coming from Android and IOS devices, seeing android and IOS query in logs but need to count, How many queries are coming from such devices, so can easily make a dashboard for same.

if you guys suggest that query, it would be a great help for me.

Here it is logs below for reference which showing a result for android devices.

"{"cluster_id":"sc-a2","log":"11.16.39.12 - - [10/Jan/2020:10:05:48 +0000] \"GET /so/search?cat_id=1255027787111_1255027789273&client=us_gr&hd=false&ht=false&offset=10&page=1&prg=android&ps=30&sort=best_match&stores=1197"

Thanks,
ss

0 Karma
1 Solution

to4kawa
Ultra Champion

sample:

| makeresults 
| eval _raw="{\"cluster_id\":\"sc-a2\",\"log\":\"11.16.39.12 - - [10/Jan/2020:10:05:48 +0000] \"GET /so/search?cat_id=1255027787111_1255027789273&client=us_gr&hd=false&ht=false&offset=10&page=1≺g=android&ps=30&sort=best_match&stores=1197\""
| rex "(?<mobile>(?<=g=).+?(?=&))"

recommend:

index=np_search-be1559690845 kubernetes.container_name=reso-og stream=stdout
| rex "(?<mobile>(?<=g=).+?(?=&))" 
| rex "\[(?<time>\d{2}/\w{3}/\d{4}:\d{2}:\d{2}:\d{2} \+0000)\]"
| eval time=strptime(time,"%d/%b/%Y:%T %z")
| spath 
| eval log = mvindex(split(log," "),0)
| fieldformat time=strftime(time,"%c")
| table time cluster_id log mobile

Hi, @sinha58
If you can identify the string, you can extract in this way.


Explanation:

  1. regex: cf. regex101.com
  2. spath: extract JSON, cluster_id and log objects.
  3. mvindex: extract IP address(split spaces)
  4. fieldformat: change time(UNIX epoch) to readable. The reason I don't use strftime is that UNIX time is just fine for future aggregations.

Splunk Search Processing Language (SPL) is processed in order.
please try one by one line and check result.

cf. SearchReference/Commands by category

View solution in original post

to4kawa
Ultra Champion

please provide sample log and your_result.
I can't see the screenshot.

0 Karma

to4kawa
Ultra Champion

sample:

| makeresults 
| eval _raw="{\"cluster_id\":\"sc-a2\",\"log\":\"11.16.39.12 - - [10/Jan/2020:10:05:48 +0000] \"GET /so/search?cat_id=1255027787111_1255027789273&client=us_gr&hd=false&ht=false&offset=10&page=1≺g=android&ps=30&sort=best_match&stores=1197\""
| rex "(?<mobile>(?<=g=).+?(?=&))"

recommend:

index=np_search-be1559690845 kubernetes.container_name=reso-og stream=stdout
| rex "(?<mobile>(?<=g=).+?(?=&))" 
| rex "\[(?<time>\d{2}/\w{3}/\d{4}:\d{2}:\d{2}:\d{2} \+0000)\]"
| eval time=strptime(time,"%d/%b/%Y:%T %z")
| spath 
| eval log = mvindex(split(log," "),0)
| fieldformat time=strftime(time,"%c")
| table time cluster_id log mobile

Hi, @sinha58
If you can identify the string, you can extract in this way.


Explanation:

  1. regex: cf. regex101.com
  2. spath: extract JSON, cluster_id and log objects.
  3. mvindex: extract IP address(split spaces)
  4. fieldformat: change time(UNIX epoch) to readable. The reason I don't use strftime is that UNIX time is just fine for future aggregations.

Splunk Search Processing Language (SPL) is processed in order.
please try one by one line and check result.

cf. SearchReference/Commands by category

sinha58
Explorer

good explanation in brief @to4kawa, thank you so much for your valuable response. Is there any good way to learn Splunk other than Splunk doc.

Have a nice day man!!

0 Karma

to4kawa
Ultra Champion

In my case, I run the queries of Splunk answers line by line and check the result of the command.
Some people write cool SPL.

Happy Splunking.

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Get the T-shirt to Prove You Survived Splunk University Bootcamp

As if Splunk University, in Las Vegas, in-person, with three days of bootcamps and labs weren’t enough, now ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...