Splunk Search

特定のフィールドの値をフィールド名として、その値を取り出したい

toyo11
New Member

下記のような1行のログデータがあります。

フィールド名 : 値

_time : 2017/11/15 00:00:00
row_no : test500

test1 ~ test1000 : 数値データ

フィールド"row_no"に、抽出すべき値のフィールド名が入っており、
最終的にフィールド"test500"の値を抽出する必要があります。
※もちろんrow_noに対する値は、行によって可変です。

目的の数値データを取り出すには、どういったサーチを行えば
良いでしょうか?

宜しくお願い致します。

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

HI

I have used google translate to understand your question. I understand that you want to extract the numeric value from "test500".

So can you please try this?

YOUR_SEARCH | rex field=_raw "row_no :.*[^\d](?<row_no_count>.*\d)"

Thanks

0 Karma

toyo11
New Member

kamlesh_vaghela san
Thank you for your reply!

But, i'm sorry, there were some problems in my question.

The type of the row data is csv format.

Example:

time, row_no, test1, test2, test3, test4, test5, test6, test7
2017/11/15 00:00:00, test6, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7

2017/11/16 00:00:00, test1, 11, 22, 33, 44, 55, 66, 77

The first line is a header line.

I want to take out the following data in the second line.
2017/11/15 00:00:00, test6, 0.6
'0.6' is the value of the field 'test6'.
I want to take out the following data in the third line.
2017/11/16 00:00:00, test1, 11
'11' is the value of the field 'test1'.

The value of 'row_no' is variable.
How do I search?

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

Hi

Can you please try this?

YOUR_SEARCH
| eval myvalue="" 
| foreach "test*" 
    [ eval myvalue="<<FIELD>>"."=".'<<FIELD>>'."|".myvalue ] 
| eval myvalue=split(myvalue,"|") 
| mvexpand myvalue 
| eval myval=mvindex(split(myvalue,"="),1),mykey=mvindex(split(myvalue,"="),0)
| where mykey=row_no | table time row_no myval mykey

Happy Splunking

0 Karma

toyo11
New Member

kamlesh_vaghela san
Thank you for your reply!

I'm trying the method.
I will update the result later.

I have one quietion.
The result of the command 'foreach' does not include the field higher than 'test100'.(test100-test1000)
Is there any limit? (the number of auto extracted fields on the datasouce csv? or?)

I'm thinking that I may have to change how to input data to Splunk...
The type of the original data is matrix data.
* combination of testcase and the result of the testcase
* about 1000 testcases on one time test
Fundamentally, should we increase the number of rows rather than increasing the number of columns?

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

Hi @toyo11,

Yes, there may be some limit in which not covering all columns but it is good to increase row number than columns.

Please let me know what you decide.

Thanks

0 Karma

toyo11
New Member

kamlesh_vaghela san
Thank you for your reply!

I'm trying the method.
I will update the result later.
There is no problem except limitation of 100 columns.
I have learned a lot.

I try to analyze the data by inputting matrix data to Splunk as data of many rows.

Thank you for your great support!

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

Hi @toyo11,

Glad to help you.

I workaround with your issue in a different way, if we don't want to split the matrix into multiple events we can use a custom command to achieve the requirement.

I'm just sharing my workaround.

create commands.conf file in SPLUNK_HOME/etc/apps/MY_APP/local/

[mycustcommand]
filename = mycommand.py
type = python
supports_getinfo = true
supports_rawargs = true
outputheader = true
passauth = true

create mycommand.py file in SPLUNK_HOME/etc/apps/MY_APP/bin/

import sys
import splunk.entity as en
from splunklib.searchcommands import \
    dispatch, StreamingCommand, Configuration, Option, validators
import operator


@Configuration()
class myCommand(StreamingCommand):

    def stream(self, events):

        for event in events:
            row_no = event['row_no']
            row_no_value = ''

            for key in event.keys():
                if key == row_no:
                    row_no_value = event[key]

            event['row_no_value'] = row_no_value
            yield event


dispatch(myCommand, sys.argv, sys.stdin, sys.stdout, __name__)

please see below link for more info.

http://dev.splunk.com/view/python-sdk/SP-CAAAEU2

You can use this if you don't want to go with split matrixes.

Please upvote my any comments that help you. If in any case my answer will help to other Splunkers then it is good to accept the answer.

Happy Splunking

0 Karma

toyo11
New Member

kamlesh_vaghela san

Thank you for your information.
I will confirm it.
But it may be dificult for me a litle bit...

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

Hi @toyo11,

Any luck with your issue?

0 Karma

toyo11
New Member

kamlesh_vaghela san

I'm sorry for reply's delay.
I have tried the method just now.
* I deployed the 2 files on APP 'search'.
* I unified the names of file and command to 'mycustcommand'.

I ran the search command 'YOUR_SEARCH | mycustcommand'.
The results are as follows.
Error in 'script': Getinfo probe failed for external search command 'mycustcommand'

Sorry, for using the method, I need enough time to learn custom command and Python...

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

Hi @toyo11,

Yeah sure,

There are few links which will help you. I think it will solve your issue. 🙂

https://www.splunk.com/blog/2014/04/14/building-custom-search-commands-in-python-part-i-a-simple-gen...

https://github.com/splunk/splunk-sdk-python/tree/develop/examples/searchcommands_app/package

Or share your commands.conf & python file SO I see it quickly.

Thanks

0 Karma

toyo11
New Member

kamlesh_vaghela san

Sorry for the late reply.
I have no time by other jobs this week.
I am going to try again next week.

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

Hi @toyo11,
Sure..

0 Karma

toyo11
New Member

kamlesh_vaghela san

I apologize for for late reply.
I repeated trial and error, and finally managed to move it somehow.
However, if the field 'fail_no' is greater than test 70, the value of the field 'row_no_value' is null.
I guess it was caused by the limit of the number of fields.

/opt/splunk/etc/apps/search/local
[authorize.conf]

 [capability::run_script_mycustcommand]   
 [role_admin]
 run_script_mycustcommand= enabled

[commands.conf]

 [mycustcommand]
 filename = mycustcommand.py
 type = python
 supports_getinfo = true
 supports_rawargs = true
 outputheader = true
 passauth = true

/opt/splunk/etc/apps/search/bin
[mycustcommand.py]

 import sys
 import splunk.entity as en
 from splunklib.searchcommands import dispatch, StreamingCommand, Configuration, Option, validators
 import operator

 @Configuration()
 class mycustcommand(StreamingCommand):

      def stream(self, events):

          for event in events:
              row_no = event['fail_no']
              row_no_value = ''

              for key in event.keys():
                  if key == row_no:
                      row_no_value = event[key]

              event['row_no_value'] = row_no_value
              yield event

 dispatch(mycustcommand, sys.argv, sys.stdin, sys.stdout, __name__)

[splunklib]

copy from splunk-sdk-python-1.6.2.zip.
cp -pR ./splunklib $SPLUNK_HOME/etc/apps/search/bin

search command on Splunk

 YOUR_SEARCH | mycustcommand
0 Karma

HiroshiSatoh
Champion

foreachでは100までしか使えないみたいですね。フィールド名を変えればいけそうですが。
ex.
1-100:test(1-100)
101-200:test_A_(1-100)
201-300:test_B_(1-100)

面倒くさいですね。こんな制限はドキュメントにも載っていないので行を増やすほうが良いと思います。

0 Karma

toyo11
New Member

HiroshiSatoh-san

ご回答ありがとうございます。

もともとマトリックス状のデータを、1つのイベントという扱いで
1行に変換して投入していました。
行に分けて投入し、分析をしてみようと思います。

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

translated :

こんにちは

あなたの質問を理解するためにGoogle翻訳を使用しました。私はあなたが "test500"から数値を抽出したいと思います。

YOUR_SEARCH | rex field=_raw "row_no :.*[^\d](?<row_no_count>.*\d)"

thanks

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...