Getting Data In

INDEXED_EXTRACTIONS = json, fields are extracted as strings, even fields that are numeric only.

Rialf1959
Explorer

I have INDEXED_EXTRACTIONS = json in props.conf.
Json data are extracted OK, but ... All fields are extracted as String data type, even fields with numbers only.
I can not do any mathematical operations.
Is there any global option to enable extracting as number data type?

Using 6.6.0 version of Splunk
Thanks

0 Karma

xavierashe
Contributor

I found the reference that confirms our observations here: http://docs.splunk.com/Documentation/Splunk/latest/SearchReference/eval

If the expression references a field name that contains non-alphanumeric characters, it needs to be surrounded by single quotation marks. For example, if the field name is server-1 you specify the field name like this new=count+'server-1'.

Because the JSON field you are referencing has a period, you have to use single quotes. This is only true for eval and where. Search seems forgiving if the non-alphanumeric character is a period.

Now as to the question about the arrays. Splunk correctly parses these as multivalue fields. The trick is adding curly braces at the end of the field (and double quotes).

| table  "cpu_stats.cpu_usage.percpu_usage{}"

You can use mvexpand() or other multivalue functions now.

0 Karma

rjthibod
Champion

I know you said you got around the issue with single quotes.

Another question, does your JSON event wrap the integers in double quotes or are they unquoted? For example,
"metric":"4.17" versus "metric":4.17

0 Karma

Rialf1959
Explorer

Without quotes...

{"read":"2017-10-25T14:21:57.271720193Z","preread":"2017-10-25T14:21:56.27179597Z","pids_stats":{"current":636},"blkio_stats":{"io_service_bytes_recursive":[{"major":7,"minor":0,"op":"Read","value":75264},{"major":7,"minor":0,"op":"Write","value":270336},{"major":7,"minor":0,"op":"Sync","value":200704},{"major":7,"minor":0,"op":"Async","value":144896},{"major":7,"minor":0,"op":"Total","value":345600},{"major":253,"minor":5,"op":"Read","value":75264},{"major":253,"minor":5,"op":"Write","value":270336},{"major":253,"minor":5,"op":"Sync","value":200704},{"major":253,"minor":5,"op":"Async","value":144896},{"major":253,"minor":5,"op":"Total","value":345600},{"major":253,"minor":22,"op":"Read","value":252715008},{"major":253,"minor":22,"op":"Write","value":442368},{"major":253,"minor":22,"op":"Sync","value":356352},{"major":253,"minor":22,"op":"Async","value":252801024},{"major":253,"minor":22,"op":"Total","value":253157376}],"io_serviced_recursive":[{"major":7,"minor":0,"op":"Read","value":10},{"major":7,"minor":0,"op":"Write","value":66},{"major":7,"minor":0,"op":"Sync","value":49},{"major":7,"minor":0,"op":"Async","value":27},{"major":7,"minor":0,"op":"Total","value":76},{"major":253,"minor":5,"op":"Read","value":10},{"major":253,"minor":5,"op":"Write","value":66},{"major":253,"minor":5,"op":"Sync","value":49},{"major":253,"minor":5,"op":"Async","value":27},{"major":253,"minor":5,"op":"Total","value":76},{"major":253,"minor":22,"op":"Read","value":6154},{"major":253,"minor":22,"op":"Write","value":133},{"major":253,"minor":22,"op":"Sync","value":112},{"major":253,"minor":22,"op":"Async","value":6175},{"major":253,"minor":22,"op":"Total","value":6287}],"io_queue_recursive":[],"io_service_time_recursive":[],"io_wait_time_recursive":[],"io_merged_recursive":[],"io_time_recursive":[],"sectors_recursive":[]},"num_procs":0,"storage_stats":{},"cpu_stats":{"cpu_usage":{"total_usage":188759585845,"percpu_usage":[26804947862,25200091098,22004237991,24984726915,23786726201,23685128621,21764745189,20528981968],"usage_in_kernelmode":19020000000,"usage_in_usermode":158840000000},"system_cpu_usage":154732990610000000,"throttling_data":{"periods":0,"throttled_periods":0,"throttled_time":0}},"precpu_stats":{"cpu_usage":{"total_usage":188758618041,"percpu_usage":[26804852383,25200048481,22004237991,24984541388,23786214859,23684995782,21764745189,20528981968],"usage_in_kernelmode":19020000000,"usage_in_usermode":158840000000},"system_cpu_usage":154732982630000000,"throttling_data":{"periods":0,"throttled_periods":0,"throttled_time":0}},"memory_stats":{"usage":944427008,"max_usage":1055326208,"stats":{"active_anon":553730048,"active_file":127234048,"cache":255229952,"hierarchical_memory_limit":9223372036854771712,"hierarchical_memsw_limit":9223372036854771712,"inactive_anon":135467008,"inactive_file":127995904,"mapped_file":25931776,"pgfault":3112345,"pgmajfault":356,"pgpgin":1235716,"pgpgout":1123695,"rss":689197056,"rss_huge":404750336,"swap":12288,"total_active_anon":553730048,"total_active_file":127234048,"total_cache":255229952,"total_inactive_anon":135467008,"total_inactive_file":127995904,"total_mapped_file":25931776,"total_pgfault":3112345,"total_pgmajfault":356,"total_pgpgin":1235716,"total_pgpgout":1123695,"total_rss":689197056,"total_rss_huge":404750336,"total_swap":12288,"total_unevictable":0,"unevictable":0},"limit":25112842240},"name":"/message-maker.1.ylis21scj6qta08x58bfobuud","id":"24d81325a31ef73446683f0ee65993290c1b96a781c838516820fcb52e6c674a","networks":{"eth0":{"rx_bytes":3994,"rx_packets":53,"rx_errors":0,"rx_dropped":0,"tx_bytes":648,"tx_packets":8,"tx_errors":0,"tx_dropped":0},"eth1":{"rx_bytes":2676,"rx_packets":34,"rx_errors":0,"rx_dropped":0,"tx_bytes":648,"tx_packets":8,"tx_errors":0,"tx_dropped":0},"eth2":{"rx_bytes":10262957,"rx_packets":19847,"rx_errors":0,"rx_dropped":0,"tx_bytes":3551982,"tx_packets":33360,"tx_errors":0,"tx_dropped":0}}}

Still Im not sure how Splunk handle arrays.
For example

    percpu_usage":[26804947862,25200091098,22004237991,24984726915,23786726201,23685128621,21764745189,20528981968]
0 Karma

niketn
Legend

@Rialf1959, for us to assist you better, can you mock sample data (one or more JSON event) and provide us with the same.

Most likely I expect that your numeric fields are having leading or trailing spaces or special character. Please verify. If it is space you can try the following eval with trim()to get rid of spaces.

| eval YourFieldName=trim(YourFieldName)

Please try out and confirm. For further assistance please provide us with sample data (you should mock/anonymize sensitive information before posting the same here).

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

Rialf1959
Explorer

Please watch bellow raw data sample. Values from JSON are without quotes.

EDIT:

eval YourFieldName=trim(cpu_stats.cpu_usage.total_usage) - not working
eval YourFieldName=trim('cpu_stats.cpu_usage.total_usage') - works OK

EDIT2:
It seems that Splunk does not handle JSON arrays well.
Expected vaules in percpu_usage{} should be 26804947862,25200091098,22004237991,24984726915,23786726201,23685128621,21764745189,20528981968, not percpu_usage{} = 26804947862

     percpu_usage":[26804947862,25200091098,22004237991,24984726915,23786726201,23685128621,21764745189,20528981968]
0 Karma

xavierashe
Contributor

Also, have you tried setting the CHARSET in props.conf?

0 Karma

xavierashe
Contributor

Are the numbers integers, or do they have any punctuation?

0 Karma

Rialf1959
Explorer

ye, all of them are integers

0 Karma

xavierashe
Contributor

I know you asked how to fix this globally, but figured I'd mention you can fix this at search time:

search | convert num(fieldtoconvert)

0 Karma

Rialf1959
Explorer

singlequotes did the trick...

| where 'cpu_stats.cpu_usage.total_usage' < 0 = OK
| where cpu_stats.cpu_usage.total_usage < 0 = NOK

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...