All Apps and Add-ons

How to define the Datetime Format on Splunk DB Connect?

edrivera3
Builder

I'm creating inputs for multiple SQLite tables. I already made 8 different inputs and all of them worked as expected.
I suspect that the problem is related to the Datatime Format which is the only difference between them.

I have a very simple query:  SELECT * FROM stepHistory WHERE id > ? ORDER by id asc;
Input  Type:  Rising
Rising Column: id
Checkpoint Value:  0
Column: timestamp
Datetime Format: EEE MMM d HH:mm:ss yyyy

Example of timestamp:
Thu Mar 8 02:05:00 2018
Wed Feb 28 20:16:04 2018

I'm able to do "Execute SQL" correctly and I see that it is retrieving the correct data.

Max Rows to Retrieve: 0
Fetch Size: 300
Execution Frequence: 17 * * * *

Is there something that I'm missing?

I looked at the index, it is empty. I did a search just with the index name and I got the following:

2018-04-02 09:17:00.043 -0700  [QuartzScheduler_Worker-30] INFO  org.easybatch.core.job.BatchJob - Job 'atedb_stephistory' finished with status: FAILED

2018-04-02 09:17:00.043 -0700 INFO  c.s.dbx.server.task.listeners.JobMetricsListener - action=collect_job_metrics connection=atedb jdbc_url=null record_read_success_count=111 db_read_time=33 record_read_error_count=1 hec_record_process_time=8 format_hec_success_count=111 status=FAILED input_name=atedb_stephistory batch_size=1000 error_threshold=N/A is_jmx_monitoring=false start_time=2018-04-02_09:17:00 end_time=2018-04-02_09:17:00 duration=43 read_count=111 write_count=0 filtered_count=0 error_count=0

2018-04-02 09:17:00.006 -0700  [QuartzScheduler_Worker-30] INFO  org.easybatch.core.job.BatchJob - Job 'atedb_stephistory' started
2018-04-02 09:17:00.005 -0700  [QuartzScheduler_Worker-30] INFO  com.splunk.dbx.connector.logger.AuditLogger - operation=dbinput connection_name=atedb stanza_name=atedb_stephistory state=success sql='SELECT *
FROM stepHistory
WHERE id > ?
ORDER BY  id asc; '

2018-04-02 09:17:00.001 -0700  [QuartzScheduler_Worker-30] INFO  c.s.d.s.dbinput.recordreader.DbInputRecordReader - action=db_input_record_reader_is_opened input_task="atedb_stephistory" query=SELECT *
FROM stepHistory
WHERE id > ?
ORDER BY  id asc; 

2018-04-02 09:17:00.000 -0700  [QuartzScheduler_Worker-30] INFO  org.easybatch.core.job.BatchJob - Job 'atedb_stephistory' starting

Edit:
After looking more closely at the events, there is an error on row 112:

"record_read_success_count=111".
As I expected, the issue is related to the timestamp. Note that there are two spaces between "Mar" and "7". I'm not sure how to manage this.

111 | Thu Feb 15 17:49:21 2018
112 | Wed Mar  7 20:11:45 2018

I'm changing the title of the question from "Unable to make a DB connect input" to "How to define the Datetime Format on Splunk DB Connect?"

0 Karma

mpipaliya_splun
Splunk Employee
Splunk Employee

There seem to be issue with date/time format. See if Date time Format: %a %b %e %H:%M:%S %Y for Wed Mar 7 20:11:45 2018 works.

If given timestamp does not solve the issue, please share content of inputs.conf and props.conf to analyze further.

For more details on date/time format in Splunk, you may refer to docs on Commontimeformatvariables:


Mayur @ Splunk | GSS FDSE | Personal: https://mayur.pipaliya.com/

If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...