All Apps and Add-ons

Splunk DB Connect App not putting data in Splunk index

raidercom
Communicator

Hi:
I'm trying to get SplunkDB Connect app to pull data from an Oracle database into Splunk.

Working:

Database Connection

DB Info

DB Query with the SQL statement I'm using

Not Working:
When I go to setup the query to actually pull data into a Splunk index, I cannot seem to get anything to go into the index from the query.

What I've done:

Created a new index for the data

Specify the SQL query in a new database input

Scheduled query

New index in the spunk app & new index in the dbx app

Every 1/2 hour & auto

dump (not tailed, wanted to keep it simple to start)

no source type & tried with a new sourcetype

Output formatting:

Key-Value format

Output timestamp with proper field and value & no output timestamp specified

Before restarting Splunk after specifying the new index, I've gotten this message, with different variations on the information contained within based on what I've chosen on the query:
This message has seemed to go away after rebooting Splunk

earch peer has the following message: received event for unconfigured/disabled/deleted index='dbxtest' with source='source::dbmon-dump://' host='host::' sourcetype='sourcetype::dbmon:kv' (3 missing total)

Things I have not tried:

Custom source type, don't see the point

Look-up table, again, don't see the point

Maybe I'm looking at this the wrong way, but I'm trying to test whether there is data in the index by searching for index=dbxtest range:all time from within the splunk dbx app

Thank you for any information/assistance you could provide.

Tags (2)
0 Karma

lukejadamec
Super Champion

Try to configure your oracle input like this:

Input Type: Tail

Database: Select the database

SQL Query:

SELECT * FROM enterqueryname

{{WHERE $rising_column$ > ?}} Note: enter this statement exactly as written - no subsitutes.

Rising Column: enter a timestamp column in the correct case

sourcetype: dbmon:kv

Index: enteryourindexname

Host field value: enterhostname

Output timestamp: check, and enter timestamp column name entered above

Timestamp format: yyyy-MMM-dd HH:mm:ss.SSS

Interval: 0/2 * * * * Note: that is cron for every two minutes, good for testing.

If you thing you have everything right, and it is still not working, try restarting splunk.

Monitor the splunkd logs for problems.

0 Karma

lukejadamec
Super Champion

For your setup you should probably match the output format and output parse format to the converted format.

0 Karma

raidercom
Communicator

Getting a different error related to the previous error: dbx5932:WARN:ResultSetOutputPayload - Error parsing timestamp '2013-08-19 15:22:35' with the given format 'java.text.SimpleDateFormat@ff7867f8': java.text.ParseException: Unparseable date: "2013-08-19 15:22:35"

Doing the conversion on the SPlunk end because I didn't think that it would be able to handle:
2010-07-28 09:42:33 == 1280324553.432

0 Karma

lukejadamec
Super Champion

If redeem_id is the rising column, then use that. I just assumed you were using timestamp.
I just checked with my dba, my mistake. We are not converting with to_char. We convert the timestamp at the splunk end with output.timestamp.format=yyyy-MMM-dd HH:mm:ss.SSS. We do not use the output.timestamp.parse.format. Sorry for the confusion.

0 Karma

raidercom
Communicator

Why use initial date in the rising column field?
The only field that increments will be the redeem_id. Everything else can be duplicated or fluctuate depending on when the data was input into the database.
I'm also not sure why the date should be specified in the format:
yyyy-MMM-dd HH:mm:ss.SSS
When the format in the SQL query is YYYY-MM-DD HH24:MI:SS

I'm running another test with:
output.timestamp.parse.format=yyyy-MMM-dd HH:mm:ss.SSS in the inputs.conf in the dbx app
and with the format for the date as:
yyyy-MMM-dd HH:mm:ss.SSS

0 Karma

lukejadamec
Super Champion

Use format yyyy-MMM-dd HH:mm:ss.SSS as you timestamp format at the bottom of the input, and don't change your to_char format.
Use InitalDate in the rising column field, but don't change the variable name "$rising_column$".
You may need to use output.timestamp.parse.format=yyyy-MMM-dd HH:mm:ss.SSS in your inputs.conf (it has to be added by hand, there is no way to do it with the gui.

0 Karma

raidercom
Communicator

I've switched it to a much simpler query, and I'm getting a weird error on the timestamp:

Timestamp format:
yyyy-MM-dd HH:mm:ss

Query:
SELECT to_char(r.INITIAL_DATE,'YYYY-MM-DD HH24:MI:SS') AS "InitialDate", r.REDEEM_ID, r.PDC_USER_ID FROM PDC.REDEEM r {{WHERE $rising_column$ > ?}}

Error:
dbx6516:WARN:ResultSetOutputPayload - Unrecognized timestamp format: '2013-08-01 09:24:05'
host=tps-splunkweb1.points.com Options| sourcetype=dbx_debug Options| source=/opt/splunk/var/log/splunk/dbx.log Options

0 Karma

lukejadamec
Super Champion

Try a simple query.
I'd like to see this solved before it starts giving me nightmares 🙂

0 Karma

raidercom
Communicator

When I try a tail, I cannot get the query to work at all, just says that the SQL is invalid. I've worked with our DBA, and it seems to be related to multiple WHERE clauses, since the select we are using is joining multiple tables.

0 Karma

lukejadamec
Super Champion

I'm assuming this site screwed up your interval value.
The output.timestamp.colum should be InitalDate, not Timestamp. I don't know why that is not throwing errors. Perhaps because it is a dump and not a tail.

0 Karma

raidercom
Communicator

The first part of the select query is SELECT to_char(r.INITIAL_DATE,'YYYY-MM-DD HH24:MI:SS') AS "InitialDate",

0 Karma

lukejadamec
Super Champion

Change the output.timestamp.format to yyyy-MMM-dd HH:mm:ss.SSS
And make sure the Timestamp column is listed first in your query.

0 Karma

raidercom
Communicator

crud, was looking at the wrong file.

it contains (some information changed for <>:

[script://$SPLUNK_HOME/etc/apps/dbx/bin/jbridge_server.py]
disabled = 0

[batch://$SPLUNK_HOME/var/spool/dbmon/*.dbmonevt]
crcSalt =
disabled = 0
move_policy = sinkhole
sourcetype = dbmon:spool

[dbmon-dump:///DBXTest3]
host =
index = dbxtest
interval = 20 * * * *
output.format = kv
output.timestamp = 1
output.timestamp.column = Timestamp
output.timestamp.format = yyyy-MMM-dd HH:mm:ss
query = SELECT
sourcetype = dbmon:kv
table = DBXTest3

0 Karma

lukejadamec
Super Champion

grep the apps directory for the db input. Make sure it only exists once, and post the settings if you can.

0 Karma

raidercom
Communicator

No.
the only thing in my dbx/local/inputs.conf is the lines specifying the index:

[dbxtest]
coldPath = $SPLUNK_DB/dbxtest/colddb
homePath = $SPLUNK_DB/dbxtest/db
thawedPath = $SPLUNK_DB/dbxtest/thaweddb

0 Karma

lukejadamec
Super Champion

Do you have a stanza like this in your dbx\local\inputs.conf?

[batch://$SPLUNK_HOME/var/spool/dbmon/*.dbmonevt]
crcSalt =
disabled = 0
move_policy = sinkhole
sourcetype = dbmon:spool

0 Karma

raidercom
Communicator

The app version is the same, the OS and Splunk version are different:

Actual:
CentOS 6.4
Splunk 5.0.2 (build 149561)
Splunk DB Connect dbx 1.0.11

Test:
Windows Server 2003 Standard
Splunk 5.0.1 (build 143156)
Splunk DB Connect dbx 1.0.11

0 Karma

lukejadamec
Super Champion

What version of splunk and db connect are you running? Is it the same as your test box?

0 Karma

raidercom
Communicator

Hmm... Splunkd.log and the other splunkd.log* files don't mention anything like that.
Does that mean that the app wasn't installed correctly in the first place?
-Jeff

0 Karma

lukejadamec
Super Champion

In splunkd.log you should be seeing TailingProcessor messages for \var\spool\dbmon*.dbmonevt.

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...