All Apps and Add-ons

Hunk Tutorial - No Results Found

fblau
Explorer

I am getting this error in Hunk:

03-11-2014 08:45:08.354 ERROR SearchOperator:stdin - Cannot consume data with unset stream_type
03-11-2014 08:45:08.355 ERROR ResultProvider - Error in 'SearchOperator:stdin': Cannot consume data with unset stream_type

And no results are returning.

This is using the Hunk Tutorial settings.

Tags (2)
0 Karma
1 Solution

apatil_splunk
Splunk Employee
Splunk Employee

Can you please check what hadoop version you selected while creating the Provider on UI.
In the drop down Hadoop version is this selected Hadoop 2.x,(MRv1) ?

View solution in original post

apatil_splunk
Splunk Employee
Splunk Employee

Can you please check what hadoop version you selected while creating the Provider on UI.
In the drop down Hadoop version is this selected Hadoop 2.x,(MRv1) ?

fblau
Explorer

BINGO! Changing it to 2.x worked!

0 Karma

fblau
Explorer

1.x is selected currently

0 Karma

fblau
Explorer
0 Karma

fblau
Explorer

file is too large to post. Here is the error section:

03-11-2014 10:43:04.242 ERROR SearchOperator:stdin - Cannot consume data with unset stream_type
03-11-2014 10:43:04.242 ERROR ResultProvider - Error in 'SearchOperator:stdin': Cannot consume data with unset stream_type
0

0 Karma

nhaddadkaveh_sp
Splunk Employee
Splunk Employee

Please paste your search.log in here

0 Karma

nhaddadkaveh_sp
Splunk Employee
Splunk Employee

You need to change vix.fs.default.name = hdfs://localhost/8020 to vix.fs.default.name = hdfs://localhost:8020
And make sure you have your data in hdfs under /data

0 Karma

fblau
Explorer

Changed that. Still getting the same error.

0 Karma

fblau
Explorer

[ponyindex]
vix.input.1.accept = .gz$
vix.input.1.path = /data/...
vix.provider = PonyProvider
[cloudera@localhost local]$ cat indexes.conf
[provider:PonyProvider]
vix.env.HADOOP_HOME = /usr/lib/hadoop
vix.env.JAVA_HOME = /usr/java/jdk1.6.0_32
vix.family = hadoop
vix.fs.default.name = hdfs://localhost:8020
vix.mapred.job.tracker = localhost:8021
vix.splunk.home.hdfs = /user/root/splunkmr

[ponyindex]
vix.input.1.accept = .gz$
vix.input.1.path = /data/...
vix.provider = PonyProvider

0 Karma

nhaddadkaveh_sp
Splunk Employee
Splunk Employee

you have a slash before 8020 but you need colon (:8020)

0 Karma

fblau
Explorer

Can you clarify that? Those are the same 🙂

0 Karma

fblau
Explorer

ps: there is a backslash before the .gz that somehow this webpage doesn't show.

0 Karma

fblau
Explorer

Here you go:

[provider:PonyProvider]
vix.env.HADOOP_HOME = /usr/lib/hadoop
vix.env.JAVA_HOME = /usr/java/jdk1.6.0_32
vix.family = hadoop
vix.fs.default.name = hdfs://localhost/8020
vix.mapred.job.tracker = localhost:8021
vix.splunk.home.hdfs = /user/root/splunkmr

[ponyindex]
vix.input.1.accept = .gz$
vix.input.1.path = /data/...
vix.provider = PonyProvider

0 Karma

nhaddadkaveh_sp
Splunk Employee
Splunk Employee

I need to see your provider and virtual indexes. I am guessing this one:
/home/cloudera/splunk/etc/apps/search/local/indexes.conf (if you haven't moved it under your App)

0 Karma

fblau
Explorer

I am using the Cloudera Quickstart VM.
There are several indexes.conf files... which one do you want?
/home/cloudera/splunk/etc/apps/sample_app/default/indexes.conf
/home/cloudera/splunk/etc/apps/SplunkLightForwarder/default/indexes.conf
/home/cloudera/splunk/etc/apps/search/local/indexes.conf
/home/cloudera/splunk/etc/system/default/indexes.conf
/home/cloudera/splunk/etc/master-apps/_cluster/default/indexes.conf

0 Karma

nhaddadkaveh_sp
Splunk Employee
Splunk Employee

Can you copy your indexes.conf in here. What version of Hadoop are you using?

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...