All Apps and Add-ons

Error while Hunk connecting with HA Hadoop (HA Namenode and HA Resource Manager)

rahulgaikwad198
New Member

Hi All ,

Getting below error while Hunk connecting to HA hadoop cluster.
BlockReaderFactory - I/O error constructing remote block reader.
java.net.ConnectException: Connection refused

Please let me know if need more details and help to resolve this issue.

Log :

04-20-2016 04:09:30.605 INFO ERP.ha_poc - VirtualIndex$Splitter - generateSplits started, vix.name=emp_index ...
04-20-2016 04:09:31.320 INFO ERP.ha_poc - TimelineClientImpl - Timeline service address: http://hadoop1.poc.com:8188/ws/v1/timeline/
04-20-2016 04:09:31.351 INFO ERP.ha_poc - RMProxy - Connecting to ResourceManager at /XX.XX.XX.8:8050
04-20-2016 04:09:32.298 WARN ERP.ha_poc - SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.SplunkJournalRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.SplunkJournalRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/000000_0, regex=/journal.gz$.
04-20-2016 04:09:32.300 WARN ERP.ha_poc - SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.ValueAvroRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.ValueAvroRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/000000_0, regex=.avro$.
04-20-2016 04:09:32.301 WARN ERP.ha_poc - SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.SimpleCSVRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.SimpleCSVRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/000000_0, regex=.([tc]sv)(?:.(?:gz|bz2|snappy))?$.
04-20-2016 04:09:32.303 WARN ERP.ha_poc - SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.SequenceFileRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.SequenceFileRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/000000_0, regex=.seq$.
04-20-2016 04:09:32.313 INFO ERP.ha_poc - JobSubmitterInputFormat - using class=com.splunk.mr.input.SplunkLineRecordReader to process split=/apps/hive/warehouse/emp/000000_0:0+134217728
04-20-2016 04:09:32.473 WARN ERP.ha_poc - ResourceMgrDelegate - getBlacklistedTrackers - Not implemented yet
04-20-2016 04:09:32.475 INFO ERP.ha_poc - ClusterInfoLogger - Hadoop cluster spec: provider=ha_poc, tasktrackers=2, map_inuse=1, map_slots=20, reduce_inuse=1, reduce_slots=4
04-20-2016 04:09:32.598 INFO ERP.ha_poc - SplunkBaseMapper - using class=com.splunk.mr.input.SplunkLineRecordReader to process split=/apps/hive/warehouse/emp/000000_0:0+134217728
04-20-2016 04:09:32.878 WARN ERP.ha_poc - BlockReaderFactory - I/O error constructing remote block reader.
04-20-2016 04:09:32.878 WARN ERP.ha_poc - java.net.ConnectException: Connection refused
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3454)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:777)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:694)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:355)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:618)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at java.io.DataInputStream.read(DataInputStream.java:149)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.fillBuffer(UncompressedSplitLineReader.java:59)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.readLine(UncompressedSplitLineReader.java:91)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:144)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:184)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.input.SplunkLineRecordReader.nextKeyValue(SplunkLineRecordReader.java:39)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.SplunkBaseMapper.doStream(SplunkBaseMapper.java:410)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:375)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:331)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:644)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:656)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:653)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.input.FileSplitGenerator.sendSplitToAcceptor(FileSplitGenerator.java:28)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.input.FileSplitGenerator.generateSplits(FileSplitGenerator.java:79)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1418)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1396)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.addStatus(VirtualIndex.java:576)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.listStatus(VirtualIndex.java:609)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$Splitter.generateSplits(VirtualIndex.java:1566)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1485)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1437)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.input.VixSplitGenerator.generateSplits(VixSplitGenerator.java:55)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:674)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler.executeImpl(SplunkMR.java:936)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler.execute(SplunkMR.java:771)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR.runImpl(SplunkMR.java:1518)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR.run(SplunkMR.java:1300)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
04-20-2016 04:09:32.878 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR.main(SplunkMR.java:1546)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - DFSInputStream - Failed to connect to /XX.XX.XX.XX:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection refused
04-20-2016 04:09:32.884 WARN ERP.ha_poc - java.net.ConnectException: Connection refused
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3454)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:777)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:694)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:355)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:618)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at java.io.DataInputStream.read(DataInputStream.java:149)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.fillBuffer(UncompressedSplitLineReader.java:59)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.readLine(UncompressedSplitLineReader.java:91)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:144)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:184)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.input.SplunkLineRecordReader.nextKeyValue(SplunkLineRecordReader.java:39)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.SplunkBaseMapper.doStream(SplunkBaseMapper.java:410)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:375)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:331)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:644)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:656)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:653)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.input.FileSplitGenerator.sendSplitToAcceptor(FileSplitGenerator.java:28)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.input.FileSplitGenerator.generateSplits(FileSplitGenerator.java:79)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1418)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1396)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.addStatus(VirtualIndex.java:576)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.listStatus(VirtualIndex.java:609)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$Splitter.generateSplits(VirtualIndex.java:1566)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1485)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1437)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.input.VixSplitGenerator.generateSplits(VixSplitGenerator.java:55)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:674)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler.executeImpl(SplunkMR.java:936)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler.execute(SplunkMR.java:771)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR.runImpl(SplunkMR.java:1518)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR.run(SplunkMR.java:1300)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
04-20-2016 04:09:32.884 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR.main(SplunkMR.java:1546)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - BlockReaderFactory - I/O error constructing remote block reader.
04-20-2016 04:09:32.886 WARN ERP.ha_poc - java.net.ConnectException: Connection refused
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3454)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:777)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:694)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:355)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:618)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at java.io.DataInputStream.read(DataInputStream.java:149)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.fillBuffer(UncompressedSplitLineReader.java:59)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.readLine(UncompressedSplitLineReader.java:91)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:144)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:184)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.input.SplunkLineRecordReader.nextKeyValue(SplunkLineRecordReader.java:39)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.SplunkBaseMapper.doStream(SplunkBaseMapper.java:410)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:375)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:331)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:644)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:656)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:653)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.input.FileSplitGenerator.sendSplitToAcceptor(FileSplitGenerator.java:28)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.input.FileSplitGenerator.generateSplits(FileSplitGenerator.java:79)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1418)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1396)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.addStatus(VirtualIndex.java:576)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.listStatus(VirtualIndex.java:609)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex$Splitter.generateSplits(VirtualIndex.java:1566)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1485)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1437)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.input.VixSplitGenerator.generateSplits(VixSplitGenerator.java:55)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:674)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler.executeImpl(SplunkMR.java:936)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR$SearchHandler.execute(SplunkMR.java:771)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR.runImpl(SplunkMR.java:1518)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR.run(SplunkMR.java:1300)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
04-20-2016 04:09:32.886 WARN ERP.ha_poc - at com.splunk.mr.SplunkMR.main(SplunkMR.java:1546)
04-20-2016 04:09:32.887 WARN ERP.ha_poc - DFSInputStream - Failed to connect to /XX.XX.XX.XX:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection refused
04-20-2016 04:09:32.887 WARN ERP.ha_poc - java.net.ConnectException: Connection refused

Tags (3)
0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

It looks as if one of your 2 data nodes might be blocked. I see this error: Failed to connect to /XX.XX.XX.XX:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection refused

0 Karma
Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...