I have configured amqp modular data input to consume data from a rabbitmq queue. I have increased my heap maximum from 64M upto 1024M and still tend to hit the heap issue. After which the streaming to splunk stops until the data input monitor is disabled and then re-enabled.
Are we hitting a memory leak issue? Is there a way I can ensure I continue streaming in the data without hitting the heap issue.
ERROR ExecProcessor - message from "python /local/mnt/splunk/etc/apps/amqp_ta/bin/amqp.py" Exception in thread "Thread-3" java.lang.OutOfMemoryError: GC overhead limit exceeded
You also emailed me , so copy/pasting that reply here :
Excessive memory use is not the same as a Memory Leak.
Allocate more memory.
As per the docs :
Hello Damien,
can you please check the picture? I cannot see it and also need a resolution for this.