Here is a link to the docs re: license violations. Splunk does not stop indexing data, but it doesn't allow you to search the data when you have exceeded 3 violations on a free license. I believe that this will reset after 30 days of no violations. But check the manual. If you are still having trouble, go to splunk.com and file a support ticket.
thanks for your response well i tried this and i do have license violation issues but its was few months back ... could you please advise me if this is due to license violation than how much time it will take to start getting events back...
If you are running an enterprise trial license, you only get 3 violations. If you are running the full enterprise, you are allowed 5 violations in a 30-day period.
Run this search to see your violation status:
index=_internal source=*license_audit.log |
eval MB_indexed_today = round(todaysBytesIndexed / (1024 * 1024),1) |
table _time log_level quotaExceededCount lastExceededDate, MB_indexed_today
Run this search to see the status of your forwarders:
index="_internal" source="*metrics.log" group=tcpin_connections |
eval sourceHost=if(isnull(hostname), sourceHost,hostname) |
eval connectionType=case(fwdType=="uf","Universal Forwarder", fwdType=="lwf", "Light Weight Forwarder",fwdType=="full", "Splunk Indexed", connectionType=="cooked" or connectionType=="cookedSSL","Splunk Forwarder", connectionType=="raw" or connectionType=="rawSSL","Legacy Forwarder") |
eval build=if(isnull(build),"n/a",build) |
eval version=if(isnull(version),"pre 4.2",version) |
eval guid=if(isnull(guid),sourceHost,guid) |
eval os=if(isnull(os),"n/a",os)| eval arch=if(isnull(arch),"n/a",arch) |
eval my_splunk_server = splunk_server |
fields connectionType sourceIp sourceHost sourcePort destPort kb tcp_eps tcp_Kprocessed tcp_KBps my_splunk_server build version os arch |
eval lastReceived = if(kb>0, _time, null) |
stats first(sourceIp) as sourceIp first(connectionType) as connectionType first(sourcePort) as sourcePort first(build) as build first(version) as version first(os) as os first(arch) as arch max(_time) as lastConnected max(lastReceived) as lastReceived sum(kb) as kb avg(tcp_eps) as avg_eps by sourceHost |
stats first(sourceIp) as sourceIp first(connectionType) as connectionType first(sourcePort) as sourcePort first(build) as build first(version) as version first(os) as os first(arch) as arch max(lastConnected) as lastConnected max(lastReceived) as lastReceived first(kb) as KB first(avg_eps) as eps by sourceHost |
eval status = if(isnull(KB) or lastConnected<(info_max_time-900),"missing",if(lastConnected>(lastReceived+300) or KB==0,"quiet","active")) |
sort sourceHost
(And yes, I took the second search from the Splunk Deployment Monitor...)
Maybe this will help you see the problem. Run the searches over at least the last 7 days. Last 30 days might be even better.
thanks for your response well i tried this and i do have license violation issues but its was few months back ... could you please advise me if this is due to license violation than how much time it will take to start getting events back...
Also we have a license violation issues going on this Splunk server.
1. I have stop couple of splunk forwarder clients on my network.
2. It happened few weeks back too but that time we just restart the splunk it started again. But after that again it went down.
Please assist me which log i should check to see if i am getting events.
We have enterprise license with license level of 500MB.
version 4.0.4, build 67724
I have around 100 host has been setup for forwarding.
Everything looks good but when running # netstat it is showing "FIN_WAIT_2" & "CLOSE_WAIT".
etherealtrace
tshark -i bnx0 port 9997
Capturing on bnx0
0.000000 192.10.21.12 -> 192.10.21.121 TCP 40711 > palace-6 [ACK] Seq=1 Ack=1 Win=1460 Len=0 TSV=3020080923 TSER=789430684
0.000034 192.10.21.12 -> 192.10.21.121 TCP [TCP ZeroWindow] [TCP ACKed lost segment] palace-6 > 40711 [ACK] Seq=1 Ack=2 Win=0 Len=0 TSV=789442682 TSER=3019755295
0.267301 192.10.21.12 -> 192.10.21.121 TCP 34658 > palace-6 [ACK] Seq=1 Ack=1 Win=46 Len=0 TSV=2742660555 TSER=789430709
0.267310 192.10.21.12 -> 192.10.21.121 TCP [TCP ZeroWindow] [TCP ACKed lost segment] palace-6 > 34658 [ACK] Seq=1 Ack=2 Win=0 Len=0 TSV=789442709 TSER=2742548382
Could you provide us more information? like what version of splunk you are running and on what platform, how many forwarder sending data to splunk?, what type of forwarder is it?, are you using free version or the enterprise version?
First thing you can check if your splunk server logs are getting indexed in splunk. If its indexing then check the network connectivity from forwarder to the servers(if you have firewalls in between). If connectivity is fine, check on the forwarder configuration and logs for any errors.