Getting Data In

http event collector in monitoring - kuberentes, not posting data

HyderAli
New Member

Hi,
I have gone through this tutorial https://www.outcoldsolutions.com/docs/monitoring-kubernetes/ for monitoring kubernetes via splunk. Have enabled HTTP Event Collector ,

curl -k https://hyder-hp:8088/services/collector/event -H 'Authorization: Splunk cf597434-503b-4dca-be82-aa03e828e1dc' -d '{"sourcetype": "mysourcetype", "event":"http auth ftw!"}'

Output : {"text":"Success","code":0}

Here are the modifications done to Collectorforkubenetes.yaml

.......
.......
# Splunk output
[output.splunk]

# Splunk HTTP Event Collector url
url = https://hyder-hp:8088/services/collector/event/1.0

# Splunk HTTP Event Collector Token
token = cf597434-503b-4dca-be82-aa03e828e1dc

# Allow invalid SSL server certificate
insecure = true

........
........

After Successfully running the .yaml file,
Output:
serviceaccount "collectorforkubernetes" created
clusterrole "collectorforkubernetes" created
clusterrolebinding "collectorforkubernetes" created
configmap "collectorforkubernetes" created
daemonset "collectorforkubernetes" created

Kubectl get pods gives :

NAME READY STATUS RESTARTS AGE
collectorforkubernetes-dsc86 1/1 Running 0 4m

There is nothing showing up in the dashboard
alt text

And while running the logs of the pod ,

*kubectl logs collectorforkubernetes-dsc86 : *

INFO 2017/12/13 04:28:23.365705 table.go:57: Found uncommitted write ahead log data/file_ackdb.tbl.l.
INFO 2017/12/13 04:28:23.511141 watcher.go:95: watching /rootfs/var/lib/docker/containers//(glob = /-json.log*, match = )
INFO 2017/12/13 04:28:23.511190 watcher.go:95: watching /rootfs/var/log//(glob = , match = ^(syslog|messages)(.\d+)?$)
INFO 2017/12/13 04:28:23.511201 watcher.go:95: watching /rootfs/var/log//(glob = , match = ^[\w]+.log(.\d+)?$)
INFO 2017/12/13 04:28:23.511478 watcher.go:150: added file /rootfs/var/lib/docker/containers/01d3854e9de693bf425b9faeb137aeed85fa468776c8dc59c3966b05f1a883cc/01d3854e9de693bf425b9faeb137aeed85fa468776c8dc59c3966b05f1a883cc-json.log
INFO 2017/12/13 04:28:23.511645 watcher.go:150: added file /rootfs/var/log/alternatives.log
INFO 2017/12/13 04:28:23.589434 watcher.go:150: added file /rootfs/var/log/syslog
INFO 2017/12/13 04:28:23.678119 watcher.go:150: added file /rootfs/var/lib/docker/containers/1781c6234f92e099efa40a2531f918d9af6709bdf1a0a1202afaa7e63d60660d/1781c6234f92e099efa40a2531f918d9af6709bdf1a0a1202afaa7e63d60660d-json.log
INFO 2017/12/13 04:28:23.769603 watcher.go:150: added file /rootfs/var/log/alternatives.log.1
INFO 2017/12/13 04:28:23.944464 watcher.go:150: added file /rootfs/var/log/syslog.1
INFO 2017/12/13 04:28:24.047542 watcher.go:150: added file /rootfs/var/lib/docker/containers/20d138cc46d2c9d2685b59a2f1853b7e77486bc615a294392a02679178cd973d/20d138cc46d2c9d2685b59a2f1853b7e77486bc615a294392a02679178cd973d-json.log
INFO 2017/12/13 04:28:24.151832 watcher.go:150: added file /rootfs/var/lib/docker/containers/34426ab841154492d1a3cbcbc523f493796738144b19ccc6a0b014ce9b043eb8/34426ab841154492d1a3cbcbc523f493796738144b19ccc6a0b014ce9b043eb8-json.log
INFO 2017/12/13 04:28:25.055018 watcher.go:150: added file /rootfs/var/lib/docker/containers/54c07e9146a60fc0896c40e0ca7fca1fee23de682bcc121709c2d83ff4ac116a/54c07e9146a60fc0896c40e0ca7fca1fee23de682bcc121709c2d83ff4ac116a-json.log
INFO 2017/12/13 04:28:25.338286 watcher.go:150: added file /rootfs/var/log/apport.log
INFO 2017/12/13 04:28:25.616821 watcher.go:150: added file /rootfs/var/lib/docker/containers/67688d28382d7fb0a7eeabd18aa5927f1d7f0634a562e4cf02f2727bc5eda64f/67688d28382d7fb0a7eeabd18aa5927f1d7f0634a562e4cf02f2727bc5eda64f-json.log
INFO 2017/12/13 04:28:25.738237 watcher.go:150: added file /rootfs/var/log/apport.log.1
INFO 2017/12/13 04:28:26.000327 watcher.go:150: added file /rootfs/var/lib/docker/containers/7c9b5d3879b308bd7cac8abef5e5554e1832245821d997386c75681a2678d49f/7c9b5d3879b308bd7cac8abef5e5554e1832245821d997386c75681a2678d49f-json.log
INFO 2017/12/13 04:28:26.142068 watcher.go:150: added file /rootfs/var/log/auth.log
INFO 2017/12/13 04:28:26.212208 watcher.go:150: added file /rootfs/var/log/auth.log.1
INFO 2017/12/13 04:28:26.587798 watcher.go:150: added file /rootfs/var/lib/docker/containers/7ff9f66e310a375c7ef9cd9a6d99ff6b750af32ae11b98a3817c7fac25867adf/7ff9f66e310a375c7ef9cd9a6d99ff6b750af32ae11b98a3817c7fac25867adf-json.log
INFO 2017/12/13 04:28:26.624154 watcher.go:150: added file /rootfs/var/lib/docker/containers/801f2e6be69139837a56c938980a465eba9466a5c5159d1271981af45a818ce2/801f2e6be69139837a56c938980a465eba9466a5c5159d1271981af45a818ce2-json.log
INFO 2017/12/13 04:28:26.650901 watcher.go:150: added file /rootfs/var/log/bootstrap.log
INFO 2017/12/13 04:28:26.909172 watcher.go:150: added file /rootfs/var/log/dpkg.log
INFO 2017/12/13 04:28:27.270508 watcher.go:150: added file /rootfs/var/log/dpkg.log.1
INFO 2017/12/13 04:28:27.445424 watcher.go:150: added file /rootfs/var/lib/docker/containers/82976c864d2d111484417c122e0255a85f86a8d3ff0384e07ab0ac68df22ee61/82976c864d2d111484417c122e0255a85f86a8d3ff0384e07ab0ac68df22ee61-json.log
INFO 2017/12/13 04:28:27.470496 watcher.go:150: added file /rootfs/var/log/fontconfig.log
INFO 2017/12/13 04:28:27.587524 watcher.go:150: added file /rootfs/var/lib/docker/containers/82ec27a34ab1b14b014afb3df1e88a5c2ffb19e03cf1def362bc0083f61f8e29/82ec27a34ab1b14b014afb3df1e88a5c2ffb19e03cf1def362bc0083f61f8e29-json.log
INFO 2017/12/13 04:28:27.612015 watcher.go:150: added file /rootfs/var/log/kern.log
INFO 2017/12/13 04:28:27.727253 watcher.go:150: added file /rootfs/var/lib/docker/containers/8b6aa14e6fcbd3077443eb98fa561c0a8f963baaefddb84a8a2f4a742cea87f7/8b6aa14e6fcbd3077443eb98fa561c0a8f963baaefddb84a8a2f4a742cea87f7-json.log
INFO 2017/12/13 04:28:27.751950 watcher.go:150: added file /rootfs/var/log/kern.log.1
INFO 2017/12/13 04:28:27.885483 watcher.go:150: added file /rootfs/var/lib/docker/containers/90e2ca7a848efca6c1eabfca9752b3ec22c320c80934973efab342ea3d5ca2f9/90e2ca7a848efca6c1eabfca9752b3ec22c320c80934973efab342ea3d5ca2f9-json.log
INFO 2017/12/13 04:28:28.193462 watcher.go:150: added file /rootfs/var/lib/docker/containers/919dc311f3e02f05a5360dbf5dda91d79a7c853fd7fde4cad3e733d19a4d8852/919dc311f3e02f05a5360dbf5dda91d79a7c853fd7fde4cad3e733d19a4d8852-json.log
INFO 2017/12/13 04:28:28.230360 watcher.go:150: added file /rootfs/var/lib/docker/containers/99a6c4fb0f5f8425bc23c69df7f189a70eb0b74769f88a00be27bf58ee475e2b/99a6c4fb0f5f8425bc23c69df7f189a70eb0b74769f88a00be27bf58ee475e2b-json.log
INFO 2017/12/13 04:28:28.266881 watcher.go:150: added file /rootfs/var/lib/docker/containers/9f8589cb7c11dc255e0129f3652e65bd5e3f2133e91c1e48ae548064acdb9537/9f8589cb7c11dc255e0129f3652e65bd5e3f2133e91c1e48ae548064acdb9537-json.log
INFO 2017/12/13 04:28:28.291771 watcher.go:150: added file /rootfs/var/lib/docker/containers/aedc06cb535ce8aeb767ce74fba9b5a66a51ece8dc9744e7d8d1d73c5b74abef/aedc06cb535ce8aeb767ce74fba9b5a66a51ece8dc9744e7d8d1d73c5b74abef-json.log
INFO 2017/12/13 04:28:28.316743 watcher.go:150: added file /rootfs/var/lib/docker/containers/b11267bca6a318b1701524fc930cb9637a4986d63c0ee075218768f9834eda67/b11267bca6a318b1701524fc930cb9637a4986d63c0ee075218768f9834eda67-json.log
INFO 2017/12/13 04:28:28.341591 watcher.go:150: added file /rootfs/var/lib/docker/containers/bc56ddcd30f1e5e47b72ed2d757afac70b4f01d9fadc65dcb93c7aa3e41ec9a6/bc56ddcd30f1e5e47b72ed2d757afac70b4f01d9fadc65dcb93c7aa3e41ec9a6-json.log
INFO 2017/12/13 04:28:28.366598 watcher.go:150: added file /rootfs/var/lib/docker/containers/bfe95c93fd39cc124c654ee2a7b000501873cf459ee832512f621a453e01085b/bfe95c93fd39cc124c654ee2a7b000501873cf459ee832512f621a453e01085b-json.log
WARN 2017/12/13 04:28:33.110443 license_check_pipe.go:71: License check failed. Post https://license.outcold.solutions/license/: dial tcp: lookup license.outcold.solutions on 10.96.0.10:53: read udp 10.32.0.4:48956->10.96.0.10:53: i/o timeout
INFO 2017/12/13 04:28:33.187422 license_check_pipe.go:102: license-check kubernetes 1 1515664227 2JVO6B5PRMU9N48S1OOC073RLG 1513072227 1513139302 2.1.59 1512864000 true false 1 Post https://license.outcold.solutions/license/: dial tcp: lookup license.outcold.solutions on 10.96.0.10:53: read udp 10.32.0.4:48956->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:28:43.122391 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:51897->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:29:03.146592 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:34426->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:29:23.168204 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:45269->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:29:53.192777 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:56197->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:30:13.232000 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:52971->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:30:33.257937 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:40248->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:31:03.296279 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:58759->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:31:23.334696 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:45521->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:31:43.361596 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:53483->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:32:03.389147 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:36995->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:32:33.430528 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:38517->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:33:03.461399 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:60887->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:33:33.494743 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:43087->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:34:03.543016 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:48696->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:34:33.588486 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:41473->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:35:03.629717 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:33995->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:35:33.687809 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:42563->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:36:03.735150 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:56025->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:36:33.782549 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:60746->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:37:03.825006 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:36986->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:37:33.865460 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:35806->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:38:03.903656 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:42185->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:38:33.940869 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:46103->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:39:03.984777 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:44937->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:39:34.022526 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:54327->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:40:04.066066 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:33246->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:40:34.105119 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:37577->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:41:04.138435 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:46024->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:41:34.177285 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:58183->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:42:04.219236 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:60234->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:42:34.266915 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:39147->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:43:04.313696 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:38303->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:43:34.350019 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:50993->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:44:14.387444 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:33476->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:44:44.431849 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:53721->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:45:14.473563 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:52843->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:45:44.506316 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:47573->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:46:29.588484 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:49841->10.96.0.10:53: i/o timeout
WARN 2017/12/13 04:46:59.591305 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:39333->10.96.0.10:53: read: connection refused
INFO 2017/12/13 04:47:03.514696 watcher.go:150: added file /rootfs/var/lib/docker/containers/d64050eb3b8a652a6984a4f4cbc4a515cd4418f65bc4b3bcf14723a82922092b/d64050eb3b8a652a6984a4f4cbc4a515cd4418f65bc4b3bcf14723a82922092b-json.log
WARN 2017/12/13 04:47:14.594003 splunk.go:147: Failed to post. Retrying in few seconds. Post https://hyder-hp:8088/services/collector/event/1.0: dial tcp: lookup hyder-hp on 10.96.0.10:53: read udp 10.32.0.4:51988->10.96.0.10:53: i/o timeout

*Please suffice any modifications *

0 Karma

nickhills
Ultra Champion

Looks like you have a DNS issue:

10.96.0.10:53: read udp 10.32.0.4:51897->10.96.0.10:53: i/o timeout

That error is while trying to resolve your hostname, and its failing to connect to the DNS server.
It looks like 10.96.0.10 is your DNS server, check it is accessible to your cluster.

If my comment helps, please give it a thumbs up!
0 Karma

outcoldman
Communicator

Hi @HyderAli,

How did you setup your kubernetes cluster?

Could you open the shell inside the cluster and try to do

kubectl exec -it collectorforkubernetes-xrn0k sh
$ ping google.com
$ ping 172.217.3.174
$ ping hyder-hp

Also find IP address of hyder-hp and try to ping it from collector as well. If ping by DNS does not work - you have issues with DNS, if ping by IP does not work you have issues with network. If both works - there is something else.

It is highly possible that you have some misconfiguration in kubernetes for DNS or network.

0 Karma

outcoldman
Communicator

And btw, if you have a DNS issue, please take a look on this article https://www.outcoldman.com/en/archive/2017/06/23/kubectl-setting-up-the-network/ (Pods could not resolve DNS part_ it is probably easy to fix.

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...