All Apps and Add-ons

Where To Install Qualys TA

vnguyen46
Contributor

Hi,

I installed both Qualys VM and Tech Add-On apps on one of the Search Heads in a distributed Splunk platform, followed the setup instruction but no data is populated in the dashboard. I couldn't find anywhere it went wrong, but want to confirm on which instances (SH, Indexer, heavy forwarder) I need to install the TA.

Thanks in advance.

0 Karma
1 Solution

vnguyen46
Contributor

Thanks for comment that helped to populate the data in Knowledgebase, but no where else. Do I need to specify value for "Cron entry or Interval", Host and Index? Any thing else, I need to configure to have data show up on other part of the dashboard?

Thanks,

View solution in original post

0 Karma

anandhalagarasa
Path Finder

@lakshman239 I have installed the (App: Qualys VM App for Splunk Enterprise) & (Add-On: Qualys Technology Add-on for Splunk) in my Search Head server. And also I have installed the
(Add-On: Qualys Technology Add-on for Splunk) in my Heavy Forwarder server.

I have few queries how to setup the app?

1.) When I launch the setup of Add-On in Heavy Frowarder it asks for QUALYS URI , User name and password. So should i need to key in the URI, Username and password only in Heavy Forwarder or also I need to key in the username and password for Search Head as well?

2.)We just need VM Detection and no WAS findings. So as per the answer provided i have created the host_detection in Heavy Forwarder -->Data Inputs-->TA-Qualys-Add-On and added New. I have provided the cron as 24h and Start Date as 2018-01-01T00:00:00Z . And i just left it as (main) the default index and saved it. In the host column I have just left the Heavy forwarder server name as it is. Then I went to the backend and changed the index name as qualys. These steps i have done in Heavy Forwarder. From disabled state I have moved to enabled state. I have also restarted the Heavy Forwarder service as well.

3.)Similarly I have created knowledge_base in Search Head -->Data Inputs-->TA-Qualys-Add-On and added New. I have provided the cron as 24h and Start Date as 2018-01-01T00:00:00Z . And i just changed the host name as Heavy forwarder server name and kept the index name as Qualys. From disabled state I have moved to enabled state.

4.) So should i need to add both host_detection & knowledge_base in Heavy Forwarder as well as in Search head or should i leave it as it is.

5.) Also is it mandate to provide the squid proxy information in both search head and heavy forwarder during setup? Or else will it work as it is.

Since when i search the data with index=qualys and hostname as heavyforwarder server i am not getting any information.

0 Karma

anandhalagarasa
Path Finder

Kindly help to check and help on this.

0 Karma

vnguyen46
Contributor

Thanks for comment that helped to populate the data in Knowledgebase, but no where else. Do I need to specify value for "Cron entry or Interval", Host and Index? Any thing else, I need to configure to have data show up on other part of the dashboard?

Thanks,

0 Karma

lakshman239
SplunkTrust
SplunkTrust

You would also need to setup the Qualys TA in the HF giving Qualys IP address etc... Then you can also configure 'host_detection' and 'was_findings' inputs [ go to Settings->Data Input-> Select Qualys and add them. Yes, you need to define cron for interval. Pls ensure you are using latest version of Qualys TA.

0 Karma

vnguyen46
Contributor

I added and enabled objects as advised:
host_detection and was_findings on HF
knowledge_base on SH
Cron: 0 1,13 * * *
Start date: 2018-01-01T00:00:00Z
Qualys a/c verified and no more error messages during re/setup.

I also created a new index (qualys) and found records on the SH:
qualys_knowledgebase_api 60 86.956%
qualys_detection_api 9 13.043%

Qualys VM App and TA are up to date.
Data still populated on Knowledgebase tab, but not where else.
Is there any setup that needs to be done on the VM app itself?

I really appreciate your inputs.

0 Karma

lakshman239
SplunkTrust
SplunkTrust

check to which index the data goes? default is I believe 'main' index. So, you may need to change inputs.conf to reflect your index. After configuration, you may need to restart the HF once and when you search for data, pls run 'All time' as Qualys pulls the scans from the past/based on start date ..

0 Karma

manish_singh_77
Builder

@lakshman239

We have installed "Qualys TA" in our HF instance, configured the credentials and data inputs for "fim events" but when we search the events in our search head it doesn't show up.

We get this error message : [idx-123-abc.splunkcloud.com] Search process did not exit cleanly, exit_code=255, description="exited with code 255". Please look in search.log for this peer in the Job Inspector for more info.

0 Karma

vnguyen46
Contributor

I am happy to say it now works.
host_detection and was_findings on HF
knowledge_base on SH
Interval: 24h
Start date: 2018-01-01T00:00:00Z
Keep the index as it is (main)
It's important to make sure the Splunk account access to Qualys has API privilege.

lakshman239 - many thanks for your help.

0 Karma

anandhalagarasa
Path Finder

@lakshman239 I have installed the (App: Qualys VM App for Splunk Enterprise) & (Add-On: Qualys Technology Add-on for Splunk) in my Search Head server. And also I have installed the
(Add-On: Qualys Technology Add-on for Splunk) in my Heavy Forwarder server.

I have few queries how to setup the app?

1.) When I launch the setup of Add-On in Heavy Frowarder it asks for QUALYS URI , User name and password. So should i need to key in the URI, Username and password only in Heavy Forwarder or also I need to key in the username and password for Search Head as well?

2.)We just need VM Detection and no WAS findings. So as per the answer provided i have created the host_detection in Heavy Forwarder -->Data Inputs-->TA-Qualys-Add-On and added New. I have provided the cron as 24h and Start Date as 2018-01-01T00:00:00Z . And i just left it as (main) the default index and saved it. In the host column I have just left the Heavy forwarder server name as it is. Then I went to the backend and changed the index name as qualys. These steps i have done in Heavy Forwarder. From disabled state I have moved to enabled state. I have also restarted the Heavy Forwarder service as well.

3.)Similarly I have created knowledge_base in Search Head -->Data Inputs-->TA-Qualys-Add-On and added New. I have provided the cron as 24h and Start Date as 2018-01-01T00:00:00Z . And i just changed the host name as Heavy forwarder server name and kept the index name as Qualys. From disabled state I have moved to enabled state.

4.) So should i need to add both host_detection & knowledge_base in Heavy Forwarder as well as in Search head or should i leave it as it is.

5.) Also is it mandate to provide the squid proxy information in both search head and heavy forwarder during setup? Or else will it work as it is.

Since when i search the data with index=qualys and hostname as heavyforwarder server i am not getting any information.

0 Karma

lakshman239
SplunkTrust
SplunkTrust

You only need Host detection and enabled in the Heavy forwarder. No need to enable in SH. Search using 'All time'. The username which you are using to connect to qualys API should have access to API to pull host_detection (vulnerability management). If you still have issues, pls post it as a new question.

0 Karma

vnguyen46
Contributor

Lakshman239 has it all and I'd like to add how I set it up in my system, in case you are not there yet:

  1. Verify your the Qualys a/c being used in Splunk has right to access APIs.
  2. On HF - I am in US Central using: https://www.qualysapi.qualys.com:443
  3. Also on HF - installed TA only and added host_detection and was_findings
  4. On SH - installed both TA and VM App. Configure TA as on HF, but add knowledge_base only.

Hope that helps.

0 Karma

lakshman239
SplunkTrust
SplunkTrust

You need to install the TA on the SH and HF. On the HF, you also need to enable the inputs for Vulnerability/host detection to collect stats. On the SH, you can enable the knowledge base inputs.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...