All Apps and Add-ons

Rapid7 Nepose Tech Add-on, how long should it take before you start seeing data?

packet_hunter
Contributor

So I got everything up and running with the R7 nexpose ta installed, but I don't see any data yet and it has been a couple hours.
in the var\log\splunk\TA-rapid7_nexpose I see the following:

nx_logger:38 - Executing nexpose_setup.py
2017-06-22 12:37:58,076 INFO    nx_logger:38 - Executing nexpose_setup.py
2017-06-22 12:37:58,848 INFO    nx_logger:38 - Executing nexpose_setup.py
2017-06-22 12:37:58,851 INFO    nx_logger:38 - Listing the fields for the set up screen...
2017-06-22 12:38:00,880 INFO    nx_logger:38 - Executing nexpose_setup.py
2017-06-22 12:38:01,638 INFO    nx_logger:38 - Executing nexpose_setup.py
2017-06-22 12:41:34,385 INFO    nx_logger:38 - Executing nexpose_setup.py
2017-06-22 12:41:35,150 INFO    nx_logger:38 - Executing nexpose_setup.py
2017-06-22 12:41:35,151 INFO    nx_logger:38 - Listing the fields for the set up screen...

does that look like it is building or running?
Thank you

0 Karma
1 Solution

packet_hunter
Contributor

Thank you for all the responses.

So the root cause was that I did not enable/create new data input for Rapid7 Nexpose on the HF.

In comparison to other TA(s) installed on a distributed architecture (i.e. HF) I would not have to enable the data inputs via the HF gui, I would only configure the inputs and outputs conf on the HF.

This TA threw me a curve ball because I did not want to accidentally configure the HF as an indexer. The TA was easy to install in test in my dev standalone/all in one environment, but it was not clear as to what need to be added in a distributed environment.

To recap, all I needed to do to get it working was add a new data input (via the HF gui) for R7 nexpose.

I hope that makes sense.

Thank you

View solution in original post

0 Karma

packet_hunter
Contributor

Thank you for all the responses.

So the root cause was that I did not enable/create new data input for Rapid7 Nexpose on the HF.

In comparison to other TA(s) installed on a distributed architecture (i.e. HF) I would not have to enable the data inputs via the HF gui, I would only configure the inputs and outputs conf on the HF.

This TA threw me a curve ball because I did not want to accidentally configure the HF as an indexer. The TA was easy to install in test in my dev standalone/all in one environment, but it was not clear as to what need to be added in a distributed environment.

To recap, all I needed to do to get it working was add a new data input (via the HF gui) for R7 nexpose.

I hope that makes sense.

Thank you

0 Karma

packet_hunter
Contributor

also, fyi the inputs file was created in myvolume:\splunk_Forwarder\etc\apps\search\local\inputs.conf

I was not expecting that.

Thank you

0 Karma

maciep
Champion

Glad it's working for you now!

Keep in mind, when you go to settings->data inputs, the input you create will be in the app you launched it from. So for example, in this case, I'm guessing you were in the Search & Reporting app when you navigated to the inputs page to create the R7 input.

0 Karma

packet_hunter
Contributor

should it be moved to the R7 app inputs conf? or just leave it?

0 Karma

maciep
Champion

I mean, it will work where it is - just as long as you know it's there. For example, if you wanted to migrate this app to another hf or something. Just be aware it's not in the rapid7 TA. Or you could move it to the local folder in the TA as well. Really up to you and how you want to manage that hf.

0 Karma

packet_hunter
Contributor

I understand , I will note the location. Hopefully this post will help others or maybe a future R7 guide will explain the setup in more detail. Thx

0 Karma

maciep
Champion

Doesn't look like it. I think you should see it query your sites for assets and vulnerabilities in that log once the input is actually running. Did you just set it up today? IIRC, I think the input runs at maybe 4am every day by default? So if you did just set it up and didn't change the schedule, maybe check on it tomorrow?

0 Karma

packet_hunter
Contributor

Agreed, something is not right.

So I tested the R7 Nexpose TA and App in my dev instance, and it worked fine. Now I am trying to install it in a distributed environment.

I have a HF calling the R7 API and then sending it to my indexers.

I installed that R7 Nexpose TA on the HF.
I used the correct creds and IP and PORT in the setup.
I noticed on the heavy fwdr under settings indexes that the Rapid7 index was setup there. I am not sure about an index of the HF.

On the HF, .../etc/system/local I have my inputs conf file, with just
[default]
host = MySPLNKFRName

On the HF .../etc/system/local I have my outputs conf file (to my indexers)

Some one before me created a indexes conf file, do I need to add [rapid7] to that?

The only other conf files pertaining to Rapid7 are in .../etc/apps/TA-rapid7_nexpose
in /local I have app, nexpose_details, passwords
I also have all the default conf files under .../TA-rapid7_nexpose/default

Is there a way to trouble shoot this?

Thank you

0 Karma

maciep
Champion

I didn't work on it here and haven't looked at the instructions lately but we were able to get data. Do you have any config for rapid7 in any of your inputs.conf files?

Is that still all you see in the TA-rapid7_nexpose logs? you don't see any entries like this?

2017-04-26 09:28:01,819 INFO    nx_logger:38 - Setting sites
2017-04-26 09:28:01,819 INFO    nx_logger:38 - Connecting Nexpose client
2017-04-26 09:28:02,457 INFO    nx_logger:38 - Querying all sites.

Side note: if you want to use the index they define, then you will need to create it on your indexers - having it just on the hf won't help. Or, if you want this data in another index, specify it in your inputs (if you can find them)

0 Karma
Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...