Deployment Architecture

Demand based script execution of script on UF

santosh_sshanbh
Path Finder

My splunk architecture is as below:

UF - > HF -> IX - SH

Here, UF and HF are in same network where as IX and SH are in Splunk cloud. I need to run certain script on UF only on user request. I can not schedule it on time basis. However, considering my SH is in AWS cloud and out of the network boundry, how can I run the script based on user demand. Is there any solution/workaround for this?

Labels (2)
0 Karma
1 Solution

anmolpatel
Builder

If the use case must have the onclick() functionality and based on @nickhillscpl highlighted concerns, you can explore the Hybrid search approach. Where you will setup a SH on prem and have it in a secured zone.
https://answers.splunk.com/answers/432244/i-am-a-splunk-cloud-customer-what-is-hybrid-search.html

View solution in original post

0 Karma

anmolpatel
Builder

If the use case must have the onclick() functionality and based on @nickhillscpl highlighted concerns, you can explore the Hybrid search approach. Where you will setup a SH on prem and have it in a secured zone.
https://answers.splunk.com/answers/432244/i-am-a-splunk-cloud-customer-what-is-hybrid-search.html

0 Karma

santosh_sshanbh
Path Finder

This also looks to be a bit simple alternative. Thanks for your valuable inputs.

0 Karma

nickhills
Ultra Champion

What determines when the script runs?
Are you trying to make a search (on the SH) fire a script action on the UF?
What does the script on the UF have to do?

If my comment helps, please give it a thumbs up!
0 Karma

santosh_sshanbh
Path Finder

I want to trigger the script when user clicks the Submit button on a dashboard. Does the purpose of the script matters how it can be done? Anyways, the script needs to read the server local time and few other details.

0 Karma

nickhills
Ultra Champion

You will need to do quite a bit of development outside of Splunk to make this work.

You would need:
a.) Some javascript on your dashboard to catch the click event from the submit button
b.) Some javascript to capture the value of the dropdown/text box or the hostname of your target workstation
c.) An application server to which you fire the event from the Splunk UI, and that server will need to orchestrate the remote data collection
d.) A client installed on your remote UF to communicate with your application server - or - the application server needs some mechanism to be able to remotely invoke a powershell or some other process.
e.) The results of the remote script need to be sent to Splunk for indexing
f.) Some javascript on your dashboard to pause and wait for the recently triggered results to be indexed, before despatching a search to reflect the results.

All of the above is totally possible, but you would need to develop and build this yourself.

You might be able to get some of the way using Splunk workflows and/or Phantom, but you are straying quite far from "conventional" Splunk usage.

Is there any reason you cant use a scripted input and schedule the script to run every 5 minutes? That would get you very close to the desired outcome with very little effort.

If my comment helps, please give it a thumbs up!
0 Karma

santosh_sshanbh
Path Finder

Thanks for such a detailed approach.

One QQ about #c. Is this technically possible even if your SH and Application servers are not in the same domain? I mean one in Splunk Cloud and another in private network?

The reason behind exploring this option vs running scripted input is because the data I want to get from UF is needed very rarely at random point in time. So don't see a value in indexing it at regular intervals.

But I think I need to balance between solution complexity vs data availability here. Anyways, thanks once again for your comments.

0 Karma

nickhills
Ultra Champion

An application server capable of orchestrating this action needs to be able to communicate with your UFs and be reachable by the SH.
Again this is all totally feasible, but if you are considering exposing endpoints across the internet, be very certain that the application is sufficiently robust to resist misuse, especially if it acts as a gatekeeper to run scripts on your endpoints.

To be clear..
Nothing about my proposed solution sounds like a good idea to me! - I certainly would not be happy with this approach, so I wasn't trying to say thats how you "should" do it. 🙂

The approach I would take is the scripted input and I would make sure that the output is as concise (small, in bytes) as possible. As you say there is a tradeoff between indexed data which may never be used, and functionality.

If my comment helps, please give it a thumbs up!
0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...