Getting Data In

With indexAndForward=true on a heavy forwarder, how do we route data to an index with a different name on the target indexer?

hemendralodhi
Contributor

Hi Team,

We are planning to migrate our existing indexed data to a new Enterprise Server which is up and running, serving other depts. Our plan is to make an existing server as a heavy forwarder which will send data to Enterprise indexer. Before that, for testing and minimizing the outage, we want to enable indexAndForward=true so that existing instance and new instance will receive same data and user can still work on the existing system while the new system will be tested.

My question: We must change index name on the new system to adhere policy, so how will data routing will work? Say for e.g. on the existing system, data comes in index "abc" and on target there will be another index "def" which should mimic its functionality. Of course we will change the knowledge object to reflect the new index name, but how will data route which comes on index=abc to index=def with indexAndForward=true?

Please help!!

Thanks
Hemendra

0 Karma

woodcock
Esteemed Legend

you could do this with aCLONE_SOURCETYPE although it does double the parse load and probably also the license cost (not sure about license). This basically sticks the event back in the top of the indexing pipeline though you do have to change the sourcetype (so it would be useful to be able to do it at INDEX_AND_FORWARD time). You could use RENAME to change the sourcetype back to the original value:

CLONE_SOURCETYPE = <string>
* This name is wrong; a transform with this setting actually clones and
  modifies events, and assigns the new events the specified sourcetype.

* If CLONE_SOURCETYPE is used as part of a transform, the transform will
  create a modified duplicate event, for all events that the transform is
  applied to via normal props.conf rules.
* Use this feature if you need to store both the original and a modified
  form of the data in your system, or if you want to send the original and a
  modified form to different outbound systems.
  * A typical example would be to retain sensitive information according to
    one policy and a version with the sensitive information removed
    according to another policy.  For example, some events may have data
    that you must retain for 30 days (such as personally identifying
    information) and only 30 days with restricted access, but you need that
    event retained without the sensitive data for a longer time with wider
    access.
* Specifically, for each event handled by this transform, a near-exact copy
  is made of the original event, and the transformation is applied to the
  copy.  The original event will continue along normal data processing
  unchanged.
* The <string> used for CLONE_SOURCETYPE selects the sourcetype that will be
  used for the duplicated events.
* The new sourcetype MUST differ from the the original sourcetype.  If the
  original sourcetype is the same as the target of the CLONE_SOURCETYPE,
  Splunk will make a best effort to log warnings to splunkd.log, but this
  setting will be silently ignored at runtime for such cases, causing the
  transform to be applied to the original event without cloning.
* The duplicated events will receive index-time transformations & sed
  commands all transforms which match its new host/source/sourcetype.
  * This means that props matching on host or source will incorrectly be
    applied a second time. (SPL-99120)
* Can only be used as part of of an otherwise-valid index-time transform.  For
  example REGEX is required, there must be a valid target (DEST_KEY or
  WRITE_META), etc as above.

http://docs.splunk.com/Documentation/Splunk/latest/Admin/transformsconf
http://docs.splunk.com/Documentation/Splunk/6.2.0/Forwarding/Routeandfilterdatad#Perform_selective_i...

0 Karma

hemendralodhi
Contributor

Thanks for the input woodcock. I have come up with below config, does this looks good?

props.conf:

[orig_sc]

TRANSFORM-orig_sc = clone_orig_sc

transforms.conf:

[clone_orig_sc]
REGEX = .
DEST_KEY = _raw
CLONE_SOURCETYPE = new_sc

[new_sc]
REGEX = .
DEST_KEY = _MetaData:Index
FORMAT =

Thanks for all your help.

0 Karma

woodcock
Esteemed Legend

Yes, that should work (but I have not done this myself so I am going by the dox, same as you are).

0 Karma

hemendralodhi
Contributor

Thanks, I will try to test this and update the result.

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...