Splunk Dev

Splunk Java - result JSON is huge- ~1.9MB - not being sent to javascript/client

1234testtest
Path Finder

Hi,
We are using SPlunk java sdk and it returns result in json format. We need _raw details also- so our Splunk query includes _raw data.
The problem we are facing is- the resulting output- data is huge and is breaking beyond a point. We got to know that JSON data has a limit of 1,000,000 characters in length.
Are there any suggestions on how we can circumvent this.
We are using strus2, jquery

Tags (3)

Damien_Dallimor
Ultra Champion

Do you have an example of the actual error/exception stacktrace your code is throwing ?

0 Karma

fross_splunk
Splunk Employee
Splunk Employee

This isn't in the SDK level, but in the underlying JSON libraries that we don't control. This isn't rare. So here's how you do this in general so you don't have to deal with giant hunks of data being sent over the pipe. It will also let you resume more straightforwardly in case the connection gets dropped for some reason.

When you call getResults on the Job, you can pass two argument: count and offset. offset is the number of records at the beginning to skip, and count is the number to return after skipping. So you call getResults with offset 0 and count 100, parse those, then call again with offset 100 and count 100, and then offset 200 and count 100, etc.

Here's some code, starting from your example where you've defined jss and waited for it to complete:

int nEventsPerRequest = 100;
JobResultsArgs oparg = new JobResultsArgs(); // This has convenience methods for getting results
oparg.setOffset(n);
for (int offset = 0; offset < jss.getResultCount(); offset += nEventsPerRequest) {
oparg.setCount(nEventsPerRequest);
InputStream res = jss.getResults(oparg);
ResultsReaderJson resultsReader = new ResultsReaderJson(res);
// ...process the results in this batch...
}

I haven't tested that, so it might have fencepost errors, but that will take care of having too much data in the stream.

1234testtest
Path Finder

The splunk query returns raw details also which is being sent in form of json. The length of this raw is not constant.. Here is where I have the challenge.

0 Karma

davidfstr
Explorer

That code looks about right to me. Where did you see the 1,000,000 character limit that you mention?

1234testtest
Path Finder

I feel I'm not doing streaming right. Trimming the code bcoz of char restriction.Is streaming rightly done here from Splunk side - or I should take care from UI.

ServiceArgs loginArgs = new ServiceArgs();
//populate loginargs
Service service = Service.connect(loginArgs);
// Retrieve the new saved search
SavedSearch savedSearch = service.getSavedSearches().get("mysearch");
Job jss= savedSearch.dispatch();
while (!jss.isDone()) {
Thread.sleep(500);}}
Args oparg= new Args();//put json
InputStream res = jss.getResults(oparg);
ResultsReaderJson resultsReader = new ResultsReaderJson(results);

0 Karma

davidfstr
Explorer

The Java result readers (including JSON) stream back results, so they aren't usually constrained by data size limits.

JSON data has a limit of 1,000,000 characters in length.

Did you find this limit with a single row, a single _raw value, the entire JSON stream, or something else?

Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...