Reporting

Can all fields be outputted with outputcsv in double quotes?

burras
Communicator

We are currently using the outputcsv command to generate a report for one of our support teams. Overall it works great but they did have one request - currently only fields that have a special character in them get included in double quotes; all other fields are simply comma separated. To make life easier on that time we'd like to accomplish one of two goals: 1) have all fields included in double quotes, or 2) be able to output in a structured format other than CSV such as PSV or tilde-separated (their choice, not mine).

Our overall search is pretty standard - it's just a standard listing of output fields: |table field1 field2 field3 ... field X

Any help is greatly appreciated!

1 Solution

hasan300zx
Engager

I was stuck on this issue many years ago, and after exhausting efforts to get splunk to understand that some legacy systems are rigid and only expect things one way.

The easiest, reliable solution I found was to use a python script to output all fields into double quotes. Because my data set was giant and the aggregate included all types of characters, it was particularly hard to get a regex or pattern-based solution to work- I wouldn't trust one either. It took ~30 minutes per run across millions of records and external lookups.

This was the only way to get a reliable and consistent csv export and I scheduled it in cron to run daily and added the -24H interval in the query itself.

I don't get why it wasn't a default mode, but whatever. Forces you away from the legacy structure.

You have two methods to get it done:
Simply use the request library to get the results as output and then parse through them. This will be easier, but will probably take long for big jobs. https://www.splunk.com/blog/2011/08/02/splunk-rest-api-is-easy-to-use.html

The other one is to use Splunk's Python SDK to create jobs, then stream the results, while processing them. http://dev.splunk.com/python
https://www.splunk.com/blog/2013/09/15/exporting-large-results-sets-to-csv.html

Another possible option for you is to get a JSON/XML output and have a back end script that converts it to your liking. This might be a bit more hacky, but easier to mock up.

View solution in original post

0 Karma

hasan300zx
Engager

I was stuck on this issue many years ago, and after exhausting efforts to get splunk to understand that some legacy systems are rigid and only expect things one way.

The easiest, reliable solution I found was to use a python script to output all fields into double quotes. Because my data set was giant and the aggregate included all types of characters, it was particularly hard to get a regex or pattern-based solution to work- I wouldn't trust one either. It took ~30 minutes per run across millions of records and external lookups.

This was the only way to get a reliable and consistent csv export and I scheduled it in cron to run daily and added the -24H interval in the query itself.

I don't get why it wasn't a default mode, but whatever. Forces you away from the legacy structure.

You have two methods to get it done:
Simply use the request library to get the results as output and then parse through them. This will be easier, but will probably take long for big jobs. https://www.splunk.com/blog/2011/08/02/splunk-rest-api-is-easy-to-use.html

The other one is to use Splunk's Python SDK to create jobs, then stream the results, while processing them. http://dev.splunk.com/python
https://www.splunk.com/blog/2013/09/15/exporting-large-results-sets-to-csv.html

Another possible option for you is to get a JSON/XML output and have a back end script that converts it to your liking. This might be a bit more hacky, but easier to mock up.

0 Karma

burras
Communicator

Thanks - knowing that there weren't any options out there to do it directly in Splunk I ended up cobbling together a gawk script that was able to convert the output to a tilde-separated field.

0 Karma
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...