I was stuck on this issue many years ago, and after exhausting efforts to get splunk to understand that some legacy systems are rigid and only expect things one way.
The easiest, reliable solution I found was to use a python script to output all fields into double quotes. Because my data set was giant and the aggregate included all types of characters, it was particularly hard to get a regex or pattern-based solution to work- I wouldn't trust one either. It took ~30 minutes per run across millions of records and external lookups.
This was the only way to get a reliable and consistent csv export and I scheduled it in cron to run daily and added the -24H interval in the query itself.
I don't get why it wasn't a default mode, but whatever. Forces you away from the legacy structure.
You have two methods to get it done:
Simply use the request library to get the results as output and then parse through them. This will be easier, but will probably take long for big jobs. https://www.splunk.com/blog/2011/08/02/splunk-rest-api-is-easy-to-use.html
The other one is to use Splunk's Python SDK to create jobs, then stream the results, while processing them. http://dev.splunk.com/python
https://www.splunk.com/blog/2013/09/15/exporting-large-results-sets-to-csv.html
Another possible option for you is to get a JSON/XML output and have a back end script that converts it to your liking. This might be a bit more hacky, but easier to mock up.
... View more