My customer has some very large csv files that are only updating about 200 events/rows of the data into Splunk.
"As per checking MDMS-001_Meter Reads Requested file have 4,633, but in Splunk it only have 208 events. "
How can I get the entire csv file to upload? I am using universal forwarders for the ingest.
| makeresults
| eval _raw="your_csv_copy_and_paste
....."
| multikv forceheader=1
| table your_csv_header
maybe, works.
Why not use collect
OR outputcsv
as needed?
Those are pretty small files. I regularly upload csv files that are several million records without any issues.
You probably have a parsing error of some sort.
Get a sample file, and try a manual ingestion. If it is not resolving, that probably indicates that there is a formatting error with the csv file itself. Look for invalid data fields.
4,633 is rows is not too large for a UF. Does the forwarder log any errors? Is the customer filtering any data from the file/sourcetype?