Splunk Search

Is it possible before doing indexing I want to ignore some lines from a log file?

saibal6
Path Finder

I have a Log file. below mentioned lines are available in that Log file. I want to ignore all lines after the entire data of line number 14.

  1. EVENT_SHURU;0;03/01/2018 01:00:04:4000;1;SHOP;;5040;1;0;0;;Imports for the MASTER STORE
  2. EVENT_SESH;0;03/01/2018 01:00:04:4000;1;SHOP;;1410;1;0;;;UNZIP SKIPPED DUE TO PRE-EXISTING FILES IN THE IMPORT DIRECTORY
  3. EVENT_SHURU;0;03/01/2018 01:00:04:4000;1;SHOP;;1360;1;0;0;;Start WSUPDATE execution
  4. EVENT_SESH;0;03/01/2018 01:00:04:5000;1;SHOP;;1360;1;0;;;End of WSUPDATE execution
  5. EVENT_SHURU;0;03/01/2018 01:00:06:0000;1;SHOP;;1030;1;0;0;;C:\SHOP\COMS\GARIA\IN\TAXES.TXT
  6. EVENT_GOL;0;03/01/2018 01:00:06:0000;1;SHOP;;1030;1;1;;;File not found : C:\SHOP\COMS\GARIA\IN\TAXES.TXT
  7. EVENT_SHURU;0;03/01/2018 01:05:03:0000;1;SHOP;;1940;IMPORT OF THE GIFTLIST;1;0;0;;C:\SHOP\COMS\GARIA\IN\GIFTLIST.TXT
  8. EVENT_GOL;0;03/01/2018 01:05:03:1000;1;SHOP;;1940;1;1;;;File not found : C:\SHOP\COMS\GARIA\IN\GIFTLIST.TXT
  9. EVENT_SESH;0;03/01/2018 01:05:03:1000;1;SHOP;;1940;1;0;;;Import canceled : import file not found
  10. EVENT_GOL;0;03/01/2018 01:05:04:6000;1;SHOP;;1410;1;1;;;Expected File #0001 not found
  11. EVENT_SESH;0;03/01/2018 01:05:04:6000;1;SHOP;;1410;1;0;;;END OF THE AUTOMATE UNPACKING PROCESS FOR THE STORE :
  12. EVENT_SHURU;0;03/01/2018 01:05:04:6000;1;SHOP;;1360;1;0;0;;Start WSUPDATE execution
  13. EVENT_SESH;0;03/01/2018 01:05:04:7000;1;SHOP;;1360;1;0;;;End of WSUPDATE execution
  14. EVENT_SESH;0;03/01/2018 01:05:04:7000;1;SHOP;;5040;1;0;;;END OF IMPORT PROCESS FOR THE MASTER STORE - PENDING_TXT(0)
  15. EVENT_SHURU;0;04/01/2018 06:30:37:1000;1;SHOP;;1970;1;0;0;;C:\SHOP\COMS\NEWTOWN\IN\0331\TILLLIST.TXT
  16. EVENT_GOL;0;04/01/2018 06:30:37:1000;1;SHOP;;1970;1;1;;;File not found : C:\SHOP\COMS\NEWTOWN\IN\0331\TILLLIST.TXT
  17. EVENT_SESH;0;04/01/2018 06:30:37:2000;1;SHOP;;1970;1;0;;;Import canceled : import file not found
  18. EVENT_SHURU;0;04/01/2018 06:30:37:2000;1;SHOP;;1010;1;0;0;;C:\SHOP\COMS\NEWTOWN\IN\0331\PROD.TXT
  19. EVENT_GOL;0;04/01/2018 06:30:37:2000;1;SHOP;;1010;1;1;;;File not found : C:\SHOP\COMS\NEWTOWN\IN\0331\PROD.TXT
  20. EVENT_SESH;0;04/01/2018 06:30:37:3000;1;SHOP;;1010;1;0;;;Import canceled : import file not found
  21. EVENT_SHURU;0;04/01/2018 06:30:37:3000;1;SHOP;;1080;1;0;0;;C:\SHOP\COMS\NEWTOWN\IN\0331\MTPRO.TXT
  22. EVENT_GOL;0;04/01/2018 06:30:37:4000;1;SHOP;;1080;1;1;;;File not found : C:\SHOP\COMS\NEWTOWN\IN\0331\MTPRO.TXT
  23. EVENT_SESH;0;04/01/2018 06:30:37:4000;1;SHOP;;1080;1;0;;;Import canceled : import file not found
  24. EVENT_SHURU;0;04/01/2018 06:30:37:5000;1;SHOP;;1180;1;0;0;;C:\SHOP\COMS\NEWTOWN\IN\0331\STAFF.TXT
  25. EVENT_GOL;0;04/01/2018 06:30:45:5000;1;SHOP;;1180;1;2;0;;0 salemen inserted, 6 salesmen updated.

I want to this entire ignoring process will be done before indexing in any log file in everyday.

Please tell me the whole process with the regular expression.(line numbers are not present in Log files, I have added it here only for help to understand after what line i want to ignore the other lines )

0 Karma

deepashri_123
Motivator
0 Karma

DalJeanis
Legend

1) The key word you are probably looking for is nullqueue. You can send data to the null queue by setting up a regex to match that data.

2) We have no idea why line 14 is different from any other line. you need to explain exactly what is different about that line or the lines after it.

I notice that those dates are a few weeks off. If you are trying to ignore future records, then we need to know whether this is a one-shot or an ongoing thing that you might be receiving future records you want to drop.

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to May Tech Talks, Office Hours, and Webinars!

Take a look below to explore our upcoming Community Office Hours, Tech Talks, and Webinars this month. This ...

They're back! Join the SplunkTrust and MVP at .conf24

With our highly anticipated annual conference, .conf, comes the fez-wearers you can trust! The SplunkTrust, as ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had two releases of new security content via the Enterprise ...