I have a data stream that contains various different log types. For example, I have a Windows log, a firewall log and an antivirus log in the same stream. Each of the events from the source is tagged for example
type: windows
type: firewall
type: antivirus
What I want to do is split these logs by type. Essentially I would like to do:
If the type equals windows then drop everything else and run it through a pipeline / route to transform my data with a windows parser and ship my data
If the type equals firewall then drop everything else and run it through a pipeline / route to transform my data with a firewall parser and ship my data
If the type equals antivirus then drop everything else and run it through a pipeline / route to transform my data with a antivirus parser and ship my data
I started thinking that I would use the eval function to filter based on this and group them together but unsure if this would destroy my data due to order. I also started thinking that I could do some pre-processing pipelines on the source, but the source only allows 1 pipeline to select (not multiple). If my pipeline was configured as follows:
Group A
- Eval | filter type===‘windows’
- Drop | filter type!=‘windows’
- Chain | windows_pipeline
Group B
- Eval | filter type===‘firewall’
- Drop | filter type!=‘firewall’
- Chain | firewall_pipeline
Group C
- Eval | filter type===‘antivirus’
- Drop | filter type!=‘antivirus’
- Chain | antivirus_pipeline
However if I do this I think Group B and Group C would fail Group A dropped that data. I considered Event Breakers, but this doesn’t seem to fit what Event Breakers are for.
Any help would be appreciated. Thanks.