Skip to main content

I have tried Event Breaker on this type of data but it didn't work, it can't separate the record into individual events

note: this is the live data streamed from Azure event hub

848_f35fb2508dff4e58a260532345ae2edc.png

and this is the event breaker configuration

848_b891ce656e154ca0af7a92ab9c3a3969.png

but if I use the same pipeline or the event breaker at the knowledge with the data in that shape it works fine and splits every record into an event

848_4250b75165b049229665523a590f5231.png

I tried to use unroll but also didn't work so any suggestion on how can i solve this issue?

thanks,

I'm not following the problem statements:

> I have tried Event Breaker on this type of data but it didn't work, it can't separate the record into individual events … but if I use the same pipeline or the event breaker at the knowledge with the data in that shape it works fine and splits every record into an event

Can you clarify?


I have created a pipeline with the event breaker function if I applied it to the live data this what will happen to the data sample

the Input data

848_252783e3c7db46438130a75260c8ba7d.png

the output

848_3e27a7b74dc048569d4ab10b119c2e96.png

but if I copy past the live data sample into event breaker rules with the same configuration it will work fine

Input :

848_152edc29cae946208b9397171a78f58c.png

output

848_9d6e1edc6f4740c09dfa171d92558f89.png

Did you commit and deploy?


yes


The EB function only operates on data in _raw. Your records[] field is not in _raw. You can use the Unroll function with Source and destination field configs listed as recordsor you can serialize recordsinto raw and then run the EB.

https://docs.cribl.io/stream/event-breaker-function

Using Unroll:

848_049ac8da945f455a94af34fb4fbd1af9.png


With this starting data:

848_3c9eb23ea6b84dce83f70bf7a70fbb5f.png

Results in:

848_f720d85b724743b9a3636f5800b6189d.png

i did the same but it didn't work on the event with an inside record 659

848_60ac3287062c48948b18b51dfd2fd42c.png

output

848_dd8ad71e5344498ca2750ac8d2554c4c.png

With a destination field different from the original, you end up with 659 records with the original 659 records in each — meaning more than 400,000 records in all. This is likely too much data for the preview window, which runs in your browser's JS engine, to handle. Try using records as the destination for Unroll.


i get your point thank you so much it worked this was very helpful


I don't see how the EB plays in - I have configured the unroll and then if I add the EB next, the output did not have any difference comparing to the result of using unroll alone.

I also don't have the cribl_breaker field shown up.


turns out I can if I change the output to "_raw".

However, "JSON Extract Fields" toggled on and off have no difference in results.


Reply