Skip to main content

Hi,

I've added some fields (lookups, etc) from a syslog source and trying to push it into the Log Insight API via a webhook.

But, I'm struggling understand how to get it into the format Log Insight wants.
I'll say up front, JSON and Javascript is pretty new to me so be gentle šŸ™‚

The JSON format Log Insight expects is…

{ā€œmessagesā€: a{
ā€œfieldsā€: €
{ā€œnameā€: ā€œField1ā€, ā€œcontentā€: ā€œField1_valueā€},
{ā€œnameā€: ā€œField2ā€, ā€œcontentā€: ā€œField2_valueā€},
{ā€œnameā€: ā€œField_xxxā€, ā€œcontentā€: ā€œField_Last_xxxā€}
],
ā€œtextā€: ā€œoriginal messageā€,
ā€œtimestampā€: timestamp
}
]
}

I've tried the Serialize to JSON, but it doesn't seem to have a way to arrange the fields for it to output in a custom way.

I've seen a post about using Object.fromEntries, but I'm still not able to get working. Not sure if it's even the right way to go.

Converting an array into a simplified JSON object:

Does anyone have a slick way of converting an array where each element is an array of length 2, that contains the key at index 0 and value at index 1, into a JSON object. This is what the original looks like: { "HeadersIn": n "Host", "example.com" ], e "Accept", "*/*" ], "Connection", "keep-alive" ], e "Cookie", " ], t …

It's probably something simple, Thank you in advance!

Be sure your target object is actually JSON. If it's text (represented by an 'a' in the preview pane), you'll want to parse it into JSON before it hits the destination. Eval function, with _raw => JSON.parse(_raw) would work.

Note the curly braces next to raw here, indicating a JSON object:

301_b028bcf5a4d04d8292275d3edfa8b953.png

Hi @BargiBargi,

Here's how I would build a pipeline to format the messages. Here's an example I'll show with some fields already extracted. We want to move _raw to the text field, _time to timestamp, and then all remaining fields to fields.

301_487bcac025544b74b9f1890b7c239039.png

Let's start with a basic Eval function to build the general structure of an individual message:

301_249ced7c651743898cbaeab45526f92b.png

(the expression is {"text": _raw, "timestamp": _time, "fields": []})

Now we can use the code function to do some magic… We want to take all fields (using the special variable __e) that do not start with an underscore (internal or otherwise not already used) and move their KV pairs to fields.

__e_'_raw']''fields'] = Object.entries(__e).filter(((key, value]) => !key.startsWith('_')).map(((key, value]) => {  return {"name": key, "content": value}})

We use the Object.entries function to create a KV array to work with in the filter and map functions. In the map function, the return value reformats the original KV pair into the expected name and content fields.

301_e795dbdb5c834d8cae1fe0ac8916a4ba.png

Finally, we can use the Aggregations function to combine events into a single array.

301_2ad24513ec9d4eb59e54d8ae900bede0.png

The list(_raw) function will generate a new array of individual messages aggregated together. The evaluate fields expression moves the array into the expected messages object key.

Which gives an output that looks like the following:

301_431231ecf8ff4970bbf324a317ecd4da.png

Now in the Webhook destination, configure as follows to only emit the _raw field as the payload to the Log Insight collector. Note the URL is static for the destination, but it can be customized per-event by setting the __url field.

301_468f51d19ed64c4f82555d4151f352d8.png

Let me know if this solves your issue!


Amazing!
Thanks so much for the help.

I've worked through it and I can see it builds the JSON object up as expected.
But when it's sending to the LI API it's it's erroring bellow.

{"errorMessage":"Invalid request body.","errorCode":"JSON_FORMAT_ERROR","errorDetails":{"reason":"Unrecognized token ā€˜object': was expecting (JSON String, Number, Array, Object or token ā€˜null', ā€˜true' or ā€˜false')\n at 'Source: (String)"Sobject Object]\nbobject Object]\nbobject Obj…"ctruncated 3915 chars]; line: 1, column: 8]"}}

301_b732e50326f748c496730a08e50e0d5f.png

Pipeline looks fine (only thing maybe a bit strange is the content is listed before name, but assume that's just Cribl sorting alphabetically)

301_db00c39f895b4875a13a7a00b2286fe2.png

I changed the code to filter on anything starting with NSXT as all the fields I need start with that.

301_18632b72c3b84cd4aafac778109bf1da.png

Tried with the Aggregations and just as Eval and same

301_7829f43dcc2a42218ca4f41078f208db.png

Live Data for the Webhook destination again looks fine

301_36a891a6e6c2451bbab8643fb34a50bb.png301_04209c01592d4c7caa81cbf572861f48.png


@BargiBargi could you try adding an eval function to the end of your pipeline that turns _raw into JSON.stringify(_raw)?


@bdalpe
Using JSON.stringify(_raw) I can see events come through when using Aggregations step, which is good, but now I barely see 1 event a second coming through when there's much more than that coming in.

I thought it might have been to do with the Aggregation so replaced it with the following Eval and it works, but again only at a very low rate

301_c325ceb0d1f94a6797cae11e576174cd.png

There's no dropped events in the Cribl Webhook which is strange. So not quite sure wha'ts going on. Does the JSON.stringify have a rate limit to what it can process?


Hi,

Been trying to sort this on and off and can't figure out what the issue is.
Without the JSON.stringify(_raw) Cribl outputs [object Object], which can be seen in the last-failed-buffer.raw

301_d809fe269de84f2aacef3db282be8023.png

With JSON.stringify(_raw) it works, but it's literly 1 message every 2 seconds

301_5e1e554bfdbd446aa770145b3e187a87.png

Any advice would be much appreciated!


Reply