Elasticsearch output question. How can I get the default `@timestamp`; on elastic output? It seems like `_time` should be automatically be renamed to `@timestamp`; .
The test that Harry did above looks to be correctly using the create action to the bulkapi, what version of Cribl are you on?
4.1.0
Fairly new....
Maybe try and get the actual output from your Cribl by sniffing, and try running that in the console to see what errors come up.
I think the problem is the creation of an _id
``` { "create": { "_index":".ds-interfaces-sensors-ptx-2023.03.22-000001","_id":"5bltTMx3nTh6pWRi","status":400,"error": { "type":"mapper_parsing_exception","reason":"failed to parse","caused_by": { "type":"illegal_argument_exception","reason":"_id must be unset or set to [0QfN01UBNYDT2wsqAAABhwrKl7A] but was [5bltTMx3nTh6pWRi] because [.ds-interfaces-sensors-ptx-2023.03.22-000001] is in time_series mode" } } } },```
I'm not sure why the worker didn't get those error messages
Based on the docs.> For TSDS documents, the document `_id` is a hash of the document's dimensions and `@timestamp`;. A TSDS doesn't support custom document `_id` valuesThe `_id` currently being set is `random(16)` . However, it is possible to override this by adding `__id` to the event. Logic looks like this:```const _id = event.__id ?? random(16);```
That was from a capture on the wire
It might need to be unset though. Is that possible?
Unfortunately this happens in formatting of data before it hits the wire.
_id should not be set. it is automatically generated by elastic on ingest.
It would only be useful to "update" documents and NEVER used for metrics
<@U0410L186KS> Can you try stripping _id out using an elastic ingest pipeline? In the interim?
I could do that.
_id is required in other situations, which is why it's provided by default. Looks like we need an option to omit _id. I'll open a ticket.
When in data stream, disable.
You will run against issues with read-only index, etc...
It should be user option to create _id in the pipeline
Removed the _id using ingest pipeline. That works.
From Logstash documentation:
Cool, glad there is a work around! Created ticket: `CRIBL-16300` to address this in an upcoming release.
Argg.. now I got data in the future!
Only 1 document made it. Something isn't working.
Reply
Login to the community
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.