-
Notifications
You must be signed in to change notification settings - Fork 73
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Indexing with hot_swap or build always fails with Malformed content, found extra data after parsing: START_OBJECT
#113
Comments
@danielberkompas if you have the chance, please help point me in the right direction, I've not found much helpful online but I've had no luck for the past week or so. |
@cdvx did you find a solution to this? @danielberkompas please can you help? |
No i didnt, just found a work around, adding the index to the the json index file so it's created when loaded |
Hi @cdvx @danielberkompas I am also facing the same issue on Elasticsearch 8.x. Can you pls help in solving this |
@cdvx can you share what did you add to the json index file ? Or a sample format of this file |
Anyone have any luck on this? @krezicoder did you find a fix? |
Hello everyone - this is because of Elasticsearch 8.x https://stackoverflow.com/questions/33340153/elasticsearch-bulk-index-json-data Based on the post - 7.x wants this, and what the libarary calls:
However in 8.x,
My tests via Kibana does confirm this. This code needs to change the prefix based on some config. |
Not sure when @danielberkompas will be updating this repo... so if anyone is looking, my workaround was to:
Code: lib/h1bjobs/elasticsearch/bulk.ex
lib/h1bjobs/elasticsearch/models/job_listing.ex
And then I run it like this:
|
When creating a new index and loading data, the index is created fine but when uploading the data to elasticsearch, it returns
This implies the ndjson payload created for the bulk api is faulty from what I can tell. Would appreciate any guidance if I'm missing something.
example payload created for bulk api by library
The text was updated successfully, but these errors were encountered: