Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Elastic Search Index going over field limit #165

Open
ashabbir opened this issue Jun 4, 2018 · 3 comments
Open

Elastic Search Index going over field limit #165

ashabbir opened this issue Jun 4, 2018 · 3 comments
Labels

Comments

@ashabbir
Copy link

ashabbir commented Jun 4, 2018

It might be happening from Headers.
we can convert headers into a string field and use it as index would resolve this issue

{"error":{"root_cause":[{"type":"remote_transport_exception","reason":"[node-1][10.132.22.96:9300][indices:data/write/update[s]]"}],"type":"illegal_argument_exception","reason":"Limit of total fields [1000] in index [ache-data] has been exceeded"}

but please make sure it doesnt change the kafka format

@aecio aecio added the bug label Jun 6, 2018
@aecio
Copy link
Member

aecio commented Jun 6, 2018

I think you mean to not change the ELASTIC format in KAFKA, and if so, that is not possible. I can't change one without changing the other since they are serialized using the same code.

@ashabbir
Copy link
Author

ashabbir commented Jun 6, 2018

ops ya Do not change Elastic Format please

@aecio
Copy link
Member

aecio commented Jun 6, 2018

We will need to fix this and change the format before the next version release. For now, changing Elasticsearch setting to increase the index.mapping.total_fields.limit might work as a temporary workaround:

curl -XPUT 'http://localhost:9200/ache-data/_settings' -d '
{
  "index.mapping.total_fields.limit": 5000
}'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants