You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What is the bug?
* When creating a MV/CI, I encountered the following error:
* A potential workaround: https://discuss.elastic.co/t/error-document-contains-at-least-one-immense-term-in-field/66486
* {"Message":"Fail to run query. Cause: failure in bulk execution:\n[316]: index [flint_securitylake_default_last_7day_ct_yue_mv], id [8no9yJEBJ8hCpANQgrau], message [OpenSearchException[OpenSearch exception [type=illegal_argument_exception, reason=Document contains at least one immense term in field=\"api.response.data\" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: '[123, 34, 82, 101, 99, 111, 114, 100, 115, 34, 58, 91, 123, 34, 70, 105, 110, 100, 105, 110, 103, 73, 100, 101, 110, 116, 105, 102, 105, 101]...', original message: bytes can be at most 32766 in length; got 39932]]; nested: OpenSearchException[OpenSearch exception [type=max_bytes_length_exceeded_exception, reason=max_bytes_length_exceeded_exception: bytes can be at most 32766 in length; got 39932]];]"}
How can one reproduce the bug?
Steps to reproduce the behavior:
Create a MV against a table that has a single field which can go beyond 32KB max limit.
What is the expected behavior?
A clear and concise description of what you expected to happen.
What is your host/environment?
OS: [e.g. iOS]
Version 2.13
Plugins
Do you have any screenshots?
If applicable, add screenshots to help explain your problem.
Do you have any additional context?
Add any other context about the problem.
The text was updated successfully, but these errors were encountered:
What is the bug?
* When creating a MV/CI, I encountered the following error:
* A potential workaround: https://discuss.elastic.co/t/error-document-contains-at-least-one-immense-term-in-field/66486
*
{"Message":"Fail to run query. Cause: failure in bulk execution:\n[316]: index [flint_securitylake_default_last_7day_ct_yue_mv], id [8no9yJEBJ8hCpANQgrau], message [OpenSearchException[OpenSearch exception [type=illegal_argument_exception, reason=Document contains at least one immense term in field=\"api.response.data\" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: '[123, 34, 82, 101, 99, 111, 114, 100, 115, 34, 58, 91, 123, 34, 70, 105, 110, 100, 105, 110, 103, 73, 100, 101, 110, 116, 105, 102, 105, 101]...', original message: bytes can be at most 32766 in length; got 39932]]; nested: OpenSearchException[OpenSearch exception [type=max_bytes_length_exceeded_exception, reason=max_bytes_length_exceeded_exception: bytes can be at most 32766 in length; got 39932]];]"}
How can one reproduce the bug?
Steps to reproduce the behavior:
What is the expected behavior?
A clear and concise description of what you expected to happen.
What is your host/environment?
Do you have any screenshots?
If applicable, add screenshots to help explain your problem.
Do you have any additional context?
Add any other context about the problem.
The text was updated successfully, but these errors were encountered: