Skip to content

Commit

Permalink
Merge branch 'main' into carlosdelest/search-serverless-metrics-listener
Browse files Browse the repository at this point in the history
  • Loading branch information
carlosdelest committed Nov 11, 2024
2 parents 7c49cda + 64c362b commit 028c2f2
Show file tree
Hide file tree
Showing 1,402 changed files with 43,509 additions and 29,479 deletions.
4 changes: 3 additions & 1 deletion .buildkite/packer_cache.sh
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,8 @@ for branch in "${branches[@]}"; do
fi

export JAVA_HOME="$HOME/.java/$ES_BUILD_JAVA"
"checkout/${branch}/gradlew" --project-dir "$CHECKOUT_DIR" --parallel -s resolveAllDependencies -Dorg.gradle.warning.mode=none -DisCI
"checkout/${branch}/gradlew" --project-dir "$CHECKOUT_DIR" --parallel -s resolveAllDependencies -Dorg.gradle.warning.mode=none -DisCI --max-workers=4
"checkout/${branch}/gradlew" --stop
pkill -f '.*GradleDaemon.*'
rm -rf "checkout/${branch}"
done
2 changes: 0 additions & 2 deletions .buildkite/pipelines/periodic-packaging.template.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,8 +40,6 @@ steps:
matrix:
setup:
image:
- windows-2016
- windows-2019
- windows-2022
agents:
provider: gcp
Expand Down
2 changes: 0 additions & 2 deletions .buildkite/pipelines/periodic-packaging.yml
Original file line number Diff line number Diff line change
Expand Up @@ -345,8 +345,6 @@ steps:
matrix:
setup:
image:
- windows-2016
- windows-2019
- windows-2022
agents:
provider: gcp
Expand Down
2 changes: 0 additions & 2 deletions .buildkite/pipelines/periodic-platform-support.yml
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,6 @@ steps:
matrix:
setup:
image:
- windows-2016
- windows-2019
- windows-2022
GRADLE_TASK:
- checkPart1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ steps:
matrix:
setup:
image:
- windows-2019
- windows-2022
PACKAGING_TASK:
- default-windows-archive
agents:
Expand Down
2 changes: 0 additions & 2 deletions .buildkite/pipelines/pull-request/packaging-tests-windows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,6 @@ steps:
matrix:
setup:
image:
- windows-2016
- windows-2019
- windows-2022
PACKAGING_TASK:
- default-windows-archive
Expand Down
6 changes: 3 additions & 3 deletions .buildkite/scripts/cloud-deploy.sh
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@

set -euo pipefail

.ci/scripts/run-gradle.sh buildCloudDockerImage
.ci/scripts/run-gradle.sh buildCloudEssDockerImage

ES_VERSION=$(grep 'elasticsearch' build-tools-internal/version.properties | awk '{print $3}')
DOCKER_TAG="docker.elastic.co/elasticsearch-ci/elasticsearch-cloud:${ES_VERSION}-${BUILDKITE_COMMIT:0:7}"
docker tag elasticsearch-cloud:test "$DOCKER_TAG"
DOCKER_TAG="docker.elastic.co/elasticsearch-ci/elasticsearch-cloud-ess:${ES_VERSION}-${BUILDKITE_COMMIT:0:7}"
docker tag elasticsearch-cloud-ess:test "$DOCKER_TAG"

echo "$DOCKER_REGISTRY_PASSWORD" | docker login -u "$DOCKER_REGISTRY_USERNAME" --password-stdin docker.elastic.co
unset DOCKER_REGISTRY_USERNAME DOCKER_REGISTRY_PASSWORD
Expand Down
1 change: 1 addition & 0 deletions .ci/dockerOnLinuxExclusions
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ sles-15.2
sles-15.3
sles-15.4
sles-15.5
sles-15.6

# These OSes are deprecated and filtered starting with 8.0.0, but need to be excluded
# for PR checks
Expand Down
24 changes: 24 additions & 0 deletions .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,30 @@ server/src/main/java/org/elasticsearch/bootstrap @elastic/es-core-infra
server/src/main/java/org/elasticsearch/node @elastic/es-core-infra
server/src/main/java/org/elasticsearch/plugins @elastic/es-core-infra
server/src/main/java/org/elasticsearch/threadpool @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/breaker @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/bytes @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/cli @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/collect @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/component @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/compress @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/document @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/file @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/hash @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/io @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/logging @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/metrics @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/network @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/path @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/recycler @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/regex @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/scheduler @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/settings @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/text @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/time @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/transport @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/unit @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/util @elastic/es-core-infra
server/src/main/java/org/elasticsearch/common/xcontent @elastic/es-core-infra

# Security
x-pack/plugin/core/src/main/java/org/elasticsearch/xpack/core/security/authz/privilege @elastic/es-security
Expand Down
21 changes: 0 additions & 21 deletions .github/workflows/sync-main-to-jdk-branch.yml

This file was deleted.

59 changes: 27 additions & 32 deletions README.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Elasticsearch is a distributed search and analytics engine, scalable data store

Use cases enabled by Elasticsearch include:

* https://www.elastic.co/search-labs/blog/articles/retrieval-augmented-generation-rag[Retrieval Augmented Generation (RAG)]
* https://www.elastic.co/search-labs/blog/articles/retrieval-augmented-generation-rag[Retrieval Augmented Generation (RAG)]
* https://www.elastic.co/search-labs/blog/categories/vector-search[Vector search]
* Full-text search
* Logs
Expand All @@ -17,7 +17,7 @@ Use cases enabled by Elasticsearch include:
To learn more about Elasticsearch's features and capabilities, see our
https://www.elastic.co/products/elasticsearch[product page].

To access information on https://www.elastic.co/search-labs/blog/categories/ml-research[machine learning innovations] and the latest https://www.elastic.co/search-labs/blog/categories/lucene[Lucene contributions from Elastic], more information can be found in https://www.elastic.co/search-labs[Search Labs].
To access information on https://www.elastic.co/search-labs/blog/categories/ml-research[machine learning innovations] and the latest https://www.elastic.co/search-labs/blog/categories/lucene[Lucene contributions from Elastic], more information can be found in https://www.elastic.co/search-labs[Search Labs].

[[get-started]]
== Get started
Expand All @@ -27,20 +27,20 @@ https://www.elastic.co/cloud/as-a-service[Elasticsearch Service on Elastic
Cloud].

If you prefer to install and manage Elasticsearch yourself, you can download
the latest version from
the latest version from
https://www.elastic.co/downloads/elasticsearch[elastic.co/downloads/elasticsearch].

=== Run Elasticsearch locally

////
////
IMPORTANT: This content is replicated in the Elasticsearch repo. See `run-elasticsearch-locally.asciidoc`.
Ensure both files are in sync.
https://github.com/elastic/start-local is the source of truth.
////
////

[WARNING]
====
====
DO NOT USE THESE INSTRUCTIONS FOR PRODUCTION DEPLOYMENTS.
This setup is intended for local development and testing only.
Expand Down Expand Up @@ -93,20 +93,20 @@ Use this key to connect to Elasticsearch with a https://www.elastic.co/guide/en/
From the `elastic-start-local` folder, check the connection to Elasticsearch using `curl`:

[source,sh]
----
----
source .env
curl $ES_LOCAL_URL -H "Authorization: ApiKey ${ES_LOCAL_API_KEY}"
----
// NOTCONSOLE

=== Send requests to Elasticsearch

You send data and other requests to Elasticsearch through REST APIs.
You can interact with Elasticsearch using any client that sends HTTP requests,
You send data and other requests to Elasticsearch through REST APIs.
You can interact with Elasticsearch using any client that sends HTTP requests,
such as the https://www.elastic.co/guide/en/elasticsearch/client/index.html[Elasticsearch
language clients] and https://curl.se[curl].
language clients] and https://curl.se[curl].

==== Using curl
==== Using curl

Here's an example curl command to create a new Elasticsearch index, using basic auth:

Expand Down Expand Up @@ -149,19 +149,19 @@ print(client.info())

==== Using the Dev Tools Console

Kibana's developer console provides an easy way to experiment and test requests.
Kibana's developer console provides an easy way to experiment and test requests.
To access the console, open Kibana, then go to **Management** > **Dev Tools**.

**Add data**

You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs.
Whether you have structured or unstructured text, numerical data, or geospatial data,
Elasticsearch efficiently stores and indexes it in a way that supports fast searches.
You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs.
Whether you have structured or unstructured text, numerical data, or geospatial data,
Elasticsearch efficiently stores and indexes it in a way that supports fast searches.

For timestamped data such as logs and metrics, you typically add documents to a
data stream made up of multiple auto-generated backing indices.

To add a single document to an index, submit an HTTP post request that targets the index.
To add a single document to an index, submit an HTTP post request that targets the index.

----
POST /customer/_doc/1
Expand All @@ -171,19 +171,19 @@ POST /customer/_doc/1
}
----

This request automatically creates the `customer` index if it doesn't exist,
adds a new document that has an ID of 1, and
This request automatically creates the `customer` index if it doesn't exist,
adds a new document that has an ID of 1, and
stores and indexes the `firstname` and `lastname` fields.

The new document is available immediately from any node in the cluster.
The new document is available immediately from any node in the cluster.
You can retrieve it with a GET request that specifies its document ID:

----
GET /customer/_doc/1
----

To add multiple documents in one request, use the `_bulk` API.
Bulk data must be newline-delimited JSON (NDJSON).
Bulk data must be newline-delimited JSON (NDJSON).
Each line must end in a newline character (`\n`), including the last line.

----
Expand All @@ -200,15 +200,15 @@ PUT customer/_bulk

**Search**

Indexed documents are available for search in near real-time.
The following search matches all customers with a first name of _Jennifer_
Indexed documents are available for search in near real-time.
The following search matches all customers with a first name of _Jennifer_
in the `customer` index.

----
GET customer/_search
{
"query" : {
"match" : { "firstname": "Jennifer" }
"match" : { "firstname": "Jennifer" }
}
}
----
Expand All @@ -223,9 +223,9 @@ data streams, or index aliases.

. Go to **Management > Stack Management > Kibana > Data Views**.
. Select **Create data view**.
. Enter a name for the data view and a pattern that matches one or more indices,
such as _customer_.
. Select **Save data view to Kibana**.
. Enter a name for the data view and a pattern that matches one or more indices,
such as _customer_.
. Select **Save data view to Kibana**.

To start exploring, go to **Analytics > Discover**.

Expand Down Expand Up @@ -254,11 +254,6 @@ To build a distribution for another platform, run the related command:
./gradlew :distribution:archives:windows-zip:assemble
----

To build distributions for all supported platforms, run:
----
./gradlew assemble
----

Distributions are output to `distribution/archives`.

To run the test suite, see xref:TESTING.asciidoc[TESTING].
Expand All @@ -281,7 +276,7 @@ The https://github.com/elastic/elasticsearch-labs[`elasticsearch-labs`] repo con
[[contribute]]
== Contribute

For contribution guidelines, see xref:CONTRIBUTING.md[CONTRIBUTING].
For contribution guidelines, see xref:CONTRIBUTING.md[CONTRIBUTING].

[[questions]]
== Questions? Problems? Suggestions?
Expand Down
6 changes: 3 additions & 3 deletions benchmarks/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -40,15 +40,15 @@ dependencies {
// us to invoke the JMH uberjar as usual.
exclude group: 'net.sf.jopt-simple', module: 'jopt-simple'
}
api(project(':libs:elasticsearch-h3'))
api(project(':libs:h3'))
api(project(':modules:aggregations'))
api(project(':x-pack:plugin:esql-core'))
api(project(':x-pack:plugin:esql'))
api(project(':x-pack:plugin:esql:compute'))
implementation project(path: ':libs:elasticsearch-simdvec')
implementation project(path: ':libs:simdvec')
expression(project(path: ':modules:lang-expression', configuration: 'zip'))
painless(project(path: ':modules:lang-painless', configuration: 'zip'))
nativeLib(project(':libs:elasticsearch-native'))
nativeLib(project(':libs:native'))
api "org.openjdk.jmh:jmh-core:$versions.jmh"
annotationProcessor "org.openjdk.jmh:jmh-generator-annprocess:$versions.jmh"
// Dependencies of JMH
Expand Down
Loading

0 comments on commit 028c2f2

Please sign in to comment.