Skip to content

Commit

Permalink
Merge branch 'main' into serverless_intro_update
Browse files Browse the repository at this point in the history
  • Loading branch information
georgewallace authored Nov 22, 2024
2 parents 1d70639 + f9876c2 commit cb9a4dc
Show file tree
Hide file tree
Showing 14 changed files with 298 additions and 572 deletions.
Binary file modified serverless/images/discover-find-data-view.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
62 changes: 62 additions & 0 deletions serverless/images/semantic-options.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
7 changes: 1 addition & 6 deletions serverless/index-serverless-elasticsearch.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
++++

include::./pages/what-is-elasticsearch-serverless.asciidoc[leveloffset=+2]
include::./pages/serverless-differences.asciidoc[leveloffset=+2]

include::./pages/get-started.asciidoc[leveloffset=+2]

Expand Down Expand Up @@ -37,13 +38,7 @@ include::./pages/search-your-data-the-search-api.asciidoc[leveloffset=+3]
include::./pages/search-with-synonyms.asciidoc[leveloffset=+3]
include::./pages/knn-search.asciidoc[leveloffset=+3]
include::./pages/search-your-data-semantic-search.asciidoc[leveloffset=+3]
include::./pages/search-your-data-semantic-search-elser.asciidoc[leveloffset=+4]

include::./pages/explore-your-data.asciidoc[leveloffset=+2]

include::./pages/search-playground.asciidoc[leveloffset=+2]

include::./pages/serverless-differences.asciidoc[leveloffset=+2]

include::./pages/pricing.asciidoc[leveloffset=+2]
include::./pages/technical-preview-limitations.asciidoc[leveloffset=+2]
33 changes: 30 additions & 3 deletions serverless/pages/apis-http-apis.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,33 @@
// :description: {es} and {kib} expose REST APIs that can be called directly to configure and access {stack} features.
// :keywords: serverless, elasticsearch, http, rest, overview

* <<elasticsearch-api-conventions>>: The {es-serverless} REST APIs have conventions for headers and request bodies.
* <<elasticsearch-kibana-api-conventions>>: The Management APIs for {serverless-short} have request header conventions.
* https://www.elastic.co/docs/api/[API Reference]: Explore the reference information for Elastic Serverless REST APIs
[discrete]
[[elasticsearch-api-references-links]]
== API references

The following APIs are available for {es-serverless} users.
These links will take you to the autogenerated API reference documentation.

https://www.elastic.co/docs/api/doc/elasticsearch-serverless[Elasticsearch Serverless APIs →]::
Use these APIs to index, manage, search, and analyze your data in {es-serverless}.
+
[TIP]
====
Learn how to <<elasticsearch-connecting-to-es-serverless-endpoint,connect to your {es-serverless} endpoint>>.
====

https://www.elastic.co/docs/api/doc/serverless[Kibana Serverless APIs →]::
Use these APIs to manage resources such as connectors, data views, and saved objects for your {serverless-full} project.

https://www.elastic.co/docs/api/doc/elastic-cloud-serverless[{serverless-full} APIs →]::
Use these APIs to manage your {serverless-full} projects.

[discrete]
[[additional-api-details]]
== Additional API information

<<elasticsearch-api-conventions>>::
Reference information about headers and request body conventions for {es-serverless} REST APIs.

<<elasticsearch-kibana-api-conventions>>::
Reference information about request header conventions for {serverless-full} REST APIs.
5 changes: 5 additions & 0 deletions serverless/pages/clients.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -20,3 +20,8 @@ Currently, the following language clients are supported:
* <<elasticsearch-php-client-getting-started,PHP>> | https://github.com/elastic/elasticsearch-serverless-php[Repository]
* <<elasticsearch-python-client-getting-started,Python>> | https://github.com/elastic/elasticsearch-serverless-python[Repository]
* <<elasticsearch-ruby-client-getting-started,Ruby>> | https://github.com/elastic/elasticsearch-serverless-ruby[Repository]

[TIP]
====
Learn how to <<elasticsearch-connecting-to-es-serverless-endpoint,connect to your {es-serverless} endpoint>>.
====
5 changes: 4 additions & 1 deletion serverless/pages/connecting-to-es-endpoint.asciidoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
[[elasticsearch-connecting-to-es-serverless-endpoint]]
= Connecting to your Elasticsearch Serverless endpoint
= Connect to your Elasticsearch Serverless endpoint
++++
<titleabbrev>Connect to your endpoint</titleabbrev>
++++

[TIP]
====
Expand Down
3 changes: 1 addition & 2 deletions serverless/pages/data-views.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -35,12 +35,11 @@ After you've loaded your data, follow these steps to create a {data-source}:

// <DocImage size="m" url="../images/discover-find-data-view.png" alt="How to set the {data-source} in Discover" />

. Open **Discover** then open the data view menu.
. Go to **{project-settings} → {manage-app} → {data-views-app}**. Alternatively, go to **Discover** and open the data view menu.
+
[role="screenshot"]
image:images/discover-find-data-view.png[How to set the {data-source} in Discover]
+
Alternatively, go to **{project-settings} → {manage-app} → {data-views-app}**.
. Click **Create a {data-source}**.
. Give your {data-source} a name.
. Start typing in the **Index pattern** field, and Elastic looks for the names of
Expand Down
39 changes: 24 additions & 15 deletions serverless/pages/ingest-your-data.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -4,27 +4,36 @@
// :description: Add data to your {es-serverless} project.
// :keywords: serverless, elasticsearch, ingest, overview

You have many options for ingesting, or indexing, data into {es}:
The best ingest option(s) for your use case depends on whether you are indexing general content or time series (timestamped) data.

* <<elasticsearch-ingest-data-through-api,{es} API>>
* <<elasticsearch-ingest-data-through-integrations-connector-client,Connector clients>>
* <<elasticsearch-ingest-data-file-upload,File Uploader>>
* <<elasticsearch-ingest-data-through-beats,{beats}>>
* <<elasticsearch-ingest-data-through-logstash,{ls}>>
* https://github.com/elastic/crawler[Elastic Open Web Crawler]
[discrete]
[[es-ingestion-overview-apis]]
== Ingest data using APIs

The best ingest option(s) for your use case depends on whether you are indexing general content or time series (timestamped) data.
You can use the <<elasticsearch-http-apis,{es} REST APIs>> to add data to your {es} indices, using any HTTP client, including the <<elasticsearch-clients,{es} client libraries>>.

**General content**
While the {es} APIs can be used for any data type, Elastic provides specialized tools that optimize ingestion for specific use cases.

[discrete]
[[es-ingestion-overview-general-content]]
== Ingest general content

General content includes HTML pages, catalogs, files, and other content that does not update continuously.
This data can be updated, but the value of the content remains relatively constant over time.
Use connector clients to sync data from a range of popular data sources to {es}.
You can also send data directly to {es} from your application using the API.
General content is typically text-heavy data that does not have a timestamp.
This could be data like knowledge bases, website content, product catalogs, and more.

You can use these specialized tools to add general content to {es} indices:

* <<elasticsearch-ingest-data-through-integrations-connector-client,Connector clients>>
* https://github.com/elastic/crawler[Elastic Open Web Crawler]
* <<elasticsearch-ingest-data-file-upload,File Uploader>>

[discrete]
[[elasticsearch-ingest-time-series-data]]
**Times series (timestamped) data**
== Ingest time series data

Time series, or timestamped data, describes data that changes frequently and "flows" over time, such as stock quotes, system metrics, and network traffic data.
Use {beats} or {ls} to collect time series data.

You can use these specialized tools to add timestamped data to {es} data streams:

* <<elasticsearch-ingest-data-through-beats,{beats}>>
* <<elasticsearch-ingest-data-through-logstash,{ls}>>
Loading

0 comments on commit cb9a4dc

Please sign in to comment.