Skip to content

Commit

Permalink
More fixes and cleanup
Browse files Browse the repository at this point in the history
  • Loading branch information
karenzone committed Oct 29, 2024
1 parent 066b327 commit 77ab142
Show file tree
Hide file tree
Showing 3 changed files with 37 additions and 26 deletions.
14 changes: 8 additions & 6 deletions docs/en/ingest-guide/ingest-intro.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -9,30 +9,30 @@ the data into {es} before you can search it, visualize it, and use it for insigh
Our ingest tools are flexible, and support a wide range of scenarios.
We can help you with everything from popular and straightforward use cases, all
the way to advanced use cases that require additional processing in order to modify or
reshape your data before sending it to {es}.
reshape your data before it goes to {es}.

You can ingest:

* **General content** (data without timestamps), such as HTML pages, catalogs, and files
* **Timestamped (time series) data**, such as logs, metrics and traces for Search, Security, Observability, or your own solution
* **Timestamped (time series) data**, such as logs, metrics, and traces for Search, Security, Observability, or your own solution

[ingest-best-approach]
.What's the best approach for ingesting data?
****
The best choice for ingesting data is the _simplest option_ that _meets your needs_ and _satisfies your use case_.
**General content**. Choose the ingest tool that aligns with your data source.
**Best practice for general content**. Choose the ingest tool that aligns with your data source.
* To index **documents** directly into {es}, use the {es} link:{ref}/docs.html[document APIs].
* To send **application data** directly to {es}, use an link:https://www.elastic.co/guide/en/elasticsearch/client/index.html[{es}
language clients].
language client].
* To index **web page content**, use the Elastic link:https://www.elastic.co/web-crawler[web crawler].
* To sync **data from third-party sources**, use link:{ref}/es-connectors.html[connectors].
* To index **single files** for testing, use the {kib} link:{kibana-ref}/connect-to-elasticsearch.html#upload-data-kibana[file uploader].
If you would like to try things out before you add your own data, try using our {kibana-ref}/connect-to-elasticsearch.html#_add_sample_data[sample data].
**Timestamped data**. Start with {fleet-guide}[Elastic Agent] and one of the
**Best practice for timestamped data**. Start with {fleet-guide}[Elastic Agent] and one of the
hundreds of {integrations-docs}[Elastic integrations] that are available.
Integrations are available for many popular platforms and services, and are a
good place to start for ingesting data into Elastic solutions--Observability,
Expand All @@ -41,5 +41,7 @@ Security, and Search--or your own search application.
Check out the {integrations-docs}/all_integrations[Integration quick reference]
to search for available integrations.
If you don't find an integration for your data source or if you need
<<ingest-addl-proc,additional processing>> to extend the integration, we still have you covered.
additional processing to extend the integration, we still have you covered.
Check out <<ingest-addl-proc,additional processing>> for a sneak peek.
****
17 changes: 11 additions & 6 deletions docs/en/ingest-guide/ingest-solutions.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

Elastic solutions--Security, Observability, and Search--are loaded with features
and functionality to help you get value and insights from your data.
{fleet-guide}[Elastic Agent] and {integrations-docs}[Elastic integrations] can help.
{fleet-guide}[Elastic Agent] and {integrations-docs}[Elastic integrations] can help, and are the best place to start.

When you use integrations with solutions, you have an integrated experience that offers
easier implementation and decreases the time it takes to get insights and value from your data.
Expand All @@ -24,7 +24,6 @@ NOTE: {serverless-docs}[Elastic serverless] makes using solutions even easier.
Sign up for a link:{serverless-docs}/general/sign-up-trial[free trial], and check it out.



[discrete]
[[ingest-for-search]]
=== Ingesting data for Search
Expand All @@ -47,7 +46,10 @@ The solution gives you more pre-built components to get you up and running quick
[[ingest-for-obs]]
=== Ingesting data for Observability

With link:https://www.elastic.co/observability[Elastic Observability], you can monitor and gain insights into logs, metrics, and application traces.
With link:https://www.elastic.co/observability[Elastic Observability], you can
monitor and gain insights into logs, metrics, and application traces.
The resources and guides in this section illustrate how to ingest data and use
it with the Observability solution.

**Resources**

Expand All @@ -70,7 +72,10 @@ With link:https://www.elastic.co/observability[Elastic Observability], you can m
[[ingest-for-security]]
=== Ingesting data for Security

You can detect and respond to threats when you use link:https://www.elastic.co/security[Elastic Security] to analyze and take action on your data.
You can detect and respond to threats when you use
link:https://www.elastic.co/security[Elastic Security] to analyze and take
action on your data.
The resources and guides in this section illustrate how to ingest data and use it with the Security solution.

**Resources**

Expand All @@ -88,8 +93,8 @@ You can detect and respond to threats when you use link:https://www.elastic.co/s
[[ingest-for-custom]]
=== Ingesting data for your own custom search solution

Elastic solutions can give you a head start for common use cases, but you are not limited.
You can still do your own thing with a custom solution designed by you.
Elastic solutions can give you a head start for common use cases, but you are not at all limited.
You can still do your own thing with a custom solution designed by _you_.

Bring your ideas and use {es} and the {stack} to store, search, and visualize your data.

Expand Down
32 changes: 18 additions & 14 deletions docs/en/ingest-guide/ingest-tools.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -14,11 +14,17 @@ A single link:{fleet-guide}[{agent}] can collect multiple types of data when it
You can use standalone {agent}s and manage them locally on the systems where they are installed, or you can manage all of your agents and policies with the link:{fleet-guide}/manage-agents-in-fleet.html[Fleet UI in {kib}].
+
Use {agent} with one of hundreds of link:{integrations-docs}[Elastic integrations] to simplify collecting, transforming, and visualizing data.
Integrations include default ingestion rules, dashboards, and visualizations to start analyzing your data right away.
Integrations include default ingestion rules, dashboards, and visualizations to help you start analyzing your data right away.
Check out the {integrations-docs}/all_integrations[Integration quick reference] to search for available integrations that can reduce your time to value.
+
{agent} is the best option for collecting timestamped data for most data sources and use cases.
If your data requires additional processing before going to {es}, you can use {agent} with link:{fleet-guide}/elastic-agent-processor-configuration.html[{agent} processors] or link:{logstash-ref}[{ls}]

{agent} is the best option for collecting timestamped data for most data sources
and use cases.
If your data requires additional processing before going to {es}, you can use
link:{fleet-guide}/elastic-agent-processor-configuration.html[{agent}
processors], link:{logstash-ref}[{ls}], or additional processing features in
{es}.
Check out <<ingest-addl-proc,additional processing>> to see options.
+
Ready to try link:{fleet-guide}[{agent}]? Check out the link:{fleet-guide}/elastic-agent-installation.html[installation instructions].
+
Expand Down Expand Up @@ -49,14 +55,14 @@ Most users never need to use {ls}, but it's available if you need it for:
+
* **Data collection** (if an Elastic integration isn't available).
{agent} and Elastic {integrations-docs}/all_integrations[integrations] provide many features out-of-the-box, so be sure to search or browse integrations for your data source.
If you don't find an Elastic integration for your data source, check {ls} for an {logstash-ref}/input-plugins.html[input plugin].
If you don't find an Elastic integration for your data source, check {ls} for an {logstash-ref}/input-plugins.html[input plugin] for your data source.
* **Additional processing.** One of the most common {ls} use cases is link:{logstash-ref}/ea-integrations.html[extending Elastic integrations].
You can take advantage of the extensive, built-in capabilities of Elastic Agent and Elastic Integrations, and
then use {ls} for additional data processing before sending the data on to {es}.
* **Advanced use cases.** {ls} can help with advanced use cases, such as when you need
link:{ingest-guide}/lspq.html[persistence or buffering],
additional link:{ingest-guide}/ls-enrich.html[data enrichment],
link:{ingest-guide}/ls-networkbridge.html[proxying] to bridge network connections, or the ability to route data to
link:{ingest-guide}/ls-networkbridge.html[proxying] as a way to bridge network connections, or the ability to route data to
link:{ingest-guide}/ls-multi.html[multiple destinations].

Language clients::
Expand Down Expand Up @@ -86,22 +92,20 @@ The link:https://www.elastic.co/guide/en/esf/current/aws-elastic-serverless-forw
[[ingest-addl-proc]]
== Tools and features for additional processing
You can start with {agent} and Elastic {integrations-docs}[integrations], and still
take advantage of additional processing options if you need them:
take advantage of additional processing options if you need them.
You can use:

* link:{fleet-guide}/elastic-agent-processor-configuration.html[{agent} processors] for sanitizing or enriching raw data at the source.
Use {agent} processors if you need to control what data is sent across the wire, or need to enrich the raw data with information available on the host.
* {es} link:{ref}/[ingest pipelines] for enriching incoming data or normalizing field data before the data is indexed.
* link:{fleet-guide}/elastic-agent-processor-configuration.html[{agent} processors] to sanitize or enrich raw data at the source.
Use {agent} processors if you need to control what data is sent across the wire, or if you need to enrich the raw data with information available on the host.
* {es} link:{ref}/[ingest pipelines] to enrich incoming data or normalize field data before the data is indexed.
{es} ingest pipelines enable you to manipulate the data as it comes in.
This approach helps you avoid adding processing overhead to the hosts from which you're collecting data.

* {es} link:{ref}/runtime.html[runtime fields] for defining or altering the schema at query time.
* {es} link:{ref}/runtime.html[runtime fields] to define or alter the schema at query time.
You can use runtime fields at query time to start working with your data without needing to understand how it is structured,
add fields to existing documents without reindexing your data,
override the value returned from an indexed field, and/or
define fields for a specific use without modifying the underlying schema.

* {ls} `elastic_integration filter` for link:{logstash-ref}/ea-integrations.html[extending Elastic integrations], and other {ls} link:[filter plugins] for transforming data before it goes to {es}.
* {ls} `elastic_integration filter` to link:{logstash-ref}/ea-integrations.html[extend Elastic integrations], and other {ls} link:{logstash-ref}/filter-plugins.html[filter plugins] to transform data before it goes to {es}.




0 comments on commit 77ab142

Please sign in to comment.