diff --git a/docs/en/ingest-arch/index.asciidoc b/docs/en/ingest-arch/index.asciidoc index 97ee6acb8..118f5dfc0 100644 --- a/docs/en/ingest-arch/index.asciidoc +++ b/docs/en/ingest-arch/index.asciidoc @@ -18,6 +18,8 @@ include::8-ls-input.asciidoc[] include::99-airgapped.asciidoc[] +include::../ingest-guide/index.asciidoc[] + // === Next set of architectures // include::3-schemamod.asciidoc[] // include::6b-filebeat-es.asciidoc[] diff --git a/docs/en/ingest-guide/index.asciidoc b/docs/en/ingest-guide/index.asciidoc new file mode 100644 index 000000000..a5404cc15 --- /dev/null +++ b/docs/en/ingest-guide/index.asciidoc @@ -0,0 +1,19 @@ +include::{docs-root}/shared/versions/stack/{source_branch}.asciidoc[] +include::{docs-root}/shared/attributes.asciidoc[] + +:doctype: book + +[[ingest-guide]] += Elastic Ingest Overview + +include::ingest-intro.asciidoc[] +include::ingest-tools.asciidoc[] +include::ingest-additional-proc.asciidoc[] +//include::ingest-static.asciidoc[] +//include::ingest-timestamped.asciidoc[] +include::ingest-solutions.asciidoc[] +//include::ingest-faq.asciidoc[] + +//include:: Prereqs (for using data after ingest) +//include:: Migration for ingest +//include:: Troubleshooting diff --git a/docs/en/ingest-guide/ingest-additional-proc.asciidoc b/docs/en/ingest-guide/ingest-additional-proc.asciidoc new file mode 100644 index 000000000..23d8bf54b --- /dev/null +++ b/docs/en/ingest-guide/ingest-additional-proc.asciidoc @@ -0,0 +1,27 @@ +[[ingest-addl-proc]] +== Additional ingest processing + +You can start with {agent} and Elastic {integrations-docs}[integrations], and still +take advantage of additional processing options if you need them. + +{agent} processors:: +You can use link:{fleet-guide}/elastic-agent-processor-configuration.html[{agent} processors] to sanitize or enrich raw data at the source. +Use {agent} processors if you need to control what data is sent across the wire, or if you need to enrich the raw data with information available on the host. + +{es} ingest pipelines:: +You can use {es} link:{ref}/[ingest pipelines] to enrich incoming data or normalize field data before the data is indexed. +{es} ingest pipelines enable you to manipulate the data as it comes in. +This approach helps you avoid adding processing overhead to the hosts from which you're collecting data. + +{es} runtime fields:: +You can use {es} link:{ref}/runtime.html[runtime fields] to define or alter the schema at query time. +You can start working with your data without needing to understand how it is +structured, add fields to existing documents without reindexing your data, +override the value returned from an indexed field, and/or define fields for a +specific use without modifying the underlying schema. + +{ls} `elastic_integration filter`:: +You can use the {ls} link:{logstash-ref}/[`elastic_integration filter`] and +other link:{logstash-ref}/filter-plugins.html[{ls} filters] to +link:{logstash-ref}/ea-integrations.html[extend Elastic integrations] by +transforming data before it goes to {es}. diff --git a/docs/en/ingest-guide/ingest-faq.asciidoc b/docs/en/ingest-guide/ingest-faq.asciidoc new file mode 100644 index 000000000..dea6534b2 --- /dev/null +++ b/docs/en/ingest-guide/ingest-faq.asciidoc @@ -0,0 +1,77 @@ +[[ingest-faq]] +== Frequently Asked Questions + +Q: What Elastic products and tools are available for ingesting data into Elasticsearch. + +Q: What's the best option for ingesting data? + +Q: What's the role of Logstash `filter-elastic-integration`? + + + +.WORK IN PROGRESS +**** +Temporary parking lot to capture outstanding questions and notes. +**** + + + +Also cover (here or in general outline): + +- https://www.elastic.co/guide/en/kibana/master/connect-to-elasticsearch.html#_add_sample_data[Sample data] +- OTel +- Beats +- Use case: GeoIP +- Airgapped +- Place for table, also adding use case + products (Exp: Logstash for multi-tenant) +- Role of LS in general content use cases + + + +[discrete] +=== Questions to answer: + +* Messaging for data sources that don't have an integration + - We're deemphasizing beats in preparation for deprecation + - We're not quite there with OTel yet + * How should we handle this in the near term? + Probably doesn't make sense to either ignore or jump them straight to Logstash + +* Should we mention Fleet and Stand-alone agent? +** If so, when, where, and how? +* How does this relate to Ingest Architectures +* Enrichment for general content + +* How to message current vs. desired state. + Especially Beats and OTel. +* HOW TO MESSAGE OTel - Current state. Future state. +* Consistent use of terminology vs. matching users' vocabulary (keywords) + +[discrete] +==== Random + +* DocsV3 - need for a sheltered space to develop new content +** Related: https://github.com/elastic/docsmobile/issues/708 +** Need a place to incubate a new doc (previews, links, etc.) +** Refine messaging in private + + +[discrete] +=== Other resources to use, reference, reconcile + +* Timeseries decision tree (needs updates) +* PM's video +** Needs an update. (We might relocate content before updating.) +* PM's product table +** Needs an update.(We might relocate content before updating.) +** Focuses on Agent over integrations. +** Same link text resolves to different locations. +** Proposal: Harvest the good and possibly repurpose the table format. +* Ingest Reference architectures +* Linkable content such as beats? Solutions ingest resources? + +* https://www.elastic.co/guide/en/starting-with-the-elasticsearch-platform-and-its-solutions/current/getting-started-guides.html[Starting with the Elastic Platform and Solutions] +* https://www.elastic.co/guide/en/observability/current/observability-get-started.html[Get started with Elastic Observability] +* https://www.elastic.co/guide/en/security/current/ingest-data.html[Ingest data into Elastic Security] +* + diff --git a/docs/en/ingest-guide/ingest-intro.asciidoc b/docs/en/ingest-guide/ingest-intro.asciidoc new file mode 100644 index 000000000..b3509fa1a --- /dev/null +++ b/docs/en/ingest-guide/ingest-intro.asciidoc @@ -0,0 +1,58 @@ +[discrete] +[[ingest-intro]] +== Ingesting data into {es} + +Bring your data! +Whether you call it _adding_, _indexing_, or _ingesting_ data, you have to get +the data into {es} before you can search it, visualize it, and use it for insights. + +Our ingest tools are flexible, and support a wide range of scenarios. +We can help you with everything from popular and straightforward use cases, all +the way to advanced use cases that require additional processing in order to modify or +reshape your data before it goes to {es}. + +You can ingest: + +* **General content** (data without timestamps), such as HTML pages, catalogs, and files +* **Timestamped (time series) data**, such as logs, metrics, and traces for Elastic Security, Observability, Search solutions, or for your own custom solutions + +[discrete] +[[ingest-general]] +=== Ingesting general content + +Elastic offer tools designed to ingest specific types of general content. +The content type determines the best ingest option. + +* To index **documents** directly into {es}, use the {es} link:{ref}/docs.html[document APIs]. +* To send **application data** directly to {es}, use an link:https://www.elastic.co/guide/en/elasticsearch/client/index.html[{es} +language client]. +* To index **web page content**, use the Elastic link:https://www.elastic.co/web-crawler[web crawler]. +* To sync **data from third-party sources**, use link:{ref}/es-connectors.html[connectors]. +* To index **single files** for testing, use the {kib} link:{kibana-ref}/connect-to-elasticsearch.html#upload-data-kibana[file uploader]. + +If you would like to try things out before you add your own data, try using our {kibana-ref}/connect-to-elasticsearch.html#_add_sample_data[sample data]. + +[discrete] +[[ingest-timestamped]] +=== Ingesting time-stamped data + +[ingest-best-timestamped] +.What's the best approach for ingesting time-stamped data? +**** +The best approach for ingesting data is the _simplest option_ that _meets your needs_ and _satisfies your use case_. + +In most cases, the _simplest option_ for ingesting timestamped data is using {agent} paired with an Elastic integration. + +* Install {fleet-guide}[Elastic Agent] on the computer(s) from which you want to collect data. +* Add the {integrations-docs}[Elastic integration] for the data source to your deployment. + +Integrations are available for many popular platforms and services, and are a +good place to start for ingesting data into Elastic solutions--Observability, +Security, and Search--or your own search application. + +Check out the {integrations-docs}/all_integrations[Integration quick reference] +to search for available integrations. +If you don't find an integration for your data source or if you need +additional processing to extend the integration, we still have you covered. +Check out <> for a sneak peek. +**** diff --git a/docs/en/ingest-guide/ingest-solutions.asciidoc b/docs/en/ingest-guide/ingest-solutions.asciidoc new file mode 100644 index 000000000..b76f3dd5c --- /dev/null +++ b/docs/en/ingest-guide/ingest-solutions.asciidoc @@ -0,0 +1,110 @@ +[[ingest-for-solutions]] +== Ingesting data for Elastic solutions + +Elastic solutions--Security, Observability, and Search--are loaded with features +and functionality to help you get value and insights from your data. +{fleet-guide}[Elastic Agent] and {integrations-docs}[Elastic integrations] can help, and are the best place to start. + +When you use integrations with solutions, you have an integrated experience that offers +easier implementation and decreases the time it takes to get insights and value from your data. + +[ingest-process-overview] +.High-level overview +**** +To use {fleet-guide}[Elastic Agent] and {integrations-docs}[Elastic integrations] +with Elastic solutions: + +1. Create an link:https://www.elastic.co/cloud[{ecloud}] deployment for your solution. + If you don't have an {ecloud} account, you can sign up for a link:https://cloud.elastic.co/registration[free trial] to get started. +2. Add the {integrations-docs}[Elastic integration] for your data source to the deployment. +3. link:{fleet-guide}/elastic-agent-installation.html[Install {agent}] on the systems whose data you want to collect. +**** + +NOTE: {serverless-docs}[Elastic serverless] makes using solutions even easier. +Sign up for a link:{serverless-docs}/general/sign-up-trial[free trial], and check it out. + + +[discrete] +[[ingest-for-search]] +=== Ingesting data for Search + +{es} is the magic behind Search and our other solutions. +The solution gives you more pre-built components to get you up and running quickly for common use cases. + +**Resources** + +* link:{fleet-guide}/elastic-agent-installation.html[Install {agent}] +* link:https://www.elastic.co/integrations/data-integrations?solution=search[Elastic Search for integrations] +* link:{ref}[{es} Guide] +** link:{ref}/docs.html[{es} document APIs] +** link:https://www.elastic.co/guide/en/elasticsearch/client/index.html[{es} language clients] +** link:https://www.elastic.co/web-crawler[Elastic web crawler] +** link:{ref}/es-connectors.html[Elastic connectors] + + +[discrete] +[[ingest-for-obs]] +=== Ingesting data for Observability + +With link:https://www.elastic.co/observability[Elastic Observability], you can +monitor and gain insights into logs, metrics, and application traces. +The guides and resources in this section illustrate how to ingest data and use +it with the Observability solution. + + +**Guides for popular Observability use cases** + +* link:{estc-welcome}/getting-started-observability.html[Monitor applications and systems with Elastic Observability] +* link:https://www.elastic.co/guide/en/observability/current/logs-metrics-get-started.html[Get started with logs and metrics] +** link:https://www.elastic.co/guide/en/observability/current/logs-metrics-get-started.html#add-system-integration[Step 1: Add the {agent} System integration] +** link:https://www.elastic.co/guide/en/observability/current/logs-metrics-get-started.html#add-agent-to-fleet[Step 2: Install and run {agent}] + +* link:{serverless-docs}/observability/what-is-observability-serverless[Observability] on link:{serverless-docs}[{serverless-full}]: +** link:{serverless-docs}/observability/quickstarts/monitor-hosts-with-elastic-agent[Monitor hosts with {agent} ({serverless-short})] +** link:{serverless-docs}/observability/quickstarts/k8s-logs-metrics[Monitor your K8s cluster with {agent} ({serverless-short})] + +**Resources** + +* link:{fleet-guide}/elastic-agent-installation.html[Install {agent}] +* link:https://www.elastic.co/integrations/data-integrations?solution=observability[Elastic Observability integrations] + +[discrete] +[[ingest-for-security]] +=== Ingesting data for Security + +You can detect and respond to threats when you use +link:https://www.elastic.co/security[Elastic Security] to analyze and take +action on your data. +The guides and resources in this section illustrate how to ingest data and use it with the Security solution. + +**Guides for popular Security use cases** + +* link:https://www.elastic.co/guide/en/starting-with-the-elasticsearch-platform-and-its-solutions/current/getting-started-siem-security.html[Use Elastic Security for SIEM] +* link:https://www.elastic.co/guide/en/starting-with-the-elasticsearch-platform-and-its-solutions/current/getting-started-endpoint-security.html[Protect hosts with endpoint threat intelligence from Elastic Security] + +**Resources** + +* link:{fleet-guide}/elastic-agent-installation.html[Install {agent}] +* link:https://www.elastic.co/integrations/data-integrations?solution=search[Elastic Security integrations] +* link:{security-guide}/es-overview.html[Elastic Security documentation] + + +[discrete] +[[ingest-for-custom]] +=== Ingesting data for your own custom search solution + +Elastic solutions can give you a head start for common use cases, but you are not at all limited. +You can still do your own thing with a custom solution designed by _you_. + +Bring your ideas and use {es} and the {stack} to store, search, and visualize your data. + +**Resources** + +* link:{fleet-guide}/elastic-agent-installation.html[Install {agent}] +* link:{ref}[{es} Guide] +** link:{ref}/docs.html[{es} document APIs] +** link:https://www.elastic.co/guide/en/elasticsearch/client/index.html[{es} language clients] +** link:https://www.elastic.co/web-crawler[Elastic web crawler] +** link:{ref}/es-connectors.html[Elastic connectors] +* link:{estc-welcome}/getting-started-general-purpose.html[Tutorial: Get started with vector search and generative AI] + diff --git a/docs/en/ingest-guide/ingest-static.asciidoc b/docs/en/ingest-guide/ingest-static.asciidoc new file mode 100644 index 000000000..162bd243c --- /dev/null +++ b/docs/en/ingest-guide/ingest-static.asciidoc @@ -0,0 +1,39 @@ +[[intro-general]] +== Ingesting general content + +Describe general content (non-timestamped)and give examples. + +.WORK IN PROGRESS +**** +Progressive disclosure: Start with basic use cases and work up to advanced processing + +Possibly repurpose and use ingest decision tree with Beats removed? +**** + +[discrete] +=== Basic use cases + +* {es} document APIs for documents. +* Elastic language clients for application data. +* Elastic web crawler for web page content. +* Connectors for data from third-party sources, such as Slack, etc. +* Kibana file uploader for individual files. +* LOGSTASH??? +** ToDO: Check out Logstash enterprisesearch-integration + +* To index **documents** directly into {es}, use the {es} document APIs. +* To send **application data** directly to {es}, use an Elastic language client. +* To index **web page content**, use the Elastic web crawler. +* To sync **data from third-party sources**, use connectors. +* To index **single files** for testing, use the Kibana file uploader. + +[discrete] +=== Advanced use cases: Data enrichment and transformation + +Tools for enriching ingested data: + +- Logstash - GEOIP enrichment. Other examples? +** Use enterprisesearch input -> Filter(s) -> ES or enterprisesearch output +- What else? + + diff --git a/docs/en/ingest-guide/ingest-timestamped.asciidoc b/docs/en/ingest-guide/ingest-timestamped.asciidoc new file mode 100644 index 000000000..a73fe30c9 --- /dev/null +++ b/docs/en/ingest-guide/ingest-timestamped.asciidoc @@ -0,0 +1,104 @@ +[[intro-timeseries]] +== Ingesting timeseries data + +.WORK IN PROGRESS +**** +Progressive disclosure: Start with basic use cases and work up to advanced processing + +Possibly repurpose and use ingest decision tree with Beats removed? +**** + +Timestamped data: +The preferred way to index timestamped data is to use Elastic Agent. Elastic Agent is a single, unified way to add monitoring for logs, metrics, and other types of data to a host. It can also protect hosts from security threats, query data from operating systems, and forward data from remote services or hardware. Each Elastic Agent based integration includes default ingestion rules, dashboards, and visualizations to start analyzing your data right away. Fleet Management enables you to centrally manage all of your deployed Elastic Agents from Kibana. + +If no Elastic Agent integration is available for your data source, use Beats to collect your data. Beats are data shippers designed to collect and ship a particular type of data from a server. You install a separate Beat for each type of data to collect. Modules that provide default configurations, Elasticsearch ingest pipeline definitions, and Kibana dashboards are available for some Beats, such as Filebeat and Metricbeat. No Fleet management capabilities are provided for Beats. + +If neither Elastic Agent or Beats supports your data source, use Logstash. Logstash is an open source data collection engine with real-time pipelining capabilities that supports a wide variety of data sources. You might also use Logstash to persist incoming data to ensure data is not lost if there’s an ingestion spike, or if you need to send the data to multiple destinations. + +---> Basic diagram + +[discrete] +=== Basic use case: Integrations to ES + +Reiterate Integrations as basic ingest use case + +ToDo: evaluate terminology (basic???) + + +[discrete] +=== Advanced use case: Integration to Logstash to ES + +Highlight logstash-filter-elastic_agent capabilities + + +[discrete] +=== Other advanced use cases (from decision tree) + +* Agent + Agent processors??? +* Agent + Runtime fields??? + + + +// CONTENT LIFTED FROM former `TOOLS` topic + + +[discrete] +=== Elastic agent and Elastic integrations +The best choice for ingesting data is the _simplest option_ that _meets your needs_ and _satisfies your use case_. +For many popular ingest scenarios, the best option is Elastic agent and Elastic integrations. + +* Elastic agent installed on the endpoints where you want to collect data. +Elastic Agent collects the data from one or more endpoints, and forwards the data to the service or location where is used. +* An Elastic integration to receive that data from agents + +TIP: Start here! +Elastic Agent for data collection paired with Elastic integrations is the best ingest option for most use cases. + + +[discrete] +=== OTel +Coming on strong. Where are we now, and cautiously explain where we're going in the near term. + +Open Telemetry is a leader for collecting Observability data + +Elastic is a supporting member. +We're contributing to the OTel project, and are using elastic/opentelemetry for specialized development not applicable to upstream. + +* https://www.elastic.co/guide/en/observability/current/apm-open-telemetry.html + +Contributing to upstream and doing our on for work specific to Elastic +* https://github.com/open-telemetry/opentelemetry-collector-contrib +* https://github.com/elastic/opentelemetry + +[discrete] +=== Logstash + +{ls} is an open source data collection engine with real-time pipelining capabilities. +It supports a wide variety of data sources, and can dynamically unify data from disparate sources and normalize the data into destinations of your choice. + +{ls} can collect data using a variety of {ls} input plugins, enrich and transform the data with {ls} filter plugins, and output the data to {es} and other destinations using the {ls} output plugins. + +You can use Logstash to extend Beats for advanced use cases, such as data routed to multiple destinations or when you need to make your data persistent. + +* {ls} input for when no integration is available +* {ls} integrations filter for advanced processing + +TIP: + +If an integration is available for your datasource, start with Elastic Agent + integration. + +Use Logstash if there's no integration for your data source or for advanced processing: + +Use {ls} when: + +* no integration (use Logstash input) +* an Elastic integration exists, but you need advanced processing between the Elastic integration and {es}: + +Advanced use cases solved by {ls}: + +* {ls} for https://www.elastic.co/guide/en/ingest/current/ls-enrich.html[data enrichment] before sending data to {es} +* https://www.elastic.co/guide/en/ingest/current/lspq.html[{ls} Persistent Queue (PQ) for buffering] +* https://www.elastic.co/guide/en/ingest/current/ls-networkbridge.html[{ls} as a proxy] when there are network restrictions that prevent connections between Elastic Agent and {es} +* https://www.elastic.co/guide/en/ingest/current/ls-multi.html[{ls} for routing data to multiple {es} clusters and additional destinations] +* https://www.elastic.co/guide/en/ingest/current/agent-proxy.html[{ls} as a proxy] + diff --git a/docs/en/ingest-guide/ingest-tools.asciidoc b/docs/en/ingest-guide/ingest-tools.asciidoc new file mode 100644 index 000000000..28a876338 --- /dev/null +++ b/docs/en/ingest-guide/ingest-tools.asciidoc @@ -0,0 +1,89 @@ +[[ingest-tools]] +== Tools for ingesting time-series data + + +Elastic and others offer tools to help you get your data from the original data source into {es}. +Some tools are designed for particular data sources, and others are multi-purpose. + +// Iterative messaging as our recommended strategy morphs. +// This section is the summary. "Here's the story _now_." +// Hint at upcoming changes, but do it cautiously and responsibly. +// Modular and co-located to make additions/updates/deprecations easier as our story matures. + + +In this section, we'll help you determine which option is best for you. + +* <> +* <> +* <> +* <> + +[discrete] +[[ingest-ea]] +=== {agent} and Elastic integrations + +A single link:{fleet-guide}[{agent}] can collect multiple types of data when it is link:{fleet-guide}/elastic-agent-installation.html[installed] on a host computer. +You can use standalone {agent}s and manage them locally on the systems where they are installed, or you can manage all of your agents and policies with the link:{fleet-guide}/manage-agents-in-fleet.html[Fleet UI in {kib}]. + +Use {agent} with one of hundreds of link:{integrations-docs}[Elastic integrations] to simplify collecting, transforming, and visualizing data. +Integrations include default ingestion rules, dashboards, and visualizations to help you start analyzing your data right away. +Check out the {integrations-docs}/all_integrations[Integration quick reference] to search for available integrations that can reduce your time to value. + +{agent} is the best option for collecting timestamped data for most data sources +and use cases. +If your data requires additional processing before going to {es}, you can use +link:{fleet-guide}/elastic-agent-processor-configuration.html[{agent} +processors], link:{logstash-ref}[{ls}], or additional processing features in +{es}. +Check out <> to see options. + +Ready to try link:{fleet-guide}[{agent}]? Check out the link:{fleet-guide}/elastic-agent-installation.html[installation instructions]. + +[discrete] +[[ingest-beats]] +=== {beats} + +link:{beats-ref}/beats-reference.html[Beats] are the original Elastic lightweight data shippers, and their capabilities live on in Elastic Agent. +When you use Elastic Agent, you're getting core Beats functionality, but with more added features. + + +Beats require that you install a separate Beat for each type of data you want to collect. +A single Elastic Agent installed on a host can collect and transport multiple types of data. + +**Best practice:** Use link:{fleet-guide}[{agent}] whenever possible. +If your data source is not yet supported by {agent}, use {beats}. +Check out the {beats} and {agent} link:{fleet-guide}/beats-agent-comparison.html#additional-capabilities-beats-and-agent[comparison] for more info. +When you are ready to upgrade, check out link:{fleet-guide}/migrate-beats-to-agent.html[Migrate from {beats} to {agent}]. + +[discrete] +[[ingest-otel]] +=== OpenTelemetry (OTel) collectors + +link:https://opentelemetry.io/docs[OpenTelemetry] is a vendor-neutral observability framework for collecting, processing, and exporting telemetry data. +Elastic is a member of the Cloud Native Computing Foundation (CNCF) and active contributor to the OpenTelemetry project. + +In addition to supporting upstream OTel development, Elastic provides link:https://github.com/elastic/opentelemetry[Elastic Distributions of OpenTelemetry], specifically designed to work with Elastic Observability. +We're also expanding link:{fleet-guide}[{agent}] to use OTel collection. + +[discrete] +[[ingest-logstash]] +=== Logstash + +link:{logstash-ref}[{ls}] is a versatile open source data ETL (extract, transform, load) engine that can expand your ingest capabilities. +{ls} can _collect data_ from a wide variety of data sources with {ls} link:{logstash-ref}/input-plugins.html[input +plugins], _enrich and transform_ the data with {ls} link:{logstash-ref}/filter-plugins.html[filter plugins], and _output_ the +data to {es} and other destinations with the {ls} link:{logstash-ref}/output-plugins.html[output plugins]. + +Many users never need to use {ls}, but it's available if you need it for: + +* **Data collection** (if an Elastic integration isn't available). +{agent} and Elastic {integrations-docs}/all_integrations[integrations] provide many features out-of-the-box, so be sure to search or browse integrations for your data source. +If you don't find an Elastic integration for your data source, check {ls} for an {logstash-ref}/input-plugins.html[input plugin] for your data source. +* **Additional processing.** One of the most common {ls} use cases is link:{logstash-ref}/ea-integrations.html[extending Elastic integrations]. +You can take advantage of the extensive, built-in capabilities of Elastic Agent and Elastic Integrations, and +then use {ls} for additional data processing before sending the data on to {es}. +* **Advanced use cases.** {ls} can help with advanced use cases, such as when you need +link:{ingest-guide}/lspq.html[persistence or buffering], +additional link:{ingest-guide}/ls-enrich.html[data enrichment], +link:{ingest-guide}/ls-networkbridge.html[proxying] as a way to bridge network connections, or the ability to route data to +link:{ingest-guide}/ls-multi.html[multiple destinations].