Skip to content

Commit

Permalink
Merge branch 'main' into testimonial_uaw
Browse files Browse the repository at this point in the history
  • Loading branch information
hcourdent authored Sep 23, 2024
2 parents c47966a + d544a34 commit 80747e4
Show file tree
Hide file tree
Showing 87 changed files with 1,390 additions and 533 deletions.
2 changes: 1 addition & 1 deletion blog/2023-08-10-supabase-partnership/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ This week, Supabase is celebrating their [8th Launch Week](https://supabase.com/

## Windmill for internal tools

Windmill is an [open-source](https://github.com/windmill-labs/windmill), blazing fast and scalable alternative to Retool, Airplane, Superblocks, n8n, Airflow, Prefect, Temporal to build all your internal tools (endpoints, workflows, UIs) through the combination of code (in TypeScript, Python, Go, PHP, Bash, SQL or any docker image) and low code builders. It embeds all-in-one:
Windmill is an [open-source](https://github.com/windmill-labs/windmill), blazing fast and scalable alternative to Retool, Airplane, Superblocks, n8n, Airflow, Prefect, Temporal to build all your internal tools (endpoints, workflows, UIs) through the combination of code (in TypeScript, Python, Go, PHP, Bash, SQL and Rust or any docker image) and low code builders. It embeds all-in-one:

- an **execution runtime** to execute functions at scale with low-latency and no overhead on a fleet of workers
- an **orchestrator** to compose functions into powerful flows at low-latency built with a low-code builder (or yaml if that's your thing)
Expand Down
6 changes: 3 additions & 3 deletions blog/2023-11-15-launch-week-1/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -297,11 +297,11 @@ If that's not sufficient you can even build your own app in React.

## Day 5

For the last day of our launch week today we focused on features that will help you in your ETLs, with restartable flows and S3 integration for data pipelines.
For the last day of our launch week today we focused on features that will help you in your ETLs, with restartable flows and Workspace object storage for data pipelines.

### Windmill for data pipelines - S3 Integration
### Windmill for data pipelines - Workspace object storage

![Windmill for data pipelines - S3 Integration](../2023-11-24-data-pipeline-orchestrator/data_pipelines.png.webp 'Windmill for data pipelines - S3 Integration')
![Windmill for data pipelines - Workspace object storage](../2023-11-24-data-pipeline-orchestrator/data_pipelines.png.webp 'Windmill for data pipelines - Workspace object storage')

_Run your ETLs on-prem up to 5x faster using Windmill compared to Spark while simplifying your infra._

Expand Down
2 changes: 1 addition & 1 deletion blog/2023-11-20-ai-flow-builder/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ You can see below an example of a simple workflow with a [for-loop](/docs/flows/
![Windmill DAG](./media/windmill-dag.png.webp)

:::info Workflow engine vs Analytics engine
All the examples above focus on small api integrations but data pipeline that would usually be run on dedicated analytics engine are a great fit for Windmill when combined with s3, and dataframe/olap libraries such as polars or duckdb.
All the examples above focus on small api integrations but data pipeline that would usually be run on dedicated analytics engine are a great fit for Windmill when combined with S3, and dataframe/olap libraries such as polars or duckdb.
Indeed, thanks to these integrations and Windmill's lack of boilerplate, Windmill offers state-of-the-art performances for data processing at scale while keeping complexity low.

<br />
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ import ScatterChart from '@site/src/components/ScatterChart';
/>
</div>

[Benchmarking data and dedicated methodology documentation](https://www.Windmill.dev/docs/misc/benchmarks/competitors).
[Benchmarking data and dedicated methodology documentation](/docs/misc/benchmarks/competitors).

You've known Windmill to be a productive environment to monitor, write and iterate on workflows, but we wanted to prove it's also the best system to deploy at scale in production.

Expand Down Expand Up @@ -149,7 +149,7 @@ That being said, Temporal is amazing at what it does and if there are overlaps b

We leave analytics/ETL engines such as Spark or Dagster out of it for today as they are not workflow engines _per se_ even if they are built on top of ones.

ETL and analytics workflows will be covered later this week, and you will find that Windmill offers best-in-class performance for analytics workloads leveraging s3, duckdb and polars
ETL and analytics workflows will be covered later this week, and you will find that Windmill offers best-in-class performance for analytics workloads leveraging S3, duckdb and polars

:::

Expand Down Expand Up @@ -277,7 +277,7 @@ In Windmill, there are 3 main ways of doing data passing:

- every input of a step can be a javascript expression that can refer to any step outputs

Every script in typescript, python, go, bash has their main signature parsed (by a WASM program in the frontend) which allows to pre-compute the different inputs needed for a given step.
Every script in TypeScript, Python, Go, Bash, SQL, Rust has their main signature parsed (by a WASM program in the frontend) which allows to pre-compute the different inputs needed for a given step.
For each of those inputs, one can define either a static input or a javascript expression that can refer to the result of any step. e.g: `results.d.foo` where `d` is the id of the step.
That javascript expression when complex is evaluated by an embedded v8 using deno runtime. It takes ~8ms by expression.

Expand All @@ -299,7 +299,7 @@ json*path.map(|x| x.split(".").map(|x| x.to_string()).collect::<Vec<*>>())

- share data in a temporary folder
Flows can be configured to be wholly executed on the same worker. When that is the case, a folder is shared and symlinked inside every job's ephemeral folder (jobs are started in an ephemeral folder that is removed at the end of their execution)
- pass data in S3 using the S3 integration (updates specific to that part to be presented on day 5)
- pass data in S3 using the [workspace object storage](/docs/core_concepts/object_storage_in_windmill#workspace-object-storage) (updates specific to that part to be presented on day 5)

## Workers efficiency

Expand Down
4 changes: 2 additions & 2 deletions blog/2023-11-24-data-pipeline-orchestrator/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ And for storage, you can now link a Windmill workspace to an S3 bucket and use i
The very large majority of ETLs can be processed step-wise on single nodes and Windmill provides (one of) the best models for orchestrating non-sharded compute. Using this model, your ETLs will see a massive performance improvement, your infrastructure
will be easier to manage and your pipeline will be easier to write, maintain, and monitor.

## Windmill integration with an external S3 storage
## Windmill integration with an external object storage

In Windmill, a data pipeline is implemented using a [flow](/docs/flows/flow_editor), and each step of the pipeline is a script. One of the key features of Windmill flows is to easily [pass a step result to its dependent steps](/docs/flows/architecture). But
because those results are serialized to Windmill database and kept as long as the job is stored, this obviously won't work when the result is a dataset of millions of rows. The solution is to save the datasets to an external storage at the end of each script.
Expand All @@ -69,7 +69,7 @@ The first step is to define an [S3 resource](/docs/integrations/s3) in Windmill
![S3 workspace settings](./workspace_s3_settings.png 'S3 workspace settings')

From now on, Windmill will be connected to this bucket and you'll have easy access to it from the code editor and the job run details. If a script takes as input a `s3object`, you will see in the input form on the right a button helping you choose the file directly from the bucket.
Same for the result of the script. If you return an `s3object` containing a key `s3` pointing to a file inside your bucket, in the result panel there will be a button to open the bucket explorer to visualize the file.
Same for the result of the script. If you return an `s3object` containing a [key](/docs/core_concepts/rich_display_rendering#s3) `s3` pointing to a file inside your bucket, in the result panel there will be a button to open the bucket explorer to visualize the file.

![Windmill code editor](./s3_object_code_editor.png 'Windmill code editor')

Expand Down
2 changes: 1 addition & 1 deletion blog/2023-12-27-bash-script-arguments/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -179,6 +179,6 @@ For [Bash scripts in Windmill](/docs/getting_started/scripts_quickstart/bash#cod

<br/>

Windmill is an open-source developer platform and workflow engine to build internal tools. It helps you transform scripts into auto-generated UIs, APIs and cron jobs. Windmill also supports coding in TypeScript, Python, Go, PHP, Bash, SQL, or any Docker image.
Windmill is an open-source developer platform and workflow engine to build internal tools. It helps you transform scripts into auto-generated UIs, APIs and cron jobs. Windmill also supports coding in TypeScript, Python, Go, PHP, Bash, SQL and Rust, or any Docker image.

To explore Windmill and make your shell scripts user-friendly for your team, even those unfamiliar with the command line, refer to the [documentation](https://www.windmill.dev/docs/).
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ This command gets the YAML statement of currently running pods and pipes the out

In this guide, we've introduced you to Kubernetes pods and why you might need to restart them. Generally, the most recommended way to ensure no application downtime is to use `kubectl rollout restart deployment <deployment_name> -n <namespace>`.

Windmill is an [open-source](https://github.com/windmill-labs/windmill) developer platform and workflow engine to build internal tools. It can turn scripts into auto-generated UIs, APIs, and cron jobs. It supports coding in TypeScript, Python, Go, PHP, Bash, SQL, or any Docker image. You can [self-host](/docs/advanced/self_host) Windmill within your own environment.
Windmill is an [open-source](https://github.com/windmill-labs/windmill) developer platform and workflow engine to build internal tools. It can turn scripts into auto-generated UIs, APIs, and cron jobs. It supports coding in TypeScript, Python, Go, PHP, Bash, SQL and Rust, or any Docker image. You can [self-host](/docs/advanced/self_host) Windmill within your own environment.

Windmill runs on Kubernetes and [workers](/docs/core_concepts/worker_groups) can be easily scaled up and down to meet performance needs.

Expand Down
2 changes: 1 addition & 1 deletion blog/2024-04-18-useful-python-scripts/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ In this blog post, we will explore ten Python scripts that stand out due to thei

## Use Windmill to create, run and monitor Python Scripts

Windmill is an open-source developer platform and workflow engine designed to build comprehensive internal tools (endpoints, workflows, UIs). It supports coding in TypeScript, Python, Go, PHP, Bash, SQL, or any Docker image, alongside intuitive low-code builders, including:
Windmill is an open-source developer platform and workflow engine designed to build comprehensive internal tools (endpoints, workflows, UIs). It supports coding in TypeScript, Python, Go, PHP, Bash, SQL and Rust, or any Docker image, alongside intuitive low-code builders, including:

- An [execution runtime](/docs/script_editor) for scalable, low-latency function execution across a worker fleet.
- An [orchestrator](/docs/flows/flow_editor) for assembling these functions into efficient, low-latency flows, using either a low-code builder or YAML.
Expand Down
2 changes: 1 addition & 1 deletion blog/2024-06-28-edit-crontab/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -166,7 +166,7 @@ In addition to scripts and flows, Windmill also allows [scheduling of app report

## What is Windmill?

Windmill is a fast, [open-source](https://github.com/windmill-labs/windmill) workflow engine and developer platform. It's an alternative to the likes of Retool, Superblocks, n8n, Airflow, Prefect, and Temporal, designed to **build comprehensive internal tools** (endpoints, workflows, UIs). It supports coding in TypeScript, Python, Go, PHP, Bash, SQL, or any Docker image, alongside intuitive low-code builders, featuring:
Windmill is a fast, [open-source](https://github.com/windmill-labs/windmill) workflow engine and developer platform. It's an alternative to the likes of Retool, Superblocks, n8n, Airflow, Prefect, and Temporal, designed to **build comprehensive internal tools** (endpoints, workflows, UIs). It supports coding in TypeScript, Python, Go, PHP, Bash, SQL and Rust, or any Docker image, alongside intuitive low-code builders, featuring:

- An [execution runtime](/docs/script_editor) for scalable, low-latency function execution across a worker fleet.
- An [orchestrator](/docs/flows/flow_editor) for assembling these functions into efficient, low-latency flows, using either a low-code builder or YAML.
Expand Down
6 changes: 3 additions & 3 deletions blog/2024-07-12-airflow-alternatives/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -134,14 +134,14 @@ Asked if it's an orchestrator, Kedro replies:
### Windmill

Windmill is an [open-source](https://github.com/windmill-labs/windmill) workflow engine and developer platform designed to build internal tools, including endpoints, workflows, and UIs. It supports coding in [multiple languages](/docs/getting_started/scripts_quickstart) such as TypeScript, Python, Go, Bash, SQL, or any Docker image, alongside low-code builders.
Windmill is an [open-source](https://github.com/windmill-labs/windmill) workflow engine and developer platform designed to build internal tools, including endpoints, workflows, and UIs. It supports coding in [multiple languages](/docs/getting_started/scripts_quickstart) such as TypeScript, Python, Go, Bash, SQL, Rust or any Docker image, alongside low-code builders.

Windmill was designed by developers for developers, ranging from semi-technical (low code builders) to senior/staff software engineers with high standards for production-grade yet flexible and customizable with code. Windmill was built to address the challenge of turning high-value code containing business logic, data transformation, and internal API calls into scalable microservices and tools without the usual heavy lifting.

On the other hand, the support of [Python](/docs/getting_started/scripts_quickstart/python) as a primary language and the integration of a workspace with [object storage](/docs/core_concepts/persistent_storage/large_data_files) (in particular, S3) make Windmill an excellent fit for data engineers, particularly for building [data pipelines](/docs/core_concepts/data_pipelines).
On the other hand, the support of [Python](/docs/getting_started/scripts_quickstart/python) as a primary language and the integration of a workspace with [object storage](/docs/core_concepts/object_storage_in_windmill) (in particular, S3) make Windmill an excellent fit for data engineers, particularly for building [data pipelines](/docs/core_concepts/data_pipelines).

Windmill has three editors (or products), all compatible, each independently functioning:
1. The [Script Editor](/docs/script_editor) is an integrated development environment that allows you to write code in various languages like TypeScript, Python, Go, Bash, SQL, or even run any Docker container through Windmill's Bash support.
1. The [Script Editor](/docs/script_editor) is an integrated development environment that allows you to write code in various languages like TypeScript, Python, Go, Bash, SQL, Rust or even run any Docker container through Windmill's Bash support.
2. The [Flow Editor](/docs/flows/flow_editor) is a low-code builder that enables you to create workflows represented as directed acyclic graphs (DAGs), orchestrating the execution of steps across different workers while respecting dependency constraints.
3. The [App Editor](/docs/apps/app_editor) is a tool for creating customized, user interfaces using a drag-and-drop editor, allowing you to build data-centric dashboards.

Expand Down
4 changes: 2 additions & 2 deletions changelog/2024-05-31-secondary-storage/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,11 @@ version: v1.340.0
title: Secondary Storage
tags: ['Persistent Storage']
image: ./secondary_storage.png
description: With all Windmill S3 Integration features, read and write from a storage that is not your main storage by specifying it in the s3 object as "secondary_storage" with the name of it.
description: Read and write from a storage that is not your main storage by specifying it in the S3 object as "secondary_storage" with the name of it.
features:
[
'Add additional storages from S3, Azure Blob, AWS OIDC or Azure Workload Identity.',
'From script, specify the secondary storage with an object with properties `s3` (path to the file) and `storage` (name of the secondary storage).'
]
docs: /docs/core_concepts/persistent_storage/large_data_files#secondary-s3-storage
docs: /docs/core_concepts/object_storage_in_windmill#secondary-s3-storage
---
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
14 changes: 14 additions & 0 deletions changelog/2024-08-30-rust-support/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
---
slug: rust-support
version: v1.388.0
title: Rust Support
tags: ['Rust', 'Code Editor']
description: Windmill now supports Rust scripts.
features:
[
'Write your Windmill script in Rust.',
'Run your Rust scripts locally or in the cloud.',
]
image: ./editor_rust.png
docs: /docs/getting_started/scripts_quickstart/rust
---
4 changes: 2 additions & 2 deletions docs/advanced/14_dependencies_in_typescript/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@ Windmill CLI, it is done automatically on `wmill sync push` for any script that

<div className="grid grid-cols-2 gap-6 mb-4">
<DocCard
title="Codebases & Bundles"
title="Codebases & bundles"
description="Deploy scripts with any local relative imports as bundles."
href="/docs/core_concepts/codebases_and_bundles"
/>
Expand Down Expand Up @@ -254,7 +254,7 @@ Note that path in Windmill can have as many depth as needed, so you can have pat

You can use private npm registries and private npm packages in your TypeScript scripts.

This applies to all methods above. Only, if using Codebases & Bundles locally, there is nothing to configure in Windmill, because the bundle is built locally using your locally-installed modules (which support traditional npm packages and private npm packages).
This applies to all methods above. Only, if using Codebases & bundles locally, there is nothing to configure in Windmill, because the bundle is built locally using your locally-installed modules (which support traditional npm packages and private npm packages).

![Private NPM registry](../6_imports/private_registry.png 'Private NPM registry')

Expand Down
6 changes: 4 additions & 2 deletions docs/advanced/18_instance_settings/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -92,12 +92,14 @@ This setting is only available on [Enterprise Edition](/pricing).

### S3/Azure for Python/Go cache & large logs

Bucket to [store large logs](../../core_concepts/20_jobs/index.mdx#s3azure-for-python-cache--large-logs) and global cache for Python and Go.
[Connect your instance](../../core_concepts/38_object_storage_in_windmill/index.mdx#instance-object-storage) to a S3 bucket to [store large logs](../../core_concepts/20_jobs/index.mdx#large-logs-management-with-s3) and [global cache for Python and Go](../../misc/13_s3_cache/index.mdx).

This feature has no overlap with the [Workspace S3 integration](../../core_concepts/11_persistent_storage/large_data_files.mdx).
This feature has no overlap with the [Workspace object storage](../../core_concepts/38_object_storage_in_windmill/index.mdx#workspace-object-storage).

You can choose to use either S3 or Azure Blob Storage. For each you will find a button to test settings from a server or from a worker.

![S3/Azure for Python/Go cache & large logs](../../core_concepts/20_jobs/s3_azure_cache.png "S3/Azure for Python/Go cache & large logs")

This setting is only available on [Enterprise Edition](/pricing).

### Critical alert channels
Expand Down
Loading

0 comments on commit 80747e4

Please sign in to comment.