Skip to content

Commit

Permalink
Merge pull request #119 from aai-institute/feature/renaming-to-contin…
Browse files Browse the repository at this point in the history
…uiti

Rename continuity to continuiti.
  • Loading branch information
samuelburbulla authored Apr 11, 2024
2 parents b6a2cd9 + 97f1bda commit ae51e89
Show file tree
Hide file tree
Showing 106 changed files with 342 additions and 342 deletions.
4 changes: 2 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Contributing

Continuity aims to be a repository of architectures and benchmarks for
**continuiti** aims to be a repository of architectures and benchmarks for
operator learning with neural networks and its applications.

Contributions are welcome from anyone in the form of pull requests,
Expand Down Expand Up @@ -86,7 +86,7 @@ during the test run.
Because we want documentation to include the full dataset, we commit notebooks
with their outputs running with full datasets to the repo. The notebooks are
then added by CI to the section
[Examples](https://aai-institute.github.io/continuity/examples.html) of the
[Examples](https://aai-institute.github.io/continuiti/examples.html) of the
documentation.

### Hiding cells in notebooks
Expand Down
22 changes: 11 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,40 +1,40 @@
<div align="center">
<img alt="Continuity" src="https://aai-institute.github.io/continuity/img/icon.png" width="100">
<img alt="continuiti" src="https://aai-institute.github.io/continuiti/img/icon.png" width="100">

<h1>Continuity</h1>
<h1>continuiti</h1>

Learning function operators with neural networks.

[![PyTorch](https://img.shields.io/badge/PyTorch-ee4c2c?logo=pytorch&logoColor=white)](https://pytorch.org/get-started/locally/)
[![Documentation](https://img.shields.io/badge/Documentation-blue)](https://aai-institute.github.io/continuity/)
[![Test](https://github.com/aai-institute/continuity/actions/workflows/test.yml/badge.svg)](https://github.com/aai-institute/continuity/actions/workflows/test.yml)
[![Documentation](https://img.shields.io/badge/Documentation-blue)](https://aai-institute.github.io/continuiti/)
[![Test](https://github.com/aai-institute/continuiti/actions/workflows/test.yml/badge.svg)](https://github.com/aai-institute/continuiti/actions/workflows/test.yml)
</div>

**Continuity** is a Python package for machine learning on function operators.
**continuiti** is a Python package for machine learning on function operators.
It implements various neural operator architectures (e.g., DeepONets),
physics-informed loss functions to train based on PDEs, and a collection of
examples and benchmarks.

## Installation
Clone the repository and install the package using pip.
```
git clone https://github.com/aai-institute/continuity.git
cd continuity
git clone https://github.com/aai-institute/continuiti.git
cd continuiti
pip install -e .
```

## Usage
Our [Documentation](https://aai-institute.github.io/continuity/) contains a
Our [Documentation](https://aai-institute.github.io/continuiti/) contains a
verbose introduction to operator learning, a collection of examples using
Continuity, and a class documentation.
continuiti, and a class documentation.

In general, the operator syntax in Continuity is
In general, the operator syntax in **continuiti** is
```python
v = operator(x, u(x), y)
```
mapping a function `u` (evaluated at `x`) to function `v` (evaluated in `y`).
For more details, see
[Learning Operators](https://aai-institute.github.io/continuity/operators/index.html).
[Learning Operators](https://aai-institute.github.io/continuiti/operators/index.html).

## Contributing
Contributions are welcome from anyone in the form of pull requests, bug reports
Expand Down
6 changes: 3 additions & 3 deletions benchmarks/flame/run_flame.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from functools import partial
from continuity.benchmarks.run import BenchmarkRunner, RunConfig
from continuity.benchmarks import Flame
from continuity.operators import ConvolutionalNeuralNetwork
from continuiti.benchmarks.run import BenchmarkRunner, RunConfig
from continuiti.benchmarks import Flame
from continuiti.operators import ConvolutionalNeuralNetwork


def run():
Expand Down
20 changes: 10 additions & 10 deletions benchmarks/html/table.html
Original file line number Diff line number Diff line change
@@ -1,25 +1,25 @@
<link rel="stylesheet" href="style.css">
<h2><a href="../api/continuity/benchmarks/#continuity.benchmarks.SineRegular">SineRegular</a></h2>
<h2><a href="../api/continuiti/benchmarks/#continuiti.benchmarks.SineRegular">SineRegular</a></h2>
<table class="benchmark-table">
<thead>
<tr><th>Operator</th><th>Params</th><th>Learning Curve</th><th>loss/train</th><th>loss/test</th></tr>
</thead>
<tbody>
<tr><th><a href="../api/continuity/operators/#continuity.operators.FourierNeuralOperator" >FourierNeuralOperator</a><div class="div-params">(depth=1, width=4)</div></th><td>305</td><td width="150px"><img height="60px" src="img/SineRegular_FourierNeuralOperator.svg"></td><td>3.43e-15</td><td><b>3.61e-15</b></td></tr>
<tr><th><a href="../api/continuity/operators/#continuity.operators.DeepNeuralOperator" >DeepNeuralOperator</a><div class="div-params">(depth=32, width=64)</div></th><td>133249</td><td width="150px"><img height="60px" src="img/SineRegular_DeepNeuralOperator.svg"></td><td>6.95e-07</td><td>7.36e-07</td></tr>
<tr><th><a href="../api/continuity/operators/#continuity.operators.BelNet" >BelNet</a><div class="div-params">(D_1=4, D_2=8, K=8, N_1=16, N_2=16)</div></th><td>7768</td><td width="150px"><img height="60px" src="img/SineRegular_BelNet.svg"></td><td>1.77e-05</td><td>1.81e-05</td></tr>
<tr><th><a href="../api/continuity/operators/#continuity.operators.DeepONet" >DeepONet</a><div class="div-params">(basis_functions=32, branch_depth=8, branch_width=32, trunk_depth=8, trunk_width=32)</div></th><td>18016</td><td width="150px"><img height="60px" src="img/SineRegular_DeepONet.svg"></td><td>2.5e-05</td><td>2.57e-05</td></tr>
<tr><th><a href="../api/continuiti/operators/#continuiti.operators.FourierNeuralOperator" >FourierNeuralOperator</a><div class="div-params">(depth=1, width=4)</div></th><td>305</td><td width="150px"><img height="60px" src="img/SineRegular_FourierNeuralOperator.svg"></td><td>3.43e-15</td><td><b>3.61e-15</b></td></tr>
<tr><th><a href="../api/continuiti/operators/#continuiti.operators.DeepNeuralOperator" >DeepNeuralOperator</a><div class="div-params">(depth=32, width=64)</div></th><td>133249</td><td width="150px"><img height="60px" src="img/SineRegular_DeepNeuralOperator.svg"></td><td>6.95e-07</td><td>7.36e-07</td></tr>
<tr><th><a href="../api/continuiti/operators/#continuiti.operators.BelNet" >BelNet</a><div class="div-params">(D_1=4, D_2=8, K=8, N_1=16, N_2=16)</div></th><td>7768</td><td width="150px"><img height="60px" src="img/SineRegular_BelNet.svg"></td><td>1.77e-05</td><td>1.81e-05</td></tr>
<tr><th><a href="../api/continuiti/operators/#continuiti.operators.DeepONet" >DeepONet</a><div class="div-params">(basis_functions=32, branch_depth=8, branch_width=32, trunk_depth=8, trunk_width=32)</div></th><td>18016</td><td width="150px"><img height="60px" src="img/SineRegular_DeepONet.svg"></td><td>2.5e-05</td><td>2.57e-05</td></tr>
</tbody>
</table>
<h2><a href="../api/continuity/benchmarks/#continuity.benchmarks.SineUniform">SineUniform</a></h2>
<h2><a href="../api/continuiti/benchmarks/#continuiti.benchmarks.SineUniform">SineUniform</a></h2>
<table class="benchmark-table">
<thead>
<tr><th>Operator</th><th>Params</th><th>Learning Curve</th><th>loss/train</th><th>loss/test</th></tr>
</thead>
<tbody>
<tr><th><a href="../api/continuity/operators/#continuity.operators.DeepNeuralOperator" >DeepNeuralOperator</a><div class="div-params">(depth=8, width=64)</div></th><td>33409</td><td width="150px"><img height="60px" src="img/SineUniform_DeepNeuralOperator.svg"></td><td>0.000239</td><td><b>0.000397</b></td></tr>
<tr><th><a href="../api/continuity/operators/#continuity.operators.BelNet" >BelNet</a><div class="div-params">(D_1=8, D_2=4, K=16, N_1=16, N_2=8)</div></th><td>11512</td><td width="150px"><img height="60px" src="img/SineUniform_BelNet.svg"></td><td>0.000317</td><td>0.000572</td></tr>
<tr><th><a href="../api/continuity/operators/#continuity.operators.DeepONet" >DeepONet</a><div class="div-params">(basis_functions=32, branch_depth=8, branch_width=32, trunk_depth=8, trunk_width=32)</div></th><td>18016</td><td width="150px"><img height="60px" src="img/SineUniform_DeepONet.svg"></td><td>0.00313</td><td>0.00533</td></tr>
<tr><th><a href="../api/continuity/operators/#continuity.operators.FourierNeuralOperator" >FourierNeuralOperator</a><div class="div-params">(depth=3, width=4)</div></th><td>889</td><td width="150px"><img height="60px" src="img/SineUniform_FourierNeuralOperator.svg"></td><td>0.199</td><td>0.207</td></tr>
<tr><th><a href="../api/continuiti/operators/#continuiti.operators.DeepNeuralOperator" >DeepNeuralOperator</a><div class="div-params">(depth=8, width=64)</div></th><td>33409</td><td width="150px"><img height="60px" src="img/SineUniform_DeepNeuralOperator.svg"></td><td>0.000239</td><td><b>0.000397</b></td></tr>
<tr><th><a href="../api/continuiti/operators/#continuiti.operators.BelNet" >BelNet</a><div class="div-params">(D_1=8, D_2=4, K=16, N_1=16, N_2=8)</div></th><td>11512</td><td width="150px"><img height="60px" src="img/SineUniform_BelNet.svg"></td><td>0.000317</td><td>0.000572</td></tr>
<tr><th><a href="../api/continuiti/operators/#continuiti.operators.DeepONet" >DeepONet</a><div class="div-params">(basis_functions=32, branch_depth=8, branch_width=32, trunk_depth=8, trunk_width=32)</div></th><td>18016</td><td width="150px"><img height="60px" src="img/SineUniform_DeepONet.svg"></td><td>0.00313</td><td>0.00533</td></tr>
<tr><th><a href="../api/continuiti/operators/#continuiti.operators.FourierNeuralOperator" >FourierNeuralOperator</a><div class="div-params">(depth=3, width=4)</div></th><td>889</td><td width="150px"><img height="60px" src="img/SineUniform_FourierNeuralOperator.svg"></td><td>0.199</td><td>0.207</td></tr>
</tbody>
</table>
4 changes: 2 additions & 2 deletions benchmarks/navierstokes/plot_navierstokes.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import torch
import matplotlib.pyplot as plt
from continuity.benchmarks import NavierStokes
from continuity.operators import FourierNeuralOperator
from continuiti.benchmarks import NavierStokes
from continuiti.operators import FourierNeuralOperator

device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")

Expand Down
6 changes: 3 additions & 3 deletions benchmarks/navierstokes/run_navierstokes.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from functools import partial
from continuity.benchmarks.run import BenchmarkRunner, RunConfig
from continuity.benchmarks import NavierStokes
from continuity.operators import FourierNeuralOperator
from continuiti.benchmarks.run import BenchmarkRunner, RunConfig
from continuiti.benchmarks import NavierStokes
from continuiti.operators import FourierNeuralOperator

config = RunConfig(
benchmark_factory=NavierStokes,
Expand Down
2 changes: 1 addition & 1 deletion benchmarks/process.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
from continuity.benchmarks.run.table import BenchmarkTable
from continuiti.benchmarks.run.table import BenchmarkTable


if __name__ == "__main__":
Expand Down
6 changes: 3 additions & 3 deletions benchmarks/run_all.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@
import torch
from functools import partial
from multiprocessing import Pool
from continuity.benchmarks.run import BenchmarkRunner, RunConfig
from continuity.benchmarks import SineRegular, SineUniform
from continuity.operators import (
from continuiti.benchmarks.run import BenchmarkRunner, RunConfig
from continuiti.benchmarks import SineRegular, SineUniform
from continuiti.operators import (
DeepONet,
BelNet,
FourierNeuralOperator,
Expand Down
6 changes: 3 additions & 3 deletions benchmarks/run_optuna.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
import optuna
from functools import partial
from continuity.benchmarks.run import BenchmarkRunner, RunConfig
from continuity.benchmarks import SineRegular
from continuity.operators import (
from continuiti.benchmarks.run import BenchmarkRunner, RunConfig
from continuiti.benchmarks import SineRegular
from continuiti.operators import (
FourierNeuralOperator,
)

Expand Down
6 changes: 3 additions & 3 deletions benchmarks/run_single.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
from continuity.benchmarks.run import BenchmarkRunner, RunConfig
from continuity.benchmarks import SineRegular
from continuity.operators import DeepNeuralOperator
from continuiti.benchmarks.run import BenchmarkRunner, RunConfig
from continuiti.benchmarks import SineRegular
from continuiti.operators import DeepNeuralOperator

config = RunConfig(
benchmark_factory=SineRegular,
Expand Down
4 changes: 2 additions & 2 deletions docs/benchmarks/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,14 @@ of different operator architectures on various problems.
The benchmarks are implemented in the `benchmarks` directory and we refer to
this directory for detailed information on how the benchmarks are run.

## [NavierStokes](../api/continuity/benchmarks/#continuity.benchmarks.NavierStokes)
## [NavierStokes](../api/continuiti/benchmarks/#continuiti.benchmarks.NavierStokes)

Reference: _Li, Zongyi, et al. "Fourier neural operator for parametric partial
differential equations." arXiv preprint arXiv:2010.08895 (2020)_ _Table 1 ($\nu$ = 1e−5 T=20 N=1000)_

_reported for_ FNO-3D: __0.1893__ (rel. test error)

[FourierNeuralOperator](../api/continuity/operators/#continuity.operators.FourierNeuralOperator):
[FourierNeuralOperator](../api/continuiti/operators/#continuiti.operators.FourierNeuralOperator):
0.0185 (rel. train error) __0.1841__ (rel. test error)

<table>
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Examples
---

This is a collection of notebooks that showcase various applications of
Continuity.
continuiti.

::cards:: cols=2

Expand Down
4 changes: 2 additions & 2 deletions docs/getting-started/first-steps.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ alias:

# First Steps

**Continuity** aims to implement recent advances in learning function operators,
**continuiti** aims to implement recent advances in learning function operators,
i.e., mappings of (continuous) functions. If you are not familiar with the
concepts of operator learning, the page [[operators]] should introduce the
key concepts.
Expand All @@ -16,7 +16,7 @@ key concepts.

If you are familiar with the idea of operator learning (or just want to dive
right into code), you can start by browsing our examples illustrating
Continuity's capabilities, either:
continuiti's capabilities, either:

- On the documentation page under <a href="../../examples">Examples</a>.
- Locally, by starting a jupyter server and navigating to the `examples` directory.
12 changes: 6 additions & 6 deletions docs/getting-started/installation.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,20 @@
---
title: Installing Continuity
title: Installing continuiti
alias:
name: installation
text: Installing Continuity
text: Installing continuiti
---

# Installing Continuity
# Installing continuiti

To install the latest development version use:
```
git clone https://github.com/aai-institute/continuity.git
cd continuity
git clone https://github.com/aai-institute/continuiti.git
cd continuiti
pip install -e .
```

## Dependencies

Continuity requires Python>=3.9 and is built on top of
**continuiti** requires Python>=3.9 and is built on top of
[PyTorch](https://pytorch.org/).
10 changes: 5 additions & 5 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,16 @@ title: Home

<div align="center">

<img alt="Continuity" src="https://aai-institute.github.io/continuity/img/icon.png" width="100">
<img alt="continuiti" src="https://aai-institute.github.io/continuiti/img/icon.png" width="100">

<h1>Continuity</h1>
<h1>continuiti</h1>

<i>Learning function operators with neural networks.</i>

</div>


**Continuity** is a Python package for machine learning on function operators.
**continuiti** is a Python package for machine learning on function operators.
It implements various neural operator architectures (e.g., DeepONets),
physics-informed loss functions to train based on PDEs, and a collection of
examples and benchmarks.
Expand All @@ -31,11 +31,11 @@ examples and benchmarks.

- title: Examples
content: >
Some notebooks using Continuity
Some notebooks using continuiti
url: examples/index.md

- title: Browse the API
content: Full class documentation
url: api/continuity/index.md
url: api/continuiti/index.md

::/cards::
14 changes: 7 additions & 7 deletions docs/operators/architectures.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,14 +7,14 @@ alias:

# Architectures

Continuity implements the following neural operator architectures:
**continuiti** implements the following neural operator architectures:

- [DeepONet](../../api/continuity/operators/deeponet/)
- [Fourier Neural Operator (FNO)](../../api/continuity/operators/fno/)
- [BelNet](../../api/continuity/operators/belnet/)
- [DeepONet](../../api/continuiti/operators/deeponet/)
- [Fourier Neural Operator (FNO)](../../api/continuiti/operators/fno/)
- [BelNet](../../api/continuiti/operators/belnet/)

- [Deep Neural Operator (DNO)](../../api/continuity/operators/dno/)
- [Convolutional Neural Network (CNN)](../../api/continuity/operators/cnn/)
- a generic [NeuralOperator](../../api/continuity/operators/neuraloperator/) class
- [Deep Neural Operator (DNO)](../../api/continuiti/operators/dno/)
- [Convolutional Neural Network (CNN)](../../api/continuiti/operators/cnn/)
- a generic [NeuralOperator](../../api/continuiti/operators/neuraloperator/) class

and more to come...
6 changes: 3 additions & 3 deletions docs/operators/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ We generally refer to such a neural network $G_\theta$ as a *neural operator*.

## Discretization

In Continuity, we use the general approach of mapping function
In continuiti, we use the general approach of mapping function
evaluations to represent both input and output functions $u$ and $v$ in
a discretized form.

Expand Down Expand Up @@ -100,6 +100,6 @@ applications to physics-informed training, super-resolution, and more.
See our <a href="../examples">Examples</a> section for more on this.

## Further Reading
Follow our introduction to <a href="../examples/functions">Functions</a> in Continuity
Follow our introduction to <a href="../examples/functions">Functions</a> in continuiti
and proceed with the <a href="../examples/training">Training</a> example to learn
more about operator learning in Continuity.
more about operator learning in continuiti.
12 changes: 6 additions & 6 deletions examples/fno.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
"# Fourier Neural Operator (FNO)\n",
"\n",
"This example demonstrates the use of `FourierLayer` and and\n",
"`FourierNeuralOperator` in Continuity.\n",
"`FourierNeuralOperator` in continuiti.\n",
"\n",
"The FNO architecture was proposed in\n",
"[Z. Li et al., 2020](https://arxiv.org/abs/2010.08895) and gained a lot of\n",
Expand Down Expand Up @@ -45,11 +45,11 @@
"outputs": [],
"source": [
"import torch\n",
"from continuity.data import OperatorDataset\n",
"from continuity.discrete import RegularGridSampler\n",
"from continuity.operators import FourierNeuralOperator\n",
"from continuity.operators.fourierlayer import FourierLayer\n",
"from continuity.trainer import Trainer"
"from continuiti.data import OperatorDataset\n",
"from continuiti.discrete import RegularGridSampler\n",
"from continuiti.operators import FourierNeuralOperator\n",
"from continuiti.operators.fourierlayer import FourierLayer\n",
"from continuiti.trainer import Trainer"
]
},
{
Expand Down
Loading

0 comments on commit ae51e89

Please sign in to comment.