Skip to content

Latest commit

 

History

History
77 lines (57 loc) · 2.65 KB

README.md

File metadata and controls

77 lines (57 loc) · 2.65 KB
continuiti

continuiti

Learning function operators with neural networks.

PyTorch Documentation Test

continuiti is a Python package for deep learning on function operators with a focus on elegance and generality. It provides a unified interface for neural operators (such as DeepONet or FNO) to be used in a plug and play fashion. As operator learning is particularly useful in scientific machine learning, continuiti also includes physics-informed loss functions and a collection of relevant benchmarks.

Installation

Install the package using pip:

pip install continuiti

Or install the latest development version from the repository:

git clone https://github.com/aai-institute/continuiti.git
cd continuiti
pip install -e ".[dev]"

Usage

Our Documentation contains a collection of tutorials on how to learn operators using continuiti, a collection of how-to guides to solve specific problems, more background, and a class documentation.

In general, the operator syntax in continuiti is

v = operator(x, u(x), y)

mapping a function u (evaluated at x) to function v (evaluated in y).

Examples

Contributing

Contributions are welcome from anyone in the form of pull requests, bug reports and feature requests. If you find a bug or have a feature request, please open an issue on GitHub. If you want to contribute code, please fork the repository and submit a pull request. See CONTRIBUTING.md for details on local development.

License

This project is licensed under the GNU LGPLv3 License - see the LICENSE file for details.