Skip to content

Inducing Point Operator Transformer: A Flexible and Scalable Architecture for Solving PDEs (AAAI 2024)

License

Notifications You must be signed in to change notification settings

7tl7qns7ch/IPOT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

50 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Inducing Point Operator Transformer: A Flexible and Scalable Architecture for Solving PDEs (AAAI 2024)

Implementation for Inducing Point Operator Transformer: A Flexible and Scalable Architecture for Solving PDEs, accepted at AAAI 2024. If you have any questions, please contact to [email protected] or [email protected].

Abstract

Solving partial differential equations (PDEs) by learning the solution operators has emerged as an attractive alternative to traditional numerical methods. However, implementing such architectures presents two main challenges: flexibility in handling irregular and arbitrary input and output formats and scalability to large discretizations. Most existing architectures are limited by their desired structure or infeasible to scale large inputs and outputs. To address these issues, we introduce an attention-based model called an inducing point operator transformer (IPOT). Inspired by inducing points methods, IPOT is designed to handle any input function and output query while capturing global interactions in a computationally efficient way. By detaching the inputs/outputs discretizations from the processor with a smaller latent bottleneck, IPOT offers flexibility in processing arbitrary discretizations and scales linearly with the size of inputs/outputs. Our experimental results demonstrate that IPOT achieves strong performances with manageable computational complexity on an extensive range of PDE benchmarks and real-world weather forecasting scenarios, compared to state-of-the-art methods.

Citation

@article{lee2024ipot,
  title={Inducing Point Operator Transformer: A Flexible and Scalable Architecture for Solving PDEs},
  author={Seungjun Lee and Taeil Oh},
  journal={The 38th AAAI Conference on Artificial Intelligence},
  year={2024}
}
@inproceedings{
lee2022meshindependent,
title={Mesh-Independent Operator Learning for Partial Differential Equations},
author={Seungjun Lee},
booktitle={ICML 2022 2nd AI for Science Workshop},
year={2022},
url={https://openreview.net/forum?id=JUtZG8-2vGp}
}

About

Inducing Point Operator Transformer: A Flexible and Scalable Architecture for Solving PDEs (AAAI 2024)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages