A curated list of awesome Scientific Machine Learning (SciML) papers, resources and software
- Neural Ordinary Differential Equations
pub
arxiv
Chen, Ricky TQ, Yulia Rubanova, Jesse Bettencourt, and David K. Duvenaud
Advances in neural information processing systems 31 (2018) - Universal Differential Equations for Scientific Machine Learning
arxiv
code
Rackauckas, Christopher, Yingbo Ma, Julius Martensen, Collin Warner, Kirill Zubov, Rohit Supekar, Dominic Skinner, Ali Ramadhan, and Alan Edelman
arXiv preprint arXiv:2001.04385 (2020) - Stiff neural ordinary differential equations
pub
arxiv
code
Kim, Suyong, Weiqi Ji, Sili Deng, Yingbo Ma, and Christopher Rackauckas
Chaos: An Interdisciplinary Journal of Nonlinear Science 31, no. 9 (2021): 093122 - Hamiltonian neural networks
pub
arxiv
code
Greydanus, Samuel, Misko Dzamba, and Jason Yosinski
Advances in neural information processing systems 32 (2019). - Augmented neural odes
pub
arxiv
code
Dupont, Emilien, Arnaud Doucet, and Yee Whye Teh
Advances in Neural Information Processing Systems 32 (2019). - Fourier neural operator for parametric partial differential equations
arxiv
Li, Zongyi, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart, and Anima Anandkumar
arXiv preprint arXiv:2010.08895 (2020). - GRU-ODE-Bayes: Continuous modeling of sporadically-observed time series
pub
arxiv
code
De Brouwer, Edward, Jaak Simm, Adam Arany, and Yves Moreau.
Advances in neural information processing systems 32 (2019). - Learning long-term dependencies in irregularly-sampled time series
arxiv
code
Lechner, Mathias, and Ramin Hasani
arXiv preprint arXiv:2006.04418 (2020). - Neural controlled differential equations for irregular time series
pub
arxiv
code
Kidger, Patrick, James Morrill, James Foster, and Terry Lyons
Advances in Neural Information Processing Systems 33 (2020): 6696-6707. - Neural stochastic differential equations: Deep latent gaussian models in the diffusion limit
arxiv
Tzen, Belinda, and Maxim Raginsky
arXiv preprint arXiv:1905.09883 (2019).
- Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
pub
code
Raissi, Maziar, Paris Perdikaris, and George E. Karniadakis
Journal of Computational physics 378 (2019): 686-707 - Artificial neural networks for solving ordinary and partial differential equations.
pub
arxiv
Lagaris, Isaac E., Aristidis Likas, and Dimitrios I. Fotiadis
IEEE transactions on neural networks 9, no. 5 (1998): 987-1000. - fPINNs: Fractional physics-informed neural networks
pub
arxiv
Pang, Guofei, Lu Lu, and George Em Karniadakis
SIAM Journal on Scientific Computing 41, no. 4 (2019): A2603-A2626. - Quantifying total uncertainty in physics-informed neural networks for solving forward and inverse stochastic problems
pub
arxiv
Zhang, Dongkun, Lu Lu, Ling Guo, and George Em Karniadakis
Journal of Computational Physics 397 (2019): 108850. - Physics-informed neural networks with hard constraints for inverse design
pub
arxiv
code
Lu, Lu, Raphael Pestourie, Wenjie Yao, Zhicheng Wang, Francesc Verdugo, and Steven G. Johnson
SIAM Journal on Scientific Computing 43, no. 6 (2021): B1105-B1132. - On the eigenvector bias of fourier feature networks: From regression to solving multi-scale pdes with physics-informed neural networks
pub
arxiv
code
Wang, Sifan, Hanwen Wang, and Paris Perdikaris
Computer Methods in Applied Mechanics and Engineering 384 (2021): 113938.
- DeepONet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators
arxiv
Lu, Lu, Pengzhan Jin, and George Em Karniadakis
arXiv preprint arXiv:1910.03193 (2019). - Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators
pub
code
Lu, Lu, Pengzhan Jin, Guofei Pang, Zhongqiang Zhang, and George Em Karniadakis
Nature Machine Intelligence 3, no. 3 (2021): 218-229. - Learning the solution operator of parametric partial differential equations with physics-informed DeepONets
pub
code
Wang, Sifan, Hanwen Wang, and Paris Perdikaris
Science advances 7, no. 40 (2021): eabi8605. - Fourier neural operator for parametric partial differential equations
arxiv
Li, Zongyi, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart, and Anima Anandkumar
arXiv preprint arXiv:2010.08895 (2020).
- Discovering governing equations from data by sparse identification of nonlinear dynamical systems
pub
arxiv
Brunton, Steven L., Joshua L. Proctor, and J. Nathan Kutz.
Proceedings of the national academy of sciences 113, no. 15 (2016): 3932-3937. - Ensemble-SINDy: Robust sparse model discovery in the low-data, high-noise limit, with active learning and control
pub
arxiv
Fasel, Urban, J. Nathan Kutz, Bingni W. Brunton, and Steven L. Brunton.
Proceedings of the Royal Society A 478, no. 2260 (2022): 20210904.
Julia has an entire organization called Scientific Machine Learning (Sciml) which provides software to do SciML with. The following list is a non-comprehensive list of the packages provided, for a full description please check out their website.
- DifferentialEquations.jl
code
docs
Differential equations solvers. - DiffEqFlux.jl
code
docs
Interface to build all variaties of neural differential equations. - NeuralPDEs.jl
code
docs
Interface to build Physics Informed Neural Networks.
Pytorch based
- torchdiffeq
code
Differential equations solvers. - torchdyn
code
docs
Library for neural differential equations and implicit models. - NeuroMANCER
code
docs
Neural Modules with Adaptive Nonlinear Constraints and Efficient Regularizations. - IDRLnet
code
docs
Machine learning library that solves both forward and inverse differential equations via physics-informed neural networks (PINN).
Jax based
Tensorflow based
- SciANN
code
docs
Neural Networks for Scientific Computations. - NeuralUQ
code
A comprehensive library for uncertainty quantification in neural differential equations and operators.
- pysindy
code
docs
Sparse Identification of Nonlinear Dynamical systems (SINDy). - DeepXDE
code
docs
Multi-platform (Pytorch, Jax, Tensorflow) library for scientific machine learning and physics-informed learning.
- Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control. 2nd ed.
pub
Brunton, Steven L., and J. Nathan Kutz.
Cambridge: Cambridge University Press, 2022. - Parallel Computing and Scientific Machine Learning (SciML): Methods and Applications
site
Chris Rackauckas
- Physics Informed Machine Learning
profile
From the channel description: This channel hosts videos from workshops at UW on Data-Driven Science and Engineering, and Physics Informed Machine Learning. - Parallel Computing and Scientific Machine Learning
profile
Lecture videos for MIT's 18.337J/6.338J: Parallel Computing and Scientific Machine Learning course of Fall 2020 and Spring 2021. - Steve Brunton
profile
Lectures ranging from Linear Algebra basis to SciML theory and applications. - Nathan Kutz
profile
From the channel description: A YouTube channel for Applied and Computational Mathematics techniques, from full graduate level courses to tutorials on emerging methods. - Crunch Group
profile
From the channel description: This channel puts all the seminars that are weekly held at the CRUNCH Group, Division of Applied Mathematics, Brown University, USA. This group is the home of PINNs, DeepONet and much more!!!
- SciMLCon 2022
list
- Machine Learning for Physics and the Physics of Learning 2019
list
- Data-driven Physical Simulations (DDPS) Seminar Series
list
- ETH Zürich | Deep Learning in Scientific Computing 2023
list
- The Symbiosis of Deep Learning and Differential Equations - NeurIPS 2022 Workshop
page
Online
2022/12/14 - SciMLCon 2022
page
Online
2022/03/23 - The Symbiosis of Deep Learning and Differential Equations - NeurIPS 2021 Workshop
page
Online
2021/12/14 - Scientific Machine Learning - Workshop
page
ICERM, Brown University
2019/01/28 - 2019/01/30
Contributions are very welcomed and encouraged! Please open a pull request with the indication of the type of contributions ([PAPER] for papers entries, [SOFTWARE] for software entries and so on), the entry itself and a few lines on why the contribution belongs to Awesome Scientific Machine Learning. Below are listed a few guidelines for each entry type. Please make sure to follow them to ensure a quick merge and an easier experience altogheter. Be aware that the guidelines themselves are subject to change, if you have ideas on how to improve the repo through them make sure to open a pr and tag it [GUIDELINES].
All of the guidelines given here are based on elements and resources already indexed, so if there is any confusion about them please make sure to check the raw version of the README.md before opening a pr.
Papers should have title, author list and pubblication venue separated by break elements. The author's name are to be fully spelled out (following the Chicago entry on Google scholar) and the pubblication venue is to be written in italic. Please provide the link of the source of the pubblication in [pub], and the arxiv abstract as [arxiv] when applicable. Additionally the code used for the paper can be linked in [code] (this only applies when the code is linked in the paper itself or written by the author(s). For third party implementations look at Papers with Code).
Software should have the full name of the library and they have to be indexed under the appropriate language (Julia, Pytho, R...). If the language index for the software you want to add to the list is not present please make sure to add it. Please provide the source of the code in [code] (it can be a GitHub link, GitLab link or you favourite version control host) and the link to the documentation in [docs]. In the next line please provide a couple of sentences with reference to what the software is doing. Please refrain from linking general deep learning libraries, in most cases they should be indexed already under their language.
Videos should have the title, speaker and host separated by break elements. After the title please provide the link to the video in [video] and the slides in [slides] if applicable. Entire channels dedicated to SciML can be linked providing the source in [profile]. Playlists of SciML videos should be linked as [list]. Please try to minimize the intersection of playlists and channels.
Events should be indexed in chronological order, with the latest on top. Link to the main page should be done in [page]. Please be mindful of the distinction between past events and future events. Nome of the event, location and date(s) should be separeted by break elements. Dates follow the format YYYY/MM/DD. If the event spanned more days please indicate starting day and ending day separated with the following symbol: -. Example: YYYY/MM/DD-YYYY/MM/DD.
An event can be added in the future section without date or location as long as there is a web page with the description of the event. The missing date or location line should be filled in as TBD. When the date is known please make sure to update the entry.