This is my final project for the Applied Stochastic Analysis (APMA 4990) course at Columbia University.
- Project Report (PDF)
- Code:
- To run:
- To read: View in nbviewer.
Inside optimization.py, there are PyTorch implementations for both the stochastic gradient Langevin dynamics (SGLD) optimizer and the preconditioned SGLD optimizer.
- Li, Chen, Carlson, and Carin, 2016. Preconditioned Stochastic Gradient Langevin Dynamics for Deep Neural Networks. [Paper link]
- Welling and Teh, 2011. Bayesian Learning via Stochastiv Gradient Langevin Dynamics. [Paper link]
This repository works as a package. The results from the research report are collected using the model I implemented in lit_modules.py.