Skip to content

Latest commit

 

History

History
36 lines (24 loc) · 2.3 KB

README.md

File metadata and controls

36 lines (24 loc) · 2.3 KB

Machine Learning for the Earth Sciences

Gradient Boosting with XGBoost

Binder Open In Colab

In this tutorial notebook, we will run through the process of fitting XGBoost models to meteorological data from the SHEBA campaign, in order to predict surface turbulent fluxes of sensible heat over sea ice in the Arctic. This application to polar turbulent fluxes, inspired by the work of Cummins et al. (2023) and Cummins et al. (2024), is an example of a parametrization problem that is hard to solve with traditional physics, and illustrates how modern boosting methods allow us to easily obtain performant models.

Notebook Prerequisites

Readers are assumed to have basic familiarity with the programming language Python, and to have already completed the tutorial notebook on Random Forests from the Leeds Institute for Fluid Dynamics (LIFD).

Running Locally

If you're already familiar with Git, Anaconda and virtual environments, the environment you need to create is found in XGB.yml and the code below will install, activate and launch the notebook. The .yml file has been tested on the Windows 11 operating system.

git clone [email protected]:cemac/XGB-notebook.git
cd XGB-notebook
conda env create -f XGB.yml
conda activate XGB
jupyter-notebook

Licence information

licence

XGBoost-notebook by CEMAC is licensed under a Creative Commons Attribution 4.0 International License.