diff --git a/08_intro_to_Numpyro.ipynb b/08_intro_to_Numpyro.ipynb
index c6d9f0c..f901480 100644
--- a/08_intro_to_Numpyro.ipynb
+++ b/08_intro_to_Numpyro.ipynb
@@ -8,28 +8,7 @@
"source": [
"# Introduction to NumPyro\n",
"\n",
- "Probabilistic programming is a powerful approach to modeling and inference in machine learning and statistics. It allows us to build models that incorporate uncertainty and make probabilistic predictions. NumPyro is a probabilistic programming library that combines the flexibility of NumPy with the probabilistic modeling capabilities of Pyro, making it an excellent choice for researchers and data scientists. In this introductory tutorial, we'll explore the basics of NumPyro and how to get started with probabilistic programming.\n",
- "\n",
- "## Prerequisites"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 51,
- "metadata": {
- "id": "mhhWCsnWNO7Q"
- },
- "outputs": [],
- "source": [
- "#import matplotlib.pyplot as plt\n",
- "\n",
- "#import numpyro\n",
- "#import numpyro.distributions as dist\n",
- "#from numpyro.infer import MCMC, NUTS, P\n",
- "#import jax\n",
- "#import jax.numpy as jnp\n",
- "\n",
- "#import arviz as az"
+ "[NumPyro](https://num.pyro.ai/en/latest/index.html#) is a probabilistic programming library that combines the flexibility of NumPy with the probabilistic modeling capabilities of Pyro, making it an excellent choice for researchers and data scientists. In this introductory tutorial, we'll explore the basics of NumPyro and how to get started with probabilistic programming in a hands-on manner."
]
},
{
@@ -38,16 +17,18 @@
"id": "Vb4ILFuduHfK"
},
"source": [
- "In a NumPyro program, you define a probabilistic model that consists of various elements. Let's break down the key elements of a typical NumPyro program:\n",
+ "## Components of a Numpyro programme\n",
"\n",
- "1. Importing Libraries:\n",
+ "In a NumPyro program, you define a probabilistic model that consists of various elements. Let's break down the key elements of a typical program unsing NumPyro:\n",
+ "\n",
+ "1. Importing Libraries:\n",
"\n",
"At the beginning of your NumPyro program, you import the necessary libraries, including NumPyro and other required dependencies like JAX and Pyro if applicable. For example:"
]
},
{
"cell_type": "code",
- "execution_count": 52,
+ "execution_count": 7,
"metadata": {
"id": "NZ7-DnBcuM_7"
},
@@ -73,61 +54,68 @@
"id": "25oL3ubXuWZg"
},
"source": [
- "2. Defining the Model Function:\n",
+ "2. Defining the Model Function:\n",
"\n",
"In NumPyro, you define your probabilistic model as a Python function. This function encapsulates the entire model, including both the prior distributions and the likelihood. Typically, the model function takes one or more arguments, such as data or model parameters, and returns a set of latent variables and observations."
]
},
{
"cell_type": "code",
- "execution_count": 53,
+ "execution_count": 8,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def model():\n",
+ " pass"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "3. Prior Distributions:\n",
+ "\n",
+ "- Inside the model function, you define prior distributions for the model parameters. These prior distributions represent your beliefs about the parameters before observing any data. You use the `numpyro.sample` function to specify these priors. In the example above, `mean` and `scale` are defined as random variables sampled from specific prior distributions.\n",
+ "\n",
+ "4. Likelihood:\n",
+ "\n",
+ "- After specifying the prior distributions, you define the likelihood of your observed data by adding `obs=data` to the sampling statement of a variable. The likelihood represents the probability distribution of your observed data given the model parameters. It describes how likely it is to observe the data under different parameter values. In the example, the numpyro.sample function is used to define the likelihood of the data points given the mean and scale parameters."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 23,
"metadata": {
"id": "z_AsEI8SuSAu"
},
"outputs": [],
"source": [
"def model(data):\n",
+ " \n",
" # Define prior distributions for model parameters\n",
" mean = numpyro.sample(\"mean\", dist.Normal(0, 1))\n",
" scale = numpyro.sample(\"scale\", dist.Exponential(1))\n",
"\n",
" # Define likelihood\n",
- " with numpyro.plate(\"data_plate\", len(data)):\n",
- " numpyro.sample(\"obs\", dist.Normal(mean, scale), obs=data)\n"
+ " numpyro.sample(\"obs\", dist.Normal(mean, scale), obs=data)\n"
]
},
{
"cell_type": "markdown",
- "metadata": {
- "id": "-LqThfWVudNN"
- },
+ "metadata": {},
"source": [
- "3. Prior Distributions:\n",
+ "5. Inference Algorithm:\n",
"\n",
- "- Inside the model function, you define prior distributions for the model parameters. These prior distributions represent your beliefs about the parameters before observing any data. You use the `numpyro.sample` function to specify these priors. In the example above, `mean` and `scale` are defined as random variables sampled from specific prior distributions.\n",
- "\n",
- "\n",
- "4. Likelihood:\n",
- "\n",
- "- After specifying the prior distributions, you define the likelihood of your observed data. The likelihood represents the probability distribution of your observed data given the model parameters. It describes how likely it is to observe the data under different parameter values. In the example, the numpyro.sample function is used to define the likelihood of the data points given the mean and scale parameters.\n",
- "\n",
- "5. Plate for Repetition:\n",
- "- In Bayesian modeling, you often work with multiple data points that share the same statistical structure. The `numpyro.plate` context manager allows you to create a plate, which represents a repeated structure for data. It's used to efficiently handle repeated observations. In the example, `numpyro.plate` is used to specify that the likelihood applies to multiple data points.\n",
- "\n",
- "6. Inference Algorithm:\n",
- "\n",
- "- After defining your model, you need to choose an inference algorithm to estimate the posterior distribution of model parameters. NumPyro supports various inference algorithms, including NUTS (No-U-Turn Sampler) and SVI (Stochastic Variational Inference). You initialize and configure the chosen inference algorithm according to your requirements.\n"
+ "- After defining your model, you need to choose an inference algorithm to estimate the posterior distribution of model parameters. NumPyro supports various inference algorithms, including NUTS (No-U-Turn Sampler) and SVI (Stochastic Variational Inference). You initialize and configure the chosen inference algorithm according to your requirements."
]
},
{
"cell_type": "code",
- "execution_count": 54,
- "metadata": {
- "id": "Qz6nkjdGua0c"
- },
+ "execution_count": 25,
+ "metadata": {},
"outputs": [],
"source": [
- "nuts_kernel = NUTS(model)\n"
+ "nuts_kernel = NUTS(model)"
]
},
{
@@ -136,14 +124,14 @@
"id": "8C2V-_BdvEwc"
},
"source": [
- "8. Performing Inference:\n",
+ "6. Performing Inference:\n",
"\n",
"- You use the configured inference algorithm to perform Bayesian inference. In the example, MCMC (Markov Chain Monte Carlo) inference is performed using the `MCMC` class. The `run` method of the `MCMC` object is called to run the inference process."
]
},
{
"cell_type": "code",
- "execution_count": 55,
+ "execution_count": 26,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
@@ -156,25 +144,24 @@
"name": "stderr",
"output_type": "stream",
"text": [
- "/var/folders/q3/n2z18__9281b8xfhctcpxfsr0000gn/T/ipykernel_88949/317822218.py:5: UserWarning: There are not enough devices to run parallel chains: expected 2 but got 1. Chains will be drawn sequentially. If you are running MCMC in CPU, consider using `numpyro.set_host_device_count(2)` at the beginning of your program. You can double-check how many devices are available in your system using `jax.local_device_count()`.\n",
- " mcmc = MCMC(nuts_kernel, num_samples=1000, num_warmup=1000, num_chains=2)\n"
+ "/var/folders/q3/n2z18__9281b8xfhctcpxfsr0000gn/T/ipykernel_75200/1258007880.py:4: UserWarning: There are not enough devices to run parallel chains: expected 2 but got 1. Chains will be drawn sequentially. If you are running MCMC in CPU, consider using `numpyro.set_host_device_count(2)` at the beginning of your program. You can double-check how many devices are available in your system using `jax.local_device_count()`.\n",
+ " mcmc = MCMC(nuts_kernel, num_samples=1000, num_warmup=1000, num_chains=2, chain_method='parallel')\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
- "sample: 100%|██████████| 2000/2000 [00:03<00:00, 542.65it/s, 1 steps of size 5.86e-01. acc. prob=0.95] \n",
- "sample: 100%|██████████| 2000/2000 [00:01<00:00, 1831.55it/s, 3 steps of size 7.44e-01. acc. prob=0.89]\n"
+ "sample: 100%|██████████| 2000/2000 [00:02<00:00, 877.72it/s, 1 steps of size 5.86e-01. acc. prob=0.95] \n",
+ "sample: 100%|██████████| 2000/2000 [00:00<00:00, 3228.09it/s, 3 steps of size 7.44e-01. acc. prob=0.89]\n"
]
}
],
"source": [
- "# Simulated data\n",
+ "# data\n",
"data = jnp.array([2.3, 3.9, 1.7, -0.8, 2.5])\n",
"\n",
- "\n",
- "mcmc = MCMC(nuts_kernel, num_samples=1000, num_warmup=1000, num_chains=2)\n",
+ "mcmc = MCMC(nuts_kernel, num_samples=1000, num_warmup=1000, num_chains=2, chain_method='parallel')\n",
"mcmc.run(jax.random.PRNGKey(0), data)"
]
},
@@ -184,14 +171,14 @@
"id": "2DXdboTTvSNu"
},
"source": [
- "9. Posterior Analysis:\n",
+ "9. Posterior Analysis:\n",
"\n",
"- After running the inference, you can retrieve posterior samples of the model parameters. These samples represent the estimated posterior distribution of the parameters given the observed data. You can then analyze these samples to make inferences about your model."
]
},
{
"cell_type": "code",
- "execution_count": 56,
+ "execution_count": 27,
"metadata": {
"id": "wL3YlwkovP6C"
},
@@ -223,14 +210,14 @@
"id": "-n-z22VnvbRV"
},
"source": [
- "10. Visualization and Inference:\n",
+ "10. Visualization inference results\n",
"\n",
"- Finally, you can perform various tasks such as visualizing the posterior distributions, computing summary statistics, and making predictions or inferences based on the posterior samples."
]
},
{
"cell_type": "code",
- "execution_count": 57,
+ "execution_count": 28,
"metadata": {},
"outputs": [
{
@@ -242,7 +229,7 @@
" ]], dtype=object)"
]
},
- "execution_count": 57,
+ "execution_count": 28,
"metadata": {},
"output_type": "execute_result"
},
@@ -258,7 +245,7 @@
}
],
"source": [
- "# Plot posterior distributions\n",
+ "# Viosualise posterior distributions and trace plots\n",
"az.plot_trace(mcmc)"
]
},
@@ -275,309 +262,100 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "## Summary\n",
- "\n",
- "The typical elements that we will need to write are model in Numpyro are as follows:\n",
- "\n",
- "- Parameters can be sampled using `numpyro.sample`\n",
- "- Parameters can be sampled from any of the available distributsions using, e.g. `dist.Beta(alpha, beta)` \n",
- "- Likelihood is constructed by adding `obs=...` to the sampling statement: `numpyro.sample('obs', dist.Binomial(n, p), obs=h)`\n",
- "- Once the model has been formulated, we need to specify\n",
- " - the sampling algorithm which we would like to use. NUTS is a good default oprtion: `kernel = NUTS(model)` ,\n",
- " - number of warm-up steps, number of iterations, number of chains, e.g. `MCMC(kernel, num_warmup=1000, num_samples=2000, num_chains=4)`,\n",
- " - using `Predictive` class we can generate predictions.\n",
- "\n",
- "\n",
- "In the example avobe\n",
- "\n",
- "- We define a simple probabilistic model with two parameters: `mean` and `scale`.\n",
- "- We specify priors for these parameters.\n",
- "- The data is assumed to be normally distributed with parameters `mean` and `scale`.\n",
- "- In this example, the likelihood is specified within the `numpyro.sample` statement inside the model function. NumPyro automatically evaluates the likelihood for the observed data points `(obs=data)` when performing MCMC inference.\n",
- "- We use the No-U-Turn Sampler (NUTS) to perform Markov Chain Monte Carlo (MCMC) inference.\n",
- "- Finally, we visualize the posterior distributions of the parameters.\n",
- "\n",
- "\n",
- "Let's build a model step by step."
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "## Coin tossing "
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "### Data"
+ "5. Plate for Repetition:\n",
+ "- In Bayesian modeling, you often work with multiple data points that share the same statistical structure. In case cases you might want to, or need to use plates. The `numpyro.plate` context manager allows you to create a plate, which represents a repeated structure for data. It's used to efficiently handle repeated observations. In the example, `numpyro.plate` is used to specify that the likelihood applies to multiple data points.\n"
]
},
{
"cell_type": "code",
- "execution_count": 58,
- "metadata": {},
- "outputs": [],
- "source": [
- "n = 100 # number of trials\n",
- "h = 61 # number of successes\n",
- "alpha = 2 # hyperparameters\n",
- "beta = 2\n",
- "\n",
- "niter = 1000"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "### Model"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 59,
- "metadata": {},
- "outputs": [],
- "source": [
- "def model(n, alpha=2, beta=2, h=None):\n",
- "\n",
- " # prior on the probability of success p\n",
- " p = numpyro.sample('p', dist.Beta(alpha, beta))\n",
- "\n",
- " # likelihood - notice the `obs=h` part\n",
- " # p is the probabiity of success,\n",
- " # n is the total number of experiments\n",
- " # h is the number of successes\n",
- " numpyro.sample('obs', dist.Binomial(n, p), obs=h)"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "### Prior Predictive check\n",
- "\n",
- "A prior predictive check is a method used in Bayesian statistics to assess the compatibility of a chosen prior distribution with the observed data by simulating data from the prior and comparing it to the actual data - an overlooked but golden tool!"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 60,
- "metadata": {},
- "outputs": [],
- "source": [
- "from jax import random\n",
- "\n",
- "rng_key = random.PRNGKey(0)\n",
- "rng_key, rng_key_ = random.split(rng_key)\n",
- "\n",
- "# use the Predictive class to generate predictions.\n",
- "# Notice that we are not passing observation `h` as data.\n",
- "# Since we have set `h=None`, this allows the model to make predictions of `h`\n",
- "# when data for it is not provided.\n",
- "prior_predictive = Predictive(model, num_samples=1000)\n",
- "prior_predictions = prior_predictive(rng_key_, n)"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 61,
+ "execution_count": 32,
"metadata": {},
"outputs": [
{
- "data": {
- "text/plain": [
- "dict_keys(['obs', 'p'])"
- ]
- },
- "execution_count": 61,
- "metadata": {},
- "output_type": "execute_result"
- }
- ],
- "source": [
- "# we have generated samples for two variables\n",
- "prior_predictions.keys()"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 62,
- "metadata": {},
- "outputs": [],
- "source": [
- "# extract samples for variable 'p'\n",
- "pred_obs = prior_predictions['p']\n",
- "\n",
- "# compute its summary statistics for the samples of `p`\n",
- "mean_prior_pred = jnp.mean(pred_obs, axis=0)\n",
- "hpdi_prior_pred = hpdi(pred_obs, 0.89)"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 63,
- "metadata": {},
- "outputs": [
+ "name": "stderr",
+ "output_type": "stream",
+ "text": [
+ "/var/folders/q3/n2z18__9281b8xfhctcpxfsr0000gn/T/ipykernel_75200/2363012817.py:17: UserWarning: There are not enough devices to run parallel chains: expected 2 but got 1. Chains will be drawn sequentially. If you are running MCMC in CPU, consider using `numpyro.set_host_device_count(2)` at the beginning of your program. You can double-check how many devices are available in your system using `jax.local_device_count()`.\n",
+ " mcmc = MCMC(nuts_kernel, num_samples=1000, num_warmup=1000, num_chains=2, chain_method='parallel')\n",
+ "sample: 100%|██████████| 2000/2000 [00:02<00:00, 899.15it/s, 1 steps of size 5.86e-01. acc. prob=0.95] \n",
+ "sample: 100%|██████████| 2000/2000 [00:00<00:00, 3431.52it/s, 3 steps of size 7.44e-01. acc. prob=0.89]"
+ ]
+ },
{
- "ename": "NameError",
- "evalue": "name 'gaussian_kde' is not defined",
- "output_type": "error",
- "traceback": [
- "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
- "\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)",
- "Cell \u001b[0;32mIn[63], line 5\u001b[0m\n\u001b[1;32m 3\u001b[0m ax\u001b[38;5;241m.\u001b[39mhist(pred_obs, bins\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m15\u001b[39m, density\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mTrue\u001b[39;00m, alpha\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m0.5\u001b[39m)\n\u001b[1;32m 4\u001b[0m x \u001b[38;5;241m=\u001b[39m jnp\u001b[38;5;241m.\u001b[39mlinspace(\u001b[38;5;241m0\u001b[39m, \u001b[38;5;241m1\u001b[39m, \u001b[38;5;241m3000\u001b[39m)\n\u001b[0;32m----> 5\u001b[0m kde \u001b[38;5;241m=\u001b[39m \u001b[43mgaussian_kde\u001b[49m(pred_obs)\n\u001b[1;32m 6\u001b[0m ax\u001b[38;5;241m.\u001b[39mplot(x, kde(x), color\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mC0\u001b[39m\u001b[38;5;124m'\u001b[39m, lw\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m3\u001b[39m, alpha\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m0.5\u001b[39m)\n\u001b[1;32m 7\u001b[0m ax\u001b[38;5;241m.\u001b[39mset_xlabel(\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mp\u001b[39m\u001b[38;5;124m'\u001b[39m)\n",
- "\u001b[0;31mNameError\u001b[0m: name 'gaussian_kde' is not defined"
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "\n",
+ " mean std median 5.0% 95.0% n_eff r_hat\n",
+ " mean 1.17 0.69 1.19 0.04 2.27 885.10 1.01\n",
+ " scale 1.87 0.62 1.74 0.97 2.72 783.04 1.00\n",
+ "\n",
+ "Number of divergences: 0\n"
]
},
{
- "data": {
- "image/png": "iVBORw0KGgoAAAANSUhEUgAAAcAAAAESCAYAAABq0wVXAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/NK7nSAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAeC0lEQVR4nO3df3BU1f3/8ddCIEHHXUQkCRIQ+EAgMMYYJD+Y4FAgGBSl05bMtEZwoDZTO4AZW4moiPZjSlXKbxQHzTBKiBgQHEMhzlcISEqHNGE6hSoobWLclAElG7AkAuf7h1/265If5C7sBnOej5n7xz059+R9j3FfnN1797qMMUYAAFimW2cXAABAZyAAAQBWIgABAFYiAAEAViIAAQBWIgABAFYiAAEAVoro7AKulYsXL+rLL7/UTTfdJJfL1dnlAAA6gTFGjY2N6t+/v7p1a3+N12UC8Msvv1RcXFxnlwEAuA7U1tZqwIAB7fbpMgF40003SfrupN1udydXAwDoDD6fT3Fxcf5MaE+XCcBLb3u63W4CEAAs15GPwrgIBgBgJQIQAGAlAhAAYCUCEABgJQIQAGAlAhAAYCUCEABgJQIQAGAlxzfCl5eX66WXXlJlZaW8Xq+2bt2q6dOnt9l/9+7dmjBhQov2I0eOaMSIEf79kpISPfPMM/rss880dOhQ/e///q9+/OMfOy0PwHXkT2WfhmzsxycPD9nYsIPjFeDZs2eVmJioVatWOTruk08+kdfr9W/Dhg3z/6yiokLZ2dnKycnRoUOHlJOToxkzZujAgQNOywMAoEMcrwCzsrKUlZXl+Bf169dPvXv3bvVny5Yt0+TJk5Wfny9Jys/P1549e7Rs2TIVFRU5/l0AAFxJ2D4DTEpKUmxsrCZOnKiPPvoo4GcVFRXKzMwMaJsyZYr279/f5nhNTU3y+XwBGwAAHRXyAIyNjdW6detUUlKiLVu2KD4+XhMnTlR5ebm/T319vaKjowOOi46OVn19fZvjFhQUyOPx+DcehQQAcCLkT4OIj49XfHy8fz8tLU21tbV6+eWXNX78eH/75d/cbYxp99u88/PzlZeX59+/9AgMAAA6olNug0hNTdXRo0f9+zExMS1WeydOnGixKvy+yMhI/6OPeAQSAMCpTgnAqqoqxcbG+vfT0tJUVlYW0GfXrl1KT08Pd2kAAEs4fgv0zJkzOnbsmH//+PHjqq6uVp8+fTRw4EDl5+errq5OGzZskPTdFZ633367Ro0apebmZr311lsqKSlRSUmJf4x58+Zp/PjxWrJkiR588EFt27ZNH374ofbt23cNThEAgJYcB+DBgwcDbmy/9DnczJkzVVhYKK/Xq5qaGv/Pm5ub9cQTT6iurk69evXSqFGj9MEHH2jq1Kn+Punp6dq0aZOefvppPfPMMxo6dKiKi4uVkpJyNecGAECbXMYY09lFXAs+n08ej0cNDQ18HghcJ/gmGISbkyzgu0ABAFYiAAEAViIAAQBWCvmN8ACuDp+jAaHBChAAYCUCEABgJQIQAGAlPgOEFfgcrXWhnBfgescKEABgJVaAAH6QQr16/SGv7NExrAABAFYiAAEAViIAAQBWIgABAFYiAAEAViIAAQBWIgABAFYiAAEAVuJGeOAq8XViwA8TK0AAgJUIQACAlQhAAICVCEAAgJUIQACAlQhAAICVHAdgeXm5pk2bpv79+8vlcum9995rt/+WLVs0efJk3XrrrXK73UpLS9POnTsD+hQWFsrlcrXYzp0757Q8AAA6xHEAnj17VomJiVq1alWH+peXl2vy5MkqLS1VZWWlJkyYoGnTpqmqqiqgn9vtltfrDdiioqKclgcAQIc4vhE+KytLWVlZHe6/bNmygP0XX3xR27Zt0/vvv6+kpCR/u8vlUkxMjNNyAAAIStg/A7x48aIaGxvVp0+fgPYzZ85o0KBBGjBggO6///4WK8TLNTU1yefzBWwAAHRU2APwlVde0dmzZzVjxgx/24gRI1RYWKjt27erqKhIUVFRGjdunI4ePdrmOAUFBfJ4PP4tLi4uHOUDALqIsAZgUVGRnnvuORUXF6tfv37+9tTUVD300ENKTExURkaG3nnnHQ0fPlwrV65sc6z8/Hw1NDT4t9ra2nCcAgCgiwjbl2EXFxdr9uzZ2rx5syZNmtRu327duunuu+9udwUYGRmpyMjIa10mAMASYVkBFhUVadasWdq4caPuu+++K/Y3xqi6ulqxsbFhqA4AYCPHK8AzZ87o2LFj/v3jx4+rurpaffr00cCBA5Wfn6+6ujpt2LBB0nfh9/DDD2v58uVKTU1VfX29JKlXr17yeDySpMWLFys1NVXDhg2Tz+fTihUrVF1drdWrV1+LcwQAoAXHK8CDBw8qKSnJfwtDXl6ekpKS9Oyzz0qSvF6vampq/P1fe+01nT9/Xo899phiY2P927x58/x9Tp8+rUcffVQjR45UZmam6urqVF5errFjx17t+QEA0CqXMcZ0dhHXgs/nk8fjUUNDg9xud2eXg+sMD62FU49PHt7ZJSAITrKA7wIFAFiJAAQAWClst0Hghy+UbyPydhOAcGMFCACwEitAXBe4SAVAuLECBABYiQAEAFiJAAQAWIkABABYiQAEAFiJAAQAWInbIACgFXzxQ9fHChAAYCUCEABgJQIQAGAlAhAAYCUCEABgJQIQAGAlAhAAYCUCEABgJQIQAGAlAhAAYCUCEABgJQIQAGAlAhAAYCXHAVheXq5p06apf//+crlceu+99654zJ49e5ScnKyoqCgNGTJEr776aos+JSUlSkhIUGRkpBISErR161anpQEA0GGOA/Ds2bNKTEzUqlWrOtT/+PHjmjp1qjIyMlRVVaWnnnpKc+fOVUlJib9PRUWFsrOzlZOTo0OHDiknJ0czZszQgQMHnJYHAECHuIwxJuiDXS5t3bpV06dPb7PPk08+qe3bt+vIkSP+ttzcXB06dEgVFRWSpOzsbPl8Pu3YscPf595779XNN9+soqKiDtXi8/nk8XjU0NAgt9sd3AmhXaF8PhpgE54HGDpOsiDknwFWVFQoMzMzoG3KlCk6ePCgvv3223b77N+/v81xm5qa5PP5AjYAADoq5AFYX1+v6OjogLbo6GidP39eJ0+ebLdPfX19m+MWFBTI4/H4t7i4uGtfPACgywrLVaAulytg/9K7rt9vb63P5W3fl5+fr4aGBv9WW1t7DSsGAHR1EaH+BTExMS1WcidOnFBERIRuueWWdvtcvir8vsjISEVGRl77ggEAVgj5CjAtLU1lZWUBbbt27dKYMWPUo0ePdvukp6eHujwAgKUcrwDPnDmjY8eO+fePHz+u6upq9enTRwMHDlR+fr7q6uq0YcMGSd9d8blq1Srl5eXpl7/8pSoqKrR+/fqAqzvnzZun8ePHa8mSJXrwwQe1bds2ffjhh9q3b981OEUAAFpyvAI8ePCgkpKSlJSUJEnKy8tTUlKSnn32WUmS1+tVTU2Nv//gwYNVWlqq3bt3684779QLL7ygFStW6Cc/+Ym/T3p6ujZt2qQ333xTd9xxhwoLC1VcXKyUlJSrPT8AAFp1VfcBXk+4DzD0uA8QuDa4DzB0rqv7AAEAuB4RgAAAKxGAAAArEYAAACsRgAAAKxGAAAArEYAAACsRgAAAKxGAAAArEYAAACsRgAAAKxGAAAArEYAAACsRgAAAKxGAAAArEYAAACsRgAAAKxGAAAArEYAAACsRgAAAKxGAAAArEYAAACtFdHYBuHb+VPZpZ5cAAD8YrAABAFYiAAEAVgoqANesWaPBgwcrKipKycnJ2rt3b5t9Z82aJZfL1WIbNWqUv09hYWGrfc6dOxdMeQAAXJHjACwuLtb8+fO1cOFCVVVVKSMjQ1lZWaqpqWm1//Lly+X1ev1bbW2t+vTpo5/97GcB/dxud0A/r9erqKio4M4KAIArcByAS5cu1ezZszVnzhyNHDlSy5YtU1xcnNauXdtqf4/Ho5iYGP928OBBff3113rkkUcC+rlcroB+MTExwZ0RAAAd4Ogq0ObmZlVWVmrBggUB7ZmZmdq/f3+Hxli/fr0mTZqkQYMGBbSfOXNGgwYN0oULF3TnnXfqhRdeUFJSUpvjNDU1qampyb/v8/kcnAkAdJ5QX7H9+OThIR2/q3C0Ajx58qQuXLig6OjogPbo6GjV19df8Xiv16sdO3Zozpw5Ae0jRoxQYWGhtm/frqKiIkVFRWncuHE6evRom2MVFBTI4/H4t7i4OCenAgCwXFAXwbhcroB9Y0yLttYUFhaqd+/emj59ekB7amqqHnroISUmJiojI0PvvPOOhg8frpUrV7Y5Vn5+vhoaGvxbbW1tMKcCALCUo7dA+/btq+7du7dY7Z04caLFqvByxhi98cYbysnJUc+ePdvt261bN919993trgAjIyMVGRnZ8eIBAPgeRyvAnj17Kjk5WWVlZQHtZWVlSk9Pb/fYPXv26NixY5o9e/YVf48xRtXV1YqNjXVSHgAAHeb4q9Dy8vKUk5OjMWPGKC0tTevWrVNNTY1yc3MlfffWZF1dnTZs2BBw3Pr165WSkqLRo0e3GHPx4sVKTU3VsGHD5PP5tGLFClVXV2v16tVBnhYAAO1zHIDZ2dk6deqUnn/+eXm9Xo0ePVqlpaX+qzq9Xm+LewIbGhpUUlKi5cuXtzrm6dOn9eijj6q+vl4ej0dJSUkqLy/X2LFjgzglAACuzGWMMZ1dxLXg8/nk8XjU0NAgt9vd2eV0Cr4MG4Bk920QTrKA7wIFAFiJAAQAWIkABABYiQAEAFiJAAQAWIkABABYiQAEAFiJAAQAWMnxN8Hg6nCzOgBcH1gBAgCsRAACAKxEAAIArMRngADQxYTyWoOu9EXbrAABAFYiAAEAViIAAQBWIgABAFYiAAEAViIAAQBWIgABAFYiAAEAViIAAQBWIgABAFYiAAEAViIAAQBWCioA16xZo8GDBysqKkrJycnau3dvm313794tl8vVYvvnP/8Z0K+kpEQJCQmKjIxUQkKCtm7dGkxpAAB0iOMALC4u1vz587Vw4UJVVVUpIyNDWVlZqqmpafe4Tz75RF6v178NGzbM/7OKigplZ2crJydHhw4dUk5OjmbMmKEDBw44PyMAADrAZYwxTg5ISUnRXXfdpbVr1/rbRo4cqenTp6ugoKBF/927d2vChAn6+uuv1bt371bHzM7Ols/n044dO/xt9957r26++WYVFRV1qC6fzyePx6OGhga53W4npxRWoXxMCQCE2vX+OCQnWeBoBdjc3KzKykplZmYGtGdmZmr//v3tHpuUlKTY2FhNnDhRH330UcDPKioqWow5ZcqUdsdsamqSz+cL2AAA6ChHAXjy5ElduHBB0dHRAe3R0dGqr69v9ZjY2FitW7dOJSUl2rJli+Lj4zVx4kSVl5f7+9TX1zsaU5IKCgrk8Xj8W1xcnJNTAQBYLqgnwrtcroB9Y0yLtkvi4+MVHx/v309LS1Ntba1efvlljR8/PqgxJSk/P195eXn+fZ/PRwgCADrM0Qqwb9++6t69e4uV2YkTJ1qs4NqTmpqqo0eP+vdjYmIcjxkZGSm32x2wAQDQUY4CsGfPnkpOTlZZWVlAe1lZmdLT0zs8TlVVlWJjY/37aWlpLcbctWuXozEBAHDC8VugeXl5ysnJ0ZgxY5SWlqZ169appqZGubm5kr57a7Kurk4bNmyQJC1btky33367Ro0apebmZr311lsqKSlRSUmJf8x58+Zp/PjxWrJkiR588EFt27ZNH374ofbt23eNThMAgECOAzA7O1unTp3S888/L6/Xq9GjR6u0tFSDBg2SJHm93oB7Apubm/XEE0+orq5OvXr10qhRo/TBBx9o6tSp/j7p6enatGmTnn76aT3zzDMaOnSoiouLlZKScg1OEQCAlhzfB3i94j5AAAg9a+8DBACgqyAAAQBWIgABAFYiAAEAViIAAQBWCuqr0LoyrtIEADuwAgQAWIkABABYiQAEAFiJAAQAWIkABABYiQAEAFiJAAQAWIkABABYiQAEAFiJAAQAWIkABABYiQAEAFiJAAQAWIkABABYiQAEAFiJAAQAWIkABABYiQAEAFiJAAQAWCmoAFyzZo0GDx6sqKgoJScna+/evW323bJliyZPnqxbb71VbrdbaWlp2rlzZ0CfwsJCuVyuFtu5c+eCKQ8AgCtyHIDFxcWaP3++Fi5cqKqqKmVkZCgrK0s1NTWt9i8vL9fkyZNVWlqqyspKTZgwQdOmTVNVVVVAP7fbLa/XG7BFRUUFd1YAAFxBhNMDli5dqtmzZ2vOnDmSpGXLlmnnzp1au3atCgoKWvRftmxZwP6LL76obdu26f3331dSUpK/3eVyKSYmxmk5AAAExdEKsLm5WZWVlcrMzAxoz8zM1P79+zs0xsWLF9XY2Kg+ffoEtJ85c0aDBg3SgAEDdP/997dYIV6uqalJPp8vYAMAoKMcBeDJkyd14cIFRUdHB7RHR0ervr6+Q2O88sorOnv2rGbMmOFvGzFihAoLC7V9+3YVFRUpKipK48aN09GjR9scp6CgQB6Px7/FxcU5ORUAgOWCugjG5XIF7BtjWrS1pqioSM8995yKi4vVr18/f3tqaqoeeughJSYmKiMjQ++8846GDx+ulStXtjlWfn6+Ghoa/FttbW0wpwIAsJSjzwD79u2r7t27t1jtnThxosWq8HLFxcWaPXu2Nm/erEmTJrXbt1u3brr77rvbXQFGRkYqMjKy48UDAPA9jlaAPXv2VHJyssrKygLay8rKlJ6e3uZxRUVFmjVrljZu3Kj77rvvir/HGKPq6mrFxsY6KQ8AgA5zfBVoXl6ecnJyNGbMGKWlpWndunWqqalRbm6upO/emqyrq9OGDRskfRd+Dz/8sJYvX67U1FT/6rFXr17yeDySpMWLFys1NVXDhg2Tz+fTihUrVF1drdWrV1+r8wQAIIDjAMzOztapU6f0/PPPy+v1avTo0SotLdWgQYMkSV6vN+CewNdee03nz5/XY489pscee8zfPnPmTBUWFkqSTp8+rUcffVT19fXyeDxKSkpSeXm5xo4de5WnBwC4lv5U9mlIx3988vCQjv99LmOMCdtvCyGfzyePx6OGhga53e6gxwn1f1wAQNuuNgCdZAHfBQoAsBIBCACwEgEIALASAQgAsBIBCACwEgEIALASAQgAsBIBCACwEgEIALASAQgAsBIBCACwEgEIALASAQgAsBIBCACwEgEIALASAQgAsBIBCACwEgEIALASAQgAsBIBCACwEgEIALASAQgAsBIBCACwEgEIALASAQgAsBIBCACwUlABuGbNGg0ePFhRUVFKTk7W3r172+2/Z88eJScnKyoqSkOGDNGrr77aok9JSYkSEhIUGRmphIQEbd26NZjSAADoEMcBWFxcrPnz52vhwoWqqqpSRkaGsrKyVFNT02r/48ePa+rUqcrIyFBVVZWeeuopzZ07VyUlJf4+FRUVys7OVk5Ojg4dOqScnBzNmDFDBw4cCP7MAABoh8sYY5wckJKSorvuuktr1671t40cOVLTp09XQUFBi/5PPvmktm/friNHjvjbcnNzdejQIVVUVEiSsrOz5fP5tGPHDn+fe++9VzfffLOKioparaOpqUlNTU3+/YaGBg0cOFC1tbVyu91OTinA6v9zLOhjAQBX57Ef/c9VHe/z+RQXF6fTp0/L4/G039k40NTUZLp37262bNkS0D537lwzfvz4Vo/JyMgwc+fODWjbsmWLiYiIMM3NzcYYY+Li4szSpUsD+ixdutQMHDiwzVoWLVpkJLGxsbGxsbXYamtrr5hpEXLg5MmTunDhgqKjowPao6OjVV9f3+ox9fX1rfY/f/68Tp48qdjY2Db7tDWmJOXn5ysvL8+/f/HiRX311Ve65ZZb5HK52j2PS/9CuNrVYlfF/FwZc9Q+5qd9zM+VBTtHxhg1Njaqf//+V+zrKAAvuTxgjDHthk5r/S9vdzpmZGSkIiMjA9p69+7dbt2Xc7vd/PG1g/m5MuaofcxP+5ifKwtmjq741uf/4+gimL59+6p79+4tVmYnTpxosYK7JCYmptX+ERERuuWWW9rt09aYAABcLUcB2LNnTyUnJ6usrCygvaysTOnp6a0ek5aW1qL/rl27NGbMGPXo0aPdPm2NCQDAVbvip4SX2bRpk+nRo4dZv369OXz4sJk/f7658cYbzb/+9S9jjDELFiwwOTk5/v6ff/65ueGGG8zjjz9uDh8+bNavX2969Ohh3n33XX+fjz/+2HTv3t384Q9/MEeOHDF/+MMfTEREhPnLX/7itLwOOXfunFm0aJE5d+5cSMb/oWN+row5ah/z0z7m58rCMUeOA9AYY1avXm0GDRpkevbsae666y6zZ88e/89mzpxp7rnnnoD+u3fvNklJSaZnz57m9ttvN2vXrm0x5ubNm018fLzp0aOHGTFihCkpKQmmNAAAOsTxfYAAAHQFfBcoAMBKBCAAwEoEIADASgQgAMBKXTYAQ/HIpq7Eyfxs2bJFkydP1q233iq32620tDTt3LkzjNWGn9O/n0s+/vhjRURE6M477wxtgdcBp3PU1NSkhQsXatCgQYqMjNTQoUP1xhtvhKna8HM6P2+//bYSExN1ww03KDY2Vo888ohOnToVpmrDq7y8XNOmTVP//v3lcrn03nvvXfGYkLxGd/ZlqKFw6V7F119/3Rw+fNjMmzfP3Hjjjebf//53q/0v3as4b948c/jwYfP666+3uFexK3E6P/PmzTNLliwxf/3rX82nn35q8vPzTY8ePczf/va3MFceHk7n55LTp0+bIUOGmMzMTJOYmBieYjtJMHP0wAMPmJSUFFNWVmaOHz9uDhw4YD7++OMwVh0+Tudn7969plu3bmb58uXm888/N3v37jWjRo0y06dPD3Pl4VFaWmoWLlxoSkpKjCSzdevWdvuH6jW6Swbg2LFjTW5ubkDbiBEjzIIFC1rt/7vf/c6MGDEioO1Xv/qVSU1NDVmNncnp/LQmISHBLF68+FqXdl0Idn6ys7PN008/bRYtWtTlA9DpHO3YscN4PB5z6tSpcJTX6ZzOz0svvWSGDBkS0LZixQozYMCAkNV4vehIAIbqNbrLvQXa3NysyspKZWZmBrRnZmZq//79rR5TUVHRov+UKVN08OBBffvttyGrtTMEMz+Xu3jxohobG9WnT59QlNipgp2fN998U5999pkWLVoU6hI7XTBztH37do0ZM0Z//OMfddttt2n48OF64okn9N///jccJYdVMPOTnp6uL774QqWlpTLG6D//+Y/effdd3XfffeEo+boXqtfooJ4GcT0L1SObuopg5udyr7zyis6ePasZM2aEosROFcz8HD16VAsWLNDevXsVEdHl/pdqIZg5+vzzz7Vv3z5FRUVp69atOnnypH7961/rq6++6nKfAwYzP+np6Xr77beVnZ2tc+fO6fz583rggQe0cuXKcJR83QvVa3SXWwFeEopHNnUlTufnkqKiIj333HMqLi5Wv379QlVep+vo/Fy4cEE///nPtXjxYg0fPjxc5V0XnPwNXbx4US6XS2+//bbGjh2rqVOnaunSpSosLOySq0DJ2fwcPnxYc+fO1bPPPqvKykr9+c9/1vHjx5WbmxuOUn8QQvEa3eX+uRqqRzZ1FcHMzyXFxcWaPXu2Nm/erEmTJoWyzE7jdH4aGxt18OBBVVVV6Te/+Y2k717sjTGKiIjQrl279KMf/SgstYdLMH9DsbGxuu222wKe0zZy5EgZY/TFF19o2LBhIa05nIKZn4KCAo0bN06//e1vJUl33HGHbrzxRmVkZOj3v/99l3oXKhiheo3ucivAUD2yqasIZn6k71Z+s2bN0saNG7v05xJO58ftduvvf/+7qqur/Vtubq7i4+NVXV2tlJSUcJUeNsH8DY0bN05ffvmlzpw542/79NNP1a1bNw0YMCCk9YZbMPPzzTffqFu3wJfj7t27S/r/Kx2bhew1+qouoblOheKRTV2J0/nZuHGjiYiIMKtXrzZer9e/nT59urNOIaSczs/lbLgK1OkcNTY2mgEDBpif/vSn5h//+IfZs2ePGTZsmJkzZ05nnUJIOZ2fN99800RERJg1a9aYzz77zOzbt8+MGTPGjB07trNOIaQaGxtNVVWVqaqqMpLM0qVLTVVVlf82kXC9RnfJADQmNI9s6kqczM8999xjJLXYZs6cGf7Cw8Tp38/32RCAxjifoyNHjphJkyaZXr16mQEDBpi8vDzzzTffhLnq8HE6PytWrDAJCQmmV69eJjY21vziF78wX3zxRZirDo+PPvqo3deUcL1G8zgkAICVutxngAAAdAQBCACwEgEIALASAQgAsBIBCACwEgEIALASAQgAsBIBCACwEgEIALASAQgAsBIBCACw0v8FxgMkZnQg5pwAAAAASUVORK5CYII=",
- "text/plain": [
- "