diff --git a/02_exercises.qmd b/02_exercises.qmd new file mode 100644 index 0000000..ad32d21 --- /dev/null +++ b/02_exercises.qmd @@ -0,0 +1,380 @@ +# Applied Exercises {-} + +```{r} +library(reticulate) +os <- import("os") +os$listdir(".") +``` + + +```{python} + +import numpy as np +import matplotlib.pyplot as plt +#%matplotlib inline +from matplotlib.pyplot import subplots +import pandas as pd +from ISLP import load_data + +``` + +```{python} +pd.set_option('display.max_columns', None) +#pd.set_option('display.max_rows', None) +``` + + +## 8. This exercise relates to the `College` data set, which can be found in the file `College.csv` on the book website. It contains a number of variables for 777 different universities and colleges in the US. The variables are + +- `Private` : Public/private indicator +- `Apps` : Number of applications received +- `Accept` : Number of applicants accepted +- `Enroll` : Number of new students enrolled +- `Top10perc` : New students from top 10 % of high school class +- `Top25perc` : New students from top 25 % of high school class +- `F.Undergrad` : Number of full-time undergraduates +- `P.Undergrad` : Number of part-time undergraduates +- `Outstate` : Out-of-state tuition +- `Room.Board` : Room and board costs +- `Books` : Estimated book costs +- `Personal` : Estimated personal spending +- `PhD` : Percent of faculty with Ph.D.s +- `Terminal` : Percent of faculty with terminal degree +- `S.F.Ratio` : Student/faculty ratio +- `perc.alumni` : Percent of alumni who donate +- `Expend` : Instructional expenditure per student +- `Grad.Rate` : Graduation rate + +Before reading the data into `Python`, it can be viewed in Excel or a +text editor. + +### (a) Use the `pd.read_csv()` function to read the data into `Python`. Call the loaded data `college`. Make sure that you have the directory set to the correct location for the data. + +```{python} +college = pd.read_csv('ISLP_data/College.csv') +college +``` + + +### (b) Look at the data used in the notebook by creating and running a new cell with just the code `college` in it. You should notice that the first column is just the name of each university in a column named something like `Unnamed: 0`. We don’t really want `pandas` to treat this as data. However, it may be handy to have these names for later. Try the following commands and similarly look at the resulting data frames: + + + +```{python} + +college2 = pd.read_csv('ISLP_data/College.csv', index_col=0) +#college2 + +college3 = college.rename({'Unnamed: 0': 'college'}, + axis=1) +#college3 + +college3 = college3.set_index('college') +#college3 +``` + + +This has used the first column in the file as an `index` for the data frame. This means that `pandas` has given each row a name corresponding to the appropriate university. Now you should see that the first data column is `Private`. Note that the names of the colleges appear on the left of the table. We also introduced a new python object above: a *dictionary*, which is specified by `(key, value)` pairs. Keep your modified version of the data with the following: + +```{python} +college = college3 +college +``` + +### (c) Use the `describe()` method to produce a numerical summary of the variables in the data set. + +```{python} + +college.describe() + +``` + + + +### (d) Use the `pd.plotting.scatter_matrix()` function to produce a scatterplot matrix of the first columns `[Top10perc, Apps, Enroll]`. Recall that you can reference a list C of columns of a data frame `A` using `A[C]`. + + +```{python} +#fig, ax = subplots(figsize=(8, 8)) +pd.plotting.scatter_matrix(college[['Top10perc','Apps','Enroll']]) +#plt.show() +``` + + + +### (e) Use the boxplot() method of college to produce side-by-side boxplots of Outstate versus Private. + +```{python} + +``` + + +### (f) Create a new qualitative variable, called Elite, by binning the Top10perc variable into two groups based on whether or not the proportion of students coming from the top 10% of their high school classes exceeds 50%. + +```{python} +college['Elite'] = pd.cut(college['Top10perc'], + [0,0.5,1], + labels=['No', 'Yes']) +``` + + + +Use the `value_counts()` method of `college['Elite']` to see how many elite universities there are. Finally, use the `boxplot()` method again to produce side-by-side boxplots of `Outstate` versus `Elite`. + + +```{python} +college['Elite'].value_counts() +``` + +```{r} + +``` + + +### (g) Use the `plot.hist()` method of `college` to produce some histograms with difering numbers of bins for a few of the quantitative variables. The command `plt.subplots(2, 2)` may be useful: it will divide the plot window into four regions so that four plots can be made simultaneously. By changing the arguments you can divide the screen up in other combinations. + +```{python} + +``` + + +### (h) Continue exploring the data, and provide a brief summary of what you discover. + +```{python} + +``` + + + +## 9. This exercise involves the `Auto` data set studied in the lab. Make sure that the missing values have been removed from the data. + +```{python} + +Auto = pd.read_csv('ISLP_data/Auto.csv', + na_values=['?']) +Auto + +``` + + +### (a) Which of the predictors are quantitative, and which are qualitative? + +Mpg, Displacement, Horsepower, Weight and Acceleration are quantitative. Cylinders, Year, Origin, and Name are qualitative. + +```{python} +Auto.info() +``` + +```{python} +Auto['cylinders'] = Auto['cylinders'].astype('object') +Auto['cylinders'] +``` + + +### (b) What is the range of each quantitative predictor? You can answer this using the `min()` and `max()` methods in `numpy`. + +```{python} +mpg_min = Auto['mpg'].min( ) +mpg_max = Auto['mpg'].max( ) + +print('The min and max miles per gallon are', (mpg_min, mpg_max)) +``` + + +```{python} +dsp_min = Auto['displacement'].min( ) +dsp_max = Auto['displacement'].max( ) + +print('The min and max displacement are', (dsp_min, dsp_max)) +``` + + +```{python} +hpwr_min = Auto['horsepower'].min( ) +hpwr_max = Auto['horsepower'].max( ) + +print('The min and max horsepower are', (hpwr_min, hpwr_max)) +``` + +```{python} +wt_min = Auto['weight'].min( ) +wt_max = Auto['weight'].max( ) + +print('The min and max weights are', (wt_min, wt_max)) +``` + +```{python} +acc_min = Auto['acceleration'].min( ) +acc_max = Auto['acceleration'].max( ) + +print('The min and max accelerations are', (acc_min, acc_max)) +``` + +### (c) What is the mean and standard deviation of each quantitative predictor? + + +```{python} + +mpg_mean = Auto['mpg'].mean( ) +mpg_sd = Auto['mpg'].std( ) + +print('The mean and standard deviation of miles per gallon are', mpg_mean,'and', mpg_sd) +``` + + +```{python} +dsp_mean = Auto['displacement'].mean( ) +dsp_sd = Auto['displacement'].std( ) + +print('The mean and standard deviation of weight are', dsp_mean,'and', dsp_sd) +``` + + +```{python} +hpwr_mean = Auto['horsepower'].mean( ) +hpwr_sd = Auto['horsepower'].std( ) + +print('The mean and standard deviation of horsepower are', hpwr_mean,'and', hpwr_sd) +``` + +```{python} +wt_mean = Auto['weight'].mean( ) +wt_sd = Auto['weight'].std( ) + +print('The mean and standard deviation of weight are', wt_mean,'and', wt_sd) +``` + +```{python} +acc_mean = Auto['acceleration'].mean( ) +acc_sd = Auto['acceleration'].std( ) + +print('The mean and standard deviation of acceleration are', acc_mean,'and', acc_sd) +``` + + +### (d) Now remove the 10th through 85th observations. What is the range, mean, and standard deviation of each predictor in the subset of the data that remains? + +```{python} +Auto_new = Auto.drop(Auto.index[10:85]) +Auto_new +``` + + +```{python} +mpg_min = Auto_new['mpg'].min( ) +mpg_max = Auto_new['mpg'].max( ) + +print('The min and max miles per gallon of the subsetted data are', (mpg_min, mpg_max)) + +mpg_mean = Auto_new['mpg'].mean( ) +mpg_sd = Auto_new['mpg'].std( ) + +print('The mean and standard deviation of miles per gallon of the subsetted data are', mpg_mean,'and', mpg_sd) +``` + + +```{python} +dsp_min = Auto_new['displacement'].min( ) +dsp_max = Auto_new['displacement'].max( ) + +print('The min and max displacement of the subsetted data are', (dsp_min, dsp_max)) + +dsp_mean = Auto_new['displacement'].mean( ) +dsp_sd = Auto_new['displacement'].std( ) + +print('The mean and standard deviation of weight of the subsetted data are', dsp_mean,'and', dsp_sd) +``` + + +```{python} +hpwr_min = Auto['horsepower'].min( ) +hpwr_max = Auto['horsepower'].max( ) + +print('The min and max horsepower of the subsetted data are', (hpwr_min, hpwr_max)) + +hpwr_mean = Auto['horsepower'].mean( ) +hpwr_sd = Auto['horsepower'].std( ) + +print('The mean and standard deviation of horsepower of the subsetted data are', hpwr_mean,'and', hpwr_sd) +``` + +```{python} +wt_min = Auto['weight'].min( ) +wt_max = Auto['weight'].max( ) + +print('The min and max weights of the subsetted data are', (wt_min, wt_max)) + +wt_mean = Auto['weight'].mean( ) +wt_sd = Auto['weight'].std( ) + +print('The mean and standard deviation of weight of the subsetted data are', wt_mean,'and', wt_sd) +``` + +```{python} +acc_min = Auto['acceleration'].min( ) +acc_max = Auto['acceleration'].max( ) + +print('The min and max accelerations of the subsetted data are', (acc_min, acc_max)) + +acc_mean = Auto['acceleration'].mean( ) +acc_sd = Auto['acceleration'].std( ) + +print('The mean and standard deviation of acceleration of the subsetted data are', acc_mean,'and', acc_sd) +``` + + +### (e) Using the full data set, investigate the predictors graphically, using scatterplots or other tools of your choice. Create some plots highlighting the relationships among the predictors. Comment on your findings. + + + + +### (f) Suppose that we wish to predict gas mileage (`mpg`) on the basis of the other variables. Do your plots suggest that any of the other variables might be useful in predicting `mpg`? Justify your answer. + + + +## 10. This exercise involves the `Boston` housing data set. + +### (a) To begin, load in the `Boston` data set, which is part of the `ISLP` library. + + + +```{python} +Boston = load_data("Boston") +Boston +``` + + +### (b) How many rows are in this data set? How many columns? What do the rows and columns represent? + + + +### (c) Make some pairwise scatterplots of the predictors (columns) in this data set. Describe your fndings. + + + +### (d) Are any of the predictors associated with per capita crime rate? If so, explain the relationship. + + +### (e) Do any of the suburbs of Boston appear to have particularly high crime rates? Tax rates? Pupil-teacher ratios? Comment on the range of each predictor. + + + +### (f) How many of the suburbs in this data set bound the Charles river? + + +### (g) What is the median pupil-teacher ratio among the towns in this data set? + + + + +### (h) Which suburb of Boston has lowest median value of owneroccupied homes? What are the values of the other predictors for that suburb, and how do those values compare to the overall ranges for those predictors? Comment on your fndings. + + + + +### (i) In this data set, how many of the suburbs average more than seven rooms per dwelling? More than eight rooms per dwelling? Comment on the suburbs that average more than eight rooms per dwelling. + + + + + diff --git a/03_exercises.qmd b/03_exercises.qmd new file mode 100644 index 0000000..465ff4d --- /dev/null +++ b/03_exercises.qmd @@ -0,0 +1 @@ +# Exercises {-} diff --git a/03_main.qmd b/03_main.qmd index 1de84ff..c8dd874 100644 --- a/03_main.qmd +++ b/03_main.qmd @@ -1,8 +1,7 @@ -# 3. Built-in Data Structures, Functions, and Files +# 3. Linear Regression ## Learning Objectives -* The data structures tuples, lists, dictionaries, and sets -* Functions -* Errors and Exception Handling -* Files and the Operating System \ No newline at end of file +- item 1 +- item 2 +- item 3 diff --git a/03_notes.ipynb b/03_notes.ipynb deleted file mode 100644 index 42fb7f6..0000000 --- a/03_notes.ipynb +++ /dev/null @@ -1,2130 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Data Structures and Sequences\n", - "\n", - "## Tuples\n", - "\n", - "![](https://pynative.com/wp-content/uploads/2021/02/python-tuple.jpg)\n", - "\n", - "A tuple is a fixed-length, immutable sequence of Python objects which, once assigned, cannot be changed. The easiest way to create one is with a comma-separated sequence of values wrapped in parentheses:\n" - ], - "id": "3a1cd725" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "tup = (4, 5, 6)\n", - "tup" - ], - "id": "1a4127fb", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In many contexts, the parentheses can be omitted\n" - ], - "id": "7fc1781c" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "tup = 4, 5, 6\n", - "tup" - ], - "id": "0f1a6f84", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "You can convert any sequence or iterator to a tuple by invoking\n" - ], - "id": "c2fda83c" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "tuple([4,0,2])\n", - "\n", - "tup = tuple('string')\n", - "\n", - "tup" - ], - "id": "95488ca7", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Elements can be accessed with square brackets [] \n", - "\n", - "Note the zero indexing\n" - ], - "id": "78eba52c" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "tup[0]" - ], - "id": "c4fee8cb", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Tuples of tuples\n" - ], - "id": "21889737" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "nested_tup = (4,5,6),(7,8)\n", - "\n", - "nested_tup" - ], - "id": "97645c8c", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "nested_tup[0]" - ], - "id": "c85ab770", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "nested_tup[1]" - ], - "id": "127f5db3", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "While the objects stored in a tuple may be mutable themselves, once the tuple is created it’s not possible to modify which object is stored in each slot:\n" - ], - "id": "d4d95305" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "tup = tuple(['foo', [1, 2], True])\n", - "\n", - "tup[2]" - ], - "id": "57fb7160", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "```{{python}}\n", - "\n", - "tup[2] = False\n", - "\n", - "```\n", - "\n", - "````\n", - "TypeError Traceback (most recent call last)\n", - "Input In [9], in ()\n", - "----> 1 tup[2] = False\n", - "\n", - "TypeError: 'tuple' object does not support item assignment\n", - "TypeError: 'tuple' object does not support item assignment\n", - "````\n", - "\n", - "If an object inside a tuple is mutable, such as a list, you can modify it in place\n" - ], - "id": "4c5947e2" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "tup[1].append(3)\n", - "\n", - "tup" - ], - "id": "3b55b0ca", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "You can concatenate tuples using the + operator to produce longer tuples:\n" - ], - "id": "67812910" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "(4, None, 'foo') + (6, 0) + ('bar',)" - ], - "id": "e4893a4b", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Unpacking tuples\n", - "\n", - "If you try to assign to a tuple-like expression of variables, Python will attempt to unpack the value on the righthand side of the equals sign:\n" - ], - "id": "15ea0473" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "tup = (4, 5, 6)\n", - "tup" - ], - "id": "9c245a45", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "a, b, c = tup\n", - "\n", - "c" - ], - "id": "f5614d81", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Even sequences with nested tuples can be unpacked:\n" - ], - "id": "0fe7113c" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "tup = 4, 5, (6,7)\n", - "\n", - "a, b, (c, d) = tup\n", - "\n", - "d" - ], - "id": "5121e8d1", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "To easily swap variable names\n" - ], - "id": "d1f1eb7c" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "a, b = 1, 4\n", - "\n", - "a" - ], - "id": "e7862391", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "b" - ], - "id": "561bc446", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "b, a = a, b\n", - "\n", - "a" - ], - "id": "2ca4bbe9", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "b" - ], - "id": "942829e5", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "A common use of variable unpacking is iterating over sequences of tuples or lists\n" - ], - "id": "d152ab53" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "seq = [(1, 2, 3), (4, 5, 6), (7, 8, 9)]\n", - "\n", - "seq" - ], - "id": "796edb9b", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "for a, b, c in seq:\n", - " print(f'a={a}, b={b}, c={c}')" - ], - "id": "d7d0cce1", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "`*rest` syntax for plucking elements\n" - ], - "id": "2893705e" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "values = 1,2,3,4,5\n", - "\n", - "a, b, *rest = values\n", - "\n", - "rest" - ], - "id": "19bfdc95", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - " As a matter of convention, many Python programmers will use the underscore (_) for unwanted variables:\n" - ], - "id": "b3d21d63" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "a, b, *_ = values" - ], - "id": "0a7aeed0", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Tuple methods\n", - "\n", - "Since the size and contents of a tuple cannot be modified, it is very light on instance methods. A particularly useful one (also available on lists) is `count`\n" - ], - "id": "5c9410b2" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "a = (1,2,2,2,2,3,4,5,7,8,9)\n", - "\n", - "a.count(2)" - ], - "id": "9c1cfa8b", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## List\n", - "\n", - "![](https://pynative.com/wp-content/uploads/2021/03/python-list.jpg)\n", - "\n", - "In contrast with tuples, lists are variable length and their contents can be modified in place.\n", - "\n", - "Lists are mutable. \n", - "\n", - "Lists use `[]` square brackts or the `list` function\n" - ], - "id": "31fa597f" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "a_list = [2, 3, 7, None]\n", - "\n", - "tup = (\"foo\", \"bar\", \"baz\")\n", - "\n", - "b_list = list(tup)\n", - "\n", - "b_list" - ], - "id": "ad9b1cba", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "b_list[1] = \"peekaboo\"\n", - "\n", - "b_list" - ], - "id": "8fe59228", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Lists and tuples are semantically similar (though tuples cannot be modified) and can be used interchangeably in many functions.\n" - ], - "id": "ddb1939c" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "gen = range(10)\n", - "\n", - "gen" - ], - "id": "01cfddb6", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "list(gen)" - ], - "id": "c9f3d6c6", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Adding and removing list elements\n", - "\n", - "the `append` method\n" - ], - "id": "fe563a3f" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "b_list.append(\"dwarf\")\n", - "\n", - "b_list" - ], - "id": "41c681f0", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "the `insert` method\n" - ], - "id": "6fc8d0d8" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "b_list.insert(1, \"red\")\n", - "\n", - "b_list" - ], - "id": "b0263126", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "`insert` is computationally more expensive than `append`\n", - "\n", - "the `pop` method, the inverse of `insert`\n" - ], - "id": "cb008d9e" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "b_list.pop(2)" - ], - "id": "f4f1b899", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "b_list" - ], - "id": "e49ad587", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "the `remove` method\n" - ], - "id": "43f3cf43" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "b_list.append(\"foo\")\n", - "\n", - "b_list" - ], - "id": "a2a16d87", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "b_list.remove(\"foo\")\n", - "\n", - "b_list" - ], - "id": "6e6f6c89", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Check if a list contains a value using the `in` keyword:\n" - ], - "id": "b35edf6c" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "\"dwarf\" in b_list" - ], - "id": "d0e319fe", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The keyword `not` can be used to negate an `in`\n" - ], - "id": "7d2db93f" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "\"dwarf\" not in b_list" - ], - "id": "5e8dc06f", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Concatenating and combining lists\n", - "\n", - "similar with tuples, use `+` to concatenate\n" - ], - "id": "2c7cdae3" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "[4, None, \"foo\"] + [7, 8, (2, 3)]" - ], - "id": "1f40071b", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "the `extend` method\n" - ], - "id": "94e06b08" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "x = [4, None, \"foo\"]\n", - "\n", - "x.extend([7,8,(2,3)])\n", - "\n", - "x" - ], - "id": "f932920f", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "list concatenation by addition is an expensive operation\n", - "\n", - "using `extend` is preferable\n", - "\n", - "```{{python}}\n", - "everything = []\n", - "for chunk in list_of_lists:\n", - " everything.extend(chunk)\n", - "\n", - "```\n", - "\n", - "is generally faster than\n", - "\n", - "```{{python}}\n", - "\n", - "everything = []\n", - "for chunk in list_of_lists:\n", - " everything = everything + chunk\n", - "\n", - "```\n", - "\n", - "### Sorting\n", - "\n", - "the `sort` method\n" - ], - "id": "bd8208a9" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "a = [7, 2, 5, 1, 3]\n", - "\n", - "a.sort()\n", - "\n", - "a" - ], - "id": "bfa975b2", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "`sort` options\n" - ], - "id": "a9c1ddab" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "b = [\"saw\", \"small\", \"He\", \"foxes\", \"six\"]\n", - "\n", - "b.sort(key = len)\n", - "\n", - "b" - ], - "id": "4d535f46", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Slicing\n", - "\n", - "Slicing semantics takes a bit of getting used to, especially if you’re coming from R or MATLAB.\n", - "\n", - "using the indexing operator `[]`\n" - ], - "id": "bdb45eae" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "seq = [7, 2, 3, 7, 5, 6, 0, 1]\n", - "\n", - "seq[3:5]" - ], - "id": "5701d6d6", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "also assigned with a sequence\n" - ], - "id": "4099c569" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "seq[3:5] = [6,3]\n", - "\n", - "seq" - ], - "id": "55b637cf", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Either the `start` or `stop` can be omitted\n" - ], - "id": "f9053599" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "seq[:5]" - ], - "id": "c8e1b8c1", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "seq[3:]" - ], - "id": "819cfcc7", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Negative indices slice the sequence relative to the end:\n" - ], - "id": "96b07bb0" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "seq[-4:]" - ], - "id": "77933d0f", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "A step can also be used after a second colon to, say, take every other element:\n" - ], - "id": "cd3a69ff" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "seq[::2]" - ], - "id": "7e056d7a", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "A clever use of this is to pass -1, which has the useful effect of reversing a list or tuple:\n" - ], - "id": "f7eec93a" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "seq[::-1]" - ], - "id": "ebdf6ab2", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Dictionary\n", - "\n", - "![](https://pynative.com/wp-content/uploads/2021/02/dictionaries-in-python.jpg)\n", - "\n", - "The dictionary or dict may be the most important built-in Python data structure. \n", - "\n", - "One approach for creating a dictionary is to use curly braces {} and colons to separate keys and values:\n" - ], - "id": "220ffa92" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "empty_dict = {}\n", - "\n", - "d1 = {\"a\": \"some value\", \"b\": [1, 2, 3, 4]}\n", - "\n", - "d1" - ], - "id": "c34db4ac", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "access, insert, or set elements \n" - ], - "id": "33105b9a" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "d1[7] = \"an integer\"\n", - "\n", - "d1" - ], - "id": "a4b37752", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "and as before\n" - ], - "id": "6498f75e" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "\"b\" in d1" - ], - "id": "c39e6a68", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "the `del` and `pop` methods\n" - ], - "id": "930cb78c" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "del d1[7]\n", - "\n", - "d1" - ], - "id": "cfb76431", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "ret = d1.pop(\"a\")\n", - "\n", - "ret" - ], - "id": "e5d88427", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The `keys` and `values` methods\n" - ], - "id": "4b1e03fb" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "list(d1.keys())" - ], - "id": "8b5f63e2", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "list(d1.values())" - ], - "id": "775fe0de", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "the `items` method\n" - ], - "id": "25b142c2" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "list(d1.items())" - ], - "id": "9f6f9bed", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - " the update method to merge one dictionary into another\n" - ], - "id": "38f1c365" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "d1.update({\"b\": \"foo\", \"c\": 12})\n", - "\n", - "d1" - ], - "id": "70f2b3cf", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - " ### Creating dictionaries from sequences\n" - ], - "id": "8317076b" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "list(range(5))" - ], - "id": "d7515c7a", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "tuples = zip(range(5), reversed(range(5)))\n", - "\n", - "tuples\n", - "\n", - "mapping = dict(tuples)\n", - "\n", - "mapping" - ], - "id": "3a31e0b1", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Default values\n", - "\n", - "imagine categorizing a list of words by their first letters as a dictionary of lists\n" - ], - "id": "435e294f" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "words = [\"apple\", \"bat\", \"bar\", \"atom\", \"book\"]\n", - "\n", - "by_letter = {}\n", - "\n", - "for word in words:\n", - " letter = word[0]\n", - " if letter not in by_letter:\n", - " by_letter[letter] = [word]\n", - " else:\n", - " by_letter[letter].append(word)\n", - "\n", - "by_letter" - ], - "id": "27c323c7", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The `setdefault` dictionary method can be used to simplify this workflow. The preceding for loop can be rewritten as:\n" - ], - "id": "d9a79573" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "by_letter = {}\n", - "\n", - "for word in words:\n", - " letter = word[0]\n", - " by_letter.setdefault(letter, []).append(word)\n", - "\n", - "by_letter" - ], - "id": "549ce24b", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The built-in `collections`module has a useful class, `defaultdict`, which makes this even easier.\n" - ], - "id": "a2e69d92" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "from collections import defaultdict\n", - "\n", - "by_letter = defaultdict(list)\n", - "\n", - "for word in words:\n", - " by_letter[word[0]].append(word)\n", - "\n", - "by_letter" - ], - "id": "3f76ba12", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Valid dictionary key types\n", - "\n", - "keys generally have to be immutable objects like scalars or tuples for *hashability*\n", - "\n", - "To use a list as a key, one option is to convert it to a tuple, which can be hashed as long as its elements also can be:\n" - ], - "id": "2be697d3" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "d = {}\n", - "\n", - "d[tuple([1,2,3])] = 5\n", - "\n", - "d" - ], - "id": "8db8ac11", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Set\n", - "\n", - "![](https://pynative.com/wp-content/uploads/2021/03/python-sets.jpg)\n", - "\n", - "can be created in two ways: via the `set` function or via a `set literal` with curly braces:\n" - ], - "id": "c94a57a7" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "set([2, 2, 2, 1, 3, 3])\n", - "\n", - "{2,2,1,3,3}" - ], - "id": "66cec67e", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Sets support mathematical set operations like union, intersection, difference, and symmetric difference.\n", - "\n", - "The `union` of these two sets:\n" - ], - "id": "15b1d09a" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "a = {1, 2, 3, 4, 5}\n", - "\n", - "b = {3, 4, 5, 6, 7, 8}\n", - "\n", - "a.union(b)\n", - "\n", - "a | b" - ], - "id": "94ff1da4", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The `&`operator or the `intersection` method\n" - ], - "id": "89c317e3" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "a.intersection(b)\n", - "\n", - "a & b" - ], - "id": "42c1d2d2", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "[A table of commonly used `set` methods](https://wesmckinney.com/book/python-builtin.html#tbl-table_set_operations)\n", - "\n", - "All of the logical set operations have in-place counterparts, which enable you to replace the contents of the set on the left side of the operation with the result. For very large sets, this may be more efficient\n" - ], - "id": "28fe8d44" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "c = a.copy()\n", - "\n", - "c |= b\n", - "\n", - "c" - ], - "id": "0c2cabfc", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "d = a.copy()\n", - "\n", - "d &= b\n", - "\n", - "d" - ], - "id": "1abae511", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "set elements generally must be immutable, and they must be hashable\n", - "\n", - "you can convert them to tuples\n", - "\n", - "You can also check if a set is a subset of (is contained in) or a superset of (contains all elements of) another set\n" - ], - "id": "1392b9bb" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "a_set = {1, 2, 3, 4, 5}\n", - "\n", - "{1, 2, 3}.issubset(a_set)" - ], - "id": "a2b0255a", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "a_set.issuperset({1, 2, 3})" - ], - "id": "be21a41e", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Built-In Sequence Functions\n", - "\n", - "### enumerate\n", - "\n", - "`enumerate` returns a sequence of (i, value) tuples\n", - "\n", - "### sorted\n", - "\n", - "`sorted` returns a new sorted list \n" - ], - "id": "b529d0fa" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "sorted([7,1,2,9,3,6,5,0,22])" - ], - "id": "6cf904db", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### zip\n", - "\n", - "`zip` “pairs” up the elements of a number of lists, tuples, or other sequences to create a list of tuples\n" - ], - "id": "8e1ec3d4" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "seq1 = [\"foo\", \"bar\", \"baz\"]\n", - "\n", - "seq2 = [\"one\", \"two\", \"three\"]\n", - "\n", - "zipped = zip(seq1, seq2)\n", - "\n", - "list(zipped)" - ], - "id": "1a835a28", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "`zip` can take an arbitrary number of sequences, and the number of elements it produces is determined by the shortest sequence\n" - ], - "id": "3c5671a3" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "seq3 = [False, True]\n", - "\n", - "list(zip(seq1, seq2, seq3))" - ], - "id": "e1c6cdc5", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "A common use of `zip` is simultaneously iterating over multiple sequences, possibly also combined with `enumerate`\n" - ], - "id": "2bd5377e" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "for index, (a, b) in enumerate(zip(seq1, seq2)):\n", - " print(f\"{index}: {a}, {b}\")" - ], - "id": "67f1d759", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "`reversed` iterates over the elements of a sequence in reverse order\n" - ], - "id": "898bcddb" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "list(reversed(range(10)))" - ], - "id": "3acad655", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## List, Set, and Dictionary Comprehensions\n", - "\n", - "```\n", - "[expr for value in collection if condition]\n", - "```\n", - "\n", - "For example, given a list of strings, we could filter out strings with length 2 or less and convert them to uppercase like this\n" - ], - "id": "570d0ca5" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "strings = [\"a\", \"as\", \"bat\", \"car\", \"dove\", \"python\"]\n", - "\n", - "[x.upper() for x in strings if len(x) > 2]" - ], - "id": "a85069bb", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "A dictionary comprehension looks like this\n", - "\n", - "```\n", - "dict_comp = {key-expr: value-expr for value in collection\n", - " if condition}\n", - "```\n", - "\n", - "Suppose we wanted a set containing just the lengths of the strings contained in the collection\n" - ], - "id": "bec26986" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "unique_lengths = {len(x) for x in strings}\n", - "\n", - "unique_lengths" - ], - "id": "934c70c9", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "we could create a lookup map of these strings for their locations in the list\n" - ], - "id": "99afc7b2" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "loc_mapping = {value: index for index, value in enumerate(strings)}\n", - "\n", - "loc_mapping" - ], - "id": "3c7831b9", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Nested list comprehensions\n", - "\n", - "Suppose we have a list of lists containing some English and Spanish names. We want to get a single list containing all names with two or more a’s in them\n" - ], - "id": "7a1641d0" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "all_data = [[\"John\", \"Emily\", \"Michael\", \"Mary\", \"Steven\"],\n", - " [\"Maria\", \"Juan\", \"Javier\", \"Natalia\", \"Pilar\"]]\n", - "\n", - "result = [name for names in all_data for name in names\n", - " if name.count(\"a\") >= 2]\n", - "\n", - "result" - ], - "id": "a1907e85", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Here is another example where we “flatten” a list of tuples of integers into a simple list of integers\n" - ], - "id": "f3443856" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "some_tuples = [(1, 2, 3), (4, 5, 6), (7, 8, 9)]\n", - "\n", - "flattened = [x for tup in some_tuples for x in tup]\n", - "\n", - "flattened" - ], - "id": "5364c62c", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Functions\n", - "\n", - "![](https://miro.medium.com/max/1200/1*ZegxhR33NdeVRpBPYXnYYQ.gif)\n", - "\n", - "`Functions` are the primary and most important method of code organization and reuse in Python.\n", - "\n", - "they use the `def` keyword\n", - "\n", - "Each function can have positional arguments and keyword arguments. Keyword arguments are most commonly used to specify default values or optional arguments. Here we will define a function with an optional z argument with the default value 1.5\n" - ], - "id": "3f9b68fd" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "def my_function(x, y, z=1.5):\n", - " return (x + y) * z \n", - "\n", - "my_function(4,25)" - ], - "id": "3a12510c", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The main restriction on function arguments is that the keyword arguments must follow the positional arguments\n", - "\n", - "## Namespaces, Scope, and Local Functions\n", - "\n", - "A more descriptive name describing a variable scope in Python is a namespace.\n", - "\n", - "Consider the following function\n" - ], - "id": "32f01707" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "a = []\n", - "\n", - "def func():\n", - " for i in range(5):\n", - " a.append(i)" - ], - "id": "08d2d758", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "When `func()` is called, the empty list a is created, five elements are appended, and then a is destroyed when the function exits. \n" - ], - "id": "1ae900a5" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "func()\n", - "\n", - "func()\n", - "\n", - "a" - ], - "id": "64984775", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Returing Multiple Values\n", - "\n", - "What’s happening here is that the function is actually just returning one object, a tuple, which is then being unpacked into the result variables.\n" - ], - "id": "2d3ec95b" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "def f():\n", - " a = 5\n", - " b = 6\n", - " c = 7\n", - " return a, b, c\n", - "\n", - "a, b, c = f()\n", - "\n", - "a" - ], - "id": "c71b2067", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Functions are Objects\n", - "\n", - " Suppose we were doing some data cleaning and needed to apply a bunch of transformations to the following list of strings:\n" - ], - "id": "33bd0d19" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "states = [\" Alabama \", \"Georgia!\", \"Georgia\", \"georgia\", \"FlOrIda\",\n", - " \"south carolina##\", \"West virginia?\"]\n", - "\n", - "import re\n", - "\n", - "def clean_strings(strings):\n", - " result = []\n", - " for value in strings:\n", - " value = value.strip()\n", - " value = re.sub(\"[!#?]\", \"\", value)\n", - " value = value.title()\n", - " result.append(value)\n", - " return result\n", - "\n", - "clean_strings(states)" - ], - "id": "4e52305f", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Another approach\n" - ], - "id": "32f8957b" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "def remove_punctuation(value):\n", - " return re.sub(\"[!#?]\", \"\", value)\n", - "\n", - "clean_ops = [str.strip, remove_punctuation, str.title]\n", - "\n", - "def clean_strings(strings, ops):\n", - " result = []\n", - " for value in strings:\n", - " for func in ops:\n", - " value = func(value)\n", - " result.append(value)\n", - " return result\n", - "\n", - "clean_strings(states, clean_ops)" - ], - "id": "c1387f42", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "You can use functions as arguments to other functions like the built-in `map` function\n" - ], - "id": "8e2c195c" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "for x in map(remove_punctuation, states):\n", - " print(x)" - ], - "id": "0b50d1c4", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Anonymous Lambda Functions\n", - "\n", - " a way of writing functions consisting of a single statement\n", - "\n", - "suppose you wanted to sort a collection of strings by the number of distinct letters in each string\n" - ], - "id": "fa443fcc" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "strings = [\"foo\", \"card\", \"bar\", \"aaaaaaa\", \"ababdo\"]\n", - "\n", - "strings.sort(key=lambda x: len(set(x)))\n", - "\n", - "strings" - ], - "id": "112bd95d", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Generators\n", - "\n", - "Many objects in Python support iteration, such as over objects in a list or lines in a file. \n" - ], - "id": "9d37edd1" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "some_dict = {\"a\": 1, \"b\": 2, \"c\": 3}\n", - "\n", - "for key in some_dict:\n", - " print(key)" - ], - "id": "825ad051", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Most methods expecting a list or list-like object will also accept any iterable object. This includes built-in methods such as `min`, `max`, and `sum`, and type constructors like `list` and `tuple`\n", - "\n", - "A `generator` is a convenient way, similar to writing a normal function, to construct a new iterable object. Whereas normal functions execute and return a single result at a time, generators can return a sequence of multiple values by pausing and resuming execution each time the generator is used. To create a generator, use the yield keyword instead of return in a function\n" - ], - "id": "540ac627" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "def squares(n=10):\n", - " print(f\"Generating squares from 1 to {n ** 2}\")\n", - " for i in range(1, n + 1):\n", - " yield i ** 2\n", - "\n", - "gen = squares()\n", - "\n", - "for x in gen:\n", - " print(x, end=\" \")" - ], - "id": "5a8cad28", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "> Since generators produce output one element at a time versus an entire list all at once, it can help your program use less memory.\n", - "\n", - "## Generator expressions\n", - "\n", - " This is a generator analogue to list, dictionary, and set comprehensions. To create one, enclose what would otherwise be a list comprehension within parentheses instead of brackets:\n" - ], - "id": "995e7b33" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "gen = (x ** 2 for x in range(100))\n", - "\n", - "gen" - ], - "id": "5500c6d0", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Generator expressions can be used instead of list comprehensions as function arguments in some cases:\n" - ], - "id": "bd9d48c2" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "sum(x ** 2 for x in range(100))" - ], - "id": "e825686f", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "dict((i, i ** 2) for i in range(5))" - ], - "id": "fa743b19", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## itertools module\n", - "\n", - "`itertools` module has a collection of generators for many common data algorithms.\n", - "\n", - "`groupby` takes any sequence and a function, grouping consecutive elements in the sequence by return value of the function\n" - ], - "id": "97a6a95e" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "import itertools\n", - "\n", - "def first_letter(x):\n", - " return x[0]\n", - "\n", - "names = [\"Alan\", \"Adam\", \"Jackie\", \"Lily\", \"Katie\", \"Molly\"]\n", - "\n", - "for letter, names in itertools.groupby(names, first_letter):\n", - " print(letter, list(names))" - ], - "id": "b79a21f8", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "[Table of other itertools functions](https://wesmckinney.com/book/python-builtin.html#tbl-table_itertools)\n", - "\n", - "# Errors and Exception Handling\n", - "\n", - "Handling errors or exceptions gracefully is an important part of building robust programs\n" - ], - "id": "93a7f023" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "def attempt_float(x):\n", - " try:\n", - " return float(x)\n", - " except:\n", - " return x\n", - "\n", - "attempt_float(\"1.2345\")" - ], - "id": "2734a8ab", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "attempt_float(\"something\")" - ], - "id": "eefd78d6", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "You might want to suppress only ValueError, since a TypeError (the input was not a string or numeric value) might indicate a legitimate bug in your program. To do that, write the exception type after except:\n" - ], - "id": "fd55caa9" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "def attempt_float(x):\n", - " try:\n", - " return float(x)\n", - " except ValueError:\n", - " return x" - ], - "id": "57e7a361", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "#| eval: false\n", - "attempt_float((1, 2))" - ], - "id": "f00f53d1", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "```\n", - "---------------------------------------------------------------------------\n", - "TypeError Traceback (most recent call last)\n", - "d:\\packages\\bookclub-py4da\\03_notes.qmd in ()\n", - "----> 1001 attempt_float((1, 2))\n", - "\n", - "Input In [114], in attempt_float(x)\n", - " 1 def attempt_float(x):\n", - " 2 try:\n", - "----> 3 return float(x)\n", - " 4 except ValueError:\n", - " 5 return x\n", - "\n", - "TypeError: float() argument must be a string or a real number, not 'tuple'\n", - "\n", - "```\n", - "\n", - "You can catch multiple exception types by writing a tuple of exception types instead (the parentheses are required):\n" - ], - "id": "b4e72e5b" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "def attempt_float(x):\n", - " try:\n", - " return float(x)\n", - " except (TypeError, ValueError):\n", - " return x\n", - "\n", - "attempt_float((1, 2))" - ], - "id": "aa6a72b0", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In some cases, you may not want to suppress an exception, but you want some code to be executed regardless of whether or not the code in the try block succeeds. To do this, use `finally`:\n" - ], - "id": "eab1031f" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "#| eval: false\n", - "f = open(path, mode=\"w\")\n", - "\n", - "try:\n", - " write_to_file(f)\n", - "finally:\n", - " f.close()" - ], - "id": "29ba6b1d", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Here, the file object f will always get closed. \n", - "\n", - "you can have code that executes only if the try: block succeeds using else:\n" - ], - "id": "33eca894" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "#| eval: false\n", - "f = open(path, mode=\"w\")\n", - "\n", - "try:\n", - " write_to_file(f)\n", - "except:\n", - " print(\"Failed\")\n", - "else:\n", - " print(\"Succeeded\")\n", - "finally:\n", - " f.close()" - ], - "id": "593dd0bc", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Exceptions in IPython\n", - "\n", - "If an exception is raised while you are %run-ing a script or executing any statement, IPython will by default print a full call stack trace. Having additional context by itself is a big advantage over the standard Python interpreter\n", - "\n", - "# Files and the Operating System\n", - "\n", - "To open a file for reading or writing, use the built-in open function with either a relative or absolute file path and an optional file encoding.\n", - "\n", - "We can then treat the file object f like a list and iterate over the lines\n" - ], - "id": "7f87df7f" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "#| eval: false\n", - "\n", - "path = \"examples/segismundo.txt\"\n", - "\n", - "f = open(path, encoding=\"utf-8\")\n", - "\n", - "lines = [x.rstrip() for x in open(path, encoding=\"utf-8\")]\n", - "\n", - "lines" - ], - "id": "cf0a4c7a", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "When you use open to create file objects, it is recommended to close the file\n" - ], - "id": "39af0d41" - }, - { - "cell_type": "code", - "metadata": {}, - "source": [ - "#| eval: false\n", - "f.close()" - ], - "id": "61abfc07", - "execution_count": null, - "outputs": [] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "some of the most commonly used methods are `read`, `seek`, and `tell`.\n", - "\n", - "`read(10)` returns 10 characters from the file\n", - "\n", - "the `read` method advances the file object position by the number of bytes read\n", - "\n", - "`tell()` gives you the current position in the file\n", - "\n", - "To get consistent behavior across platforms, it is best to pass an encoding (such as `encoding=\"utf-8\"`)\n", - "\n", - "`seek(3)` changes the file position to the indicated byte \n", - "\n", - "To write text to a file, you can use the file’s `write` or `writelines` methods\n", - "\n", - "## Byte and Unicode with Files\n", - "\n", - "The default behavior for Python files (whether readable or writable) is text mode, which means that you intend to work with Python strings (i.e., Unicode). \n" - ], - "id": "fb339453" - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - } - }, - "nbformat": 4, - "nbformat_minor": 5 -} \ No newline at end of file diff --git a/03_notes.qmd b/03_notes.qmd index e5b986b..b65567e 100644 --- a/03_notes.qmd +++ b/03_notes.qmd @@ -1,1115 +1 @@ -# Data Structures and Sequences - -## Tuples - -![](https://pynative.com/wp-content/uploads/2021/02/python-tuple.jpg) - -A tuple is a fixed-length, immutable sequence of Python objects which, once assigned, cannot be changed. The easiest way to create one is with a comma-separated sequence of values wrapped in parentheses: - -```{python} -tup = (4, 5, 6) -tup -``` - -In many contexts, the parentheses can be omitted - -```{python} -tup = 4, 5, 6 -tup -``` - -You can convert any sequence or iterator to a tuple by invoking - -```{python} -tuple([4,0,2]) - -tup = tuple('string') - -tup -``` - -Elements can be accessed with square brackets [] - -Note the zero indexing - -```{python} -tup[0] - -``` - -Tuples of tuples - -```{python} -nested_tup = (4,5,6),(7,8) - -nested_tup - -``` - -```{python} -nested_tup[0] -``` - -```{python} -nested_tup[1] -``` - -While the objects stored in a tuple may be mutable themselves, once the tuple is created it’s not possible to modify which object is stored in each slot: - -```{python} -tup = tuple(['foo', [1, 2], True]) - -tup[2] - -``` - -```{{python}} - -tup[2] = False - -``` - -```` -TypeError Traceback (most recent call last) -Input In [9], in () -----> 1 tup[2] = False - -TypeError: 'tuple' object does not support item assignment -TypeError: 'tuple' object does not support item assignment -```` - -If an object inside a tuple is mutable, such as a list, you can modify it in place - -```{python} -tup[1].append(3) - -tup -``` - -You can concatenate tuples using the + operator to produce longer tuples: - -```{python} -(4, None, 'foo') + (6, 0) + ('bar',) - -``` - -### Unpacking tuples - -If you try to assign to a tuple-like expression of variables, Python will attempt to unpack the value on the righthand side of the equals sign: - -```{python} -tup = (4, 5, 6) -tup -``` - -```{python} - -a, b, c = tup - -c - -``` - -Even sequences with nested tuples can be unpacked: - -```{python} - -tup = 4, 5, (6,7) - -a, b, (c, d) = tup - -d -``` - -To easily swap variable names - -```{python} -a, b = 1, 4 - -a - -``` - -```{python} -b -``` - -```{python} -b, a = a, b - -a -``` - -```{python} -b -``` - -A common use of variable unpacking is iterating over sequences of tuples or lists - -```{python} -seq = [(1, 2, 3), (4, 5, 6), (7, 8, 9)] - -seq -``` - -```{python} -for a, b, c in seq: - print(f'a={a}, b={b}, c={c}') - -``` - -`*rest` syntax for plucking elements - -```{python} -values = 1,2,3,4,5 - -a, b, *rest = values - -rest - -``` - - As a matter of convention, many Python programmers will use the underscore (_) for unwanted variables: - -```{python} - -a, b, *_ = values - -``` - -### Tuple methods - -Since the size and contents of a tuple cannot be modified, it is very light on instance methods. A particularly useful one (also available on lists) is `count` - -```{python} -a = (1,2,2,2,2,3,4,5,7,8,9) - -a.count(2) - -``` - -## List - -![](https://pynative.com/wp-content/uploads/2021/03/python-list.jpg) - -In contrast with tuples, lists are variable length and their contents can be modified in place. - -Lists are mutable. - -Lists use `[]` square brackts or the `list` function - -```{python} - -a_list = [2, 3, 7, None] - -tup = ("foo", "bar", "baz") - -b_list = list(tup) - -b_list - -``` - -```{python} - -b_list[1] = "peekaboo" - -b_list -``` - -Lists and tuples are semantically similar (though tuples cannot be modified) and can be used interchangeably in many functions. - -```{python} -gen = range(10) - -gen -``` - -```{python} - -list(gen) - -``` - -### Adding and removing list elements - -the `append` method - -```{python} - -b_list.append("dwarf") - -b_list -``` - -the `insert` method - -```{python} - -b_list.insert(1, "red") - -b_list -``` - -`insert` is computationally more expensive than `append` - -the `pop` method, the inverse of `insert` - -```{python} - -b_list.pop(2) - -``` - -```{python} - -b_list -``` - -the `remove` method - -```{python} - -b_list.append("foo") - -b_list -``` - -```{python} - -b_list.remove("foo") - -b_list - -``` - -Check if a list contains a value using the `in` keyword: - -```{python} - -"dwarf" in b_list - -``` - -The keyword `not` can be used to negate an `in` - -```{python} -"dwarf" not in b_list - -``` - -### Concatenating and combining lists - -similar with tuples, use `+` to concatenate - -```{python} -[4, None, "foo"] + [7, 8, (2, 3)] - -``` - -the `extend` method - -```{python} -x = [4, None, "foo"] - -x.extend([7,8,(2,3)]) - -x -``` - -list concatenation by addition is an expensive operation - -using `extend` is preferable - -```{{python}} -everything = [] -for chunk in list_of_lists: - everything.extend(chunk) - -``` - -is generally faster than - -```{{python}} - -everything = [] -for chunk in list_of_lists: - everything = everything + chunk - -``` - -### Sorting - -the `sort` method - -```{python} - -a = [7, 2, 5, 1, 3] - -a.sort() - -a - -``` - -`sort` options - -```{python} -b = ["saw", "small", "He", "foxes", "six"] - -b.sort(key = len) - -b - -``` - -### Slicing - -Slicing semantics takes a bit of getting used to, especially if you’re coming from R or MATLAB. - -using the indexing operator `[]` - -```{python} - -seq = [7, 2, 3, 7, 5, 6, 0, 1] - -seq[3:5] - -``` - -also assigned with a sequence - -```{python} - -seq[3:5] = [6,3] - -seq -``` - -Either the `start` or `stop` can be omitted - -```{python} -seq[:5] - -``` - -```{python} -seq[3:] -``` - -Negative indices slice the sequence relative to the end: - -```{python} -seq[-4:] -``` - -A step can also be used after a second colon to, say, take every other element: - -```{python} -seq[::2] - -``` - -A clever use of this is to pass -1, which has the useful effect of reversing a list or tuple: - -```{python} -seq[::-1] -``` - -## Dictionary - -![](https://pynative.com/wp-content/uploads/2021/02/dictionaries-in-python.jpg) - -The dictionary or dict may be the most important built-in Python data structure. - -One approach for creating a dictionary is to use curly braces {} and colons to separate keys and values: - -```{python} -empty_dict = {} - -d1 = {"a": "some value", "b": [1, 2, 3, 4]} - -d1 -``` - -access, insert, or set elements - -```{python} - -d1[7] = "an integer" - -d1 - -``` - -and as before - -```{python} -"b" in d1 -``` - -the `del` and `pop` methods - -```{python} - -del d1[7] - -d1 -``` - -```{python} - -ret = d1.pop("a") - -ret -``` - -The `keys` and `values` methods - -```{python} -list(d1.keys()) -``` - -```{python} -list(d1.values()) -``` - -the `items` method - -```{python} -list(d1.items()) - -``` - - the update method to merge one dictionary into another - -```{python} - -d1.update({"b": "foo", "c": 12}) - -d1 - -``` - - ### Creating dictionaries from sequences - -```{python} -list(range(5)) -``` - -```{python} -tuples = zip(range(5), reversed(range(5))) - -tuples - -mapping = dict(tuples) - -mapping - -``` - -### Default values - -imagine categorizing a list of words by their first letters as a dictionary of lists - -```{python} -words = ["apple", "bat", "bar", "atom", "book"] - -by_letter = {} - -for word in words: - letter = word[0] - if letter not in by_letter: - by_letter[letter] = [word] - else: - by_letter[letter].append(word) - -by_letter -``` - -The `setdefault` dictionary method can be used to simplify this workflow. The preceding for loop can be rewritten as: - -```{python} -by_letter = {} - -for word in words: - letter = word[0] - by_letter.setdefault(letter, []).append(word) - -by_letter -``` -The built-in `collections`module has a useful class, `defaultdict`, which makes this even easier. - -```{python} -from collections import defaultdict - -by_letter = defaultdict(list) - -for word in words: - by_letter[word[0]].append(word) - -by_letter -``` - -### Valid dictionary key types - -keys generally have to be immutable objects like scalars or tuples for *hashability* - -To use a list as a key, one option is to convert it to a tuple, which can be hashed as long as its elements also can be: - -```{python} -d = {} - -d[tuple([1,2,3])] = 5 - -d - -``` - -## Set - -![](https://pynative.com/wp-content/uploads/2021/03/python-sets.jpg) - -can be created in two ways: via the `set` function or via a `set literal` with curly braces: - -```{python} - -set([2, 2, 2, 1, 3, 3]) - -{2,2,1,3,3} - -``` - -Sets support mathematical set operations like union, intersection, difference, and symmetric difference. - -The `union` of these two sets: - -```{python} -a = {1, 2, 3, 4, 5} - -b = {3, 4, 5, 6, 7, 8} - -a.union(b) - -a | b - -``` - -The `&`operator or the `intersection` method - -```{python} -a.intersection(b) - -a & b - -``` - -[A table of commonly used `set` methods](https://wesmckinney.com/book/python-builtin.html#tbl-table_set_operations) - -All of the logical set operations have in-place counterparts, which enable you to replace the contents of the set on the left side of the operation with the result. For very large sets, this may be more efficient - - -```{python} - -c = a.copy() - -c |= b - -c -``` - - -```{python} - -d = a.copy() - -d &= b - -d -``` - -set elements generally must be immutable, and they must be hashable - -you can convert them to tuples - -You can also check if a set is a subset of (is contained in) or a superset of (contains all elements of) another set - -```{python} -a_set = {1, 2, 3, 4, 5} - -{1, 2, 3}.issubset(a_set) - -``` - -```{python} - -a_set.issuperset({1, 2, 3}) - -``` - -## Built-In Sequence Functions - -### enumerate - -`enumerate` returns a sequence of (i, value) tuples - -### sorted - -`sorted` returns a new sorted list - -```{python} - -sorted([7,1,2,9,3,6,5,0,22]) - -``` - -### zip - -`zip` “pairs” up the elements of a number of lists, tuples, or other sequences to create a list of tuples - -```{python} -seq1 = ["foo", "bar", "baz"] - -seq2 = ["one", "two", "three"] - -zipped = zip(seq1, seq2) - -list(zipped) - -``` - -`zip` can take an arbitrary number of sequences, and the number of elements it produces is determined by the shortest sequence - -```{python} -seq3 = [False, True] - -list(zip(seq1, seq2, seq3)) - -``` - -A common use of `zip` is simultaneously iterating over multiple sequences, possibly also combined with `enumerate` - -```{python} -for index, (a, b) in enumerate(zip(seq1, seq2)): - print(f"{index}: {a}, {b}") - -``` - -`reversed` iterates over the elements of a sequence in reverse order - -```{python} - -list(reversed(range(10))) -``` - -## List, Set, and Dictionary Comprehensions - -``` -[expr for value in collection if condition] -``` - -For example, given a list of strings, we could filter out strings with length 2 or less and convert them to uppercase like this - -```{python} -strings = ["a", "as", "bat", "car", "dove", "python"] - -[x.upper() for x in strings if len(x) > 2] - -``` - -A dictionary comprehension looks like this - -``` -dict_comp = {key-expr: value-expr for value in collection - if condition} -``` - -Suppose we wanted a set containing just the lengths of the strings contained in the collection - -```{python} -unique_lengths = {len(x) for x in strings} - -unique_lengths - -``` - -we could create a lookup map of these strings for their locations in the list - -```{python} -loc_mapping = {value: index for index, value in enumerate(strings)} - -loc_mapping - -``` -## Nested list comprehensions - -Suppose we have a list of lists containing some English and Spanish names. We want to get a single list containing all names with two or more a’s in them - -```{python} -all_data = [["John", "Emily", "Michael", "Mary", "Steven"], - ["Maria", "Juan", "Javier", "Natalia", "Pilar"]] - -result = [name for names in all_data for name in names - if name.count("a") >= 2] - -result -``` - -Here is another example where we “flatten” a list of tuples of integers into a simple list of integers - -```{python} -some_tuples = [(1, 2, 3), (4, 5, 6), (7, 8, 9)] - -flattened = [x for tup in some_tuples for x in tup] - -flattened - -``` - -# Functions - -![](https://miro.medium.com/max/1200/1*ZegxhR33NdeVRpBPYXnYYQ.gif) - -`Functions` are the primary and most important method of code organization and reuse in Python. - -they use the `def` keyword - -Each function can have positional arguments and keyword arguments. Keyword arguments are most commonly used to specify default values or optional arguments. Here we will define a function with an optional z argument with the default value 1.5 - -```{python} - -def my_function(x, y, z=1.5): - return (x + y) * z - -my_function(4,25) - -``` - -The main restriction on function arguments is that the keyword arguments must follow the positional arguments - -## Namespaces, Scope, and Local Functions - -A more descriptive name describing a variable scope in Python is a namespace. - -Consider the following function - -```{python} -a = [] - -def func(): - for i in range(5): - a.append(i) - -``` - -When `func()` is called, the empty list a is created, five elements are appended, and then a is destroyed when the function exits. - -```{python} -func() - -func() - -a -``` - -## Returing Multiple Values - -What’s happening here is that the function is actually just returning one object, a tuple, which is then being unpacked into the result variables. - -```{python} -def f(): - a = 5 - b = 6 - c = 7 - return a, b, c - -a, b, c = f() - -a - -``` - -## Functions are Objects - - Suppose we were doing some data cleaning and needed to apply a bunch of transformations to the following list of strings: - -```{python} -states = [" Alabama ", "Georgia!", "Georgia", "georgia", "FlOrIda", - "south carolina##", "West virginia?"] - -import re - -def clean_strings(strings): - result = [] - for value in strings: - value = value.strip() - value = re.sub("[!#?]", "", value) - value = value.title() - result.append(value) - return result - -clean_strings(states) -``` - -Another approach - -```{python} -def remove_punctuation(value): - return re.sub("[!#?]", "", value) - -clean_ops = [str.strip, remove_punctuation, str.title] - -def clean_strings(strings, ops): - result = [] - for value in strings: - for func in ops: - value = func(value) - result.append(value) - return result - -clean_strings(states, clean_ops) - -``` - -You can use functions as arguments to other functions like the built-in `map` function - -```{python} - -for x in map(remove_punctuation, states): - print(x) -``` - -## Anonymous Lambda Functions - - a way of writing functions consisting of a single statement - -suppose you wanted to sort a collection of strings by the number of distinct letters in each string - -```{python} -strings = ["foo", "card", "bar", "aaaaaaa", "ababdo"] - -strings.sort(key=lambda x: len(set(x))) - -strings -``` - -# Generators - -Many objects in Python support iteration, such as over objects in a list or lines in a file. - -```{python} -some_dict = {"a": 1, "b": 2, "c": 3} - -for key in some_dict: - print(key) - -``` - -Most methods expecting a list or list-like object will also accept any iterable object. This includes built-in methods such as `min`, `max`, and `sum`, and type constructors like `list` and `tuple` - -A `generator` is a convenient way, similar to writing a normal function, to construct a new iterable object. Whereas normal functions execute and return a single result at a time, generators can return a sequence of multiple values by pausing and resuming execution each time the generator is used. To create a generator, use the yield keyword instead of return in a function - -```{python} -def squares(n=10): - print(f"Generating squares from 1 to {n ** 2}") - for i in range(1, n + 1): - yield i ** 2 - -gen = squares() - -for x in gen: - print(x, end=" ") -``` - -> Since generators produce output one element at a time versus an entire list all at once, it can help your program use less memory. - -## Generator expressions - - This is a generator analogue to list, dictionary, and set comprehensions. To create one, enclose what would otherwise be a list comprehension within parentheses instead of brackets: - -```{python} -gen = (x ** 2 for x in range(100)) - -gen - -``` - -Generator expressions can be used instead of list comprehensions as function arguments in some cases: - -```{python} - -sum(x ** 2 for x in range(100)) - -``` - -```{python} -dict((i, i ** 2) for i in range(5)) -``` - -## itertools module - -`itertools` module has a collection of generators for many common data algorithms. - -`groupby` takes any sequence and a function, grouping consecutive elements in the sequence by return value of the function - -```{python} -import itertools - -def first_letter(x): - return x[0] - -names = ["Alan", "Adam", "Jackie", "Lily", "Katie", "Molly"] - -for letter, names in itertools.groupby(names, first_letter): - print(letter, list(names)) - -``` - -[Table of other itertools functions](https://wesmckinney.com/book/python-builtin.html#tbl-table_itertools) - -# Errors and Exception Handling - -Handling errors or exceptions gracefully is an important part of building robust programs - -```{python} -def attempt_float(x): - try: - return float(x) - except: - return x - -attempt_float("1.2345") - -``` - -```{python} -attempt_float("something") -``` - -You might want to suppress only ValueError, since a TypeError (the input was not a string or numeric value) might indicate a legitimate bug in your program. To do that, write the exception type after except: - -```{python} -def attempt_float(x): - try: - return float(x) - except ValueError: - return x - -``` - -```{python} -#| eval: false -attempt_float((1, 2)) -``` - -``` ---------------------------------------------------------------------------- -TypeError Traceback (most recent call last) -d:\packages\bookclub-py4da\03_notes.qmd in () -----> 1001 attempt_float((1, 2)) - -Input In [114], in attempt_float(x) - 1 def attempt_float(x): - 2 try: -----> 3 return float(x) - 4 except ValueError: - 5 return x - -TypeError: float() argument must be a string or a real number, not 'tuple' - -``` - -You can catch multiple exception types by writing a tuple of exception types instead (the parentheses are required): - -```{python} -def attempt_float(x): - try: - return float(x) - except (TypeError, ValueError): - return x - -attempt_float((1, 2)) - -``` - -In some cases, you may not want to suppress an exception, but you want some code to be executed regardless of whether or not the code in the try block succeeds. To do this, use `finally`: - -```{python} -#| eval: false -f = open(path, mode="w") - -try: - write_to_file(f) -finally: - f.close() - -``` - -Here, the file object f will always get closed. - -you can have code that executes only if the try: block succeeds using else: - -```{python} -#| eval: false -f = open(path, mode="w") - -try: - write_to_file(f) -except: - print("Failed") -else: - print("Succeeded") -finally: - f.close() - -``` - -## Exceptions in IPython - -If an exception is raised while you are %run-ing a script or executing any statement, IPython will by default print a full call stack trace. Having additional context by itself is a big advantage over the standard Python interpreter - -# Files and the Operating System - -To open a file for reading or writing, use the built-in open function with either a relative or absolute file path and an optional file encoding. - -We can then treat the file object f like a list and iterate over the lines - -```{python} -#| eval: false - -path = "examples/segismundo.txt" - -f = open(path, encoding="utf-8") - -lines = [x.rstrip() for x in open(path, encoding="utf-8")] - -lines - -``` - -When you use open to create file objects, it is recommended to close the file - -```{python} -#| eval: false -f.close() - -``` - -some of the most commonly used methods are `read`, `seek`, and `tell`. - -`read(10)` returns 10 characters from the file - -the `read` method advances the file object position by the number of bytes read - -`tell()` gives you the current position in the file - -To get consistent behavior across platforms, it is best to pass an encoding (such as `encoding="utf-8"`) - -`seek(3)` changes the file position to the indicated byte - -To write text to a file, you can use the file’s `write` or `writelines` methods - -## Byte and Unicode with Files - -The default behavior for Python files (whether readable or writable) is text mode, which means that you intend to work with Python strings (i.e., Unicode). - +# Notes {-} diff --git a/04_exercises.qmd b/04_exercises.qmd new file mode 100644 index 0000000..465ff4d --- /dev/null +++ b/04_exercises.qmd @@ -0,0 +1 @@ +# Exercises {-} diff --git a/04_main.qmd b/04_main.qmd index fef1d88..b6adffc 100644 --- a/04_main.qmd +++ b/04_main.qmd @@ -1,749 +1,7 @@ -# 4. NumPy Basics: Arrays and Vectorized Computation +# 4. Classification ## Learning Objectives -- Learn about NumPy, a package for numerical computing in Python -- Use NumPy for array-based data: operations, algorithms - -## Import NumPy - -```{python} -import numpy as np # Recommended standard NumPy convention -``` - -## Array-based operations -* A fast, flexible container for large datasets in Python -* Stores multiple items of the same type together -* Can perform operations on whole blocks of data with similar syntax - -![Image of an array with 10 length and the first index, 8th element, and indicies denoted by text](https://media.geeksforgeeks.org/wp-content/uploads/CommonArticleDesign1-min.png) - -::: {.panel-tabset} - -## Create an array -```{python} -arr = np.array([[1.5, -0.1, 3], [0, -3, 6.5]]) -arr -``` - -## Perform operation -All of the elements have been multiplied by 10. - -```{python} -arr * 10 -``` -::: - -* Every array has a `shape` indicating the size of each dimension -* and a `dtype`, an object describing the data type of the array - -::: {.panel-tabset} - -## Shape -```{python} -arr.shape -``` - -## dtype -```{python} -arr.dtype -``` -::: - -### ndarray - -* Generic one/multi-dimensional container where all elements are the same type -* Created using `numpy.array` function - -::: {.panel-tabset} -## 1D -```{python} -data1 = [6, 7.5, 8, 0, 1] -arr1 = np.array(data1) -arr1 -``` -```{python} -print(arr1.ndim) -print(arr1.shape) -``` - -## Multi-dimensional -```{python} -data2 = [[1, 2, 3, 4], [5, 6, 7, 8]] -arr2 = np.array(data2) -arr2 -``` -```{python} -print(arr2.ndim) -print(arr2.shape) -``` -::: - -#### Special array creation - -* `numpy.zeros` creates an array of zeros with a given length or shape -* `numpy.ones` creates an array of ones with a given length or shape -* `numpy.empty` creates an array without initialized values -* `numpy.arange` creates a range -* Pass a tuple for the shape to create a higher dimensional array - -::: {.panel-tabset} - -## Zeros -```{python} -np.zeros(10) -``` - -## Multi-dimensional -```{python} -np.zeros((3, 6)) -``` -::: - -::: {.column-margin} -`numpy.empty` does not return an array of zeros, though it may look like it. -```{python} -np.empty(1) -``` -::: - -[Wes provides a table of array creation functions in the book.](https://wesmckinney.com/book/numpy-basics.html#tbl-table_array_ctor) - -#### Data types for ndarrays - -* Unless explicitly specified, `numpy.array` tries to infer a good data created arrays. -* Data type is stored in a special `dtype` metadata object. -* Can be explict or converted (cast) -* It is important to care about the general kind of data you’re dealing with. - -::: {.panel-tabset} -## Inferred dtype -```{python} -arr1.dtype -``` -## Explicit dtype -```{python} -arr2 = np.array([1, 2, 3], dtype=np.int32) -arr2.dtype -``` -## Cast dtype -```{python} -float_arr = arr1.astype(np.float64) -float_arr.dtype -``` -## Cast dtype using another array -```{python} -int_array = arr1.astype(arr2.dtype) -int_array.dtype -``` -::: - -::: {.column-margin} -Calling `astype` always creates a new array (a copy of the data), even if the new data type is the same as the old data type. -::: - -[Wes provides a table of supported data types in the book.](https://wesmckinney.com/book/numpy-basics.html#tbl-table_array_dtypes) - -## Arithmetic with NumPy Arrays - -::: {.panel-tabset} -## Vectorization - -Batch operations on data without `for` loops - -```{python} -arr = np.array([[1., 2., 3.], [4., 5., 6.]]) -arr * arr -``` - -## Arithmetic operations with scalars - -Propagate the scalar argument to each element in the array - -```{python} -1 / arr -``` - -## Comparisons between arrays - -of the same size yield boolean arrays - -```{python} -arr2 = np.array([[0., 4., 1.], [7., 2., 12.]]) - -arr2 > arr -``` - -::: - -## Basic Indexing and Slicing - -* select a subset of your data or individual elements - -```{python} -arr = np.arange(10) -arr -``` - -::: {.column-margin} -Array views are on the original data. Data is not copied, and any modifications to the view will be reflected in the source array. If you want a copy of a slice of an ndarray instead of a view, you will need to explicitly copy the array—for example, `arr[5:8].copy()`. -::: - -::: {.panel-tabset} - -## select the sixth element - -```{python} -arr[5] -``` - -## select sixth through eighth - -```{python} -arr[5:8] -``` - -## broadcast data - -```{python} -arr[5:8] = 12 -``` - -::: - -Example of "not copied data" - -**Original** - -```{python} -arr_slice = arr[5:8] -arr -``` - -**Change values in new array** - -Notice that arr is now changed. - -```{python} -arr_slice[1] = 123 -arr -``` - -**Change all values in an array** - -This is done with bare slice `[:]`: - -```{python} -arr_slice[:] = 64 -arr_slice -``` - -Higher dimensional arrays have 1D arrays at each index: - -```{python} -arr2d = np.array([[1,2,3], [4,5,6], [7,8,9]]) -arr2d -``` - -To slice, can pass a comma-separated list to select individual elements: - -```{python} -arr2d[0][2] -``` - -![](https://media.geeksforgeeks.org/wp-content/uploads/Numpy1.jpg) - -Omitting indicies will reduce number of dimensions: - -```{python} -arr2d[0] -``` - -Can assign scalar values or arrays: - -```{python} -arr2d[0] = 9 -arr2d -``` - -Or create an array of the indices. This is like indexing in two steps: - -```{python} -arr2d = np.array([[1,2,3], [4,5,6], [7,8,9]]) -arr2d[1,0] -``` - -### Indexing with slices - -ndarrays can be sliced with the same syntax as Python lists: - -```{python} -arr = np.arange(10) - -arr[1:6] -``` - -This slices a range of elements ("select the first row of `arr2d`"): - -```{python} -# arr2d[row, column] -arr2d[:1] -``` - -Can pass multiple indicies: - -```{python} -arr2d[:3, :1] # colons keep the dimensions -# arr2d[0:3, 0] # does not keep the dimensions -``` - -## Boolean Indexing - -```{python} -names = np.array(["Bob", "Joe", "Will", "Bob", "Will", "Joe", "Joe"]) -names -``` - -```{python} -data = np.array([[4, 7], [0, 2], [-5, 6], [0, 0], [1, 2], [-12, -4], [3, 4]]) -data -``` - -Like arithmetic operations, comparisons (such as `==`) with arrays are also vectorized. - -```{python} -names == "Bob" -``` - -This boolean array can be passed when indexing the array: - -```{python} -data[names == "Bob"] -``` - -Select from the rows where names == "Bob" and index the columns, too: - -```{python} -data[names == "Bob", 1:] -``` - -Select everything but "Bob": - -```{python} -names != "Bob" # or ~(names == "Bob") -``` - -Use boolean arithmetic operators like `&` (and) and `|` (or): - -```{python} -mask = (names == "Bob") | (names == "Will") -mask -``` - -:::{.column-margin} -Selecting data from an array by boolean indexing and assigning the result to a new variable always creates a copy of the data. -::: - -Setting values with boolean arrays works by substituting the value or values on the righthand side into the locations where the boolean array's values are `True`. - -```{python} -data[data < 0] = 0 -``` - -You can also set whole rows or columns using a one-dimensional boolean array: - -```{python} -data[names != "Joe"] = 7 -``` - -## Fancy Indexing - -A term adopted by NumPy to describe indexing using integer arrays. - -```{python} -arr = np.zeros((8, 4)) # 8 × 4 array - -for i in range(8): - arr[i] = i - -arr -``` - -Pass a list or ndarray of integers specifying the desired order to subset rows in a particular order: - -```{python} -arr[[4, 3, 0, 6]] -``` - -Use negative indices selects rows from the end: - -```{python} -arr[[-3, -5, -7]] -``` - -Passing multiple index arrays selects a one-dimensional array of elements corresponding to each tuple of indices (go down then across): - -```{python} -arr = np.arange(32).reshape((8, 4)) -arr -``` - -Here, the elements (1, 0), (5, 3), (7, 1), and (2, 2) are selected. - -```{python} -arr[[1, 5, 7, 2], [0, 3, 1, 2]] -``` - -:::{.column-margin} -Fancy indexing, unlike slicing, always copies the data into a new array when assigning the result to a new variable. -::: - -## Transposing Arrays and Swapping Axes - -Transposing is a special form of reshaping using the special `T` attribute: - -```{python} -arr = np.arange(15).reshape((3, 5)) -arr -``` - -```{python} -arr.T -``` - -### Matrix multiplication - -:::{.panel-tabset} -## using `T` - -```{python} -np.dot(arr.T, arr) -``` - -## using `@` infix operator - -```{python} -arr.T @ arr -``` - -::: - -ndarray has the method `swapaxes`, which takes a pair of axis numbers and switches the indicated axes to rearrange the data: - -```{python} -arr = np.array([[0, 1, 0], [1, 2, -2], [6, 3, 2], [-1, 0, -1], [1, 0, 1], [3, 5, 6]]) -arr -arr.swapaxes(0, 1) -``` - -## Pseudorandom Number Generation - -The `numpy.random` module supplements the built-in Python random module with functions for efficiently generating whole arrays of sample values from many kinds of probability distributions. - -* Much faster than Python's built-in `random` module - -```{python} -samples = np.random.standard_normal(size=(4, 4)) -samples -``` - -Can use an explicit generator: - -* `seed` determines initial state of generator -```{python} -rng = np.random.default_rng(seed=12345) -data = rng.standard_normal((2, 3)) -data -``` - -[Wes provides a table of NumPy random number generator methods](https://wesmckinney.com/book/numpy-basics.html#tbl-table_numpy_random) - -## Universal Functions: Fast Element-Wise Array Functions - -A universal function, or ufunc, is a function that performs element-wise operations on data in ndarrays. - -Many ufuncs are simple element-wise transformations: - -:::{.panel-tabset} -## unary -One array -```{python} -arr = np.arange(10) -np.sqrt(arr) -``` - -## binary - -```{python} -arr1 = rng.standard_normal(10) -arr2 = rng.standard_normal(10) -np.maximum(arr1, arr2) -``` - -## multiple - -```{python} -remainder, whole_part = np.modf(arr1) -remainder -``` - -::: - -Use the `out` argument to assign results into an existing array rather than create a new one: - -```{python} -out = np.zeros_like(arr) -np.add(arr, 1, out=out) -``` - -## Array-Oriented Programming with Arrays - -Evaluate the function `sqrt(x^2 + y^2)` across a regular grid of values: use the `numpy.meshgrid` function takes two one-dimensional arrays and produce two two-dimensional matrices corresponding to all pairs of (x, y) in the two arrays: - -```{python} -points = np.arange(-5, 5, 0.01) # 100 equally spaced points -xs, ys = np.meshgrid(points, points) -xs -``` - -```{python} -ys -``` - -Evaluate the function as if it were two points: - -```{python} -z = np.sqrt(xs ** 2 + ys ** 2) -z -``` - -### Bonus: matplotlib visualization - -```{python} -import matplotlib.pyplot as plt - -plt.imshow(z, cmap=plt.cm.gray) #, extent=[-25, 10, -10, 10]) -plt.colorbar() -plt.title("Image plot of $\sqrt{x^2 + y^2}$ for a grid of values") -``` - -```{python} -plt.close("all") -``` - -## Expressing Conditional Logic as Array Operations - -The `numpy.where` function is a vectorized version of the ternary expression `x if condition else`. - -* second and third arguments to `numpy.where` can also be scalars -* can also combine scalars and arrays - -```{python} -xarr = np.array([1.1, 1.2, 1.3, 1.4, 1.5]) -yarr = np.array([2.1, 2.2, 2.3, 2.4, 2.5]) -cond = np.array([True, False, True, True, False]) -``` - -Take a value from `xarr` whenever the corresponding value in `cond` is `True`, and otherwise take the value from `yarr`: - -:::{.panel-tabset} - -## `x if condition else` -```{python} -result = [(x if c else y) - for x, y, c in zip(xarr, yarr, cond)] - -result -``` - -## `numpy.where` -```{python} -result = np.where(cond, xarr, yarr) -result -``` -::: - -Can also do this with scalars, or combine arrays and scalars: - -```{python} -arr = rng.standard_normal((4,4)) -arr -``` - -```{python} -np.where(arr > 0, 2, -2) -``` - -```{python} -# set only positive to 2 -np.where(arr > 0,2,arr) -``` - -## Mathematical and Statistical Methods - -Use "aggregations' like `sum`, `mean`, and `std` - -* If using NumPy, must pass the array you want to aggregate as the first argument - -```{python} -arr = rng.standard_normal((5, 4)) - -arr.mean() -``` - -```{python} -np.mean(arr) -``` - -Can use `axis` to specify which axis to computer the statistic - -:::{.panel-tabset} - -## "compute across the columns" -```{python} -arr.mean(axis=1) -``` - -## "compute across the rows" - -```{python} -arr.mean(axis=0) -``` - -::: - -Other methods like cumsum and cumprod do not aggregate, instead producing an array of the intermediate results: - -```{python} -arr.cumsum() -``` - -In multidimensional arrays, accumulation functions like cumsum compute along the indicated axis: - -:::{.panel-tabset} - -## "compute across the columns" -```{python} -arr.cumsum(axis=1) -``` - -## "compute across the rows" - -```{python} -arr.cumsum(axis=0) -``` - -::: - -## Methods for Boolean Arrays - -Boolean values are coerced to 1 (`True`) and 0 (`False`) in the preceding methods. Thus, sum is often used as a means of counting True values in a boolean array: - -```{python} -(arr > 0).sum() # Number of positive values -``` - -`any` tests whether one or more values in an array is True, while `all` checks if every value is True: - -```{python} -bools = np.array([False, False, True, False]) -bools.any() -``` - -## Sorting - -NumPy arrays can be sorted in place with the `sort` method: - -```{python} -arr = rng.standard_normal(6) -arr.sort() -arr -``` - -Can sort multidimensional section by providing an axis: - -```{python} -arr = rng.standard_normal((5, 3)) -``` - -:::{.panel-tabset} -## "compute across the columns" -```{python} -arr.cumsum(axis=1) -``` - -## "compute across the rows" - -```{python} -arr.cumsum(axis=0) -``` - -::: - -The top-level method `numpy.sort` returns a sorted copy of an array (like the Python built-in function `sorted`) instead of modifying the array in place: - -```{python} -arr2 = np.array([5, -10, 7, 1, 0, -3]) -sorted_arr2 = np.sort(arr2) -sorted_arr2 -``` - -## Unique and Other Set Logic - -`numpy.unique` returns the sorted unique values in an array: - -```{python} -np.unique(names) -``` - -`numpy.in1d` tests membership of the values in one array in another, returning a boolean array: - -```{python} -np.in1d(arr1, arr2) -``` - -## File Input and Output with Arrays - -NumPy is able to save `np.save` and load `np.load` data to and from disk in some text or binary formats. - -Arrays are saved by default in an uncompressed raw binary format with file extension .npy: - -```{python} -arr = np.arange(10) -np.save("some_array", arr) -``` - -```{python} -np.load("some_array.npy") -``` - -* Save multiple arrays in an uncompressed archive using `numpy.savez` -* If your data compresses well, use `numpy.savez_compressed` instead - -## 4.6 Linear Algebra - -Linear algebra operations, like matrix multiplication, decompositions, determinants, and other square matrix math, can be done with Numpy (`np.dot(y)` vs `x.dot(y)`): - -```{python} -np.dot(arr1, arr) -``` - -## Example: Random Walks - -```{python} -import matplotlib.pyplot as plt -#! blockstart -import random -position = 0 -walk = [position] -nsteps = 1000 -for _ in range(nsteps): - step = 1 if random.randint(0, 1) else -1 - position += step - walk.append(position) -#! blockend - -plt.plot(walk[:100]) -plt.show() -``` - +- item 1 +- item 2 +- item 3 diff --git a/04_notes.qmd b/04_notes.qmd new file mode 100644 index 0000000..b65567e --- /dev/null +++ b/04_notes.qmd @@ -0,0 +1 @@ +# Notes {-} diff --git a/05_exercises.qmd b/05_exercises.qmd new file mode 100644 index 0000000..465ff4d --- /dev/null +++ b/05_exercises.qmd @@ -0,0 +1 @@ +# Exercises {-} diff --git a/05_main.qmd b/05_main.qmd index 6e4740d..a40c873 100644 --- a/05_main.qmd +++ b/05_main.qmd @@ -1,7 +1,7 @@ -# 5. Getting Started with pandas +# 5. Resampling Methods ## Learning Objectives -* Learn about Pandas two major data structures: Series and DataFrame - -* Learn some essential functionality! \ No newline at end of file +- item 1 +- item 2 +- item 3 diff --git a/05_notes.qmd b/05_notes.qmd index 40cb3b9..5f6ce99 100644 --- a/05_notes.qmd +++ b/05_notes.qmd @@ -1,413 +1,2 @@ # Notes {.unnumbered} - -## Introduction - -:::{.callout-note} -This is a long chapter, these notes are intended as a tour of main ideas! -::: - -![Panda bus tour!](images/pandabus.jpg){width=300px} - -* Pandas is a major tool in Python data analysis - -* Works with Numpy, adding support for tabular / heterogenous data - -## Import conventions: - -```{python} -import numpy as np -import pandas as pd -``` - - -## Panda's primary data structures - -* Series: One dimensional object containing a sequence of values of the same type. - -* DataFrame: Tabular data, similar (and inspired by) R dataframe. - -* Other structures will be introduced as they arise, e.g. Index and Groupby objects. - -### Series - -```{python} -obj = pd.Series([4,7,-4,3], index = ["A","B","C","D"]) -obj -``` - -The `index` is optional, if not specified it will default to 0 through N-1 - -#### Selection - -Select elements or sub-Series by labels, sets of labels, boolean arrays ... - -```{python} -obj['A'] -``` - -```{python} -obj[['A','C']] -``` - -```{python} -obj[obj > 3] -``` - -#### Other things you can do - -* Numpy functions and Numpy-like operations work as expected: - -```{python} -obj*3 -``` - -```{python} -np.exp(obj) -``` - -* Series can be created from and converted to a dictionary - -```{python} -obj.to_dict() -``` - -* Series can be converted to numpy array: - -```{python} -obj.to_numpy() -``` - - -### DataFrame - -* Represents table of data - -* Has row index *index* and column index *column* - -* Common way to create is from a dictionary, but see *Table 5.1* for more! - -```{python} - -test = pd.DataFrame({"cars":['Chevy','Ford','Dodge','BMW'],'MPG':[14,15,16,12], 'Year':[1979, 1980, 2001, 2020]}) -test -``` - -* If you want a non-default index, it can be specified just like with Series. - -* `head(n)` / `tail(n)` - return the first / last n rows, 5 by default - -#### Selecting - -* Can retrieve columns or sets of columns by using `obj[...]`: - -```{python} -test['cars'] -``` - -Note that we got a `Series` here. - -```{python} -test[['cars','MPG']] -``` - -* Dot notation can also be used (`test.cars`) as long as the column names are valid identifiers - -* *Rows* can be retrieved with `iloc[...]` and `loc[...]`: - - - `loc` retrieves by index - - - `iloc` retrieves by position. - - -#### Modifying / Creating Columns - -* Columns can be modified (and created) by assignment: - -```{python} -test['MPG^2'] = test['MPG']**2 -test -``` - -* `del` keyword can be used to drop columns, or `drop` method can be used to do so non-destructively - - -### Index object - -* Index objects are used for holding axis labels and other metadata - -```{python} -test.index -``` - -* Can change the index, in this case replacing the default: - -```{python} -# Create index from one of the columns -test.index = test['cars'] - - # remove 'cars' column since i am using as an index now. s -test=test.drop('cars', axis = "columns") # or axis = 1 -test -``` - -* Note the `axis` keyword argument above, many DataFrame methods use this. - -* Above I changed a column into an index. Often you want to go the other way, this can be done with `reset_index`: - -```{python} -test.reset_index() # Note this doesn't actually change test -``` - - -* Columns are an index as well: -```{python} -test.columns -``` - -* Indexes act like immutable sets, see *Table 5.2* in book for Index methods and properties - -## Essential Functionality - -### Reindexing and dropping - -* `reindex` creats a *new* object with the values arranged according to the new index. Missing values are used if necessary, or you can use optional fill methods. You can use `iloc` and `loc` to reindex as well. - -```{python} -s = pd.Series([1,2,3,4,5], index = list("abcde")) -s2 = s.reindex(list("abcfu")) # not a song by GAYLE -s2 -``` - -* Missing values and can be tested for with `isna` or `notna` methods - -```{python} -pd.isna(s2) -``` - -* `drop` , illustrated above can drop rows or columns. In addition to using `axis` you can use `columns` or `index`. Again these make copies. - -```{python} -test.drop(columns = 'MPG') -``` - -```{python} -test.drop(index = ['Ford', 'BMW']) -``` - -### Indexing, Selection and Filtering - -#### Series -* For Series, indexing is similar to Numpy, except you can use the index as well as integers. - -```{python} -obj = pd.Series(np.arange(4.), index=["a", "b", "c", "d"]) -obj[0:3] -``` - -```{python} -obj['a':'c'] -``` - -```{python} -obj[obj<2] -``` - -```{python} -obj[['a','d']] -``` - -* *However*, preferred way is to use `loc` for selection by *index* and `iloc` for selection by position. This is to avoid the issue where the `index` is itself integers. - -```{python} -obj.loc[['a','d']] -``` - -```{python} -obj.iloc[1] -``` - -:::{.callout-note} -Note if a range or a set of indexes is used, a Series is returned. If a single item is requested, you get just that item. -::: - -#### DataFrame - -* Selecting with `df[...]` for a DataFrame retrieves one or more columns as we have seen, if you select a single column you get a Series - -* There are some special cases, indexing with a boolean selects *rows*, as does selecting with a slice: - -```{python} -test[0:1] -``` - -```{python} -test[test['MPG'] < 15] -``` - -* `iloc` and `loc` can be used to select rows as illustrated before, but can also be used to select columns or subsets of rows/columns - - -```{python} -test.loc[:,['Year','MPG']] -``` - -```{python} -test.loc['Ford','MPG'] -``` - -* These work with slices and booleans as well! The following says "give me all the rows with MPG more then 15, and the columns starting from Year" - -```{python} -test.loc[test['MPG'] > 15, 'Year':] -``` - -* Indexing options are fully illustrated in the book and *Table 5.4* - -* Be careful with *chained indexing*: - -```{python} -test[test['MPG']> 15].loc[:,'MPG'] = 18 -``` - -Here we are assigning to a 'slice', which is probably not what is intended. You will get a warning and a recommendation to fix it by using one `loc`: - -```{python} -test.loc[test['MPG']> 15 ,'MPG'] = 18 -test -``` - -:::{.callout-tip} -### Rule of Thumb - -Avoid chained indexing when doing assignments -::: - -### Arithmetic and Data Alignment - -* Pandas can make it simpler to work with objects that have different indexes, usually 'doing the right thing' - -```{python} -s1 = pd.Series([7.3, -2.5, 3.4, 1.5], index=["a", "c", "d", "e"]) -s2 = pd.Series([-2.1, 3.6, -1.5, 4, 3.1], index=["a", "c", "e", "f", "g"]) -s1+s2 -``` - -* Fills can be specified by using methods: - -```{python} -s1.add(s2, fill_value = 0) -``` - -* See *Table 5.5* for list of these methods. - -* You can also do arithmetic between *DataFrame*s and *Series* in a way that is similar to Numpy. - -### Function Application and Mapping - -* Numpy *ufuncs* also work with Pandas objects. - -```{python} -frame = pd.DataFrame(np.random.standard_normal((4, 3)), - columns=list("bde"), - index=["Utah", "Ohio", "Texas", "Oregon"]) -frame -``` - -```{python} -np.abs(frame) -``` - -* `apply` can be used to apply a function on 1D arrays to each column or row: - -```{python} -frame.apply(np.max, axis = 'rows') #'axis' is optional here, default is rows -``` - -Applying accross columns is common, especially to combine different columns in some way: - -```{python} - -frame['max'] = frame.apply(np.max, axis = 'columns') -frame -``` - -* Many more examples of this in the book. - - -### Sorting and Ranking - -* `sort_index` will sort with the index (on either axis for *DataFrame*) -* `sort_values` is used to sort by values or a particular column - -```{python} -test.sort_values('MPG') -``` - -* `rank` will assign ranks from on through the number of data points. - - -## Summarizing and Computing Descriptive Statistics - -```{python} -df = pd.DataFrame([[1.4, np.nan], [7.1, -4.5], - [np.nan, np.nan], [0.75, -1.3]], - index=["a", "b", "c", "d"], - columns=["one", "two"]) -df -``` - -Some Examples: - - -Sum over rows: -```{python} -df.sum() -``` - -Sum over columns: -```{python} -# Sum Rows -df.sum(axis="columns") -``` - -Extremely useful is `describe`: - -```{python} -df.describe() -``` - -**Book chapter contains *many* more examples and a full list of summary statistics and related methods.** - -## Summary - -* Primary Panda's data structures: - - - Series - - - DataFrame - -* Many ways to access and transform these objects. Key ones are: - - - `[]` : access an element(s) of a `Series` or columns(s) of a `DataFrame` - - - `loc[r ,c]` : access a row / column / cell by the `index`. - - - `iloc[i, j]` : access ar row / column / cell by the integer position. - -* [Online reference.](https://pandas.pydata.org/docs/reference/index.html) - -:::{.callout-tip} -## Suggestion -Work though the chapter's code and try stuff! -::: - -## References - -* [Chapter's code.](https://nbviewer.org/github/pydata/pydata-book/blob/3rd-edition/ch05.ipynb) - -* [Panda reference.](https://pandas.pydata.org/docs/reference/index.html) - -## Next Chapter - -* Loading and writing data sets! \ No newline at end of file diff --git a/06_exercises.qmd b/06_exercises.qmd new file mode 100644 index 0000000..465ff4d --- /dev/null +++ b/06_exercises.qmd @@ -0,0 +1 @@ +# Exercises {-} diff --git a/06_main.qmd b/06_main.qmd index 9fd3ecb..43a859f 100644 --- a/06_main.qmd +++ b/06_main.qmd @@ -1,3 +1,7 @@ -# 6. Data Loading, Storage, and File Formats +# 6. Linear Model Selection and Regularization ## Learning Objectives + +- item 1 +- item 2 +- item 3 diff --git a/06_notes.ipynb b/06_notes.ipynb new file mode 100644 index 0000000..901c3c0 --- /dev/null +++ b/06_notes.ipynb @@ -0,0 +1,1069 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Notes {.unnumbered}\n", + "\n", + "Before we can even get to the fun of data analysis, we first need to learn how to load in our data!\n", + "\n", + "![](images/DAmeme.png){dpi=\"300\" width=\"425\"}\n", + "\n", + "Today, we'll learn to work with the following categories of data inputs and outputs:\n", + "\n", + "- Text\n", + "- Binary\n", + "- Web APIs\n", + "- Databases\n", + "\n", + "## Reading and Writing Data in Text Format\n", + "\n", + "### `read_csv` Arguments\n", + "\n", + "[Table 6.1](https://wesmckinney.com/book/accessing-data.html#tbl-table_parsing_functions) lists the various data types pandas can read.\n", + "\n", + "Each function can be called with `pd.read_*` (for example, `pd.read_csv`).\n", + "\n", + "::: callout-note\n", + "Wes points out that the number of arguments can be overwhelming. `pd.read_csv` has about 50. The [pandas documentation](https://pandas.pydata.org/docs/reference/io.html) is a good resource for finding the right arguments.\n", + ":::\n", + "\n", + "[Table 6.2](https://wesmckinney.com/book/accessing-data.html#tbl-table_read_csv_function) lists frequently used options in `pd.read_csv`.\n", + "\n", + "Let's import the [Palmer Penguins dataset](https://github.com/allisonhorst/palmerpenguins/blob/main/inst/extdata/penguins.csv) to explore this function and some of the csv arguments. *Note*: I added random numbers for month and day to demonstrate date parsing.\n" + ], + "id": "f71a23c1" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "import pandas as pd\n", + "\n", + "penguins = pd.read_csv(\"data/penguins.csv\")\n", + "\n", + "penguins.head(5)" + ], + "id": "c02729f1", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Index Columns\n", + "\n", + "**Indexing** gets column names from the file or from this argument\n" + ], + "id": "ce480426" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "penguins_indexed = pd.read_csv(\"data/penguins.csv\", index_col = \"species\")\n", + "penguins_indexed.head(5)" + ], + "id": "e7555fe5", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Infer or Convert Data Type\n", + "\n", + "**Type inference and data conversion** converts values (including missing) to a user-defined value.\n", + "\n", + "If you data uses another string value as the missing placeholder, you can add it to `na_values`.\n" + ], + "id": "044a8f7a" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "penguins_NA = pd.read_csv(\n", + " \"data/penguins.csv\", \n", + " na_values = [\"male\"]\n", + " )\n", + " \n", + "penguins_NA.head(5)" + ], + "id": "c1e6f917", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Parse Date and Time\n", + "\n", + "**Date and time parsing** combines date and time from multiple columns into a single column\n" + ], + "id": "8f038723" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "penguins_dates = pd.read_csv(\n", + " \"data/penguins.csv\", \n", + " parse_dates = {\"date\": [\"month\", \"day\", \"year\"]}\n", + " )\n", + " \n", + "penguins_dates[\"date\"] = pd.to_datetime(\n", + " penguins_dates.date, \n", + " format = \"%m%d%Y\"\n", + " )\n", + " \n", + "print(penguins_dates.date.head(5))\n", + "\n", + "print(penguins_dates.date.dtypes)" + ], + "id": "5f938ed5", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Iterate Through Large Files\n", + "\n", + "**Iterating** allows iteration over chunks of very large files\n", + "\n", + "Using `nrows` to read in only 5 rows:\n" + ], + "id": "f265e4ec" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "pd.read_csv(\"data/penguins.csv\", nrows = 5\n", + " )" + ], + "id": "5102110c", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using `chunksize` and the `TextFileReader` to aggregate and summarize the data by species:\n" + ], + "id": "f4981c36" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "chunker = pd.read_csv(\"data/penguins.csv\", chunksize = 10)\n", + "\n", + "print(type(chunker))\n", + "\n", + "tot = pd.Series([], dtype = 'int64')\n", + "for piece in chunker:\n", + " tot = tot.add(piece[\"species\"].value_counts(), fill_value = 0)\n", + "\n", + "tot" + ], + "id": "f7ccd593", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Import Semi-Clean Data\n", + "\n", + "**Unclean data issues** skips rows, comments, punctuation, etc.\n", + "\n", + "We can import a subset of the columns using `usecols` and change their names (`header = 0`; `names = [list]`).\n" + ], + "id": "4f21c751" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "penguins_custom = pd.read_csv(\n", + " \"data/penguins.csv\", \n", + " usecols = [0,1,6],\n", + " header = 0, \n", + " names = [\"Species\", \"Island\", \"Sex\"]\n", + " )\n", + "\n", + "penguins_custom.head(5)" + ], + "id": "70c6d28a", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Writing Data to Text Format\n", + "\n", + "To write to a csv file, we can use pandas DataFrame's `to_csv` method with `index = False` so the row numbers are not stored in the first column. Missing values are written as empty strings, we can specify a placeholder with `na_rep = \"NA\"`:\n" + ], + "id": "404eb54f" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "penguins_custom.to_csv(\n", + " \"data/penguins_custom.csv\", \n", + " index = False,\n", + " na_rep = \"NA\"\n", + " )" + ], + "id": "91151eec", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "::: {layout-ncol=\"2\"}\n", + "![](images/penguins_custom_noArgs.png){width=\"317\"}\n", + "\n", + "![](images/penguins_custom.png){width=\"247\"}\n", + ":::\n", + "\n", + "### Working with Other Delimited Formats\n", + "\n", + "#### Reading\n", + "\n", + "In case your tabular data makes pandas trip up and you need a little extra manual processing, you can use Python's built in `csv` module.\n", + "\n", + "Let's read in the penguins dataset the hard, manual way.\n" + ], + "id": "01988f9e" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "import csv\n", + "\n", + "penguin_reader = csv.reader(penguins)\n", + "\n", + "print(penguin_reader)\n" + ], + "id": "51c4e924", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we have the `_csv_reader` object.\n", + "\n", + "Next, Wes iterated through the reader to print the lines, which seems to only give me the row with my headings.\n" + ], + "id": "fc5d18b9" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "for line in penguin_reader:\n", + " print(line)" + ], + "id": "4437660c", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We'll keep following along to wrangle it into a form we can use:\n" + ], + "id": "3a946e1c" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "with open(\"data/penguins.csv\") as penguin_reader:\n", + " lines = list(csv.reader(penguin_reader))\n", + " \n", + "header, values = lines[0], lines[1:]\n", + "\n", + "print(header)\n", + "print(values[5])" + ], + "id": "d7490840", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we have two lists: header and values. We use a dictionary of data columns and the expression `zip(*values)`. This combination of dictionary comprehension and expression is generally faster than iterating through a loop. However, Wes warns that this can use a lot of memory on large files.\n" + ], + "id": "747dc06a" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "penguin_dict = {h: v for h, v in zip(header, zip(*values))}\n", + "\n", + "# too big to print and I'm not sure how to print a select few key-value pairs" + ], + "id": "b4e82ae4", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "::: callout-note\n", + "## Recall\n", + "\n", + "For a reminder on dictionary comprehensions, see [Chapter 3](https://wesmckinney.com/book/python-builtin.html#comprehensions).\n", + ":::\n", + "\n", + "Now to finally get this into a usable dataframe we'll use pandas DataFrame `from_dict` method!\n" + ], + "id": "f666c78e" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "penguin_df = pd.DataFrame.from_dict(penguin_dict)\n", + "penguin_df.head(5)" + ], + "id": "0cbc0dc5", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### `csv.Dialect`\n", + "\n", + "Since there are many kinds of delimited files, string quoting conventions, and line terminators, you may find yourself wanting to define a \"Dialect\" to read in your delimited file. The options available are found in [Table 6.3](https://wesmckinney.com/book/accessing-data.html#tbl-table_csv_dialect).\n", + "\n", + "You can either define a `csv.Dialect` subclass or pass dialect parameters to `csv.reader`.\n" + ], + "id": "7bbb126c" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "# option 1\n", + "\n", + "## define a dialect subclass\n", + "\n", + "class my_dialect(csv.Dialect):\n", + " lineterminator = \"\\n\"\n", + " delimiter = \";\"\n", + " quotechar = '\"'\n", + " quoting = csv.QUOTE_MINIMAL\n", + " \n", + "## use the subclass\n", + "\n", + "reader = csv.reader(penguins, dialect = my_dialect)\n", + "\n", + "# option 2\n", + "\n", + "## pass just dialect parameters\n", + "\n", + "reader = csv.reader(penguins, delimiter = \",\")" + ], + "id": "5acc62ad", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "::: callout-tip\n", + "## Recap for when to use what?\n", + "\n", + "For most data, pandas `read_*` functions, plus the overwhelming number of options, will likely get you close to what you need.\n", + "\n", + "If there are additional, minor wrangling needs, you can try using Python's `csv.reader` with either a `csv.Dialect` subclass or just by passing in dialect parameters.\n", + "\n", + "If you have complicated or multicharacter delimiters, you'll likely need to import the string module and use the `split` method or regular expression method `re.split`.\n", + ":::\n", + "\n", + "#### Writing\n", + "\n", + "`csv.writer` is the companion to `csv.reader` with the same dialect and format options. The first argument in `open` is the path and filename you want to write to and the second argument `\"w\"` makes the file writeable.\n", + "\n", + "::: callout-note\n", + "[Python documentation](https://docs.python.org/3/library/csv.html#id3) notes that `newline=\"\"` should be specified in case there are newlines embedded inside quoted fields to ensure they are interpreted correctly.\n", + ":::\n" + ], + "id": "8df4c74b" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "with open(\"data/write_data.csv\", \"w\", newline = \"\") as f:\n", + " writer = csv.writer(f, dialect = my_dialect)\n", + " writer.writerow((\"one\", \"two\", \"three\"))\n", + " writer.writerow((\"1\", \"2\", \"3\"))\n", + " writer.writerow((\"4\", \"5\", \"6\"))\n", + " writer.writerow((\"7\", \"8\", \"9\"))" + ], + "id": "c49ec8a9", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### JavaScript Object Notation (JSON) Data\n", + "\n", + "Standard format for HTTP requests between web browsers, applications, and APIs. Its almost valid Python code:\n", + "\n", + "- Instead of `NaN`, it uses `null`\n", + "\n", + "- Doesn't allow trailing commas at end of lists\n", + "\n", + "- Data types: objects (dictionaries), arrays (lists), strings, numbers, booleans, and nulls.\n", + "\n", + "We'll make up a simple file of my pets' names, types, and sex to demonstrate JSON data loading and writing.\n", + "\n", + "![](images/mts.jpg){width=\"319\"}\n", + "\n", + "Import the json module and use `json.loads` to convert a JSON string to Python. There are multiple ways to convert JSON objects to a DataFrame.\n" + ], + "id": "918950b2" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "import json\n", + "\n", + "obj = \"\"\"\n", + "{\"name\": \"Jadey\",\n", + " \"pets\": [{\"name\": \"Mai\", \"type\": \"cat\", \"sex\": \"Female\"},\n", + " {\"name\": \"Tai\", \"type\": \"cat\", \"sex\": \"Male\"},\n", + " {\"name\": \"Skye\", \"type\": \"cat\", \"sex\": \"Female\"}]\n", + "}\n", + "\"\"\"\n", + "\n", + "json_to_py = json.loads(obj)\n", + "\n", + "print(json_to_py)\n", + "type(json_to_py)" + ], + "id": "a802f6ce", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Since this imported the object as a dictionary, we can use `pd.DataFrame` to create a DataFrame of the pets' names, type, and sex.\n" + ], + "id": "d6056885" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "pets_df = pd.DataFrame(json_to_py[\"pets\"], columns = [\"name\", \"type\", \"sex\"])\n", + "\n", + "print(type(pets_df))\n", + "pets_df" + ], + "id": "9aec0779", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Use `json.dumps` to convert from Python (class: dictionary) back to JSON (class: string).\n" + ], + "id": "6b5ed0ba" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "py_to_json = json.dumps(json_to_py)\n", + "\n", + "print(\"json_to_py type:\", type(json_to_py))\n", + "print(\"py_to_json type:\", type(py_to_json))\n", + "py_to_json" + ], + "id": "5f552633", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can use pandas `pd.read_json` function and `to_json` DataFrame method to read and write JSON files.\n" + ], + "id": "05f9689a" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "pets_df.to_json(\"data/pets.json\")" + ], + "id": "877e8e59", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can easily import a JSON file using `pandas.read_json`.\n" + ], + "id": "6063159c" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "pet_data = pd.read_json(\"data/pets.json\")\n", + "pet_data" + ], + "id": "f5ac0c50", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Web Scraping\n", + "\n", + "#### HTML\n", + "\n", + "`pd.read_html` uses libraries to read and write HTML and XML:\n", + "\n", + "- Try: xlml \\[faster\\]\n", + "\n", + "- Catch: beautifulsoup4 and html5lib \\[better equipped for malformed files\\]\n", + "\n", + "If you want to specify which parsing engine is used, you can use the `flavor` argument.\n" + ], + "id": "d1c1cad9" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "tables = pd.read_html(\n", + " \"https://www.fdic.gov/resources/resolutions/bank-failures/failed-bank-list/\", \n", + " flavor = \"html5lib\"\n", + " )\n", + "\n", + "print(\"Table Length:\", len(tables))\n", + "\n", + "# since this outputs a list of tables, we can grab just the first table\n", + "\n", + "tables[0].head(5)" + ], + "id": "ffb10492", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### XML\n", + "\n", + "XML format is more general than HTML, but they are structurally similar. See [pandas documentation](https://pandas.pydata.org/docs/reference/api/pandas.read_xml.html) for `pd.read_xml`.\n", + "\n", + "This snippet of an xml file is from [Microsoft](https://docs.microsoft.com/en-us/previous-versions/windows/desktop/ms762271(v=vs.85)).\n", + "\n", + "``` xml\n", + "\n", + " \n", + " Gambardella, Matthew\n", + " XML Developer's Guide\n", + " Computer\n", + " 44.95\n", + " 2000-10-01\n", + " An in-depth look at creating applications \n", + " with XML.\n", + " \n", + "```\n" + ], + "id": "a8af4dd6" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "books = pd.read_xml(\"data/books.xml\")\n", + "\n", + "books.head(5)" + ], + "id": "ae69ec25", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If you'd like to manually parse a file, Wes demonstrates this process in the [textbook](https://wesmckinney.com/book/accessing-data.html#io_file_formats_xml), before demonstrating how the following steps are turned into one line of code using `pd.read_xml`.\n", + "\n", + "1. `from lxml import objectify`\n", + "2. Use `lxml.objectify`,\n", + "3. Create a dictionary of tag names to data values\n", + "4. Cnvert that list of dictionaries into a DataFrame.\n", + "\n", + "### Binary Data Formats\n", + "\n", + "#### Pickle\n", + "\n", + "Python has a built-in `pickle` module that converts pandas objects into the pickle format (serializes the data into a byte stream), which is generally readable only in Python.\n", + "\n", + "More information can be found in Python [documentation](https://docs.python.org/3/library/pickle.html).\n", + "\n", + "Here's a demo to show pickling and unpickling the penguins dataset.\n" + ], + "id": "9e11788b" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "print(\"Unpickled penguins type:\", type(penguins))\n", + "\n", + "penguins.to_pickle(\"data/penguins_pickle\")\n", + "\n", + "# do some machine learning\n", + "\n", + "pickled_penguins = pd.read_pickle(\"data/penguins_pickle\")\n", + "pickled_penguins" + ], + "id": "de3838ff", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "::: callout-warning\n", + "`pickle` is recommended only as a short-term storage format (i.e. loading and unloading your machine learning models) because the format may not be stable over time. Also, the module is not secure -- pickle data can be maliciously tampered with. [Python docs](https://docs.python.org/3/library/pickle.html) recommend signing data with `hmac` to ensure it hasn't been tampered with.\n", + ":::\n", + "\n", + "#### Microsoft Excel Files\n", + "\n", + "`pd.ExcelFile` class or `pd.read_excel` functions use packages `xlrd` (for older .xlx files) and `openpyxl` (for newer .xlsx files), which must be installed separately from pandas.\n", + "\n", + "``` bash\n", + "conda install xlrd openpyxl\n", + "```\n", + "\n", + "`pd.read_excel` takes most of the same arguments as `pd.read_csv`.\n" + ], + "id": "9586ea45" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "penguins_excel = pd.read_excel(\n", + " \"data/penguins.xlsx\", \n", + " index_col = \"species\",\n", + " parse_dates = {\"date\": [\"month\", \"day\", \"year\"]}\n", + ")\n", + "\n", + "penguins_excel.head(5)" + ], + "id": "fdfeaeb8", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To read multiple sheets, use `pd.ExcelFile`.\n" + ], + "id": "c1798029" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "penguins_sheets = pd.ExcelFile(\"data/penguins_sheets.xlsx\")\n", + "\n", + "print(\"Available sheet names:\", penguins_sheets.sheet_names)\n", + "\n", + "penguins_sheets" + ], + "id": "8dc7ec1d", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Then we can `parse` all sheets into a dictionary by specifying the `sheet_name` argument as `None`. Or, we can read in a subset of sheets.\n" + ], + "id": "db245c94" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "sheets = penguins_sheets.parse(sheet_name = None)\n", + "\n", + "sheets" + ], + "id": "5002b161", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Then we can subset one of the sheets as a pandas DataFrame object.\n" + ], + "id": "c8d27da8" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "chinstrap = sheets[\"chinstrap\"].head(5)\n", + "chinstrap" + ], + "id": "d2e0f78f", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Write one sheet to using `to_excel:`\n" + ], + "id": "95a36f3b" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "chinstrap.to_excel(\"data/chinstrap.xlsx\")" + ], + "id": "bae11b80", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If you want to write to multiple sheets, create an `ExcelWriter` class and then write the data to it:\n" + ], + "id": "ff23b48d" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "gentoo = sheets[\"gentoo\"].head(5)\n", + "\n", + "writer = pd.ExcelWriter(\"data/chinstrap_gentoo.xlsx\")\n", + "\n", + "chinstrap.to_excel(writer, sheet_name = \"chinstrap\")\n", + "\n", + "gentoo.to_excel(writer, sheet_name = \"gentoo\")\n", + "\n", + "writer.save()" + ], + "id": "aff9c5db", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### HDF5 Format\n", + "\n", + "Hierarchical data format (HDF) is used in Python, C, Java, Julia, MATLAB, and others for storing big scientific array data (multiple datasets and metadata within one file). HDF5 can be used to efficiently read/write chunks of large arrays.\n", + "\n", + "The PyTables package must first be installed.\n", + "\n", + "``` bash\n", + "conda install pytables\n", + "\n", + "pip install tables # the package is called \"tables\" in PyPI\n", + "```\n", + "\n", + "pandas provides an dictionary-like-class for HDF5 files called `HDFStore`:\n" + ], + "id": "0de56ffd" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "store = pd.HDFStore(\"data/pets.h5\")\n", + "\n", + "store[\"pets\"] = pets_df\n", + "store[\"pets\"]" + ], + "id": "beabda71", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`HDFStore` can store data as a `fixed` or as a `table` schema. Table allows querying but is generally slower.\n" + ], + "id": "67419945" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "pets_df.to_hdf(\"data/petnames.h5\", \"pets\", format = \"table\")\n", + "pd.read_hdf(\"data/petnames.h5\", \"pets\", where=[\"columns = name\"])" + ], + "id": "28daf9a4", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "::: callout-tip\n", + "## When should I use HDF5?\n", + "\n", + "Wes recommends using HDF5 for write-once, read-many datasets that are worked with locally. If your data is stored on remote servers, then you may try other binary formats designed for distributed storage (for example, [Apache Parquet](https://parquet.apache.org/)).\n", + ":::\n", + "\n", + "### Interacting with Web APIs\n", + "\n", + "To access data from APIs, Wes suggests using the [requests](http://docs.python-requests.org/) package.\n", + "\n", + "``` bash\n", + "conda install requests\n", + "```\n", + "\n", + "Let's pull from this free zoo animal API.\n" + ], + "id": "081991a8" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "import requests\n", + "\n", + "url = \"https://zoo-animal-api.herokuapp.com/animals/rand\"\n", + "\n", + "resp = requests.get(url)\n", + "\n", + "resp.raise_for_status()\n", + "\n", + "print(\"HTTP status\", resp)\n", + "\n", + "animal = resp.json()\n", + "animal\n", + "\n", + "animal_df = pd.DataFrame([animal]) # important to wrap the dictionary object into a list\n", + "animal_df" + ], + "id": "bf9df9f4", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "::: callout-note\n", + "It is important to note that the dictionary is wrapped into a list. If it isn't, then you will get the following error: `ValueError: If using all scalar values, you must pass an index`.\n", + ":::\n", + "\n", + "### Interacting with Databases\n", + "\n", + "Some popular SQL-based relational databases are: SQL Server, PostgreSQL, MySQL, SQLite3. We can use pandas to load the results of a SQL query into a DataFrame.\n", + "\n", + "Import sqlite3 and create a database.\n" + ], + "id": "b3f2e90c" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "import sqlite3\n", + "\n", + "con = sqlite3.connect(\"data/data.sqlite\")" + ], + "id": "9a254988", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This creates a table.\n" + ], + "id": "a7abca28" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "#| eval: false\n", + "\n", + "query = \"\"\"\n", + " CREATE TABLE states\n", + " (Capital VARCHAR(20), State VARCHAR(20),\n", + " x1 REAL, x2 INTEGER\n", + ");\"\"\"\n", + "\n", + "con.execute(query)\n", + "\n", + "con.commit()" + ], + "id": "9316923c", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This inserts the rows of data:\n" + ], + "id": "1e317b71" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "data = [(\"Atlanta\", \"Georgia\", 1.25, 6), (\"Seattle\", \"Washington\", 2.6, 3), (\"Sacramento\", \"California\", 1.7, 5)]\n", + " \n", + "stmt = \"INSERT INTO states VALUES(?, ?, ?, ?)\"\n", + "\n", + "con.executemany(stmt, data)\n", + "\n", + "con.commit()" + ], + "id": "de7fd5e9", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we can look at the data:\n" + ], + "id": "86488099" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "cursor = con.execute(\"SELECT * FROM states\")\n", + "\n", + "rows = cursor.fetchall()\n", + "\n", + "rows" + ], + "id": "2c546066", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To get the data into a pandas DataFrame, we'll need to provide column names in the `cursor.description`.\n" + ], + "id": "a748eef0" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "print(cursor.description)\n", + "\n", + "pd.DataFrame(rows, columns = [x[0] for x in cursor.description])" + ], + "id": "d82971a2", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As per usual, Wes likes to show us the manual way first and then the easier version. Using [SQLAlchemy](http://www.sqlalchemy.org/), we can must less verbosely create our DataFrame.\n" + ], + "id": "0111ddbf" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "import sqlalchemy as sqla\n", + "\n", + "db = sqla.create_engine(\"sqlite:///data/data.sqlite\")\n", + "\n", + "pd.read_sql(\"SELECT * FROM states\", db)" + ], + "id": "05104324", + "execution_count": null, + "outputs": [] + } + ], + "metadata": { + "kernelspec": { + "name": "python3", + "language": "python", + "display_name": "Python 3 (ipykernel)" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} \ No newline at end of file diff --git a/06_notes.qmd b/06_notes.qmd index 636906e..7a77083 100644 --- a/06_notes.qmd +++ b/06_notes.qmd @@ -1,617 +1,2 @@ ## Notes {.unnumbered} -Before we can even get to the fun of data analysis, we first need to learn how to load in our data! - -![](images/DAmeme.png){dpi="300" width="425"} - -Today, we'll learn to work with the following categories of data inputs and outputs: - -- Text -- Binary -- Web APIs -- Databases - -## Reading and Writing Data in Text Format - -### `read_csv` Arguments - -[Table 6.1](https://wesmckinney.com/book/accessing-data.html#tbl-table_parsing_functions) lists the various data types pandas can read. - -Each function can be called with `pd.read_*` (for example, `pd.read_csv`). - -::: callout-note -Wes points out that the number of arguments can be overwhelming. `pd.read_csv` has about 50. The [pandas documentation](https://pandas.pydata.org/docs/reference/io.html) is a good resource for finding the right arguments. -::: - -[Table 6.2](https://wesmckinney.com/book/accessing-data.html#tbl-table_read_csv_function) lists frequently used options in `pd.read_csv`. - -Let's import the [Palmer Penguins dataset](https://github.com/allisonhorst/palmerpenguins/blob/main/inst/extdata/penguins.csv) to explore this function and some of the csv arguments. *Note*: I added random numbers for month and day to demonstrate date parsing. - -```{python} -import pandas as pd - -penguins = pd.read_csv("data/penguins.csv") - -penguins.head(5) - -``` - -#### Index Columns - -**Indexing** gets column names from the file or from this argument - -```{python} -penguins_indexed = pd.read_csv("data/penguins.csv", index_col = "species") -penguins_indexed.head(5) - -``` - -#### Infer or Convert Data Type - -**Type inference and data conversion** converts values (including missing) to a user-defined value. - -If you data uses another string value as the missing placeholder, you can add it to `na_values`. - -```{python} -penguins_NA = pd.read_csv( - "data/penguins.csv", - na_values = ["male"] - ) - -penguins_NA.head(5) - -``` - -#### Parse Date and Time - -**Date and time parsing** combines date and time from multiple columns into a single column - -```{python} -penguins_dates = pd.read_csv( - "data/penguins.csv", - parse_dates = {"date": ["month", "day", "year"]} - ) - -penguins_dates["date"] = pd.to_datetime( - penguins_dates.date, - format = "%m%d%Y" - ) - -print(penguins_dates.date.head(5)) - -print(penguins_dates.date.dtypes) - -``` - -#### Iterate Through Large Files - -**Iterating** allows iteration over chunks of very large files - -Using `nrows` to read in only 5 rows: - -```{python} -pd.read_csv("data/penguins.csv", nrows = 5 - ) - -``` - -Using `chunksize` and the `TextFileReader` to aggregate and summarize the data by species: - -```{python} -chunker = pd.read_csv("data/penguins.csv", chunksize = 10) - -print(type(chunker)) - -tot = pd.Series([], dtype = 'int64') -for piece in chunker: - tot = tot.add(piece["species"].value_counts(), fill_value = 0) - -tot - -``` - -#### Import Semi-Clean Data - -**Unclean data issues** skips rows, comments, punctuation, etc. - -We can import a subset of the columns using `usecols` and change their names (`header = 0`; `names = [list]`). - -```{python} -penguins_custom = pd.read_csv( - "data/penguins.csv", - usecols = [0,1,6], - header = 0, - names = ["Species", "Island", "Sex"] - ) - -penguins_custom.head(5) - -``` - -### Writing Data to Text Format - -To write to a csv file, we can use pandas DataFrame's `to_csv` method with `index = False` so the row numbers are not stored in the first column. Missing values are written as empty strings, we can specify a placeholder with `na_rep = "NA"`: - -```{python} -penguins_custom.to_csv( - "data/penguins_custom.csv", - index = False, - na_rep = "NA" - ) - -``` - -::: {layout-ncol="2"} -![](images/penguins_custom_noArgs.png){width="317"} - -![](images/penguins_custom.png){width="247"} -::: - -### Working with Other Delimited Formats - -#### Reading - -In case your tabular data makes pandas trip up and you need a little extra manual processing, you can use Python's built in `csv` module. - -Let's read in the penguins dataset the hard, manual way. - -```{python} -import csv - -penguin_reader = csv.reader(penguins) - -print(penguin_reader) - -``` - -Now we have the `_csv_reader` object. - -Next, Wes iterated through the reader to print the lines, which seems to only give me the row with my headings. - -```{python} -for line in penguin_reader: - print(line) - -``` - -We'll keep following along to wrangle it into a form we can use: - -```{python} -with open("data/penguins.csv") as penguin_reader: - lines = list(csv.reader(penguin_reader)) - -header, values = lines[0], lines[1:] - -print(header) -print(values[5]) -``` - -Now we have two lists: header and values. We use a dictionary of data columns and the expression `zip(*values)`. This combination of dictionary comprehension and expression is generally faster than iterating through a loop. However, Wes warns that this can use a lot of memory on large files. - -```{python} -penguin_dict = {h: v for h, v in zip(header, zip(*values))} - -# too big to print and I'm not sure how to print a select few key-value pairs - -``` - -::: callout-note -## Recall - -For a reminder on dictionary comprehensions, see [Chapter 3](https://wesmckinney.com/book/python-builtin.html#comprehensions). -::: - -Now to finally get this into a usable dataframe we'll use pandas DataFrame `from_dict` method! - -```{python} -penguin_df = pd.DataFrame.from_dict(penguin_dict) -penguin_df.head(5) - -``` - -#### `csv.Dialect` - -Since there are many kinds of delimited files, string quoting conventions, and line terminators, you may find yourself wanting to define a "Dialect" to read in your delimited file. The options available are found in [Table 6.3](https://wesmckinney.com/book/accessing-data.html#tbl-table_csv_dialect). - -You can either define a `csv.Dialect` subclass or pass dialect parameters to `csv.reader`. - -```{python} -# option 1 - -## define a dialect subclass - -class my_dialect(csv.Dialect): - lineterminator = "\n" - delimiter = ";" - quotechar = '"' - quoting = csv.QUOTE_MINIMAL - -## use the subclass - -reader = csv.reader(penguins, dialect = my_dialect) - -# option 2 - -## pass just dialect parameters - -reader = csv.reader(penguins, delimiter = ",") - -``` - -::: callout-tip -## Recap for when to use what? - -For most data, pandas `read_*` functions, plus the overwhelming number of options, will likely get you close to what you need. - -If there are additional, minor wrangling needs, you can try using Python's `csv.reader` with either a `csv.Dialect` subclass or just by passing in dialect parameters. - -If you have complicated or multicharacter delimiters, you'll likely need to import the string module and use the `split` method or regular expression method `re.split`. -::: - -#### Writing - -`csv.writer` is the companion to `csv.reader` with the same dialect and format options. The first argument in `open` is the path and filename you want to write to and the second argument `"w"` makes the file writeable. - -::: callout-note -[Python documentation](https://docs.python.org/3/library/csv.html#id3) notes that `newline=""` should be specified in case there are newlines embedded inside quoted fields to ensure they are interpreted correctly. -::: - -```{python} -with open("data/write_data.csv", "w", newline = "") as f: - writer = csv.writer(f, dialect = my_dialect) - writer.writerow(("one", "two", "three")) - writer.writerow(("1", "2", "3")) - writer.writerow(("4", "5", "6")) - writer.writerow(("7", "8", "9")) - -``` - -#### JavaScript Object Notation (JSON) Data - -Standard format for HTTP requests between web browsers, applications, and APIs. Its almost valid Python code: - -- Instead of `NaN`, it uses `null` - -- Doesn't allow trailing commas at end of lists - -- Data types: objects (dictionaries), arrays (lists), strings, numbers, booleans, and nulls. - -We'll make up a simple file of my pets' names, types, and sex to demonstrate JSON data loading and writing. - -![](images/mts.jpg){width="319"} - -Import the json module and use `json.loads` to convert a JSON string to Python. There are multiple ways to convert JSON objects to a DataFrame. - -```{python} -import json - -obj = """ -{"name": "Jadey", - "pets": [{"name": "Mai", "type": "cat", "sex": "Female"}, - {"name": "Tai", "type": "cat", "sex": "Male"}, - {"name": "Skye", "type": "cat", "sex": "Female"}] -} -""" - -json_to_py = json.loads(obj) - -print(json_to_py) -type(json_to_py) - -``` - -Since this imported the object as a dictionary, we can use `pd.DataFrame` to create a DataFrame of the pets' names, type, and sex. - -```{python} -pets_df = pd.DataFrame(json_to_py["pets"], columns = ["name", "type", "sex"]) - -print(type(pets_df)) -pets_df -``` - -Use `json.dumps` to convert from Python (class: dictionary) back to JSON (class: string). - -```{python} -py_to_json = json.dumps(json_to_py) - -print("json_to_py type:", type(json_to_py)) -print("py_to_json type:", type(py_to_json)) -py_to_json -``` - -We can use pandas `pd.read_json` function and `to_json` DataFrame method to read and write JSON files. - -```{python} - -pets_df.to_json("data/pets.json") -``` - -We can easily import a JSON file using `pandas.read_json`. - -```{python} -pet_data = pd.read_json("data/pets.json") -pet_data -``` - -### Web Scraping - -#### HTML - -`pd.read_html` uses libraries to read and write HTML and XML: - -- Try: xlml \[faster\] - -- Catch: beautifulsoup4 and html5lib \[better equipped for malformed files\] - -If you want to specify which parsing engine is used, you can use the `flavor` argument. - -```{python} -tables = pd.read_html( - "https://www.fdic.gov/resources/resolutions/bank-failures/failed-bank-list/", - flavor = "html5lib" - ) - -print("Table Length:", len(tables)) - -# since this outputs a list of tables, we can grab just the first table - -tables[0].head(5) -``` - -#### XML - -XML format is more general than HTML, but they are structurally similar. See [pandas documentation](https://pandas.pydata.org/docs/reference/api/pandas.read_xml.html) for `pd.read_xml`. - -This snippet of an xml file is from [Microsoft](https://docs.microsoft.com/en-us/previous-versions/windows/desktop/ms762271(v=vs.85)). - -``` xml - - - Gambardella, Matthew - XML Developer's Guide - Computer - 44.95 - 2000-10-01 - An in-depth look at creating applications - with XML. - -``` - -```{python} -books = pd.read_xml("data/books.xml") - -books.head(5) -``` - -If you'd like to manually parse a file, Wes demonstrates this process in the [textbook](https://wesmckinney.com/book/accessing-data.html#io_file_formats_xml), before demonstrating how the following steps are turned into one line of code using `pd.read_xml`. - -1. `from lxml import objectify` -2. Use `lxml.objectify`, -3. Create a dictionary of tag names to data values -4. Cnvert that list of dictionaries into a DataFrame. - -### Binary Data Formats - -#### Pickle - -Python has a built-in `pickle` module that converts pandas objects into the pickle format (serializes the data into a byte stream), which is generally readable only in Python. - -More information can be found in Python [documentation](https://docs.python.org/3/library/pickle.html). - -Here's a demo to show pickling and unpickling the penguins dataset. - -```{python} -print("Unpickled penguins type:", type(penguins)) - -penguins.to_pickle("data/penguins_pickle") - -# do some machine learning - -pickled_penguins = pd.read_pickle("data/penguins_pickle") -pickled_penguins -``` - -::: callout-warning -`pickle` is recommended only as a short-term storage format (i.e. loading and unloading your machine learning models) because the format may not be stable over time. Also, the module is not secure -- pickle data can be maliciously tampered with. [Python docs](https://docs.python.org/3/library/pickle.html) recommend signing data with `hmac` to ensure it hasn't been tampered with. -::: - -#### Microsoft Excel Files - -`pd.ExcelFile` class or `pd.read_excel` functions use packages `xlrd` (for older .xlx files) and `openpyxl` (for newer .xlsx files), which must be installed separately from pandas. - -``` bash -conda install xlrd openpyxl -``` - -`pd.read_excel` takes most of the same arguments as `pd.read_csv`. - -```{python} -penguins_excel = pd.read_excel( - "data/penguins.xlsx", - index_col = "species", - parse_dates = {"date": ["month", "day", "year"]} -) - -penguins_excel.head(5) -``` - -To read multiple sheets, use `pd.ExcelFile`. - -```{python} -penguins_sheets = pd.ExcelFile("data/penguins_sheets.xlsx") - -print("Available sheet names:", penguins_sheets.sheet_names) - -penguins_sheets -``` - -Then we can `parse` all sheets into a dictionary by specifying the `sheet_name` argument as `None`. Or, we can read in a subset of sheets. - -```{python} -sheets = penguins_sheets.parse(sheet_name = None) - -sheets -``` - -Then we can subset one of the sheets as a pandas DataFrame object. - -```{python} -chinstrap = sheets["chinstrap"].head(5) -chinstrap -``` - -Write one sheet to using `to_excel:` - -```{python} -chinstrap.to_excel("data/chinstrap.xlsx") -``` - -If you want to write to multiple sheets, create an `ExcelWriter` class and then write the data to it: - -```{python} - -gentoo = sheets["gentoo"].head(5) - -writer = pd.ExcelWriter("data/chinstrap_gentoo.xlsx") - -chinstrap.to_excel(writer, sheet_name = "chinstrap") - -gentoo.to_excel(writer, sheet_name = "gentoo") - -writer.save() -``` - -#### HDF5 Format - -Hierarchical data format (HDF) is used in Python, C, Java, Julia, MATLAB, and others for storing big scientific array data (multiple datasets and metadata within one file). HDF5 can be used to efficiently read/write chunks of large arrays. - -The PyTables package must first be installed. - -``` bash -conda install pytables - -pip install tables # the package is called "tables" in PyPI -``` - -pandas provides an dictionary-like-class for HDF5 files called `HDFStore`: - -```{python} -store = pd.HDFStore("data/pets.h5") - -store["pets"] = pets_df -store["pets"] -``` - -`HDFStore` can store data as a `fixed` or as a `table` schema. Table allows querying but is generally slower. - -```{python} -pets_df.to_hdf("data/petnames.h5", "pets", format = "table") -pd.read_hdf("data/petnames.h5", "pets", where=["columns = name"]) -``` - -::: callout-tip -## When should I use HDF5? - -Wes recommends using HDF5 for write-once, read-many datasets that are worked with locally. If your data is stored on remote servers, then you may try other binary formats designed for distributed storage (for example, [Apache Parquet](https://parquet.apache.org/)). -::: - -### Interacting with Web APIs - -To access data from APIs, Wes suggests using the [requests](http://docs.python-requests.org/) package. - -``` bash -conda install requests -``` - -Let's pull from this free zoo animal API. - -```{python} -import requests - -url = "https://zoo-animal-api.herokuapp.com/animals/rand" - -resp = requests.get(url) - -resp.raise_for_status() - -print("HTTP status", resp) - -animal = resp.json() -animal - -animal_df = pd.DataFrame([animal]) # important to wrap the dictionary object into a list -animal_df -``` - -::: callout-note -It is important to note that the dictionary is wrapped into a list. If it isn't, then you will get the following error: `ValueError: If using all scalar values, you must pass an index`. -::: - -### Interacting with Databases - -Some popular SQL-based relational databases are: SQL Server, PostgreSQL, MySQL, SQLite3. We can use pandas to load the results of a SQL query into a DataFrame. - -Import sqlite3 and create a database. - -```{python} -import sqlite3 - -con = sqlite3.connect("data/data.sqlite") -``` - -This creates a table. - -```{python} -#| eval: false - -query = """ - CREATE TABLE states - (Capital VARCHAR(20), State VARCHAR(20), - x1 REAL, x2 INTEGER -);""" - -con.execute(query) - -con.commit() -``` - -This inserts the rows of data: - -```{python} - -data = [("Atlanta", "Georgia", 1.25, 6), ("Seattle", "Washington", 2.6, 3), ("Sacramento", "California", 1.7, 5)] - -stmt = "INSERT INTO states VALUES(?, ?, ?, ?)" - -con.executemany(stmt, data) - -con.commit() -``` - -Now we can look at the data: - -```{python} -cursor = con.execute("SELECT * FROM states") - -rows = cursor.fetchall() - -rows -``` - -To get the data into a pandas DataFrame, we'll need to provide column names in the `cursor.description`. - -```{python} -print(cursor.description) - -pd.DataFrame(rows, columns = [x[0] for x in cursor.description]) -``` - -As per usual, Wes likes to show us the manual way first and then the easier version. Using [SQLAlchemy](http://www.sqlalchemy.org/), we can must less verbosely create our DataFrame. - -```{python} -import sqlalchemy as sqla - -db = sqla.create_engine("sqlite:///data/data.sqlite") - -pd.read_sql("SELECT * FROM states", db) -``` diff --git a/07_exercises.qmd b/07_exercises.qmd new file mode 100644 index 0000000..465ff4d --- /dev/null +++ b/07_exercises.qmd @@ -0,0 +1 @@ +# Exercises {-} diff --git a/07_main.qmd b/07_main.qmd index a6c69bf..a991d7d 100644 --- a/07_main.qmd +++ b/07_main.qmd @@ -1,467 +1,7 @@ -# 7. Data Cleaning and Preparation +# 7. Moving Beyond Linearity ## Learning Objectives -- Know which tools to use for missing data -- Know how to filter out missing data -- Understand methods to fill in missing values -- Know when and how to transform data -- Know how to use certain `numpy` functions to handle outliers, permute, and take random samples -- Know how to manipulate strings -- Understand some useful methods for regular expressions -- Learn about some helpful methods in `pandas` to explore strings -- Understand how to handle categorical data more optimally - ------------------------------------------------------------------------- - -```{python} -#| warning: false -import pandas as pd -import numpy as np - -food = pd.read_csv("https://openmv.net/file/food-consumption.csv") - -print(food.head(5)) -``` - -*dataset: The relative consumption of certain food items in European and Scandinavian countries. The numbers represent the percentage of the population consuming that food type* - -## 7.1 Handling Missing Data - -Some things to note: - -- ALL DESCRIPTIVE STATISTICS ON `pandas` OBJECTS EXLUDE MISSING DATA - BY DEFAULT - -- `NaN` is used for missing values of type: `float64` - -- Values like `NaN` are called *sentinel values* - - - a value that is not part of the input but indicates a special meaning; a signal value - - - `NaN` for missing integers, `-1` as a value to be inserted in a function that computes only non-negative integers, etc. - -```{python} -print(food.Yoghurt.isna()) -``` - -We do have an `NaN` in our midst! - -```{python} -# descriptive stats -print(np.mean(food['Yoghurt']), "\n versus", np.average(food['Yoghurt'])) -``` - -Different results! Why?? According to `numpy` documentation: - -`np.mean` always calculates the arithmetic mean along a specified axis. The first argument requires the type to be of `int64` so will take the mean of those that fit. The average is taken over the flattened array by default. `np.average` computes the *weighted* average along the specified axis. - -`sum(food.Yoghurt) –> nan` - -from `average` source: - - - avg = avg_as_array = np.multiply(a, wgt, - dtype=result_dtype).sum(axis, **keepdims_kw) / scl - -from `mean` source: - - if type(a) is not mu.ndarray: - try: - mean = a.mean - except AttributeError: - pass - else: - return mean(axis=axis, dtype=dtype, out=out, **kwargs) - - return _methods._mean(a, axis=axis, dtype=dtype, - out=out, **kwargs) - -FYI: the `statistics` [module](https://docs.python.org/3/library/statistics.html?highlight=mean#statistics.mean) includes `mean()` - -Something weird to consider.... - -```{python} -print(np.nan == np.nan) - -# apparently, according to the floating-point standard, NaN is not equal to itself! -``` - -I digress... - -### Filtering Missing Data - -```{python} -# method dropna -print("`dropna`: option to include `how = all` to only remove rows where every value is NaN \n",food.Yoghurt.dropna().tail(), "\n", -"`fillna`: pass fillna a dictionary (fillna({1: 0.5, 2: 0})) to specify a different value for each column\n", food.Yoghurt.fillna(0).tail(), "\n", -"`isna`\n", food.Yoghurt.isna().tail(), "\n", -"`notna`\n", food.Yoghurt.notna().tail()) -``` - -## 7.2 Data Transformation - -### Removing Duplicates - -Check to see is duplicates exists: - -```{python} -food.duplicated() -``` - -If you were to have duplicates, you can use the function `drop_duplicates()`. - -\*NOTE: by default, `drop_duplicates` will only return the first observed value\* - -```{python} -dup_food = food[['Yoghurt','Yoghurt']] -dup_food.columns = ['a','b'] -dup_food -``` - -```{python} -# index 11,12 are dropped - dont understand this at all -dup_food.drop_duplicates() -``` - -```{python} -# index 6, 10 are dropped- also dont understand this at all -dup_food.drop_duplicates(keep = 'last') -``` - -```{python} -# again 11,12 are dropped - still dont understand - help -dup_food.drop_duplicates(subset=['a']) -``` - -### Transforming Data with a Function or Mapping - -Since mapping a function over a series has already been covered, this section will only go over a few more helpful ways to map. - -- define your own function - similar to how we would do in `apply` functions or `purrr:map()` - - ```{python} - food_sub = food[:5][['Country','Yoghurt']] - country_yogurt = { - 'Germany':'Quark', - 'Italy':'Yomo', - 'France':'Danone', - 'Holland':'Campina', - 'Belgium':'Activia' - } - ``` - -```{python} -def get_yogurt(x): - return country_yogurt[x] - -food_sub['Brand'] = food_sub['Country'].map(get_yogurt) - -food_sub['Country'].map(get_yogurt) -``` - -### Replace Values - -```{python} -print("using `replace`: \n", food_sub.replace([30],50), '\n', -"using `replace` for more than one value: \n", food_sub.replace([30, 20],[50, 40])) -``` - -### Renaming Axis Indices - -As we've seen, standard indices are labelled as such: - - >>> food_sub.index - RangeIndex(start=0, stop=5, step=1) - -That can also be changed with the mapping of a function: - -```{python} - -print(food_sub.index.map(lambda x: x + 10)) -print('or') -print(food_sub.index.map({0:'G', 1:'I', 2:'F', 3:'H', 4:'B'})) -``` - -### Discretization and Binning - -It is common to convert continuous variables into discrete and group them. Let's group the affinity for yogurt into random bins: - -```{python} - -scale = [0, 20, 30, 50, 70] -# reasonable, ok, interesting, why - -pd.cut(food.Yoghurt, scale) -``` - -```{python} -#|warning: false -scaled = pd.cut(food.Yoghurt.values, scale) -scaled.categories - -pd.value_counts(scaled) -``` - -Apply the labels to the bins to have it make more sense: - -```{python} -#|warning: false -scale_names = ['reasonable', 'ok', 'interesting', 'why'] -pd.value_counts(pd.cut(food.Yoghurt.values, scale, labels = scale_names)) -``` - -Finally, let `pandas` do the work for you by supplying a number of bins and a precision point. It will bin your data equally while limiting the decimal point based on the value of `precision` - -```{python} -#|warning: false -pd.qcut(food.Yoghurt.values, 4, precision = 2) -``` - -### Detecting and Filtering Outliers - -We often have to face the decision of how to handle outliers. We can choose to exclude them or to transform them. - -```{python} - -# let's say any country who's percentage of yogurt consumption is over 50% is an outlier - -yog = food.Yoghurt -yog[yog.abs() > 50] -``` - -More interestingly, what if we wanted to know if the consumption of ANY food was over 50% ? - -```{python} -food2 = food.drop('Country', axis = 'columns') -food2[(food2.abs() > 95).any(axis = 'columns')] -``` - -### Permutation and Random Sampling - -- Permuting = random reordering - - - `np.random.permutation` = takes the length of the axis you want to permute - -- Random sampling = each sample has an equal probability of being chosen - -Let's randomly reorder yogurt affinity: - -```{python} - -print(np.random.permutation(5)) - -food.take(np.random.permutation(5)) -``` - -This method can be helpful when using `iloc` indexing! - -```{python} -food.take(np.random.permutation(5), axis = 'columns') -``` - -Let's try taking a random subset without replacement:\ - -```{python} -food.sample(n =5) -# you can always add `replace=True` if you want replacement -``` - -### Computing Indicator/Dummy Vars - -This kind of transformation is really helpful for machine learning. It converts categorical variables into indicator or *dummy* variable through a transformation that results in 0's and 1's. - -```{python} -#|warning: false - -pd.get_dummies(food['Country']) -``` - -This example is not the most helpful since this set of countries are *unique* but I hope you get the idea.. - -This is topic will make more sense in Ch.13 when data analysis examples are worked out. - -## 7.3 Extension Data Types - -Extension types addresses some of the shortcomings brought on by `numpy` such as: - -- expensive string computations - -- missing data conversions - -- lack of support for time related objects - -```{python} -s = pd.Series([1, 2, 3, None]) -s.dtype - -``` - -```{python} -s = pd.Series([1, 2, 3, None], dtype=pd.Int64Dtype()) -s -print(s.dtype) -``` - -Note that this extension type indicates missing with `` - -```{python} -print(s.isna()) -``` - -`` uses the `pandas.NA` sentinal value - -```{python} -s[3] is pd.NA -``` - -Types can be set with `astype()` - -```{python} -df = pd.DataFrame({"A": [1, 2, None, 4], -"B": ["one", "two", "three", None], -"C": [False, None, False, True]}) - -df["A"] = df["A"].astype("Int64") -df["B"] = df["B"].astype("string") -df["C"] = df["C"].astype("boolean") - -df -``` - -Find a table of extension types [here](https://wesmckinney.com/book/data-cleaning.html#pandas-ext-types) - -## 7.4 String Manipulation - -Functions that are built in: - -- `split()` : break a string into pieces - -- `join()` - -- `strip()` : trim whitespace - -- `in()`: good for locating a substring - -- `count()` : returns the number of occurrences of a substring - -- `replace()` : substitute occurrences of one pattern for another - -See more function [here](https://wesmckinney.com/book/data-cleaning.html#text_string_methods) - -```{python} - -lb = " layla is smart, witty, charming, and... " -lb.split(" ") -lb.strip() -'-'.join(lb) -'smart' in lb -lb.count(',') -lb.replace('...', ' bad at python.') -``` - -### Regular Expressions - -RegEx is not easy. It takes some getting used to. It is really useful for programatically applying any of the string functions to particular pattern. - -I often refer to this handy \[cheat sheet\](https://raw.githubusercontent.com/rstudio/cheatsheets/main/strings.pdf) - -To use regular expression in python, you must import the module `re`: - -```{python} -import re - -text = "layla has lived in philadelphia county, miami-dade county, and rockdale county" - -# split on whitespace -re.split(r"\s+", text) -``` - -To avoid repeating a common expression, you can *compile it* and store it as it's own object. - - regex = re.compile(r"\s+") - -**Don't forget**: there are certain characters you must escape before using like: '\\,., +, :' and more - -What if I wanted to get the counties? - -```{python} - -regex = re.compile(r"\w+(?=\s+county)") - -regex.findall(text) -``` - -### String Functions - -```{python} - -data = {"Dave": "dave@google.com", "Steve": "steve@gmail.com", -"Rob": "rob@gmail.com", "Wes": np.nan} -# convert to series -data = pd.Series(data) -data - -``` - -To get certain information, we can apply string functions from `Series` array-oriented methods: - -```{python} -# does the string contain something -data.str.contains("gmail") -# change the extension tryp -data_as_string_ext = data.astype('string') -data_as_string_ext -``` - -```{python} -# vectorized element retrieval -pattern = r"([A-Z0-9._%+-]+)@([A-Z0-9.-]+)\.([A-Z]{2,4})" -data.str.findall(pattern, flags=re.IGNORECASE).str[0] -``` - -## 7.5 Categorical Data - -```{python} -values = pd.Series(['apple', 'orange', 'apple', - 'apple'] * 2) - -pd.unique(values) -pd.value_counts(values) -``` - -You can improve performance by creating categorical representations that are numerical: - -```{python} -values = pd.Series([0, 1, 0, 0] * 2) -dim = pd.Series(['apple', 'orange']) - -dim -``` - -Retrieve the original set of strings with `take` - -```{python} -dim.take(values) -``` - -### Computations with Categoricals - -```{python} -rng = np.random.default_rng(seed=12345) -draws = rng.standard_normal(1000) -bins = pd.qcut(draws, 4) -bins -``` - -```{python} -bins = pd.qcut(draws, 4, labels=['Q1', 'Q2', 'Q3', 'Q4']) -bins -# then use groupby -bins = pd.Series(bins, name='quartile') -results = (pd.Series(draws) - .groupby(bins) - .agg(['count', 'min', 'max']) - .reset_index()) -``` - -Leads to better performance +- item 1 +- item 2 +- item 3 diff --git a/07_notes.qmd b/07_notes.qmd new file mode 100644 index 0000000..7a77083 --- /dev/null +++ b/07_notes.qmd @@ -0,0 +1,2 @@ +## Notes {.unnumbered} + diff --git a/08_exercises.qmd b/08_exercises.qmd new file mode 100644 index 0000000..465ff4d --- /dev/null +++ b/08_exercises.qmd @@ -0,0 +1 @@ +# Exercises {-} diff --git a/08_main.qmd b/08_main.qmd index 58b8f9f..01e8618 100644 --- a/08_main.qmd +++ b/08_main.qmd @@ -1,4 +1,7 @@ -# 8. Data Wrangling: Join, Combine, and Reshape +# 8. Tree-Based Methods ## Learning Objectives +- item 1 +- item 2 +- item 3 diff --git a/08_notes.qmd b/08_notes.qmd new file mode 100644 index 0000000..7a77083 --- /dev/null +++ b/08_notes.qmd @@ -0,0 +1,2 @@ +## Notes {.unnumbered} + diff --git a/09_exercises.qmd b/09_exercises.qmd new file mode 100644 index 0000000..465ff4d --- /dev/null +++ b/09_exercises.qmd @@ -0,0 +1 @@ +# Exercises {-} diff --git a/09_main.qmd b/09_main.qmd index d237bf8..5cea955 100644 --- a/09_main.qmd +++ b/09_main.qmd @@ -1,544 +1,7 @@ -# 9. Plotting and Visualization +# 9. Support Vector Machines ## Learning Objectives -::: incremental -- We are going to learn the basic data visualization technique using matplotlib, pandas and seaborn. -::: - -# Introduction - -Making informative visualizations is one of the most important tasks in every exploratory data analysis process and this can be done using **matplotlib.** It may be a part of the exploratory process for example, to help identify **outliers** or needed data transformations, or as a way of generating ideas for models. For others, building an interactive visualization for the web may be the end goal. Python has many add-on libraries for making static or dynamic visualizations, but I'll be mainly focused on **matplotlib** and libraries that build on top of it. - -## import the necessary library - -```{python} -import matplotlib.pyplot as plt -import numpy as np -import seaborn as sns - -``` - -::: panel-tabset -## Demo_Dataset - -```{python} -#| echo: fenced -#| eval: false -data = np.arange(10) -``` - -## Print - -```{python} -#| eval: true -#| echo: false -data = np.arange(10) - -data -``` -::: - -::: panel-tabset -## Code - -```{python} -#| echo: fenced -#| eval: false -plt.plot(data) -``` - -## Output - -```{python} -#| eval: true -#| echo: false -plt.plot(data) -``` -::: - -## We can use plt.show() function to display the plot in quarto - -```{python} -plt.show() -``` - -When we are in jupyter notebook we can use **%matplotlib notebook** so that we can display the plot, but when we are in Ipython we can use **%matplotlib** to display the plot. - -## Customization of the visualization - -While libraries like seaborn and pandas's built-in plotting functions will deal with many of the mundane details of making plots, should you wish to customize them beyond the function options provided, you will need to learn a bit about the matplotlib API. - -## Figures and Subplots - -Plots in **matplotlib** reside within a **Figure** object. You can create a new figure with **plt.figure ()** - -```{python} -fig = plt.figure() -``` - -**plt.figure** has a number of options; notably, **figsize** will guarantee the figure has a certain size and aspect ratio if saved to disk. - -You can't make a plot with a blank figure. You have to create one or more subplots using add_subplot - -## Add Subplot - -```{python} -ax1 = fig.add_subplot(2,2, 1) - -ax1 - -``` - -This means that the figure should be 2 × 2, and we're selecting the first of four subplots (numbered from 1). We can add more subplot - -## We can add more subplot - -```{python} -ax2 = fig.add_subplot(2, 2, 2) - -ax3 = fig.add_subplot(2, 2, 3) - -ax2 - -ax3 -``` - -## Adding axis methods to the plot - -These plot axis objects have various methods that create different types of plots, and it is preferred to use the axis methods over the top-level plotting functions like **plt.show()**. For example, we could make a line plot with the plot method. - -```{python} -fig = plt.figure() - -ax1 = fig.add_subplot(2, 2, 1) - -ax2 = fig.add_subplot(2, 2, 2) - -ax3 = fig.add_subplot(2, 2, 3) - -ax3.plot(np.random.standard_normal(50).cumsum(), color="black", -linestyle="dashed") - -``` - -We may notice output like **matplotlib.lines.Line2D** at when we are creating our visualization. matplotlib returns objects that reference the plot subcomponent that was just added. A lot of the time you can safely ignore this output, or you can put a **semicolon** at the end of the line to suppress the output. - -The additional options instruct matplotlib to plot a black dashed line. The objects returned by **fig.add_subplot** here are **AxesSubplot** objects, on which you can directly plot on the other empty subplots by calling each one's instance method. - -```{python} -ax1.hist(np.random.standard_normal(100),bins=20,color="black", alpha=0.3); - -ax2.scatter(np.arange(30), np.arange(30) + 3*np.random.standard_normal(30)); -``` - -To make creating a grid of subplots more convenient, matplotlib includes a **plt.subplots** method that creates a new figure and returns a NumPy array containing the created subplot objects: - -```{python} -axes = plt.subplots(2, 3) - -axes - -``` - -The axes array can then be indexed like a two-dimensional array; for example, **axes\[0, 1\]** refers to the subplot in the top row at the center - -## Scatter Plot - -```{python} -plt.style.use('_mpl-gallery') - -# make the data -np.random.seed(3) -x = 4 + np.random.normal(0, 2, 24) -y = 4 + np.random.normal(0, 2, len(x)) -# size and color: -sizes = np.random.uniform(15, 80, len(x)) -colors = np.random.uniform(15, 80, len(x)) - -# plot -fig, ax = plt.subplots() - -ax.scatter(x, y, s=sizes, c=colors, vmin=0, vmax=100) - -ax.set(xlim=(0, 8), xticks=np.arange(1, 8), - ylim=(0, 8), yticks=np.arange(1, 8)) - -plt.show() -``` - -## Bar Plot - -```{python} -plt.style.use('_mpl-gallery') - -# make data: -np.random.seed(3) -x = 0.5 + np.arange(8) -y = np.random.uniform(2, 7, len(x)) - -# plot -fig, ax = plt.subplots() - -ax.bar(x, y, width=1, edgecolor="white", linewidth=0.7) - -ax.set(xlim=(0, 8), xticks=np.arange(1, 8), - ylim=(0, 8), yticks=np.arange(1, 8)) - -plt.show() -``` - -## Box Plot - -```{python} - -plt.style.use('_mpl-gallery') - -# make data: -np.random.seed(10) -D = np.random.normal((3, 5, 4), (1.25, 1.00, 1.25), (100, 3)) - -# plot -fig, ax = plt.subplots() -VP = ax.boxplot(D, positions=[2, 4, 6], widths=1.5, patch_artist=True, - showmeans=False, showfliers=False, - medianprops={"color": "white", "linewidth": 0.5}, - boxprops={"facecolor": "C0", "edgecolor": "white", - "linewidth": 0.5}, - whiskerprops={"color": "C0", "linewidth": 1.5}, - capprops={"color": "C0", "linewidth": 1.5}) - -ax.set(xlim=(0, 8), xticks=np.arange(1, 8), - ylim=(0, 8), yticks=np.arange(1, 8)) - -plt.show() -``` - -[**We can learn more with the matplotlib documentation**](https://matplotlib.org) - -| Argument | Description | -|--------------------|--------------------------------------------------------| -| nrows | Number of rows of subplots | -| ncols | Number of columns of subplots | -| sharex | All subplots should use the same x-axis ticks (adjusting the xlim will affect all subplots) | -| sharey | All subplots should use the same y-axis ticks (adjusting the ylim will affect all subplots) | -| subplot_kw | Dictionary of keywords passed to add_subplot call used to create each subplot | -| **fig_kw** | Additional keywords to subplots are used when creating the figure, such as plt.subplots (2,2, figsize=(8,6)) | - -: Table.1: Matplotlib.pyplot.subplots options - -## Adjusting the spacing around subplots - -By default, matplotlib leaves a certain amount of padding around the outside of the subplots and in spacing between subplots. This spacing is all specified relative to the height and width of the plot, so that if you resize the plot either programmatically or manually using the GUI window, the plot will dynamically adjust itself. You can change the spacing using the subplots_adjust method on Figure objects: - -subplots_adjust(left=None, bottom=None, right=None, top=None, wspace=None, hspace=None) - -**wspace** and **hspace** control the percent of the figure width and figure height, respectively, to use as spacing between subplots. - -```{python} -fig, axes = plt.subplots(2, 2, sharex=True, sharey=True) -for i in range(2): - for j in range(2): - axes[i, j].hist(np.random.standard_normal(500), bins=50, - color="black", alpha=0.5) -fig.subplots_adjust(wspace=0, hspace=0) -``` - -## **Colors, Markers, and Line Styles** - -matplotlib's line `plot` function accepts arrays of x and y coordinates and optional color styling options. For example, to plot `x` versus `y` with green dashes, you would execute: - -```{python} -ax.plot(x, y, linestyle="--", color="green") -``` - -```{python} -ax = fig.add_subplot() - -ax.plot(np.random.standard_normal(30).cumsum(), color="black", -linestyle="dashed", marker="o") - -plt.show() -``` - -line plots, you will notice that subsequent points are **linearly interpolated** by default. This can be altered with the **drawstyle** option. - -```{python} -fig = plt.figure() - -ax = fig.add_subplot() - -data = np.random.standard_normal(30).cumsum() - -ax.plot(data, color="black", linestyle="dashed", label="Default"); -ax.plot(data, color="black", linestyle="dashed", -drawstyle="steps-post", label="steps-post"); -ax.legend() -``` - -## Ticks, Labels, and Legends - -Most kinds of plot decorations can be accessed through methods on matplotlib axes objects. This includes methods like **xlim**, **xticks**, and **xticklabels**. These control the **plot range**, **tick locations**, and **tick labels**, respectively. They can be used in two ways: - -- Called with no arguments returns the current parameter value (e.g., **ax.xlim()** returns the current **x-axis** plotting range) - -- Called with parameters sets the parameter value (e.g., **ax.xlim**(\[0, 10\]) sets the x-axis range to 0 to 10) - -## Setting the title, axis labels, ticks, and tick labels - -```{python} -fig, ax = plt.subplots() - -ax.plot(np.random.standard_normal(1000).cumsum()); - -plt.show() -``` - -To change the **x-axis ticks**, it's easiest to use **set_xticks** and **set_xticklabels**. The former instructs matplotlib where to place the ticks along the data range; by default these locations will also be the labels. But we can set any other values as the labels using **set_xticklabels:** - -The rotation option sets the x tick labels at a 30-degree rotation. Lastly, **set_xlabel** gives a name to the x-axis, and set_title is the subplot title. - -## Adding legends - -Legends are another critical element for identifying plot elements. There are a couple of ways to add one. The easiest is to pass the label argument when adding each piece of the plot: - -```{python} -fig, ax = plt.subplots() - -ax.plot(np.random.randn(1000).cumsum(), color="black", label="one"); -ax.plot(np.random.randn(1000).cumsum(), color="black", linestyle="dashed", -label="two"); -ax.plot(np.random.randn(1000).cumsum(), color="black", linestyle="dotted", -label="three"); - - -ax.legend() - -``` - -The legend method has several other choices for the location **loc argument**. See the docstring (with **ax.legend?**) for more information. The **loc legend** option tells matplotlib where to place the plot. The default is **"best"**, which tries to choose a location that is most out of the way. To exclude one or more elements from the legend, pass **no label** or **label="*nolegend*"**. - -## Saving Plots to File - -You can save the active figure to file using the figure object's savefig instance method. For example, to save an SVG version of a figure, you need only type: - -```{python} -fig.savefig("figpath.png", dpi=400) -``` - -| Argument | Description | -|-----------------|-------------------------------------------------------| -| fname | String containing a filepath or a Python file-like object. The figure format is inferred from the file extension (e.g., `.pdf` for PDF or `.png` for PNG). | -| dpi | The figure resolution in dots per inch; defaults to 100 in IPython or 72 in Jupyter out of the box but can be configured. | -| facecolor, edgecolor | The color of the figure background outside of the subplots; `"w"` (white), by default. | -| format | The explicit file format to use (`"png"`, `"pdf"`, `"svg"`, `"ps"`, `"eps"`, \...). | - -: Table 9.2: Some fig.savefig options - -### **matplotlib Configuration** - -matplotlib comes configured with color schemes and defaults that are geared primarily toward preparing figures for publication. Fortunately, nearly all of the default behavior can be customized via global parameters governing figure size, subplot spacing, colors, font sizes, grid styles, and so on. One way to modify the configuration programmatically from Python is to use the **`rc`** method; for example, to set the global default figure size to be 10 × 10, you could enter: - -The first argument to `rc` is the component you wish to customize, such as **`"figure"`**, **`"axes"`**, **`"xtick"`**, **`"ytick"`**, **`"grid"`**, **`"legend"`**, or many others. After that can follow a sequence of keyword arguments indicating the new parameters. A convenient way to write down the options in your program is as a dictionary: - -```{python} -plt.rc("font", family="monospace", weight="bold", size=8) -``` - -```{python} -plt.rc("figure", figsize=(10, 10)) -``` - -```{python} -plt.rc("font", family="monospace", weight="bold", size=8) -``` - -For more extensive customization and to see a list of all the options, matplotlib comes with a configuration file ***matplotlibrc*** in the ***matplotlib/mpl-data*** **directory**. If you customize this file and place it in your home directory **titled *.matplotlibrc***, it will be loaded each time you use **matplotlib**. - -All of the current configuration settings are found in the **plt.rcParams dictionary**, and they can be restored to their default values by calling the **plt.rcdefaults()** function. - -The first argument to **rc** is the component you wish to customize, such as **"figure"**, **"axes"**, **"xtick"**, **"ytick"**, **"grid"**, **"legend"**, or many others. After that can follow a sequence of keyword arguments indicating the new parameters. A convenient way to write down the options in your program is as a dictionary: - -## Plotting with pandas and seaborn - -**matplotlib** can be a fairly low-level tool. You assemble a plot from its base components: the data display (i.e., the type of plot: line, bar, box, scatter, contour, etc.), legend, title, tick labels, and other annotations. In **pandas**, we may have multiple columns of data, along with row and column labels. pandas itself has built-in methods that simplify creating visualizations from DataFrame and Series objects. Another library is **seaborn**, a high-level statistical graphics library built on matplotlib. seaborn simplifies creating many common visualization types. - -## Plotting with Pandas - -## Line Plots - -```{python} -import pandas as pd -import numpy as np - -s = pd.Series(np.random.standard_normal(10).cumsum(), index=np.arange(0, - 100, 10)) - -s.plot() -``` - -The Series object's index is passed to **matplotlib** for plotting on the x-axis, though you can disable this by passing **use_index=False**. The x-axis ticks and limits can be adjusted with the **xticks** and **xlim** options, and the **y-axis** respectively with **yticks** and **ylim**. - -| Argument | Description | -|----------------|--------------------------------------------------------| -| label | Label for plot legend | -| ax | matplotlib subplot object to plot on; if nothing passed, uses active matplotlib subplot | -| style | Style string, like `"ko--"`, to be passed to matplotlib | -| alpha | The plot fill opacity (from 0 to 1) | -| kind | Can be `"area"`, `"bar"`, `"barh"`, `"density"`, `"hist"`, `"kde"`, `"line"`, or `"pie"`; defaults to `"line"` | -| figsize | Size of the figure object to create | -| logx | Pass `True` for logarithmic scaling on the x axis; pass `"sym"` for symmetric logarithm that permits negative values | -| logy | Pass `True` for logarithmic scaling on the y axis; pass `"sym"` for symmetric logarithm that permits negative values | -| title | Title to use for the plot | -| use_index | Use the object index for tick labels | -| rot | Rotation of tick labels (0 through 360) | -| xticks | Values to use for x-axis ticks | -| yticks | Values to use for y-axis ticks | -| xlim | x-axis limits (e.g., `[0, 10]`) | -| ylim | y-axis limits | -| grid | Display axis grid (off by default) | - -: Table 9.3: Series.plot method arguments - -## Line Graph - -```{python} -df = pd.DataFrame(np.random.standard_normal((10, 4)).cumsum(0), -columns=["A", "B", "C", "D"], -index=np.arange(0, 100, 10)) - -plt.style.use('grayscale') - -df.plot() -``` - -Here I used `plt.style.use('grayscale')` to switch to a color scheme more suitable for black and white publication, since some readers will not be able to see the full color plots. The `plot` attribute contains a "family" of methods for different plot types. For example, `df.plot()` is equivalent to `df.plot.line()` - -## Bar Plots - -The **plot.bar()** and **plot.barh()** make vertical and horizontal bar plots, respectively. In this case, the Series or DataFrame index will be used as the x **(bar)** or y **(barh)** ticks - -```{python} -fig, axes = plt.subplots(2, 1) - -data = pd.Series(np.random.uniform(size=16), index=list("abcdefghijklmnop")) - -data.plot.bar(ax=axes[0], color="black", alpha=0.7) - -data.plot.barh(ax=axes[1], color="black", alpha=0.7) -``` - -## With a DataFrame, bar plots group the values in each row in bars, side by side, for each value. - -```{python} -df = pd.DataFrame(np.random.uniform(size=(6, 4)), -index=["one", "two", "three", "four", "five", "six"], -columns=pd.Index(["A", "B", "C", "D"], name="Genus")) - -df - - -df.plot.bar() -``` - -## We create stacked bar plots from a DataFrame by passing **stacked=True**, resulting in the value in each row being stacked together horizontally - -```{python} -df.plot.barh(stacked=True, alpha=0.5) -``` - -## Visualizing categorical data with Seaborn - -Plotting functions in seaborn take a `data` argument, which can be a pandas DataFrame. The other arguments refer to column names. - -```{python} -import seaborn as sns -tips = sns.load_dataset("tips") -sns.catplot(data=tips, x="day", y="total_bill") - -``` - -```{python} -sns.catplot(data=tips, x="day", y="total_bill", hue="sex", kind="swarm") -``` - -## Boxplot - -```{python} -sns.catplot(data=tips, x="day", y="total_bill", kind="box") -``` - -## Barplot - -```{python} -titanic = sns.load_dataset("titanic") -sns.catplot(data=titanic, x="sex", y="survived", hue="class", kind="bar") -``` - -## Scatter Plot - -```{python} -import pandas as pd -import seaborn as sns -penguin=pd.read_excel('data/penguins.xlsx') - -ax = sns.regplot(x="bill_length_mm", y="flipper_length_mm", data=penguin) - -ax - -``` - -In exploratory data analysis, it's helpful to be able to look at all the scatter plots among a group of variables; this is known as a *pairs* plot or *scatter plot matrix.* - -```{python} - sns.pairplot(penguin, diag_kind="kde", plot_kws={"alpha": 0.2}) - -``` - -This **plot_kws** enables us to pass down configuration options to the individual plotting calls on the off-diagonal elements. - -**Point Plot** - -```{python} -sns.catplot(data=titanic, x="sex", y="survived", hue="class", kind="point") -``` - -## Line Plot - -```{python} -import numpy as np -import pandas as pd -import seaborn as sns -sns.set_theme(style="whitegrid") - -rs = np.random.RandomState(365) -values = rs.randn(365, 4).cumsum(axis=0) -dates = pd.date_range("1 1 2016", periods=365, freq="D") -data = pd.DataFrame(values, dates, columns=["A", "B", "C", "D"]) -data = data.rolling(7).mean() - -sns.lineplot(data=data, palette="tab10", linewidth=2.5) -``` - -## **Facet Grids and Categorical Data** - -One way to visualize data with many categorical variables is to use a ***facet grid***, which is a **two-dimensional layout** of plots where the data is split across the plots on each axis based on the distinct values of a certain variable. **seaborn** has a useful built-in function **`catplot`** that simplifies making many kinds of faceted plots split by **categorical variables** - -```{python} -sns.catplot(x="species", y="bill_length_mm", hue="sex", col="island", -kind="bar", data=penguin) -``` - -**`catplot`** supports other plot types that may be useful depending on what you are trying to display. For example, *box plots* (which show the median, quartiles, and outliers) can be an effective visualization type. - -```{python} -sns.catplot(x="bill_length_mm", y="island", kind="box", -data=penguin) - -``` - -## **Other Python Visualization Tools** - -- There are many other tools for data visualization in python such as \[[Altair](https://altair-viz.github.io/)\](https://altair-viz.github.io/); \[[Bokeh](http://bokeh.pydata.org/)\](http://bokeh.pydata.org/) and \[[Plotly](https://plotly.com/python)\](https://plotly.com/python) - -- For creating static graphics for print or web, I recommend using matplotlib and libraries that build on matplotlib, like pandas and seaborn, for your needs. +- item 1 +- item 2 +- item 3 diff --git a/09_notes.qmd b/09_notes.qmd new file mode 100644 index 0000000..49f1e91 --- /dev/null +++ b/09_notes.qmd @@ -0,0 +1,3 @@ +## Notes {.unnumbered} + + diff --git a/10_exercises.qmd b/10_exercises.qmd new file mode 100644 index 0000000..465ff4d --- /dev/null +++ b/10_exercises.qmd @@ -0,0 +1 @@ +# Exercises {-} diff --git a/10_main.qmd b/10_main.qmd index 524d7bc..4c5d00a 100644 --- a/10_main.qmd +++ b/10_main.qmd @@ -1,4 +1,7 @@ -# 10. Data Aggregation and Group Operations +# 10. Deep Learning ## Learning Objectives +- item 1 +- item 2 +- item 3 diff --git a/10_notes.qmd b/10_notes.qmd new file mode 100644 index 0000000..11f07fd --- /dev/null +++ b/10_notes.qmd @@ -0,0 +1 @@ +## Notes {.unnumbered} diff --git a/11_exercises.qmd b/11_exercises.qmd new file mode 100644 index 0000000..465ff4d --- /dev/null +++ b/11_exercises.qmd @@ -0,0 +1 @@ +# Exercises {-} diff --git a/11_main.qmd b/11_main.qmd index 2318c8f..3e32ced 100644 --- a/11_main.qmd +++ b/11_main.qmd @@ -1,4 +1,9 @@ -# 11. Time Series +# 11. Survival Analysis and Censored Data ## Learning Objectives +- item 1 +- item 2 +- item 3 + + diff --git a/11_notes.qmd b/11_notes.qmd new file mode 100644 index 0000000..b65567e --- /dev/null +++ b/11_notes.qmd @@ -0,0 +1 @@ +# Notes {-} diff --git a/12_exercises.qmd b/12_exercises.qmd new file mode 100644 index 0000000..465ff4d --- /dev/null +++ b/12_exercises.qmd @@ -0,0 +1 @@ +# Exercises {-} diff --git a/12_main.qmd b/12_main.qmd index f3dd3d6..9c780f8 100644 --- a/12_main.qmd +++ b/12_main.qmd @@ -1,4 +1,7 @@ -# 12. Introduction to Modeling Libraries in Python +# 12. Unsupervised Learning ## Learning Objectives +- item 1 +- item 2 +- item 3 diff --git a/12_notes.ipynb b/12_notes.ipynb deleted file mode 100644 index a37cf4c..0000000 --- a/12_notes.ipynb +++ /dev/null @@ -1,2796 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "id": "6dfacd91", - "metadata": {}, - "source": [ - "# Notes {-}" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "id": "d45e4a69", - "metadata": { - "scrolled": true - }, - "outputs": [], - "source": [ - "import numpy as np \n", - "import pandas as pd \n", - "import patsy\n", - "import statsmodels.api as sm\n", - "import statsmodels.formula.api as smf\n", - "from statsmodels.tsa.ar_model import AutoReg\n", - "from sklearn.model_selection import train_test_split\n", - "from sklearn.linear_model import LinearRegression\n", - "import matplotlib.pyplot as plot\n", - "from yellowbrick.regressor import prediction_error" - ] - }, - { - "cell_type": "markdown", - "id": "a8161956", - "metadata": {}, - "source": [ - "## Introduction to Modeling Libraries in Python\n", - "\n", - "* Use different libraries depending on the application\n", - "\n", - "## Interfacing Between pandas and Model Code\n", - "\n", - "* Pandas for data loading and cleaning\n", - "* Modeling library for building model" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "id": "d7bba33f", - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
countrycountry_coderegionincome_groupyearlife_expectancy
187PolandPOLEurope & Central AsiaHigh income196067.680488
248United StatesUSANorth AmericaHigh income196069.770732
444PakistanPAKSouth AsiaLower middle income196146.223220
274AustraliaAUSEast Asia & PacificHigh income196170.973171
367IndonesiaIDNEast Asia & PacificLower middle income196149.269805
.....................
14254Czech RepublicCZEEurope & Central AsiaHigh income201478.824390
14368MalaysiaMYSEast Asia & PacificUpper middle income201474.718293
14235CanadaCANNorth AmericaHigh income201481.956610
14381OmanOMNMiddle East & North AfricaHigh income201477.085098
14509ComorosCOMSub-Saharan AfricaLow income201563.554024
\n", - "

140 rows × 6 columns

\n", - "
" - ], - "text/plain": [ - " country country_code region \\\n", - "187 Poland POL Europe & Central Asia \n", - "248 United States USA North America \n", - "444 Pakistan PAK South Asia \n", - "274 Australia AUS East Asia & Pacific \n", - "367 Indonesia IDN East Asia & Pacific \n", - "... ... ... ... \n", - "14254 Czech Republic CZE Europe & Central Asia \n", - "14368 Malaysia MYS East Asia & Pacific \n", - "14235 Canada CAN North America \n", - "14381 Oman OMN Middle East & North Africa \n", - "14509 Comoros COM Sub-Saharan Africa \n", - "\n", - " income_group year life_expectancy \n", - "187 High income 1960 67.680488 \n", - "248 High income 1960 69.770732 \n", - "444 Lower middle income 1961 46.223220 \n", - "274 High income 1961 70.973171 \n", - "367 Lower middle income 1961 49.269805 \n", - "... ... ... ... \n", - "14254 High income 2014 78.824390 \n", - "14368 Upper middle income 2014 74.718293 \n", - "14235 High income 2014 81.956610 \n", - "14381 High income 2014 77.085098 \n", - "14509 Low income 2015 63.554024 \n", - "\n", - "[140 rows x 6 columns]" - ] - }, - "execution_count": 2, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "df = pd.read_excel(\"data/Life Expectancy at Birth.xlsx\", engine=\"openpyxl\")\n", - "df.dropna(inplace=True)\n", - "df.columns = df.columns.map(lambda row: \"_\".join(row.lower().split(\" \")))\n", - "sample = df.groupby(\"region\").sample(n=20).sort_values(by=\"year\")\n", - "sample" - ] - }, - { - "cell_type": "code", - "execution_count": 52, - "id": "da90e1ce", - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
yearlife_expectancy
187196067.680488
248196069.770732
444196146.223220
274196170.973171
367196149.269805
.........
14254201478.824390
14368201474.718293
14235201481.956610
14381201477.085098
14509201563.554024
\n", - "

140 rows × 2 columns

\n", - "
" - ], - "text/plain": [ - " year life_expectancy\n", - "187 1960 67.680488\n", - "248 1960 69.770732\n", - "444 1961 46.223220\n", - "274 1961 70.973171\n", - "367 1961 49.269805\n", - "... ... ...\n", - "14254 2014 78.824390\n", - "14368 2014 74.718293\n", - "14235 2014 81.956610\n", - "14381 2014 77.085098\n", - "14509 2015 63.554024\n", - "\n", - "[140 rows x 2 columns]" - ] - }, - "execution_count": 52, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "numeric_cols = [\"year\", \"life_expectancy\"]\n", - "df_num = sample[numeric_cols]\n", - "df_num" - ] - }, - { - "cell_type": "markdown", - "id": "02b11be0", - "metadata": {}, - "source": [ - "Turn a DataFrame into a NumPy array, use the `to_numpy` method:" - ] - }, - { - "cell_type": "code", - "execution_count": 53, - "id": "7e9e3e33", - "metadata": {}, - "outputs": [], - "source": [ - "df_num = df_num.to_numpy()" - ] - }, - { - "cell_type": "code", - "execution_count": 54, - "id": "383f6262", - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "text/plain": [ - "True" - ] - }, - "execution_count": 54, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "isinstance(df_num, np.ndarray) # an ndarray of Python objects" - ] - }, - { - "cell_type": "markdown", - "id": "9778cabe", - "metadata": {}, - "source": [ - "To convert back to a DataFrame, as you may recall from earlier chapters, you can pass a two-dimensional ndarray with optional column names:" - ] - }, - { - "cell_type": "code", - "execution_count": 57, - "id": "2a495ab9", - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
countrycountry_coderegionincome_groupyearlife_expectancy
0PolandPOLEurope & Central AsiaHigh income196067.680488
1United StatesUSANorth AmericaHigh income196069.770732
2PakistanPAKSouth AsiaLower middle income196146.22322
3AustraliaAUSEast Asia & PacificHigh income196170.973171
4IndonesiaIDNEast Asia & PacificLower middle income196149.269805
.....................
135Czech RepublicCZEEurope & Central AsiaHigh income201478.82439
136MalaysiaMYSEast Asia & PacificUpper middle income201474.718293
137CanadaCANNorth AmericaHigh income201481.95661
138OmanOMNMiddle East & North AfricaHigh income201477.085098
139ComorosCOMSub-Saharan AfricaLow income201563.554024
\n", - "

140 rows × 6 columns

\n", - "
" - ], - "text/plain": [ - " country country_code region \\\n", - "0 Poland POL Europe & Central Asia \n", - "1 United States USA North America \n", - "2 Pakistan PAK South Asia \n", - "3 Australia AUS East Asia & Pacific \n", - "4 Indonesia IDN East Asia & Pacific \n", - ".. ... ... ... \n", - "135 Czech Republic CZE Europe & Central Asia \n", - "136 Malaysia MYS East Asia & Pacific \n", - "137 Canada CAN North America \n", - "138 Oman OMN Middle East & North Africa \n", - "139 Comoros COM Sub-Saharan Africa \n", - "\n", - " income_group year life_expectancy \n", - "0 High income 1960 67.680488 \n", - "1 High income 1960 69.770732 \n", - "2 Lower middle income 1961 46.22322 \n", - "3 High income 1961 70.973171 \n", - "4 Lower middle income 1961 49.269805 \n", - ".. ... ... ... \n", - "135 High income 2014 78.82439 \n", - "136 Upper middle income 2014 74.718293 \n", - "137 High income 2014 81.95661 \n", - "138 High income 2014 77.085098 \n", - "139 Low income 2015 63.554024 \n", - "\n", - "[140 rows x 6 columns]" - ] - }, - "execution_count": 57, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "df2 = pd.DataFrame(sample.to_numpy(), columns=['country', 'country_code', 'region', 'income_group', 'year', 'life_expectancy'])\n", - "df2" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "id": "5fa581cc", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "array([['Poland', 'POL', 'Europe & Central Asia', 'High income', 1960,\n", - " 67.680487805],\n", - " ['United States', 'USA', 'North America', 'High income', 1960,\n", - " 69.770731707],\n", - " ['Pakistan', 'PAK', 'South Asia', 'Lower middle income', 1961,\n", - " 46.223219512],\n", - " ['Australia', 'AUS', 'East Asia & Pacific', 'High income', 1961,\n", - " 70.973170732],\n", - " ['Indonesia', 'IDN', 'East Asia & Pacific', 'Lower middle income',\n", - " 1961, 49.269804878],\n", - " ['Kazakhstan', 'KAZ', 'Europe & Central Asia',\n", - " 'Upper middle income', 1962, 59.199073171],\n", - " ['Bangladesh', 'BGD', 'South Asia', 'Lower middle income', 1962,\n", - " 47.08397561],\n", - " ['Cuba', 'CUB', 'Latin America & Caribbean',\n", - " 'Upper middle income', 1962, 65.138219512],\n", - " ['Finland', 'FIN', 'Europe & Central Asia', 'High income', 1962,\n", - " 68.577804878],\n", - " ['Hungary', 'HUN', 'Europe & Central Asia', 'High income', 1962,\n", - " 67.865853659],\n", - " ['Azerbaijan', 'AZE', 'Europe & Central Asia',\n", - " 'Upper middle income', 1963, 62.052],\n", - " ['Nepal', 'NPL', 'South Asia', 'Low income', 1963, 36.425170732],\n", - " ['Macedonia, FYR', 'MKD', 'Europe & Central Asia',\n", - " 'Upper middle income', 1963, 62.459512195],\n", - " ['Micronesia, Fed. Sts.', 'FSM', 'East Asia & Pacific',\n", - " 'Lower middle income', 1963, 58.781585366],\n", - " ['Tunisia', 'TUN', 'Middle East & North Africa',\n", - " 'Lower middle income', 1963, 44.100780488],\n", - " ['Antigua and Barbuda', 'ATG', 'Latin America & Caribbean',\n", - " 'High income', 1963, 62.992585366],\n", - " ['Maldives', 'MDV', 'South Asia', 'Upper middle income', 1964,\n", - " 39.806902439],\n", - " ['Saudi Arabia', 'SAU', 'Middle East & North Africa',\n", - " 'High income', 1964, 47.811268293],\n", - " ['Sao Tome and Principe', 'STP', 'Sub-Saharan Africa',\n", - " 'Lower middle income', 1964, 52.674756098],\n", - " ['Maldives', 'MDV', 'South Asia', 'Upper middle income', 1965,\n", - " 40.49602439],\n", - " ['Lebanon', 'LBN', 'Middle East & North Africa',\n", - " 'Upper middle income', 1965, 64.715],\n", - " ['Canada', 'CAN', 'North America', 'High income', 1967,\n", - " 72.207804878],\n", - " ['Lao PDR', 'LAO', 'East Asia & Pacific', 'Lower middle income',\n", - " 1967, 45.316878049],\n", - " ['Egypt, Arab Rep.', 'EGY', 'Middle East & North Africa',\n", - " 'Lower middle income', 1968, 51.568682927],\n", - " ['Iran, Islamic Rep.', 'IRN', 'Middle East & North Africa',\n", - " 'Upper middle income', 1969, 50.158341463],\n", - " ['Samoa', 'WSM', 'East Asia & Pacific', 'Upper middle income',\n", - " 1970, 54.969512195],\n", - " ['Botswana', 'BWA', 'Sub-Saharan Africa', 'Upper middle income',\n", - " 1970, 54.443463415],\n", - " ['Nepal', 'NPL', 'South Asia', 'Low income', 1970, 40.504439024],\n", - " ['Bermuda', 'BMU', 'North America', 'High income', 1970, 70.29],\n", - " ['Mexico', 'MEX', 'Latin America & Caribbean',\n", - " 'Upper middle income', 1971, 61.819170732],\n", - " ['Trinidad and Tobago', 'TTO', 'Latin America & Caribbean',\n", - " 'High income', 1971, 65.260390244],\n", - " ['St. Lucia', 'LCA', 'Latin America & Caribbean',\n", - " 'Upper middle income', 1971, 63.443170732],\n", - " ['Sudan', 'SDN', 'Sub-Saharan Africa', 'Lower middle income',\n", - " 1971, 52.570146341],\n", - " ['Nepal', 'NPL', 'South Asia', 'Low income', 1971, 41.088146341],\n", - " ['Canada', 'CAN', 'North America', 'High income', 1972,\n", - " 72.933902439],\n", - " ['Swaziland', 'SWZ', 'Sub-Saharan Africa', 'Lower middle income',\n", - " 1972, 49.151121951],\n", - " ['Bhutan', 'BTN', 'South Asia', 'Lower middle income', 1973,\n", - " 39.287487805],\n", - " ['Peru', 'PER', 'Latin America & Caribbean',\n", - " 'Upper middle income', 1973, 55.758],\n", - " ['United States', 'USA', 'North America', 'High income', 1974,\n", - " 71.956097561],\n", - " ['Romania', 'ROU', 'Europe & Central Asia', 'Upper middle income',\n", - " 1974, 69.499756098],\n", - " ['New Caledonia', 'NCL', 'East Asia & Pacific', 'High income',\n", - " 1975, 65.273170732],\n", - " ['Uruguay', 'URY', 'Latin America & Caribbean', 'High income',\n", - " 1975, 69.142829268],\n", - " ['Papua New Guinea', 'PNG', 'East Asia & Pacific',\n", - " 'Lower middle income', 1975, 49.270512195],\n", - " ['United States', 'USA', 'North America', 'High income', 1976,\n", - " 72.856097561],\n", - " ['Afghanistan', 'AFG', 'South Asia', 'Low income', 1976,\n", - " 39.575707317],\n", - " ['Bhutan', 'BTN', 'South Asia', 'Lower middle income', 1977,\n", - " 42.600829268],\n", - " ['Lao PDR', 'LAO', 'East Asia & Pacific', 'Lower middle income',\n", - " 1977, 48.208390244],\n", - " ['Dominican Republic', 'DOM', 'Latin America & Caribbean',\n", - " 'Upper middle income', 1977, 61.878902439],\n", - " ['Guatemala', 'GTM', 'Latin America & Caribbean',\n", - " 'Lower middle income', 1978, 56.391512195],\n", - " ['Ireland', 'IRL', 'Europe & Central Asia', 'High income', 1978,\n", - " 72.104756098],\n", - " ['Ecuador', 'ECU', 'Latin America & Caribbean',\n", - " 'Upper middle income', 1978, 61.925195122],\n", - " ['Ecuador', 'ECU', 'Latin America & Caribbean',\n", - " 'Upper middle income', 1979, 62.506902439],\n", - " ['United States', 'USA', 'North America', 'High income', 1979,\n", - " 73.804878049],\n", - " ['Sudan', 'SDN', 'Sub-Saharan Africa', 'Lower middle income',\n", - " 1979, 54.149073171],\n", - " ['Bangladesh', 'BGD', 'South Asia', 'Lower middle income', 1979,\n", - " 52.887292683],\n", - " ['Egypt, Arab Rep.', 'EGY', 'Middle East & North Africa',\n", - " 'Lower middle income', 1979, 57.641365854],\n", - " ['Cyprus', 'CYP', 'Europe & Central Asia', 'High income', 1980,\n", - " 74.756804878],\n", - " ['Qatar', 'QAT', 'Middle East & North Africa', 'High income',\n", - " 1980, 72.791658537],\n", - " ['Bermuda', 'BMU', 'North America', 'High income', 1980,\n", - " 72.304634146],\n", - " ['Puerto Rico', 'PRI', 'Latin America & Caribbean', 'High income',\n", - " 1980, 73.702292683],\n", - " ['Myanmar', 'MMR', 'East Asia & Pacific', 'Lower middle income',\n", - " 1981, 55.334390244],\n", - " ['Qatar', 'QAT', 'Middle East & North Africa', 'High income',\n", - " 1981, 73.087365854],\n", - " ['St. Martin (French part)', 'MAF', 'Latin America & Caribbean',\n", - " 'High income', 1982, 73.219512195],\n", - " ['Kenya', 'KEN', 'Sub-Saharan Africa', 'Lower middle income',\n", - " 1982, 58.799756098],\n", - " ['Greenland', 'GRL', 'Europe & Central Asia', 'High income', 1982,\n", - " 63.192682927],\n", - " ['Indonesia', 'IDN', 'East Asia & Pacific', 'Lower middle income',\n", - " 1983, 60.82295122],\n", - " ['Algeria', 'DZA', 'Middle East & North Africa',\n", - " 'Upper middle income', 1984, 63.117121951],\n", - " ['Central African Republic', 'CAF', 'Sub-Saharan Africa',\n", - " 'Low income', 1984, 49.825634146],\n", - " ['New Caledonia', 'NCL', 'East Asia & Pacific', 'High income',\n", - " 1984, 68.063414634],\n", - " ['Somalia', 'SOM', 'Sub-Saharan Africa', 'Low income', 1984,\n", - " 45.984365854],\n", - " ['France', 'FRA', 'Europe & Central Asia', 'High income', 1984,\n", - " 75.0],\n", - " ['Lao PDR', 'LAO', 'East Asia & Pacific', 'Lower middle income',\n", - " 1984, 50.544243902],\n", - " ['India', 'IND', 'South Asia', 'Lower middle income', 1985,\n", - " 55.860902439],\n", - " ['Philippines', 'PHL', 'East Asia & Pacific',\n", - " 'Lower middle income', 1985, 63.798560976],\n", - " ['Kosovo', 'XKX', 'Europe & Central Asia', 'Lower middle income',\n", - " 1985, 66.797560976],\n", - " ['United Arab Emirates', 'ARE', 'Middle East & North Africa',\n", - " 'High income', 1985, 70.076634146],\n", - " ['Uzbekistan', 'UZB', 'Europe & Central Asia',\n", - " 'Lower middle income', 1986, 66.942439024],\n", - " ['United States', 'USA', 'North America', 'High income', 1987,\n", - " 74.765853659],\n", - " ['South Sudan', 'SSD', 'Sub-Saharan Africa', 'Low income', 1987,\n", - " 41.50802439],\n", - " ['Bhutan', 'BTN', 'South Asia', 'Lower middle income', 1988,\n", - " 50.898487805],\n", - " ['Canada', 'CAN', 'North America', 'High income', 1988,\n", - " 76.809268293],\n", - " ['Niger', 'NER', 'Sub-Saharan Africa', 'Low income', 1988,\n", - " 42.962073171],\n", - " ['Tunisia', 'TUN', 'Middle East & North Africa',\n", - " 'Lower middle income', 1988, 67.607902439],\n", - " ['Guatemala', 'GTM', 'Latin America & Caribbean',\n", - " 'Lower middle income', 1989, 61.704658537],\n", - " ['Sri Lanka', 'LKA', 'South Asia', 'Lower middle income', 1989,\n", - " 69.534341463],\n", - " ['Uruguay', 'URY', 'Latin America & Caribbean', 'High income',\n", - " 1990, 72.539536585],\n", - " ['United States', 'USA', 'North America', 'High income', 1990,\n", - " 75.214634146],\n", - " ['United States', 'USA', 'North America', 'High income', 1991,\n", - " 75.365853659],\n", - " ['Saudi Arabia', 'SAU', 'Middle East & North Africa',\n", - " 'High income', 1991, 69.592195122],\n", - " ['Philippines', 'PHL', 'East Asia & Pacific',\n", - " 'Lower middle income', 1991, 65.481268293],\n", - " ['Syrian Arab Republic', 'SYR', 'Middle East & North Africa',\n", - " 'Lower middle income', 1991, 70.396],\n", - " ['Libya', 'LBY', 'Middle East & North Africa',\n", - " 'Upper middle income', 1992, 69.275073171],\n", - " ['Albania', 'ALB', 'Europe & Central Asia', 'Upper middle income',\n", - " 1992, 71.900804878],\n", - " ['Bangladesh', 'BGD', 'South Asia', 'Lower middle income', 1993,\n", - " 60.418804878],\n", - " ['United States', 'USA', 'North America', 'High income', 1993,\n", - " 75.419512195],\n", - " ['Cameroon', 'CMR', 'Sub-Saharan Africa', 'Lower middle income',\n", - " 1993, 53.524829268],\n", - " ['Aruba', 'ABW', 'Latin America & Caribbean', 'High income', 1994,\n", - " 73.535756098],\n", - " ['Ghana', 'GHA', 'Sub-Saharan Africa', 'Lower middle income',\n", - " 1994, 57.594829268],\n", - " ['Canada', 'CAN', 'North America', 'High income', 1994,\n", - " 77.86195122],\n", - " ['Pakistan', 'PAK', 'South Asia', 'Lower middle income', 1995,\n", - " 61.485292683],\n", - " ['Rwanda', 'RWA', 'Sub-Saharan Africa', 'Low income', 1995,\n", - " 31.634512195],\n", - " ['Somalia', 'SOM', 'Sub-Saharan Africa', 'Low income', 1996,\n", - " 48.077609756],\n", - " ['Sierra Leone', 'SLE', 'Sub-Saharan Africa', 'Low income', 1998,\n", - " 37.046804878],\n", - " ['United Arab Emirates', 'ARE', 'Middle East & North Africa',\n", - " 'High income', 1998, 73.920219512],\n", - " ['Oman', 'OMN', 'Middle East & North Africa', 'High income', 1999,\n", - " 71.910390244],\n", - " ['Virgin Islands (U.S.)', 'VIR', 'Latin America & Caribbean',\n", - " 'High income', 1999, 77.306170732],\n", - " ['Spain', 'ESP', 'Europe & Central Asia', 'High income', 1999,\n", - " 78.717073171],\n", - " ['Swaziland', 'SWZ', 'Sub-Saharan Africa', 'Lower middle income',\n", - " 2001, 47.434097561],\n", - " ['Spain', 'ESP', 'Europe & Central Asia', 'High income', 2001,\n", - " 79.368292683],\n", - " ['United Arab Emirates', 'ARE', 'Middle East & North Africa',\n", - " 'High income', 2003, 75.217219512],\n", - " ['United States', 'USA', 'North America', 'High income', 2004,\n", - " 77.487804878],\n", - " ['Lebanon', 'LBN', 'Middle East & North Africa',\n", - " 'Upper middle income', 2006, 77.246707317],\n", - " ['Indonesia', 'IDN', 'East Asia & Pacific', 'Lower middle income',\n", - " 2006, 67.367487805],\n", - " ['Kazakhstan', 'KAZ', 'Europe & Central Asia',\n", - " 'Upper middle income', 2006, 66.16097561],\n", - " ['Canada', 'CAN', 'North America', 'High income', 2006,\n", - " 80.292682927],\n", - " ['Mauritania', 'MRT', 'Sub-Saharan Africa', 'Lower middle income',\n", - " 2007, 61.139731707],\n", - " ['Puerto Rico', 'PRI', 'Latin America & Caribbean', 'High income',\n", - " 2008, 77.906463415],\n", - " ['Togo', 'TGO', 'Sub-Saharan Africa', 'Low income', 2008,\n", - " 56.005902439],\n", - " ['Bangladesh', 'BGD', 'South Asia', 'Lower middle income', 2008,\n", - " 69.277853659],\n", - " ['Qatar', 'QAT', 'Middle East & North Africa', 'High income',\n", - " 2008, 77.448756098],\n", - " ['Hungary', 'HUN', 'Europe & Central Asia', 'High income', 2008,\n", - " 73.702439024],\n", - " ['Fiji', 'FJI', 'East Asia & Pacific', 'Upper middle income',\n", - " 2009, 69.202853659],\n", - " ['Aruba', 'ABW', 'Latin America & Caribbean', 'High income', 2009,\n", - " 74.818146341],\n", - " ['Malaysia', 'MYS', 'East Asia & Pacific', 'Upper middle income',\n", - " 2009, 74.038414634],\n", - " ['Canada', 'CAN', 'North America', 'High income', 2010,\n", - " 81.197560976],\n", - " ['Nicaragua', 'NIC', 'Latin America & Caribbean',\n", - " 'Lower middle income', 2010, 73.581731707],\n", - " ['United States', 'USA', 'North America', 'High income', 2011,\n", - " 78.641463415],\n", - " ['Bangladesh', 'BGD', 'South Asia', 'Lower middle income', 2011,\n", - " 70.47195122],\n", - " ['Seychelles', 'SYC', 'Sub-Saharan Africa', 'High income', 2011,\n", - " 72.724390244],\n", - " ['Ireland', 'IRL', 'Europe & Central Asia', 'High income', 2012,\n", - " 80.846341463],\n", - " ['India', 'IND', 'South Asia', 'Lower middle income', 2012,\n", - " 67.289878049],\n", - " ['New Zealand', 'NZL', 'East Asia & Pacific', 'High income', 2013,\n", - " 81.407317073],\n", - " ['Bermuda', 'BMU', 'North America', 'High income', 2013,\n", - " 80.572439024],\n", - " ['Myanmar', 'MMR', 'East Asia & Pacific', 'Lower middle income',\n", - " 2014, 65.857853659],\n", - " ['Pakistan', 'PAK', 'South Asia', 'Lower middle income', 2014,\n", - " 66.183365854],\n", - " ['Czech Republic', 'CZE', 'Europe & Central Asia', 'High income',\n", - " 2014, 78.824390244],\n", - " ['Malaysia', 'MYS', 'East Asia & Pacific', 'Upper middle income',\n", - " 2014, 74.718292683],\n", - " ['Canada', 'CAN', 'North America', 'High income', 2014,\n", - " 81.956609756],\n", - " ['Oman', 'OMN', 'Middle East & North Africa', 'High income', 2014,\n", - " 77.085097561],\n", - " ['Comoros', 'COM', 'Sub-Saharan Africa', 'Low income', 2015,\n", - " 63.55402439]], dtype=object)" - ] - }, - "execution_count": 9, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "df_mix = sample.to_numpy()\n", - "df_mix" - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "id": "fbcca069", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "True" - ] - }, - "execution_count": 10, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "isinstance(df_mix, np.ndarray) # an ndarray of Python objects" - ] - }, - { - "cell_type": "markdown", - "id": "875c6f87", - "metadata": {}, - "source": [ - "Use a subset by using `loc` indexing with `to_numpy`:" - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "id": "fc275ddd", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "array([[1960. , 67.6804878 ],\n", - " [1960. , 69.77073171],\n", - " [1961. , 46.22321951],\n", - " [1961. , 70.97317073],\n", - " [1961. , 49.26980488],\n", - " [1962. , 59.19907317],\n", - " [1962. , 47.08397561],\n", - " [1962. , 65.13821951],\n", - " [1962. , 68.57780488],\n", - " [1962. , 67.86585366],\n", - " [1963. , 62.052 ],\n", - " [1963. , 36.42517073],\n", - " [1963. , 62.4595122 ],\n", - " [1963. , 58.78158537],\n", - " [1963. , 44.10078049],\n", - " [1963. , 62.99258537],\n", - " [1964. , 39.80690244],\n", - " [1964. , 47.81126829],\n", - " [1964. , 52.6747561 ],\n", - " [1965. , 40.49602439],\n", - " [1965. , 64.715 ],\n", - " [1967. , 72.20780488],\n", - " [1967. , 45.31687805],\n", - " [1968. , 51.56868293],\n", - " [1969. , 50.15834146],\n", - " [1970. , 54.9695122 ],\n", - " [1970. , 54.44346341],\n", - " [1970. , 40.50443902],\n", - " [1970. , 70.29 ],\n", - " [1971. , 61.81917073],\n", - " [1971. , 65.26039024],\n", - " [1971. , 63.44317073],\n", - " [1971. , 52.57014634],\n", - " [1971. , 41.08814634],\n", - " [1972. , 72.93390244],\n", - " [1972. , 49.15112195],\n", - " [1973. , 39.2874878 ],\n", - " [1973. , 55.758 ],\n", - " [1974. , 71.95609756],\n", - " [1974. , 69.4997561 ],\n", - " [1975. , 65.27317073],\n", - " [1975. , 69.14282927],\n", - " [1975. , 49.2705122 ],\n", - " [1976. , 72.85609756],\n", - " [1976. , 39.57570732],\n", - " [1977. , 42.60082927],\n", - " [1977. , 48.20839024],\n", - " [1977. , 61.87890244],\n", - " [1978. , 56.39151219],\n", - " [1978. , 72.1047561 ],\n", - " [1978. , 61.92519512],\n", - " [1979. , 62.50690244],\n", - " [1979. , 73.80487805],\n", - " [1979. , 54.14907317],\n", - " [1979. , 52.88729268],\n", - " [1979. , 57.64136585],\n", - " [1980. , 74.75680488],\n", - " [1980. , 72.79165854],\n", - " [1980. , 72.30463415],\n", - " [1980. , 73.70229268],\n", - " [1981. , 55.33439024],\n", - " [1981. , 73.08736585],\n", - " [1982. , 73.21951219],\n", - " [1982. , 58.7997561 ],\n", - " [1982. , 63.19268293],\n", - " [1983. , 60.82295122],\n", - " [1984. , 63.11712195],\n", - " [1984. , 49.82563415],\n", - " [1984. , 68.06341463],\n", - " [1984. , 45.98436585],\n", - " [1984. , 75. ],\n", - " [1984. , 50.5442439 ],\n", - " [1985. , 55.86090244],\n", - " [1985. , 63.79856098],\n", - " [1985. , 66.79756098],\n", - " [1985. , 70.07663415],\n", - " [1986. , 66.94243902],\n", - " [1987. , 74.76585366],\n", - " [1987. , 41.50802439],\n", - " [1988. , 50.89848781],\n", - " [1988. , 76.80926829],\n", - " [1988. , 42.96207317],\n", - " [1988. , 67.60790244],\n", - " [1989. , 61.70465854],\n", - " [1989. , 69.53434146],\n", - " [1990. , 72.53953658],\n", - " [1990. , 75.21463415],\n", - " [1991. , 75.36585366],\n", - " [1991. , 69.59219512],\n", - " [1991. , 65.48126829],\n", - " [1991. , 70.396 ],\n", - " [1992. , 69.27507317],\n", - " [1992. , 71.90080488],\n", - " [1993. , 60.41880488],\n", - " [1993. , 75.41951219],\n", - " [1993. , 53.52482927],\n", - " [1994. , 73.5357561 ],\n", - " [1994. , 57.59482927],\n", - " [1994. , 77.86195122],\n", - " [1995. , 61.48529268],\n", - " [1995. , 31.63451219],\n", - " [1996. , 48.07760976],\n", - " [1998. , 37.04680488],\n", - " [1998. , 73.92021951],\n", - " [1999. , 71.91039024],\n", - " [1999. , 77.30617073],\n", - " [1999. , 78.71707317],\n", - " [2001. , 47.43409756],\n", - " [2001. , 79.36829268],\n", - " [2003. , 75.21721951],\n", - " [2004. , 77.48780488],\n", - " [2006. , 77.24670732],\n", - " [2006. , 67.3674878 ],\n", - " [2006. , 66.16097561],\n", - " [2006. , 80.29268293],\n", - " [2007. , 61.13973171],\n", - " [2008. , 77.90646342],\n", - " [2008. , 56.00590244],\n", - " [2008. , 69.27785366],\n", - " [2008. , 77.4487561 ],\n", - " [2008. , 73.70243902],\n", - " [2009. , 69.20285366],\n", - " [2009. , 74.81814634],\n", - " [2009. , 74.03841463],\n", - " [2010. , 81.19756098],\n", - " [2010. , 73.58173171],\n", - " [2011. , 78.64146342],\n", - " [2011. , 70.47195122],\n", - " [2011. , 72.72439024],\n", - " [2012. , 80.84634146],\n", - " [2012. , 67.28987805],\n", - " [2013. , 81.40731707],\n", - " [2013. , 80.57243902],\n", - " [2014. , 65.85785366],\n", - " [2014. , 66.18336585],\n", - " [2014. , 78.82439024],\n", - " [2014. , 74.71829268],\n", - " [2014. , 81.95660976],\n", - " [2014. , 77.08509756],\n", - " [2015. , 63.55402439]])" - ] - }, - "execution_count": 13, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "sample.loc[:, numeric_cols].to_numpy()" - ] - }, - { - "cell_type": "markdown", - "id": "cdc1c5a3", - "metadata": {}, - "source": [ - "There is a nonnumeric column in our example dataset. Let's convert `income_group` to dummy variables. But this can be error prone." - ] - }, - { - "cell_type": "code", - "execution_count": 14, - "id": "5482debd", - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
countrycountry_coderegionyearlife_expectancyincome_group_High incomeincome_group_Low incomeincome_group_Lower middle incomeincome_group_Upper middle income
187PolandPOLEurope & Central Asia196067.6804881000
248United StatesUSANorth America196069.7707321000
444PakistanPAKSouth Asia196146.2232200010
274AustraliaAUSEast Asia & Pacific196170.9731711000
367IndonesiaIDNEast Asia & Pacific196149.2698050010
..............................
14254Czech RepublicCZEEurope & Central Asia201478.8243901000
14368MalaysiaMYSEast Asia & Pacific201474.7182930001
14235CanadaCANNorth America201481.9566101000
14381OmanOMNMiddle East & North Africa201477.0850981000
14509ComorosCOMSub-Saharan Africa201563.5540240100
\n", - "

140 rows × 9 columns

\n", - "
" - ], - "text/plain": [ - " country country_code region year \\\n", - "187 Poland POL Europe & Central Asia 1960 \n", - "248 United States USA North America 1960 \n", - "444 Pakistan PAK South Asia 1961 \n", - "274 Australia AUS East Asia & Pacific 1961 \n", - "367 Indonesia IDN East Asia & Pacific 1961 \n", - "... ... ... ... ... \n", - "14254 Czech Republic CZE Europe & Central Asia 2014 \n", - "14368 Malaysia MYS East Asia & Pacific 2014 \n", - "14235 Canada CAN North America 2014 \n", - "14381 Oman OMN Middle East & North Africa 2014 \n", - "14509 Comoros COM Sub-Saharan Africa 2015 \n", - "\n", - " life_expectancy income_group_High income income_group_Low income \\\n", - "187 67.680488 1 0 \n", - "248 69.770732 1 0 \n", - "444 46.223220 0 0 \n", - "274 70.973171 1 0 \n", - "367 49.269805 0 0 \n", - "... ... ... ... \n", - "14254 78.824390 1 0 \n", - "14368 74.718293 0 0 \n", - "14235 81.956610 1 0 \n", - "14381 77.085098 1 0 \n", - "14509 63.554024 0 1 \n", - "\n", - " income_group_Lower middle income income_group_Upper middle income \n", - "187 0 0 \n", - "248 0 0 \n", - "444 1 0 \n", - "274 0 0 \n", - "367 1 0 \n", - "... ... ... \n", - "14254 0 0 \n", - "14368 0 1 \n", - "14235 0 0 \n", - "14381 0 0 \n", - "14509 0 0 \n", - "\n", - "[140 rows x 9 columns]" - ] - }, - "execution_count": 14, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "dummies = pd.get_dummies(sample.income_group, prefix='income_group')\n", - "data_with_dummies = sample.drop('income_group', axis=1).join(dummies)\n", - "data_with_dummies" - ] - }, - { - "cell_type": "markdown", - "id": "aa5d4bd1", - "metadata": {}, - "source": [ - "## Creating Model Descriptions with Patsy\n", - "\n", - "Patsy is a Python package that allows data transformations using arbitrary Python code.\n", - "\n", - "Patsy describes statistical models with a string-based \"formula syntax\":\n", - "\n", - "```\n", - "y ~ x0 + x1\n", - "```\n", - "\n", - "where `x0`, `x1` are terms in the design matrix created for the model. " - ] - }, - { - "cell_type": "code", - "execution_count": 15, - "id": "126d823a", - "metadata": {}, - "outputs": [], - "source": [ - "outcome, predictors = patsy.dmatrices(\n", - " 'life_expectancy ~ income_group + region + year', data=sample\n", - ")" - ] - }, - { - "cell_type": "markdown", - "id": "4dddf14c", - "metadata": {}, - "source": [ - "`outcome` is a `DesignMatrix` object that represents `life_expectancy` which is the variable we want to predict." - ] - }, - { - "cell_type": "code", - "execution_count": 16, - "id": "17476384", - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "text/plain": [ - "DesignMatrix with shape (140, 1)\n", - " life_expectancy\n", - " 67.68049\n", - " 69.77073\n", - " 46.22322\n", - " 70.97317\n", - " 49.26980\n", - " 59.19907\n", - " 47.08398\n", - " 65.13822\n", - " 68.57780\n", - " 67.86585\n", - " 62.05200\n", - " 36.42517\n", - " 62.45951\n", - " 58.78159\n", - " 44.10078\n", - " 62.99259\n", - " 39.80690\n", - " 47.81127\n", - " 52.67476\n", - " 40.49602\n", - " 64.71500\n", - " 72.20780\n", - " 45.31688\n", - " 51.56868\n", - " 50.15834\n", - " 54.96951\n", - " 54.44346\n", - " 40.50444\n", - " 70.29000\n", - " 61.81917\n", - " [110 rows omitted]\n", - " Terms:\n", - " 'life_expectancy' (column 0)\n", - " (to view full data, use np.asarray(this_obj))" - ] - }, - "execution_count": 16, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "outcome" - ] - }, - { - "cell_type": "markdown", - "id": "7a7a5a74", - "metadata": {}, - "source": [ - "`predictors` is a `DesignMatrix` object that represents the combination of `income_group`, `region`, `year`. \n", - "\n", - "We have 140 examples, 11 features:\n", - "\n", - "* Intercept (column 0): an array of 1s\n", - "* `income group`: columns 1 - 4\n", - "* `region`: columns 4-10\n", - "* `year`: columns 10\n", - "\n", - "The extra columns for `income_group` and `regions` are from One-Hot encoding, a process by which categorical variables are converted into a form that could be provided to ML algorithms." - ] - }, - { - "cell_type": "code", - "execution_count": 17, - "id": "03d0058d", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "DesignMatrix with shape (140, 11)\n", - " Columns:\n", - " ['Intercept',\n", - " 'income_group[T.Low income]',\n", - " 'income_group[T.Lower middle income]',\n", - " 'income_group[T.Upper middle income]',\n", - " 'region[T.Europe & Central Asia]',\n", - " 'region[T.Latin America & Caribbean]',\n", - " 'region[T.Middle East & North Africa]',\n", - " 'region[T.North America]',\n", - " 'region[T.South Asia]',\n", - " 'region[T.Sub-Saharan Africa]',\n", - " 'year']\n", - " Terms:\n", - " 'Intercept' (column 0)\n", - " 'income_group' (columns 1:4)\n", - " 'region' (columns 4:10)\n", - " 'year' (column 10)\n", - " (to view full data, use np.asarray(this_obj))" - ] - }, - "execution_count": 17, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "predictors" - ] - }, - { - "cell_type": "markdown", - "id": "553c0994", - "metadata": {}, - "source": [ - "These Patsy DesignMatrix instances are NumPy ndarrays with additional metadata:" - ] - }, - { - "cell_type": "code", - "execution_count": 18, - "id": "a72fa89f", - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "text/plain": [ - "array([[1.000e+00, 0.000e+00, 0.000e+00, ..., 0.000e+00, 0.000e+00,\n", - " 1.960e+03],\n", - " [1.000e+00, 0.000e+00, 0.000e+00, ..., 0.000e+00, 0.000e+00,\n", - " 1.960e+03],\n", - " [1.000e+00, 0.000e+00, 1.000e+00, ..., 1.000e+00, 0.000e+00,\n", - " 1.961e+03],\n", - " ...,\n", - " [1.000e+00, 0.000e+00, 0.000e+00, ..., 0.000e+00, 0.000e+00,\n", - " 2.014e+03],\n", - " [1.000e+00, 0.000e+00, 0.000e+00, ..., 0.000e+00, 0.000e+00,\n", - " 2.014e+03],\n", - " [1.000e+00, 1.000e+00, 0.000e+00, ..., 0.000e+00, 1.000e+00,\n", - " 2.015e+03]])" - ] - }, - "execution_count": 18, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "np.asarray(predictors)" - ] - }, - { - "cell_type": "code", - "execution_count": 19, - "id": "2fbbd225", - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "text/plain": [ - "array([[67.6804878 ],\n", - " [69.77073171],\n", - " [46.22321951],\n", - " [70.97317073],\n", - " [49.26980488],\n", - " [59.19907317],\n", - " [47.08397561],\n", - " [65.13821951],\n", - " [68.57780488],\n", - " [67.86585366],\n", - " [62.052 ],\n", - " [36.42517073],\n", - " [62.4595122 ],\n", - " [58.78158537],\n", - " [44.10078049],\n", - " [62.99258537],\n", - " [39.80690244],\n", - " [47.81126829],\n", - " [52.6747561 ],\n", - " [40.49602439],\n", - " [64.715 ],\n", - " [72.20780488],\n", - " [45.31687805],\n", - " [51.56868293],\n", - " [50.15834146],\n", - " [54.9695122 ],\n", - " [54.44346341],\n", - " [40.50443902],\n", - " [70.29 ],\n", - " [61.81917073],\n", - " [65.26039024],\n", - " [63.44317073],\n", - " [52.57014634],\n", - " [41.08814634],\n", - " [72.93390244],\n", - " [49.15112195],\n", - " [39.2874878 ],\n", - " [55.758 ],\n", - " [71.95609756],\n", - " [69.4997561 ],\n", - " [65.27317073],\n", - " [69.14282927],\n", - " [49.2705122 ],\n", - " [72.85609756],\n", - " [39.57570732],\n", - " [42.60082927],\n", - " [48.20839024],\n", - " [61.87890244],\n", - " [56.39151219],\n", - " [72.1047561 ],\n", - " [61.92519512],\n", - " [62.50690244],\n", - " [73.80487805],\n", - " [54.14907317],\n", - " [52.88729268],\n", - " [57.64136585],\n", - " [74.75680488],\n", - " [72.79165854],\n", - " [72.30463415],\n", - " [73.70229268],\n", - " [55.33439024],\n", - " [73.08736585],\n", - " [73.21951219],\n", - " [58.7997561 ],\n", - " [63.19268293],\n", - " [60.82295122],\n", - " [63.11712195],\n", - " [49.82563415],\n", - " [68.06341463],\n", - " [45.98436585],\n", - " [75. ],\n", - " [50.5442439 ],\n", - " [55.86090244],\n", - " [63.79856098],\n", - " [66.79756098],\n", - " [70.07663415],\n", - " [66.94243902],\n", - " [74.76585366],\n", - " [41.50802439],\n", - " [50.89848781],\n", - " [76.80926829],\n", - " [42.96207317],\n", - " [67.60790244],\n", - " [61.70465854],\n", - " [69.53434146],\n", - " [72.53953658],\n", - " [75.21463415],\n", - " [75.36585366],\n", - " [69.59219512],\n", - " [65.48126829],\n", - " [70.396 ],\n", - " [69.27507317],\n", - " [71.90080488],\n", - " [60.41880488],\n", - " [75.41951219],\n", - " [53.52482927],\n", - " [73.5357561 ],\n", - " [57.59482927],\n", - " [77.86195122],\n", - " [61.48529268],\n", - " [31.63451219],\n", - " [48.07760976],\n", - " [37.04680488],\n", - " [73.92021951],\n", - " [71.91039024],\n", - " [77.30617073],\n", - " [78.71707317],\n", - " [47.43409756],\n", - " [79.36829268],\n", - " [75.21721951],\n", - " [77.48780488],\n", - " [77.24670732],\n", - " [67.3674878 ],\n", - " [66.16097561],\n", - " [80.29268293],\n", - " [61.13973171],\n", - " [77.90646342],\n", - " [56.00590244],\n", - " [69.27785366],\n", - " [77.4487561 ],\n", - " [73.70243902],\n", - " [69.20285366],\n", - " [74.81814634],\n", - " [74.03841463],\n", - " [81.19756098],\n", - " [73.58173171],\n", - " [78.64146342],\n", - " [70.47195122],\n", - " [72.72439024],\n", - " [80.84634146],\n", - " [67.28987805],\n", - " [81.40731707],\n", - " [80.57243902],\n", - " [65.85785366],\n", - " [66.18336585],\n", - " [78.82439024],\n", - " [74.71829268],\n", - " [81.95660976],\n", - " [77.08509756],\n", - " [63.55402439]])" - ] - }, - "execution_count": 19, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "np.asarray(outcome)" - ] - }, - { - "cell_type": "markdown", - "id": "794897df", - "metadata": {}, - "source": [ - "Suppress the intercept by adding the term + 0 to the model:" - ] - }, - { - "cell_type": "code", - "execution_count": 21, - "id": "79d9d059", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "DesignMatrix with shape (140, 11)\n", - " Columns:\n", - " ['income_group[High income]',\n", - " 'income_group[Low income]',\n", - " 'income_group[Lower middle income]',\n", - " 'income_group[Upper middle income]',\n", - " 'region[T.Europe & Central Asia]',\n", - " 'region[T.Latin America & Caribbean]',\n", - " 'region[T.Middle East & North Africa]',\n", - " 'region[T.North America]',\n", - " 'region[T.South Asia]',\n", - " 'region[T.Sub-Saharan Africa]',\n", - " 'year']\n", - " Terms:\n", - " 'income_group' (columns 0:4)\n", - " 'region' (columns 4:10)\n", - " 'year' (column 10)\n", - " (to view full data, use np.asarray(this_obj))" - ] - }, - "execution_count": 21, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "y, X = patsy.dmatrices(\n", - " 'life_expectancy ~ income_group + region + year + 0', data=sample\n", - ")\n", - "X" - ] - }, - { - "cell_type": "markdown", - "id": "a49e4994", - "metadata": {}, - "source": [ - "The Patsy objects can be passed directly into algorithms like `numpy.linalg.lstsq`, which performs an ordinary least squares regression. The model metadata is retained in the design_info attribute:" - ] - }, - { - "cell_type": "code", - "execution_count": 22, - "id": "61293370", - "metadata": {}, - "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "/tmp/ipykernel_133/2774571045.py:1: FutureWarning: `rcond` parameter will change to the default of machine precision times ``max(M, N)`` where M and N are the input matrix dimensions.\n", - "To use the future default and silence this warning we advise to pass `rcond=None`, to keep using the old, explicitly pass `rcond=-1`.\n", - " coef, resid, _, _ = np.linalg.lstsq(outcome, predictors)\n" - ] - }, - { - "data": { - "text/plain": [ - "array([[1.51964589e-02, 9.81119953e-04, 4.41412174e-03, 2.65860943e-03,\n", - " 2.40182610e-03, 2.29746257e-03, 2.26362465e-03, 2.58305159e-03,\n", - " 1.77260076e-03, 1.76133204e-03, 3.02004208e+01]])" - ] - }, - "execution_count": 22, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "coef, resid, _, _ = np.linalg.lstsq(outcome, predictors)\n", - "coef" - ] - }, - { - "cell_type": "markdown", - "id": "4e58472e", - "metadata": {}, - "source": [ - "You can reattach the model column names to the fitted coefficients to obtain a Series, for example:" - ] - }, - { - "cell_type": "code", - "execution_count": 23, - "id": "22eb822b", - "metadata": {}, - "outputs": [ - { - "ename": "ValueError", - "evalue": "Length of values (11) does not match length of index (1)", - "output_type": "error", - "traceback": [ - "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", - "\u001b[0;31mValueError\u001b[0m Traceback (most recent call last)", - "Input \u001b[0;32mIn [23]\u001b[0m, in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[0;32m----> 1\u001b[0m coef \u001b[38;5;241m=\u001b[39m \u001b[43mpd\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mSeries\u001b[49m\u001b[43m(\u001b[49m\u001b[43mcoef\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43msqueeze\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mindex\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43moutcome\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mdesign_info\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mcolumn_names\u001b[49m\u001b[43m)\u001b[49m\n", - "File \u001b[0;32m/cloud/lib/lib/python3.9/site-packages/pandas/core/series.py:461\u001b[0m, in \u001b[0;36mSeries.__init__\u001b[0;34m(self, data, index, dtype, name, copy, fastpath)\u001b[0m\n\u001b[1;32m 459\u001b[0m index \u001b[38;5;241m=\u001b[39m default_index(\u001b[38;5;28mlen\u001b[39m(data))\n\u001b[1;32m 460\u001b[0m \u001b[38;5;28;01melif\u001b[39;00m is_list_like(data):\n\u001b[0;32m--> 461\u001b[0m \u001b[43mcom\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mrequire_length_match\u001b[49m\u001b[43m(\u001b[49m\u001b[43mdata\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mindex\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 463\u001b[0m \u001b[38;5;66;03m# create/copy the manager\u001b[39;00m\n\u001b[1;32m 464\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28misinstance\u001b[39m(data, (SingleBlockManager, SingleArrayManager)):\n", - "File \u001b[0;32m/cloud/lib/lib/python3.9/site-packages/pandas/core/common.py:561\u001b[0m, in \u001b[0;36mrequire_length_match\u001b[0;34m(data, index)\u001b[0m\n\u001b[1;32m 557\u001b[0m \u001b[38;5;124;03m\"\"\"\u001b[39;00m\n\u001b[1;32m 558\u001b[0m \u001b[38;5;124;03mCheck the length of data matches the length of the index.\u001b[39;00m\n\u001b[1;32m 559\u001b[0m \u001b[38;5;124;03m\"\"\"\u001b[39;00m\n\u001b[1;32m 560\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mlen\u001b[39m(data) \u001b[38;5;241m!=\u001b[39m \u001b[38;5;28mlen\u001b[39m(index):\n\u001b[0;32m--> 561\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(\n\u001b[1;32m 562\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mLength of values \u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 563\u001b[0m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124m(\u001b[39m\u001b[38;5;132;01m{\u001b[39;00m\u001b[38;5;28mlen\u001b[39m(data)\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m) \u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 564\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mdoes not match length of index \u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 565\u001b[0m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124m(\u001b[39m\u001b[38;5;132;01m{\u001b[39;00m\u001b[38;5;28mlen\u001b[39m(index)\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m)\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 566\u001b[0m )\n", - "\u001b[0;31mValueError\u001b[0m: Length of values (11) does not match length of index (1)" - ] - } - ], - "source": [ - "coef = pd.Series(coef.squeeze(), index=outcome.design_info.column_names)" - ] - }, - { - "cell_type": "markdown", - "id": "09225d58", - "metadata": {}, - "source": [ - "### Data Transformations in Patsy Formulas\n", - "You can mix Python code into your Patsy formulas:" - ] - }, - { - "cell_type": "code", - "execution_count": 24, - "id": "151f2be8", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "DesignMatrix with shape (140, 1)\n", - " np.round(life_expectancy)\n", - " 68\n", - " 70\n", - " 46\n", - " 71\n", - " 49\n", - " 59\n", - " 47\n", - " 65\n", - " 69\n", - " 68\n", - " 62\n", - " 36\n", - " 62\n", - " 59\n", - " 44\n", - " 63\n", - " 40\n", - " 48\n", - " 53\n", - " 40\n", - " 65\n", - " 72\n", - " 45\n", - " 52\n", - " 50\n", - " 55\n", - " 54\n", - " 41\n", - " 70\n", - " 62\n", - " [110 rows omitted]\n", - " Terms:\n", - " 'np.round(life_expectancy)' (column 0)\n", - " (to view full data, use np.asarray(this_obj))" - ] - }, - "execution_count": 24, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "outcome, predictors = patsy.dmatrices(\n", - " 'np.round(life_expectancy) ~ income_group + region + year', data=sample\n", - ")\n", - "\n", - "outcome" - ] - }, - { - "cell_type": "markdown", - "id": "e993d70b", - "metadata": {}, - "source": [ - "Some commonly used variable transformations include standardizing (to mean 0 and variance 1) and centering (subtracting the mean). Patsy has built-in functions for this purpose:" - ] - }, - { - "cell_type": "code", - "execution_count": 25, - "id": "414b0aef", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "DesignMatrix with shape (140, 11)\n", - " Columns:\n", - " ['Intercept',\n", - " 'income_group[T.Low income]',\n", - " 'income_group[T.Lower middle income]',\n", - " 'income_group[T.Upper middle income]',\n", - " 'region[T.Europe & Central Asia]',\n", - " 'region[T.Latin America & Caribbean]',\n", - " 'region[T.Middle East & North Africa]',\n", - " 'region[T.North America]',\n", - " 'region[T.South Asia]',\n", - " 'region[T.Sub-Saharan Africa]',\n", - " 'center(year)']\n", - " Terms:\n", - " 'Intercept' (column 0)\n", - " 'income_group' (columns 1:4)\n", - " 'region' (columns 4:10)\n", - " 'center(year)' (column 10)\n", - " (to view full data, use np.asarray(this_obj))" - ] - }, - "execution_count": 25, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "outcome, predictors = patsy.dmatrices(\n", - " 'life_expectancy ~ income_group + region + center(year)', data=sample\n", - ")\n", - "\n", - "predictors" - ] - }, - { - "cell_type": "markdown", - "id": "77e4a959", - "metadata": {}, - "source": [ - "The patsy.build_design_matrices function can apply transformations to new out-of-sample data using the saved information from the original in-sample dataset:" - ] - }, - { - "cell_type": "code", - "execution_count": 26, - "id": "3fb2addc", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "[DesignMatrix with shape (140, 11)\n", - " Columns:\n", - " ['Intercept',\n", - " 'income_group[T.Low income]',\n", - " 'income_group[T.Lower middle income]',\n", - " 'income_group[T.Upper middle income]',\n", - " 'region[T.Europe & Central Asia]',\n", - " 'region[T.Latin America & Caribbean]',\n", - " 'region[T.Middle East & North Africa]',\n", - " 'region[T.North America]',\n", - " 'region[T.South Asia]',\n", - " 'region[T.Sub-Saharan Africa]',\n", - " 'center(year)']\n", - " Terms:\n", - " 'Intercept' (column 0)\n", - " 'income_group' (columns 1:4)\n", - " 'region' (columns 4:10)\n", - " 'center(year)' (column 10)\n", - " (to view full data, use np.asarray(this_obj))]" - ] - }, - "execution_count": 26, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "outcome, predictors = patsy.dmatrices(\n", - " 'life_expectancy ~ income_group + region + center(year)', data=sample\n", - ")\n", - "\n", - "new_df = df.groupby(\"region\").sample(n=20).sort_values(by=\"year\")\n", - "\n", - "new_predictors = patsy.build_design_matrices([predictors.design_info], new_df)\n", - "\n", - "new_predictors" - ] - }, - { - "cell_type": "markdown", - "id": "fec49531", - "metadata": {}, - "source": [ - "When you want to add columns from a dataset by name, you must wrap them in the special I function:" - ] - }, - { - "cell_type": "code", - "execution_count": 27, - "id": "6c2894b6", - "metadata": {}, - "outputs": [], - "source": [ - "outcome, predictors = patsy.dmatrices(\n", - " 'life_expectancy ~ income_group + region + I(year + year)', data=sample # nonsense\n", - ")" - ] - }, - { - "cell_type": "markdown", - "id": "17be1de1", - "metadata": {}, - "source": [ - "## Categorical Data and Patsy\n", - "\n", - "When you use nonnumeric terms in a Patsy formula, they are converted to dummy variables by default." - ] - }, - { - "cell_type": "code", - "execution_count": 28, - "id": "2ed7d773", - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "text/plain": [ - "DesignMatrix with shape (140, 11)\n", - " Columns:\n", - " ['Intercept',\n", - " 'income_group[T.Low income]',\n", - " 'income_group[T.Lower middle income]',\n", - " 'income_group[T.Upper middle income]',\n", - " 'region[T.Europe & Central Asia]',\n", - " 'region[T.Latin America & Caribbean]',\n", - " 'region[T.Middle East & North Africa]',\n", - " 'region[T.North America]',\n", - " 'region[T.South Asia]',\n", - " 'region[T.Sub-Saharan Africa]',\n", - " 'year']\n", - " Terms:\n", - " 'Intercept' (column 0)\n", - " 'income_group' (columns 1:4)\n", - " 'region' (columns 4:10)\n", - " 'year' (column 10)\n", - " (to view full data, use np.asarray(this_obj))" - ] - }, - "execution_count": 28, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "outcome, predictors = patsy.dmatrices(\n", - " 'life_expectancy ~ income_group + region + year', data=sample\n", - ")\n", - "\n", - "predictors" - ] - }, - { - "cell_type": "markdown", - "id": "e7b9c1bc", - "metadata": {}, - "source": [ - "If you omit the intercept from the model, then columns for each category value will be included in the model design matrix:" - ] - }, - { - "cell_type": "code", - "execution_count": 31, - "id": "921b5eda", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "DesignMatrix with shape (140, 11)\n", - " Columns:\n", - " ['region[East Asia & Pacific]',\n", - " 'region[Europe & Central Asia]',\n", - " 'region[Latin America & Caribbean]',\n", - " 'region[Middle East & North Africa]',\n", - " 'region[North America]',\n", - " 'region[South Asia]',\n", - " 'region[Sub-Saharan Africa]',\n", - " 'income_group[T.Low income]',\n", - " 'income_group[T.Lower middle income]',\n", - " 'income_group[T.Upper middle income]',\n", - " 'year']\n", - " Terms:\n", - " 'region' (columns 0:7)\n", - " 'income_group' (columns 7:10)\n", - " 'year' (column 10)\n", - " (to view full data, use np.asarray(this_obj))" - ] - }, - "execution_count": 31, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "outcome, predictors = patsy.dmatrices(\n", - " 'life_expectancy ~ region + income_group + year + 0', data=sample # will have all levels but only for the FIRST categorical variable\n", - ")\n", - "\n", - "predictors" - ] - }, - { - "cell_type": "markdown", - "id": "c521480c", - "metadata": {}, - "source": [ - "Numeric columns can be interpreted as categorical with the C function:" - ] - }, - { - "cell_type": "code", - "execution_count": 32, - "id": "49eabe99", - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "text/plain": [ - "DesignMatrix with shape (140, 60)\n", - " Columns:\n", - " ['Intercept',\n", - " 'income_group[T.Low income]',\n", - " 'income_group[T.Lower middle income]',\n", - " 'income_group[T.Upper middle income]',\n", - " 'region[T.Europe & Central Asia]',\n", - " 'region[T.Latin America & Caribbean]',\n", - " 'region[T.Middle East & North Africa]',\n", - " 'region[T.North America]',\n", - " 'region[T.South Asia]',\n", - " 'region[T.Sub-Saharan Africa]',\n", - " 'C(year)[T.1961]',\n", - " 'C(year)[T.1962]',\n", - " 'C(year)[T.1963]',\n", - " 'C(year)[T.1964]',\n", - " 'C(year)[T.1965]',\n", - " 'C(year)[T.1967]',\n", - " 'C(year)[T.1968]',\n", - " 'C(year)[T.1969]',\n", - " 'C(year)[T.1970]',\n", - " 'C(year)[T.1971]',\n", - " 'C(year)[T.1972]',\n", - " 'C(year)[T.1973]',\n", - " 'C(year)[T.1974]',\n", - " 'C(year)[T.1975]',\n", - " 'C(year)[T.1976]',\n", - " 'C(year)[T.1977]',\n", - " 'C(year)[T.1978]',\n", - " 'C(year)[T.1979]',\n", - " 'C(year)[T.1980]',\n", - " 'C(year)[T.1981]',\n", - " 'C(year)[T.1982]',\n", - " 'C(year)[T.1983]',\n", - " 'C(year)[T.1984]',\n", - " 'C(year)[T.1985]',\n", - " 'C(year)[T.1986]',\n", - " 'C(year)[T.1987]',\n", - " 'C(year)[T.1988]',\n", - " 'C(year)[T.1989]',\n", - " 'C(year)[T.1990]',\n", - " 'C(year)[T.1991]',\n", - " 'C(year)[T.1992]',\n", - " 'C(year)[T.1993]',\n", - " 'C(year)[T.1994]',\n", - " 'C(year)[T.1995]',\n", - " 'C(year)[T.1996]',\n", - " 'C(year)[T.1998]',\n", - " 'C(year)[T.1999]',\n", - " 'C(year)[T.2001]',\n", - " 'C(year)[T.2003]',\n", - " 'C(year)[T.2004]',\n", - " 'C(year)[T.2006]',\n", - " 'C(year)[T.2007]',\n", - " 'C(year)[T.2008]',\n", - " 'C(year)[T.2009]',\n", - " 'C(year)[T.2010]',\n", - " 'C(year)[T.2011]',\n", - " 'C(year)[T.2012]',\n", - " 'C(year)[T.2013]',\n", - " 'C(year)[T.2014]',\n", - " 'C(year)[T.2015]']\n", - " Terms:\n", - " 'Intercept' (column 0)\n", - " 'income_group' (columns 1:4)\n", - " 'region' (columns 4:10)\n", - " 'C(year)' (columns 10:60)\n", - " (to view full data, use np.asarray(this_obj))" - ] - }, - "execution_count": 32, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "outcome, predictors = patsy.dmatrices(\n", - " 'life_expectancy ~ income_group + region + C(year)', data=sample\n", - ")\n", - "\n", - "predictors" - ] - }, - { - "cell_type": "markdown", - "id": "deb69c77", - "metadata": {}, - "source": [ - "Include interaction terms of the form `key1:key2`:" - ] - }, - { - "cell_type": "code", - "execution_count": 33, - "id": "06586b1c", - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "text/plain": [ - "DesignMatrix with shape (140, 29)\n", - " Columns:\n", - " ['Intercept',\n", - " 'income_group[T.Low income]',\n", - " 'income_group[T.Lower middle income]',\n", - " 'income_group[T.Upper middle income]',\n", - " 'region[T.Europe & Central Asia]',\n", - " 'region[T.Latin America & Caribbean]',\n", - " 'region[T.Middle East & North Africa]',\n", - " 'region[T.North America]',\n", - " 'region[T.South Asia]',\n", - " 'region[T.Sub-Saharan Africa]',\n", - " 'income_group[T.Low income]:region[T.Europe & Central Asia]',\n", - " 'income_group[T.Lower middle income]:region[T.Europe & Central Asia]',\n", - " 'income_group[T.Upper middle income]:region[T.Europe & Central Asia]',\n", - " 'income_group[T.Low income]:region[T.Latin America & Caribbean]',\n", - " 'income_group[T.Lower middle income]:region[T.Latin America & Caribbean]',\n", - " 'income_group[T.Upper middle income]:region[T.Latin America & Caribbean]',\n", - " 'income_group[T.Low income]:region[T.Middle East & North Africa]',\n", - " 'income_group[T.Lower middle income]:region[T.Middle East & North Africa]',\n", - " 'income_group[T.Upper middle income]:region[T.Middle East & North Africa]',\n", - " 'income_group[T.Low income]:region[T.North America]',\n", - " 'income_group[T.Lower middle income]:region[T.North America]',\n", - " 'income_group[T.Upper middle income]:region[T.North America]',\n", - " 'income_group[T.Low income]:region[T.South Asia]',\n", - " 'income_group[T.Lower middle income]:region[T.South Asia]',\n", - " 'income_group[T.Upper middle income]:region[T.South Asia]',\n", - " 'income_group[T.Low income]:region[T.Sub-Saharan Africa]',\n", - " 'income_group[T.Lower middle income]:region[T.Sub-Saharan Africa]',\n", - " 'income_group[T.Upper middle income]:region[T.Sub-Saharan Africa]',\n", - " 'year']\n", - " Terms:\n", - " 'Intercept' (column 0)\n", - " 'income_group' (columns 1:4)\n", - " 'region' (columns 4:10)\n", - " 'income_group:region' (columns 10:28)\n", - " 'year' (column 28)\n", - " (to view full data, use np.asarray(this_obj))" - ] - }, - "execution_count": 33, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "outcome, predictors = patsy.dmatrices(\n", - " 'life_expectancy ~ income_group + region + year + income_group:region', data=sample\n", - ")\n", - "\n", - "predictors" - ] - }, - { - "cell_type": "markdown", - "id": "637ed2bb", - "metadata": {}, - "source": [ - "## Introduction to statsmodels\n", - "\n", - "statsmodels is a Python library for fitting many kinds of statistical models, performing statistical tests, and data exploration and visualization.\n", - "\n", - "* Linear models, generalized linear models, and robust linear models\n", - "* Linear mixed effects models\n", - "* Analysis of variance (ANOVA) methods\n", - "* Time series processes and state space models\n", - "* Generalized method of moments\n", - "\n", - "The sm.add_constant function can add an intercept column to an existing matrix:" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "7c47eb74", - "metadata": {}, - "outputs": [], - "source": [ - "predictor_model = sm.add_constant(predictors)\n", - "predictor_model" - ] - }, - { - "cell_type": "markdown", - "id": "645a6d65", - "metadata": {}, - "source": [ - "The sm.OLS class can fit an ordinary least squares linear regression. The model's fit method returns a regression results object containing estimated model parameters and other diagnostics:" - ] - }, - { - "cell_type": "code", - "execution_count": 34, - "id": "b0caf599", - "metadata": {}, - "outputs": [], - "source": [ - "outcome, predictors = patsy.dmatrices(\n", - " 'life_expectancy ~ income_group + region + year', data=sample\n", - ")\n", - "\n", - "predictors_org = np.asarray(predictors)\n", - "\n", - "rng = np.random.default_rng(seed=12345)\n", - "\n", - "def dnorm(mean, variance, size=1):\n", - " if isinstance(size, int):\n", - " size = size,\n", - " return mean + np.sqrt(variance) * rng.standard_normal(*size)\n", - "\n", - "predictors = np.c_[np.hstack(tuple(predictors_org[:, [2]])),\n", - " np.hstack(tuple(predictors_org[:, [0]])),\n", - " np.hstack(tuple(predictors_org[:, [1]]))]\n", - "\n", - "predictors = np.asarray(predictors)\n", - "\n", - "eps = dnorm(0, 0.1, size=len(sample))\n", - "beta = dnorm(45,82, size=3)\n", - "\n", - "outcome = np.dot(predictors, beta) + eps" - ] - }, - { - "cell_type": "code", - "execution_count": 35, - "id": "1b0d6932", - "metadata": {}, - "outputs": [], - "source": [ - "X_model = sm.add_constant(X)" - ] - }, - { - "cell_type": "code", - "execution_count": 36, - "id": "030b8865", - "metadata": {}, - "outputs": [], - "source": [ - "model = sm.OLS(y, X)" - ] - }, - { - "cell_type": "code", - "execution_count": 37, - "id": "7d0471ee", - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - " OLS Regression Results \n", - "==============================================================================\n", - "Dep. Variable: life_expectancy R-squared: 0.835\n", - "Model: OLS Adj. R-squared: 0.823\n", - "Method: Least Squares F-statistic: 65.48\n", - "Date: Fri, 28 Oct 2022 Prob (F-statistic): 1.19e-45\n", - "Time: 20:29:40 Log-Likelihood: -420.62\n", - "No. Observations: 140 AIC: 863.2\n", - "Df Residuals: 129 BIC: 895.6\n", - "Df Model: 10 \n", - "Covariance Type: nonrobust \n", - "========================================================================================================\n", - " coef std err t P>|t| [0.025 0.975]\n", - "--------------------------------------------------------------------------------------------------------\n", - "income_group[High income] -601.0677 53.808 -11.171 0.000 -707.527 -494.608\n", - "income_group[Low income] -620.7862 53.651 -11.571 0.000 -726.936 -514.637\n", - "income_group[Lower middle income] -609.9033 53.643 -11.370 0.000 -716.038 -503.769\n", - "income_group[Upper middle income] -606.9144 53.554 -11.333 0.000 -712.872 -500.957\n", - "region[T.Europe & Central Asia] 5.3402 1.741 3.066 0.003 1.895 8.786\n", - "region[T.Latin America & Caribbean] 3.2563 1.712 1.903 0.059 -0.130 6.643\n", - "region[T.Middle East & North Africa] 1.4021 1.675 0.837 0.404 -1.913 4.717\n", - "region[T.North America] 6.2191 1.894 3.283 0.001 2.471 9.967\n", - "region[T.South Asia] -4.7101 1.684 -2.797 0.006 -8.042 -1.378\n", - "region[T.Sub-Saharan Africa] -5.0216 1.824 -2.753 0.007 -8.630 -1.413\n", - "year 0.3371 0.027 12.481 0.000 0.284 0.391\n", - "==============================================================================\n", - "Omnibus: 11.871 Durbin-Watson: 1.996\n", - "Prob(Omnibus): 0.003 Jarque-Bera (JB): 13.271\n", - "Skew: -0.586 Prob(JB): 0.00131\n", - "Kurtosis: 3.950 Cond. No. 4.96e+05\n", - "==============================================================================\n", - "\n", - "Notes:\n", - "[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.\n", - "[2] The condition number is large, 4.96e+05. This might indicate that there are\n", - "strong multicollinearity or other numerical problems.\n" - ] - } - ], - "source": [ - "results = model.fit()\n", - "print(results.summary())" - ] - }, - { - "cell_type": "code", - "execution_count": 38, - "id": "00bfd7bb", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "Intercept -601.067727\n", - "income_group[T.Low income] -19.718427\n", - "income_group[T.Lower middle income] -8.835585\n", - "income_group[T.Upper middle income] -5.846721\n", - "region[T.Europe & Central Asia] 5.340173\n", - "region[T.Latin America & Caribbean] 3.256349\n", - "region[T.Middle East & North Africa] 1.402143\n", - "region[T.North America] 6.219114\n", - "region[T.South Asia] -4.710066\n", - "region[T.Sub-Saharan Africa] -5.021556\n", - "year 0.337079\n", - "dtype: float64" - ] - }, - "execution_count": 38, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "results = smf.ols('life_expectancy ~ income_group + region + year', data=sample).fit()\n", - "results.params" - ] - }, - { - "cell_type": "code", - "execution_count": 39, - "id": "4b610642", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "Intercept -11.170682\n", - "income_group[T.Low income] -9.011465\n", - "income_group[T.Lower middle income] -6.207760\n", - "income_group[T.Upper middle income] -4.309918\n", - "region[T.Europe & Central Asia] 3.066470\n", - "region[T.Latin America & Caribbean] 1.902550\n", - "region[T.Middle East & North Africa] 0.836912\n", - "region[T.North America] 3.283279\n", - "region[T.South Asia] -2.797053\n", - "region[T.Sub-Saharan Africa] -2.753270\n", - "year 12.480643\n", - "dtype: float64" - ] - }, - "execution_count": 39, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "results.tvalues" - ] - }, - { - "cell_type": "code", - "execution_count": 40, - "id": "9ff1b2fd", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "187 64.948099\n", - "248 65.827040\n", - "444 46.399354\n", - "274 59.945005\n", - "367 51.109421\n", - "dtype: float64" - ] - }, - "execution_count": 40, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "results.predict(sample[:5])" - ] - }, - { - "cell_type": "markdown", - "id": "bc2bc15c", - "metadata": {}, - "source": [ - "## Estimating Time Series Processes\n", - "\n", - "Another class of models in statsmodels is for time series analysis. " - ] - }, - { - "cell_type": "code", - "execution_count": 41, - "id": "89121562", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "array([ 0.01141668, 0.8278958 , -0.42492365, -0.01487685, 0.03721318,\n", - " -0.03644228])" - ] - }, - "execution_count": 41, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "init_x = 4\n", - "\n", - "values = [init_x, init_x]\n", - "N = 1000\n", - "\n", - "# AR(2) structure with two lags\n", - "b0 = 0.8\n", - "b1 = -0.4\n", - "noise = dnorm(0, 0.1, N)\n", - "for i in range(N):\n", - " new_x = values[-1] * b0 + values[-2] * b1 + noise[i]\n", - " values.append(new_x)\n", - " \n", - "# fit with more lags\n", - "\n", - "MAXLAGS = 5\n", - "model = AutoReg(values, MAXLAGS)\n", - "results = model.fit()\n", - "results.params" - ] - }, - { - "cell_type": "markdown", - "id": "90200d51", - "metadata": {}, - "source": [ - "## Introduction to scikit-learn\n", - "\n", - "scikit-learn is one of the most widely used and trusted general-purpose Python machine learning toolkits. \n" - ] - }, - { - "cell_type": "code", - "execution_count": 42, - "id": "87bd3ba3", - "metadata": {}, - "outputs": [], - "source": [ - "X_train, X_test, y_train, y_test = train_test_split(\n", - " sample[['income_group', 'region', 'year']], sample[['life_expectancy']], test_size=0.2, random_state=42)" - ] - }, - { - "cell_type": "code", - "execution_count": 43, - "id": "cce1309f", - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "income_group 0\n", - "region 0\n", - "year 0\n", - "dtype: int64\n", - "life_expectancy 0\n", - "dtype: int64\n" - ] - } - ], - "source": [ - "# make sure there are no missing values\n", - "\n", - "print(X_test.isna().sum())\n", - "print(y_test.isna().sum())" - ] - }, - { - "cell_type": "code", - "execution_count": 44, - "id": "363c22d0", - "metadata": {}, - "outputs": [], - "source": [ - "# can also do it with patsy\n", - "\n", - "outcome, predictors = patsy.dmatrices(\n", - " 'life_expectancy ~ income_group + region + year', data=sample\n", - ")\n", - "\n", - "X_train, X_test, y_train, y_test = train_test_split(\n", - " predictors, outcome, test_size=0.2, random_state=42)\n", - "\n", - "y_train = y_train.squeeze()\n", - "y_test = y_test.squeeze()" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "e6dd4ee5", - "metadata": {}, - "outputs": [], - "source": [ - "# impute_value = train['Age'].median()\n", - "# train['Age'] = train['Age'].fillna(impute_value)\n", - "# test['Age'] = test['Age'].fillna(impute_value)" - ] - }, - { - "cell_type": "code", - "execution_count": 45, - "id": "0ff89b5c", - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
LinearRegression()
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
" - ], - "text/plain": [ - "LinearRegression()" - ] - }, - "execution_count": 45, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "model = LinearRegression().fit(X_train, y_train)\n", - "model.fit(X_train, y_train)" - ] - }, - { - "cell_type": "code", - "execution_count": 46, - "id": "99e189b9", - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "text/plain": [ - "array([78.63905681, 41.6710372 , 61.07560873, 77.12346239, 56.24225717,\n", - " 60.82576811, 42.99395169, 41.6710372 , 74.1469048 , 75.46981928,\n", - " 75.04709887, 71.69375575, 63.94037441, 51.82387157, 83.87519146,\n", - " 42.66322307, 35.6275935 , 37.94269386, 63.56521171, 80.89863386,\n", - " 50.99752407, 62.57302584, 58.22662891, 51.61205646, 77.52772236,\n", - " 52.57565731, 79.10783413, 73.92796809])" - ] - }, - "execution_count": 46, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "y_predict = model.predict(X_test)\n", - "y_predict" - ] - }, - { - "cell_type": "code", - "execution_count": 49, - "id": "c5f72a8c", - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "DesignMatrix with shape (140, 1)\n", - " life_expectancy\n", - " 67.68049\n", - " 69.77073\n", - " 46.22322\n", - " 70.97317\n", - " 49.26980\n", - " 59.19907\n", - " 47.08398\n", - " 65.13822\n", - " 68.57780\n", - " 67.86585\n", - " 62.05200\n", - " 36.42517\n", - " 62.45951\n", - " 58.78159\n", - " 44.10078\n", - " 62.99259\n", - " 39.80690\n", - " 47.81127\n", - " 52.67476\n", - " 40.49602\n", - " 64.71500\n", - " 72.20780\n", - " 45.31688\n", - " 51.56868\n", - " 50.15834\n", - " 54.96951\n", - " 54.44346\n", - " 40.50444\n", - " 70.29000\n", - " 61.81917\n", - " [110 rows omitted]\n", - " Terms:\n", - " 'life_expectancy' (column 0)\n", - " (to view full data, use np.asarray(this_obj))" - ] - }, - "execution_count": 49, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "outcome" - ] - }, - { - "cell_type": "markdown", - "id": "5630bc00", - "metadata": {}, - "source": [ - "## Going rogue\n", - "\n", - "Using yellowbrick:\n", - " \n", - "* y is the real avalues\n", - "* y hat is predicted\n", - "* black dotted line is fitted line created by the model\n", - "* grey dotted line is if the predicted values == real values" - ] - }, - { - "cell_type": "code", - "execution_count": 48, - "id": "5d0b74fd", - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQ4AAAEXCAYAAABcaKr2AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjYuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/av/WaAAAACXBIWXMAAAsTAAALEwEAmpwYAAA+cUlEQVR4nO2dd3hUxfrHP7ubRgotlBCCCAGG5kVFLggIEYNIaAIiIAhcmogK4vWi2OAqItafoqBiA1EBKdJBBQ1YKN4oIhFmIdICGKohdZMtvz+2mISU3WRbkvk8Dw+758zOvOdk93veeWfmHY3FYkGhUChcQetrAxQKReVDCYdCoXAZJRwKhcJllHAoFAqXUcKhUChcRgmHQqFwmQBfG+BphBAWIAUwYhXKdOBxKeWOCtb7FNBCSjlOCLED+I+U8udSyk+SUr5ne11meRfsiAO+Av4oek5K2bqi9ZfR9jxgHPCklPKjctYxDhgtpYwv5txhoKeUMq0idjppx3FAA+TYDgUA+4EHpZR/erp9VxFCPAg0lFI+7Yv2q7xw2IiTUqYCCCG6ARuFEEJKed4dlUspbyvtvBBCB7wMvOdM+XJw0tMiUQLDgXsrKsIl4YNrGiWl/B4cf7PXgVeBUV62o0yklG/5sv3qIhwOpJQ/CCGOAjcLIQ4APwIrgRullD1twvI6UAe4ANwjpfxDCFEDWAJ0AY4Dh+112p5Wo6WU3wshxgBP2U7tBSYCW4FatidoX+DbAuWHAbOx/i3OAJOklClCiDlAPaAx0MFmyyAp5VlXrtf2RB8I1AKSgC3APCAVyJdSjirDBnv7n0kpXy9Q76fANcCHQoi5wBrgHVtZE7BUSvmirawFeAKrd9JWSmly0nYL0ARoAbwAJAJ3AiHAOCnlTiFEMFZRvgMIAhZLKefZPn8z8BYQBpiBaVLK7UKIaynydy/atpTSJITYBLxmq0sDPI1VREKAdcAjtnI32uoC+AQYCkzD+j1x9vvVGPgYaAQEAyuklE+WcnwOECOlnCiEuAbrQ+laIB94SUr5se06d9vu3SSgrs1mu63lprrGOAIBg+11PWC/7Y8aAWwEnpBStgDeAD63lfsXEAXEAkOA24tWavtDvQLEAQLrF3YaMB4wSSlbSymPFShv/4PfaXu6bgbeLVDlMOBhW5vnbPWUh9uBKVLKmbb3NwDv2ESjLBsSgISCogEgpRwFnMb6lH4PqxhdllIKoDswVQjRvcBHNFJK4axoFMMNwB4pZRtgEX+L80ygLXAd0A64SwjR33ZuMfCy7brmYxU2O46/e3GN2R4U47H+8AFGA3cD/8T694gF7i/QzmtSypZYu8KtimunjO/Xw8AuKaX9WpoLIRqVcrwgi4FE273vByywfRft7ZullNfZ6ppb3PW6SrUTDiFEX6wC8IPtUCDwhe31LUCqlPJrACnlcqCF7cfVA1grpTRKKS8Cm4qp/nbgRynlGSmlBbgH+L9SzOkNfCulPGp7/z5wqxDC7gnuklKesNX1C9YnfHFcI4Q4XOTfqwXO66WURwq8z5FSfuOkDXullBdKuQY7/bD+oJFSXgLWUlhci7tfrpAhpVxve/0zf9+LAcAiKaVBSpmF9ek8xHbuev7+YX4HNC9QX8G/u51PbffuCHAJq/f1cIF2PpRSpkspjVjv0xCbwHQEltvKLcQaKymundK+X+eAPjaxNUgpR9q8y5KOAyCECMT6N7Tf+xNYPdpetiIBgD3+VPC+VYjq0lVJFELYg6PHgb5SykwhRD2snsAVW7naQKytS2HHANTH6ualFzh+GYgo0k494C/7GyllLoAQoiS76tvqsZdPt7nE9WyHCrZnAnQl1FNWjONSKe/LsqHoZ0uiUD2219Gl2OAqJd2L2sD/2QK1YHXn99lejwKm2Z70Ogr/oAv+3e2MsnUfgwA9sNEmRvZ2HhVCTLa9DwDOY+1yWKSUfwFIKfOFEOdKaKc2JX+//s9m4yIgWgixEJhTynE7kVi9uaLfzQYF2rdfQ2nfIZeoLsLhCI6WwRngkJTypqInhBCXscYJ7NQv5vMXgK4FPlMTqFFKe2nAzQXK18HaF3fmCe8u3GVDGtYv8Unb+0jbMU9zBnhFSlnIo7HFBt4DOksp9wshWmIVgzKRUubZYgivCCFuklKabe1sKBqUFEKEAxohRKiUMtvmqRX33bDbWuz3y8Z8YL4QohXWuNj3Nu/kquMFPnMBMAsh6kgp7cLt8Xtf7boqZbAXaCSE6AwghGguhFhmewLvBgYKIXQ2TyWhmM9vAboJIa61feYdYALWgJXW9uQryNdADyGE3YWeAnxlc4W9hbts2ARMBrDdnyFY4yWeZj0w0fZ30QghnhJC3IH1x5sFHLb9mO22hTtZ7zKsQdB7C7RzrxAi1FbPfUKIsVLKTOAQ1vgHwH1ASUvOS/x+CSHeFUL0tpVLAf4ELCUdt1do+zt9aWsXIUQs1m71dievs1wo4SiAlDIHuAt4UwhxCGvfdJUtxvAeVnf5D6z996L9Y2xezWTgG6xPNwvWqPxZrE+Jk0KIrkXKTwTW29zXHti+AC5SXIzjsBDin05cs7tseAqoY6tjFzBfSrmvjM/YubmI3d+50O5C4ASQjHWkqw3We/0rViHXYxX9jcAeYKczldqCuE8Dc21xjHW2On62XeNArD9YgKnAk0KIZKwB8dMUIx5lfL/eAZ631f27zeYdpRwvyBQgzlbmC2CilPKUM9dZXjQqH4dCUXGEEBqbACCEOA/ESyl/9bFZHkN5HApFBRFCrMI6LIwQohfWIKxT8ZTKivI4FIoKIoRog3XIsy6Qh3U5wVbfWuVZlHAoFAqXqZTDsUlJScFAJ6xBx/LORFQoFCWjwzrN/aeOHTsaip6slMKBVTRcibwrFIrycQuF540AlVc4zgK0atWKoKAgt1R48OBB2rdv75a6vIWy2TtUJpvzTCYeWLOXc5euEB4WVuhcaLCOhUM7E6QrefKoxWJBo9GQl5eHXq8H22+tKJVVOEwAQUFBBAcHu61Sd9blLZTN3qGy2BwM3NCkAZ+evkh+oNlx3Gyx0K1FIyJCQ0v8rF6v5+DBgwwcOLDgA7nYUIAajlUoqhjTerShR0xNwoMDyDOZCA8OYGC7GKb1aFPiZ/R6PevWrUOv13Po0KEy26isHkeJGI1GzGZz2QWLIS8vz83WeB5f2azVagkIqHJfnyqBTqtlVJtI2ne4notZBiLDggkOKLl7YhcNs9lM586dad++fZnfqyrlcWRkZJT7hxQbG+tmazyPL23Oy8sjIyPDZ+0ryiY4QEd0rVCXRCMuLo6TJ0+WWN5OlXlkGI1GdDodoaX04UojPz/fbYFWb+FLm4OCgsjOzsZoNCrPo5JSVDR69uzJc889x7vvvsuaNWsIDAws8bNVxuMwm83qC+xldDpdubuFCt9isVj47bffHKJxyy238Oijj/L666+Tl5dHWlrpq/LVL01RbjQaTdmFFH6JRqNh0KBB/P777wghmDx5MuvWrSMkJISPPvqIuLg4Dh48WOLnq4zHoVAoyubUqVMYjdZUKwEBAbRo0YLRo0ezbt06IiIiWLVqFX369CmzHiUcCoWfYzCaOJOejcFYsdUVer2e5cuX88UXX2AymbBYLIwePZodO3ZQr149NmzYQLdu3ZyqS3VVFAo/xWQ2s2DXIRJT0hzDqnGxDZnWow06rWvP/IKB0Hr16qHVatFoNNx///0cP36clStX0rJlS6frU8LhRbZv305iYiKZmZncdddddO/evewPKaotC3YdYkNyKlqNhuAAHZkGIxuSralzZ8S1c7qeoqMn3bp1c8SnevfuTc+ePV0enVNdFQ+wYsUKunXrxsCBA4mPj2fdunUAxMfHM3fuXP773/+yZcsWt7S1a9cu+vTpQ+/evVm8eHGxZZYsWUK/fv3o378/jzzyCAbD34sdly5dSv/+/enXrx9LliwB4OzZs9x7770kJCTQr18/li5d6hZbFc5jMJpITElDWyQArdVoSExJc7rbUlQ0oqKi6NKlC7t27XKUKc+QvhIOD6DX63nwwQfZsGEDr732Gi+88EKh82+//TajRlV8V0GTycSzzz7L+++/z+bNm9m0aRNHjx4tVCYtLY2PP/6YNWvWsGnTJkwmE5s3b3bYuWrVKlatWsX69etJTEzkxIkT6HQ6Hn/8cbZs2cLKlSv57LPPrqpX4VkuZhm4mHXVanYALmWXfK4gqamphUQjIiKC/v37c/z4cRYtWlQh+7zeVbFlmf4Y634UwcB/sWZufhtrgtcDUsr7S67BvRiMJi5mGQjBRFjZxZ1CSsntt1v3IoqJiXFMpLFYLLzyyiv06NGDdu2cdzVL4uDBgzRt2pQmTZoA0K9fP3bs2EGLFi0KlTOZTOTm5hIQEEBubi4NGli33EhJSeEf//gHNWpYd3Do1KkTX331FZMmTXKUCQ8Pp3nz5qSlpV1Vr8JzRIYFExkWTKbh6mTzdUOt58oiKiqK5s2bExkZiUajYfDgwWRlZdGnTx8+/PDDCtnnixjHOEBKKWcJIaKxZgQ/C0yXUv4khPhMCNHX06nXigaeaoUEEN8qulyBp6Lo9XqaNWuGxWLhk08+YcaMGQAsW7aM3bt3k5GRwYkTJxg5cmSJddxzzz1kZWVddfyxxx6ja1drovTz588TFRXlONewYUMOHDhQqHzDhg0ZP348t956K8HBwXTr1s0RW2nVqhWvv/46ly9fJiQkhF27dl21fDw1NZVDhw7RoUOH8t0MRbkIDtARF9vQEeOwY7ZYiIttWOo0cntWv4CAAAYPHsyWLVuYNGkSeXl53H333bz55pulzgp1Bl8IxwXgH7bXdbDu8NVMSvmT7dhGIB7rxjMeo2jgKaucgaeinD17lqysLCZPnkxaWhpCCB566CEAxowZw5gxY5yq57PPPiu3DQVJT09nx44d7Nixg4iICKZPn8769esZNGgQsbGxTJw4kQkTJlCjRg1at26NtoBoZmVlMW3aNJ544gnCw53djkThLuyrWRNT0riUbaBu6N+jKiWh1+s5cOAA11xj3elx9erVPPDAA5jNZiZNmsQLL7xQ6G9cXrwuHFLKFUKIccK6Y3wdrHtyLixQ5BzWlGVlUnRmW2xsLPn5+WV+zmA0sV1/BovZXCjZgMVsZrv+DGNvaFKqopfGgQMHuOGGG1i8eDFXrlxh2LBh7N692+Un9vjx48nOzr7q+IwZM+jcuTMA9evXJzU11eGZnDp1ijp16hTyVL799luioqIIDg4mLy+Pnj178tNPPxEfHw9AQkICCQnWvaXefPNNGjZsSFZWFvn5+UyfPp0+ffrQrVu3Yr2f/Px8UlJSXLougKSkJJc/42t8ZXOXUDMtWoSAJYR6oYEE6XLZ/8svxZY9efIkO3fuxGKxcPPNN6PT6bhy5Qo6nY577rmHYcOG8UsJn3UVX8Q4RmPd6/QOIUQHrBvIFNz30ul5zO3bt3ckWLGvinUmQpyenk16rrGQOJhMJuuNNhjJJYC6YeVbLHfixAmuu+46wsLCCAsLY8CAAezdu9fRvXCWlStXllmmXbt2pKamcunSJRo2bMjXX3/Nq6++SliBzE/NmjXjvffeQ6vVEhISws8//0z79u0dZS5evEhkZCRnzpwhMTGRzz//nNDQUB577DFatWrFffeVvDdTXl4e1113nUtR+aSkJDp27Oh0eX/AFzYXP4ejVoldab1ez5EjR4iKinIEQjt27EjHjh2Jj4+nWbNmLrVvMBj8bsp5N2w7YNk2rKnB3xscAzTGusemx7AHnorD2cBTSUgpadPmb1eyV69e7Nzp1OZhLhMQEMAzzzzDxIkTSUhIoG/fvo5JPJMmTSItLY0OHTrQp08fBg8ezIABAzCbzQwfPtxRx0MPPURCQgJTpkxh9uzZ1KxZk6SkJNavX8+ePXsYNGgQgwYN8tg1KIrH3pXONBgLzeFYsOvqJDt6vZ41a78gPcfADR1vYtu2bezb9/cmeq6KhjN4fXsEIcS/gYZSyplCiKZY9y49Djxr2yl8PfCmlLLEvS+TkpKuBY6V1+MA+L/E5EKBJ5PJhEarZWC7mArFOLxJVlZWIe/C27h6z0F5HM5gMJq4e+nOYkdUwoMD+HxsT4e3fFhKZr3xHscuZmBpcC05yT9yYf8PhIdHcPDgb9SsWbN8NvztcTTr2LHj8aLnfREcfRf4UAix09b+FKzDse8KIbTA3tJEw10UDTzVDP57VEWh8CX2ORzFxdnscziia4Vah/dXfcnhtHTCGsdyft9XZOj3ow0Kodu4h8stGs7gi+BoJn/v7F2QW7xph06rZUZcO6Z2b22bx2Gkbi3P3WiFwlmcncORZzJzrkEbasbC6a+Wk31SoguNIHbM45ypFYXBaCp3kL8sqv3MUWfSqykU3sQ+h8NcJIxgn8Nx7uwZ8vPzuZhl4NyFi6RuXkL2SUlgzbq0mjiHsJhYruSZnJpdWl6qvXAoFP7ItB5tGNgu5qpM5X0bBbB8+XLWrl1L7ZAAauRcJvfCGYIjo2g5aQ4hDRoDUDNId1WQ313L80GtjlUo/JKiXenIsGBO/JHiWHvSoEEDagQFMvC2HmRmz6RGwxgCw2sDVs+kY8MwhxftzuX5dpRwKBR+THCA1XP46UAy3321FQ0W6tWrR25uLhqNpsTZpd3Cchx1uGt5fkGUcCgUfordU9iy+2dO/rSLGoFaWkXXZ/fCheTn57Nt2zauv/76qzyT4ACdY6ZrWcvzp3ZvXa74nopxuJG1a9fy4osvVqiObdu2XXUsJSWFPn36sGzZMp5//nlOnTpFZmYmu3fvrlBbCv9mwa5DrP5hP6eTvkengTyThe2fLCYrK4v+/fvTtm1bR9mSgvzuWJ5fHEo4/Ii8vDxHMp2C/Pbbb/To0YN7772XJ598kiZNmpCcnMyePXu8b6TCK9g9heBakQRHRmHOz+PPb9eAyUhUl9tZsHCRUxPvPDVLWnVV3ExqaiqTJk3izz//ZOzYsdx1113873//47XXXiMgIIBGjRrx3HPPYTAYePjhh8nLyyMvL49nnnmG1atXI6Vkzpw5zJkzB4BLly7xzjvvkJOTQ0xMDNu3b+fpp5/m2WefJSMjg5YtWxaaQq6oGlzIzHVMAjNmXeHc9xsBaNjzTureehd/5RoJDS5bOCqyPL80qrRw1K1bt8Rzr732GuPGjQOsqfUeeeSREsteunTJ6TaPHz/O2rVryczMZNCgQQwdOpS5c+eyZMkSateuzUsvvcS2bdsICQmhYcOGzJs3j1OnTnHs2DEmTJjAr7/+6hAN+zVMnjyZI0eOMHbsWLZvt06qnTBhAr///rsSjSqIlJKkX/ZTJ6QO6ZcucXb7CgCi7xhNw+79CQ8OcMlTKM/y/LKo0sLhC2688UYCAwOpU6cO4eHhXLx4kRMnTjhycmRnZ1OnTh0GDRrE66+/zjPPPMPtt99Ojx49SE1N9bH1Cl8jpWT9+vWYzWba1A/kp4g6NLvnUfLTLxDZ8dZyeQrFDe1WdMJjlRYOZz2FcePGMWzYMLcsGCu6u5lOp6NBgwYsW7bsqrLr169n7969LF++nP3793PnnXdWuH1F5cGettL+Q7aLhtFopG7dujw6bhBvfneYxJBALmUbCA8OqJCnYA+guoMqLRy+YP/+/ZhMJtLT08nJyaF27doAHD16lBYtWrBs2TI6derEpUuXyM/Pp2fPnrRo0YI5c+YwZMgQTCbnZvVptVrHjlyKykVxE7LaBedQ6/RB8vPz+e677/j5559p0aIFM+Lj3eopuAslHG6mefPmTJ8+nRMnTvDwww+j0Wh4/vnnmTVrFoGBgTRo0IDhw4cTHh7Of/7zH95//33rRJ5p06hfvz75+flMmzaNBQsWlNpO27Ztefnll2nSpAkTJkzw0tVVHoo+zf2JohOyLpw6wRcHdxNbK5h0/X6Sk5OpXbs2tWrVAtzrKbgLr+fjcAfuyMdRFF/ntigPvrbZH/NxeGJ6tTttLi7XxuWDe8hKPUrG4f+Re+FPoqKiWL16daF5Gq5SUZvLyseh5nEoqhSuZM7yBcVNyAptHMtfyXvJvfAnMdc0ZcuWLRUSDW+ghENRZXDX7meexD4hy3D5HBaTEYvZzB+fvkz+5fOERl3Dps2buPbaa31tZpko4VBUGTw1vbo0XF2qHhygo31wLhd/2cWlX38Ai4Um/ccT3rw9U196h2saN3a7jZ5ABUcVVQZ37H7mLM7GUooGafV6PTVP/0bzmoFkRNYjz2IhSrRneN9eTO/p392TgijhUFQZPDW9ujjKWqpenLC0D86l5unfOHr0KL9s28zCt9+hc1ycX478lIUSDkWVwhPTq4vizFL1Rd8fLjzkmnqCtb/tpk72efT7vsNsNrP7h+8ZPGig2+zyJirG4UZ27dp11daNQ4YMcXkqeWZmJt9//z0Aixcvduy+9eWXX7rH0CqMfXr152N7snJMTz4f25MZce0qvB9wQcqKpZxJzy4kLHl/XeDyb3vITj3K4T07MZvNTJs2rcIpGHyJ8jjcSI8ePdxST3JyMj/88APdu3dn8uTJgHXV7ebNm+nTp49b2qjqeHLSVFmxFDQU2t4gIKIOuedTydDvB2DGY7N4+rH/eMQ2b6GEw42sXbuWI0eOkJ+fzy+//EKzZs0ce9mmpaXx5JNPkp+fj06nY+7cuURHR9O7d2/i4+P5+eefiYiIYPHixTz77LNkZmZy7bXX8ssvv9CnTx+WL1/OgQMHeOutt1i3bh3r168HrBN9PvroI9566y1fXnq1oqxYSnTNUCLDgsnIzUej0ZCWuJa/frMmXWox9D5m/rvkldiVhSotHPPnzy/x3B133MH1118PWNeXbNy4kcDAwGLLPv744063eerUKc6cOcPq1atJS0ujd+/eALzxxhuMHz+erl27snPnThYtWsTcuXM5deoUgwYN4rHHHuPuu+9GSsmECRM4cuQIw4cPd3RTJkyYwKeffsqDDz5IVlYW33zzDb169WLHjh3079/fafsU7qG0WIpOq6V9cC6b9vxI3eu6UrPV9Zzfs42YARMYOWJYpQuEFkeVFg5fIKWke/fuaLVaGjVqRJMmTQD45ZdfOHbsGG+//TYmk8mRKyQ8PJzWrVsDEBUVRUZGRpltDBo0iDfeeINevXqxb98+pk+f7rkLUhRLaUvV9Xo9Ead/o5kuhyuXTxMY3Zx/zlpEfPvmVWanwCotHM56Ctdffz0tW7Z0y7oPi8WCtkAgzmw2AxAYGMgbb7xBgwYNCpXX6XRXfb4sWrduzYULF0hOTqZly5aO9ToK71M0lqLX6/n8889Zu3YtgwcP5j+P/4tL2XmVcsi1NNSoiptp3bo1ycnJWCwWTp8+zenTpwHo0KGDI3vX7t272bhxY4l1FLdkvuixvn37Mn/+fAYMGOCBq1CUB71ez/Lly/nss884fvw4K1euxGLMr5I7BSrhcDNNmjShVatWDB8+nDfeeMPRDXnwwQfZsWMHo0aNYuHChY74SnG0bduWrVu38sEHHziOxcbG8vvvvzNv3jwAEhISSEtLo0uXLh69HoVz6PV6li1bxieffMKff/5JkyZN2LBhAzVq1LiqrDt3VPMVVbqr4m2GDBlS6vmCQmBn7969jtcFc3DY53EUJDEx0fH6hx9+YPDgwYW6RQrfsXPnTpYtW8aVK1do1aoVa9euJTo6ulAZTyz59xVKOCohTz31FKdOneLll1/2tSkKrPNuXnnlFa5cucINN9zAqlWrik2U7Ykd1XxF5ZI5BQBz585l6dKlPk88VBmTQLmTEydOkJeXh8lkIjc3l549e7Ju3bpiRaMyLPl3hSrjcWi1WvLy8sqdAUzhOiaTqdreb71ez7p162jcuDHDhw9ny5YtxMbGEhISUmx5+zT14oKk9iX//pYesDSqjHAEBASQk5NDdnY2Op3uqmzjZZGfn+9IhVdZ8JXNFosFk8mEyWQiIKDKfIWcRq/X88ILL2A0GunUqRM6nY527Urvanhzyb83qFJdlYiICIKCglwWDbDuz1rZ8JXNGo2GoKAgIiIifNK+L9Hr9cyePZv169ezdetWGjRo4NT3zT5N3Vyke+eJJf/eoMo9LiryBKyMbndltLmyotfrefzxxx2jW48//rhLuUG9seTfW1Q54VAoPEFqairTp093DJ+/9NJLTJw40aU6PLGjmq/wunAIISYA9xY4dBPQDXgbsAAHpJT3e9suhaIkTCYTL7/8Mnv37kWr1fLOO+9w1113lbs+f9wnxVW8HuOQUn4gpYyTUsYBs4GlwOvAdCllN6CWEKKvt+1SKIrDYrFw/vx5Nm3aRHBwMJ9++mmFRKOq4OuuyjPAv4BdUsqfbMc2AvHAVp9ZpVBgjWkkJSXRrFkzVq5cSX5+PjfffLOvzfILfCYcQohOwCnACFwucOoc0MiZOmw7TbmNpKQkt9bnDZTNnuHQoUOsWbOGa665BovFghCCoKCgSmG7HU/a6kuPYyKwpJjjTo+lFtwCsqJ4emtCT6Bs9gw//vgjzz33HOfPn2fmzJm0atXK720uihu3gCwWX87jiAN+BM4DkQWONwbO+MIghSIxMZHRo0dz/vx5oqOjGT16dLnmBVV1fCIcQohoIFNKmSelzAcOCyG6204PAbb5wi5F9Wbbtm2MGTOGv/76i9jYWBITE4mJifG1WX6Jr7oqjbDGMuw8DLwrhNACe6WU231ilaLa8tNPPzFx4kSys7Np164dmzZtolatWr42y2/xiXBIKZOAvgXe/w7c4gtbFIr8/Hzuu+8+srOz6d69OytXriw2AY/ib6rUWhWFwlVOnDiB2WxmyZIlTJgwgbVr1yrRcAJfz+NQKHzGl19+ya+//kp0dDQjRoxQiZFcQHkcimrJ7NmzGTlyJAcOHKBx48ZXZZtXlI7yOBTVCovFwr///W+WLFkCQKNGjYiLi1NDri6ihENRbTCbzUyePJm1a9ei0Wh46KGHmD17thKNcqCEQ1EtyMvL41//+hdbt25Fp9Mxc+ZMHn30USUa5UQJh6Ja8Mgjj7B161ZCQkKYNWsWDz74oBKNCqCEQ1HlsVgsTJ06lX379vH2229z4403KtGoIEo4FFWWnJwcTp06xf/+9z+GDh3K7t271eiJm1DDsYoqycmTJ+nSpQuzZ8/m5MmTJCcnK9FwI0o4FFWOw4cP07t3b06dOsX+/fu56aabuOGGG3xtVpVCCYeiSvHzzz/Tt29fzp8/T0xMDK+88gq33Xabimm4GSUciirDd999x8CBA0lPT6d58+a89NJLJCQkKNHwAEo4FFWCr7/+mrvvvpvs7GzatGnD/Pnz6dOnjxIND6FGVRRVgmbNmhEREcHo0aMZN24cbdu2VaLhQZRwKCo9x48fp0mTJiQmJtKoUSMlGF5ACYeiUmKxWJg/fz75+fmEhIQQFRXFyJEjlWh4CSUcCr/HYDQV2jLRbDbzxBNPsHjxYrRaLRMnTqRTp04V2jdY4RrqTiv8FpPZzIJdh0hMSXMIxy1NIzmyahGrVq1Cp9MxcOBA+vTpo5bGexklHAq/ZcGuQ2xITkWr0RAcoONKZjZvPT2PK/JngoKCGDx4MHfffbdXRKOo11PdUcKh8EsMRhOJKWlobYJgys3mj09eJvP4IbSBQdw17G6GDhnscdEwmc18eugiKft3OoQjLrYh03q0QaetvrMZqu+VK/yai1kGLmYZHO+NOZnkXvyTgPDaNIgfQVyfBK94Ggt2HWJX6hUyDUaCA3RkGoxsSE5lwa5DHm3X31Eeh8IviQwLJjIsmEyDEYCg2vVpMe4JtIHB1I1qRP8+Pb3SPSno9djRajQkpqQxtXvratttUR6Hwi8JDtDRPsTAub1fkXv+NJd+TiS4bkMC69Tn1hZRhAR65plnMJo4k57tiGkU9HoKcim75HPVAeVxKPySX3/9lc/nTOPixYsYOnRFUyeKupdOM+DW7kzr0cbt7RU3gtO9WQPqhgbxZ072VeXrhlo9ouqKEg6F3/HDDz8wcuRIMjMzadasGQPiOtG+Y2f694n3mKdRdAQn02Bky6HT1AoJxGyxFCprtliIi21YbbspoIRD4Wd8+eWX/Otf/yI3N5c2bdqQkJBA165dPRoILS2WYbZA1+gITuQHcCnbQN3Qv0dVqjNKOBR+w6pVq5g6dSomk4nrr7+e3r1706VLF4+PnthjGcV5EJdzDEwVdbj15k5qHkcBlHAo/IKcnBzmzp2LyWRi4MCBtG7dms6dO3tlyLXoCE5B6oYGUytYR3CAjuhaoR61ozKhhEPhF9SoUYM1a9awY8cOJk+ezOHDh2ndurVXppEHB+iIi23oiHHYsccygnS5HrehsqGGYxU+w2w2s337dsC6ND4mJob77rsPjUZDmzZtColGwWFSTzCtRxsGtoshPDiAPJOJ8OAABraLqfaxjJJQHofCJxiNRqZPn87y5cuZMWMGwcHBNGjQgFGjRhEYGOiYR1G7RiDv/qgvNEzqiSnfOq2WGXHtmNq9tYplOIESDoXXyc3NZdKkSWzevJmQkBDOnDlD06ZNadq0KRqtlv9LTHYIRXpOHjlGE9fUDis05RtgRlw7t9umYhnOoboqCq+SnZ3NiBEj2Lx5M+Hh4QwbNoymTZs6AqFvfneYDcmpZBqMBOq0pGXmcinLQOpff0/Csk/59lS3RVE2yuNQeI2LFy8yc+ZMpJRERkZy5513EhkZ6RCNPJO50HyKfJOFfJMFrUbD5dw8GltCHefsU76Vd+AblHAovMaDDz6IlJLGjRvTv39/atWqVWjIteh8ikCdhkCdBpPZYhMRs+NcdZ/y7Wt8IhxCiFHATMAIPAMcAJYBOuAscK+UsvquIKqivPDCC1y4cIGlS5eyf/9+atSoUWieRtH5FFqNhto1griQZbCJiLVnraZ8+x6vxziEEJHAbKA70B8YBDwLLJRS3gIcBcZ72y6FZzh79qzjddOmTZk/fz7R0dH07dv3qsld9vkUBdeGxNQOIzI0iIYRIRjNZjVM6ie47HEIIYZIKddWoM14YLuUMgPIACYLIY4BU2znNwKPAm9XoA2FH7Bnzx5GjBjBjBkz6Nu3L3v37iU2NhagxIlddkFITElzrA2Z2k1wX9dW/JWTr4ZJ/QSNpcjKv6IIIWoBL0opp9jebwHMwFQp5UlXGxRCPAa0AeoCdYA5wHIpZQPb+VhgmZSya0l1JCUlXQscc7VthffYt28fzz77LAaDgY4dO9K2bVsAOnXqRJs2ZXsLeSYz6QYTtYJ1BOnU4J8PadaxY8fjRQ8643H8ANxjfyOlTBBC3A3sEEJ8ALwspXRlXEwDRAKDgabAt7ZjBc87Rfv27QkOdk+ALCkpiY4dO7qlLm/hrzavWbOG2bNnYzQa6d+/v0M0OnfuTEREhF/aXBr+ep9Lo6I2GwwGDh48WOJ5Z6R8BfBwwQNSys+BG4Fo4GchRHcXbEoDfpRSGqWUKVi7KxlCiBq2842BMy7Up/AjPvroIyZPnozRaGTUqFG0b98ewGsL1hTeoUzhkFLOBV4ueEwI0R4YCdTE+kPfIoRYLIRwZlD9K6CXEEJrC5SGA9uBobbzQ4Ftzl+Cwl9YsmQJ//73v7FYLNx///00adIEs9msRKMK4lTnUUrpSOkshPgLWAV0Ar6x/V8bOAysdqKu07Zye4CtwENYR1nGCiG+wxr7WOrCNSg8QHkWlcXHx3PNNdfw2muv0bNnTyUaVZjyzONoKaU8X8zx14QQE52pQEr5LvBukcO9y2GLws0Ul3uztEVlJpMJrVaLRqMhJiaGH3/8kdDQUCwWC9dccw1CCCUaVRCXw9UliIadwRWwReEH2HNvOrOPiMFgYPz48bz00ksAHDt2zCESGo3Ga/k0FN7HreNcUkrpzvoU3qWsfUQKdlsyMzMZMWIEGzdu5O233+bHH39k1apVrFy5kry8PG+brvAyaoBc4cDZfUQuX77M4MGD2blzJw0aNGDhwoV8//33mM1mmjZtSmBgoDfNVvgAJRwKB/a1IsVhX1R29uxZ+vXrR1JSEk2aNGHhwoUkJyerQGg1QwmHwkFxa0Xg70VlZ1NPkZCQwOHDhxFCsHDhQpKSkpRoVEOUcCgKUVruzZCQEDQaDTfeeCMfffQRP/74oxKNaorKx6EoRGm5N6Oioli/fj21a9cmPDycDh06EBQUpESjGqKEo5JgT97rrdWh9tyb33zzDXv37mXWrFkAxMTEOETi9ttvB0pe6aqouijh8HNKm5DladavX8/kyZPJz8+nY8eOXHvttezevZu7776bGjVqKMGoxijh8HOK2wzZnuW7R4Tn2l22bBkzZszAbDYzZcoUmjZtyrp16zCbzSQnJ3PTTTd5rnGF36OCo35MWROy8kxmj7S7YMECpk+fjtls5oknnmDs2LGsX7/eEQitbEvMFe5HeRx+TGmbIV/KNpBucG+Gb4vFwnPPPcfrr78OwIsvvkjPnj0dnoYaPVHYUcLhxzizGbI7ycjIYPPmzeh0OhYtWkSHDh2UaCiKRXVV/JiyJmS5O6VezZo1WbNmDcuXL2fYsGGcOHFCiYaiWJRw+Dme3gw5KyuLJUuWYM89GxMTQ3x8PGDNrzFkyBAlGoqrUF0VP8eTmyH/9ddfjBgxgn379nHlyhWmTZvGsWPHaNiwIaGhoWg0Glq1auWWthRVC+VxVBLsE7LcJRppaWkMGDCAffv20bhxY/r27Yter2fVqlWsWLFCLY1XlIoSjmrIiRMnSEhIIDk5mRYtWrB161YsFosjENqsWTO1NF5RKko4qhmHDh0iISGBY8eO0aFDBzZv3kx2drYaPVG4hBKOaoTFYmHmzJmcPXuWbt26sX79ei5fvqxEQ+EySjiqERqNhvfee4/x48fz+eefk5OTo0RDUS7UqEo14Ndff+Uf//gHGo2GqKgoXnnlFQBCQkK44YYbCAgIUKKhcAnlcVRxPv30U2677Taee+45xzGz2brGRaPREB8fr0RD4TJKOKowixYt4qGHHsJsNhMUFITFYkGv1/Pxxx+TlZUFWMVDo9GUawMmRfVFdVWqIBaLhXnz5vHqq68CMG/ePKZMmYJer3fENH7//Xc6derk8gZMCgUo4ahymM1mHnvsMT744AN0Oh1vvvkmI0aMKCQanTt3duTTKC3fx4y4dr68FIUfox4pVYwXX3yRDz74gODgYJYuXVqsaNhjGq5swKRQFEQJRwGqQj9/0qRJ3HjjjXz++eckJCSUKBrg/AZMCkVRVFcFa17PTw9dJGX/Tr/t55eWrDgzM5PQ0FC0Wi316tXj66+/dojDqVOnSpynUVa+j5I2Z1IolHBg7efvSr1CrZo1/a6fX1ay4nPnzjFs2DBuvvlmXnjhBccoiZ1evXpxzTXX0KJFi6uGXO35PuwxDjv2fB/eyKauqJxUe+Eoq58/tXtrn/6ASgteiuw0pkyZQkpKCtnZ2aSnp1O7dm3++OMPGjZsSFhYGBqNhpYtW5ZYv12AElPSuJRtoG6o97KoKyov1VI4Crr9zvTzo2u5N7ens5Qmalt2/8zbHz3HhQsXuO6661i1ahW1a9d2xDQiIyMZPXo0wcGldzc8me9DUXWpVsJRnNvfvVkD6oYG8WdO9lXlfd3PLylZcfbpFI4unY8pO4MuXbqwYsUKatasWSgQ2rx5c4KCgpxuy57vQ6FwBv+I/HkJu9ufaTA63P4th06j1WhKzOvpy6dvcbvHZ506wpEP52LKzuCmTp1YvXr1VaKhFqwpPE21EY7S3H6zBbpGR3gsr2d5KS5ZcXBkI4Jq1aNd99t49r//JTQ0VImGwutUm65KaXuUXM4xMFXU4dabO/ldP98uXt8e/ZPLOXnUrVObB19+h0fvuInfDhzg3LlzSjQUXsfrwiGEiANWAcm2Q78BLwHLAB1wFrhXSunW2UfO7FHij/18nVZLqPyeZocO8/acudQLDykkavXr1+fGG29Ep9Mp0VB4DV95HDullHfZ3wghPgIWSilXCSHmAeOBt93ZYFlzFoJ0ue5szi1YLBZeeuklXnzxRQCG3z2Mxl27AoWXxt92222O1wqFN/CXGEccsMH2eiMQ74lGPL1HiTsxm83MmjWLF198Ea1Wy4IFC+hqEw29Xs/SpUvJyckBuGrSl0LhaXzlcbQVQmwA6gL/BcIKdE3OAY2cqeTgwYMuN9wjArpcF0G6IZRawTqCdLns/+UXAJKSklyuzxMYjUZeeeUVduzYQWBgILNmzaJNmzYkJSVx8uRJdu7cicViISAggBo1avjaXJfxl/vsCsrmwvhCOI5gFYvPgebAt0XscPrR2b59+zInODlLUlKSX+zCnpOTw4QJE9ixYwdhYWEsW7aMuLg4wOppHDlyhKioKDp37kxERIRf2OwK/nKfXaE62mwwGEp9MHu9qyKlPC2lXCmltEgpU4A/gTpCCPujszFwxtt2+Qv5+fmcPXuW2rVrs3bt2kKioUZPFP6CL0ZVRgGNpJSvCCGigIbAR8BQ4BPb/9u8bZe/ULNmTVatWsWFCxdo3bo1oERD4X/4Iji6AegphPgOWA/cDzwJjLUdqwss9YFdPuP06dM8//zzjpGSevXqOUQD4MyZM0o0FH6F1z0OKWUGMKCYU729bYs/cPToUYYMGUJqaioRERFMmzbtqjI9e/YkJiaG2NhYJRoKv8BfhmOrJQcOHCAhIYHU1FQ6derEmDFjHOf++OMPMjMzAetwa3H5NBQKX6GEw0fs3r2bAQMGcOHCBXr16sXatWupXbs2YI1prF69mhUrVmAwqPR9Cv9DCYcP+Oqrrxg6dCgZGRkMGjSIzz77jLCwMKBwIDQ2NtalpfEKhbdQwuFlLBYLb7zxBrm5uYwZM4b333/fIQ5q9ERRWag2q2P9BY1GwyeffMKKFSuYMmWKQxiUaCgqE8rj8AIWi4WNGzdiMlm3XahTpw7333+/QxjOnz+vRENRqVDC4WHMZjNPPfUUY8eOZebMmcWWqVevHp06dVKioag0qK6KBzEajUyfPp3ly5cTGBjILbfcUui82WxGq9Wi0WgcU8uVaCgqA8rj8BC5ubmMHz+e5cuXExoaymeffcadd97pOK/X61myZAkZGRmAWhqvqFwo4fAAGRkZjBw5kk2bNlGrVi3WrFnjSLYDfwdCz507x++//+5DSxWK8qG6Kh7g+eefZ+fOnTRo0IA1a9bQrt3fu8EVHT355z//6UNLFYryoYTDAzzxxBOcOXOGOXPm0Lx5c8dxNeSqqCoo4XATqampREVFERAQQM2aNfn4448LnVeioahKqBiHGzh48CDx8fFMmzbNsTS+KGfPnlWioagyKI+jguzZs4eRI0eSnp7O2bNnMRgMxeYB7dGjBzExMTRv3lyJhqLSozyOCrB9+3aGDh1Keno6/fv3Z8WKFYVEIyUlpdBwq8qnoagqKOEoJ1988QWjRo0iJyeHUaNG8eGHHxZKnKzX61mzZg3Lly8nN9f/9mxRKCqCEo5y8OWXXzJx4kTy8/N54IEHWLBgAQEBf/f6CgZCW7Zs6bZM7AqFv6BiHOWgW7dudOzYkTvuuIMZM2YU6n6o0RNFdUAJh5NYLBaMRiOBgYGEh4ezadOmq5LsKNFQVBdUV8UJTCYTDz/8MPfdd59jaXxR0VBL4xXVCeVxlIHBYOC+++5jw4YNhISEcOjQIdq3b39VuXr16tG5c2fMZrMSDUWVRwlHKWRlZTFmzBi+/fZbIiIiWLFihUM0DEYTF7MM1KkRSI2gQDQaDT169ADU0nhF1UcJRwlcvnyZ4cNH8L///US9evVZvXoV//jHPzCZzSzYdYjElDTOHD8Gpw8z9K5h/OeOm9BpVc9PUT1QwlEMf6alEXdHP86d+IPAWpE0Hf80Oy7paGcTjQ3JqeRdOEPWoX1gMbM2cQ/BoWHMiGtXduUKRRVACUcxfLz/NFmaIILrR9Ni3JOYa0ayITkVo9nC98fOkXfhDJd/2wMWM2FNBeFNBYkpaUzt3prgAJ2vzVcoPI4SjiIYjCZ+SP2L2NEzsZiMBITVBECr0bBd/ycXTh8n59BPDtGIiL0OjUbDpWwDF7MMRNcK9fEVKBSeR3XKbfz+++888MADpKVncTHLgC4k1CEadi6ePoHh8NWiAVA3NJjIMDVDVFE9UB4H8O233/LYY4+Rm5tLu+uuIzKsFZkG41XlwoxZRNcN40xoo0KiYbZYiItt6FQ3xT4aExkWXO5ujb2OPFPxS/gVCk9T7YVjw4YNTJ48mby8PEaMGMHkiRMxfC/ZkJyKtsCwqtliYdDtvRjQJISNp3LZ+cc5LmUbqBsaTFxsQ6b1aFNqOwVHY+zCYf+cs6MxRevQ5edyZ3aoS3UoFO6gWgvHsmXLmDFjBmazmTvvvJO33noLrVbrEIHElDTOnjpB/fr1iW/XzPEDfaQFPHCLa56DfTRGq9EQHKAj02BkQ3IqgNOjMUXryMgxu1yHQuEOqu1j6s0332T69OmYzWYef/xxpk6ditb21NZptcyIa8fzXaPpoztDgvYU93eJLfRUDw7QEV0r1OnuSWJKWiEPBqwB18SUNAxGk1fqUCjcRbUUDqPRyDfffAPA/PnzmTlz5lWzPfV6PVs2biAsUEfb1q0rtDT+YpZ1xKU47KMx3qhDoXAX1bKrEhAQwLJly/jhhx/o06fPVefdvco1Msw64lJcwNXZ0Rh31KFQuItq43Hk5eWxYMEC8vLyAAgPD/eKaIC1WxMX2xCzxVLouCujMe6oQ6FwF9XC48jOzmbcuHFs376dI0eO8OabbxZb7uLFix5bGl8w4OrKaExpddQI0DKwXYxLdSgU7sBnwiGEqAEcBJ4DdgDLAB1wFrhXSumWTnt6ejojRoxg7969REZGMmHChBLLRkZG0rVrV/Lz892+NN4ecJ3avXW553EUreOETObmf6rRFIX38WVX5Sngku31s8BCKeUtwFFgvDsaOHfuHAMGDGDv3r1ER0ezefNmrr/++qvK2ZPzAHTv3t2j+TRcGY0pq44gXbXpaSr8DJ9884QQrYG2wGbboThgg+31RiC+om2cPHmShIQEDh48SIsWLdi2bRutWrW6qpxer+fDDz8kKyvLcUzl01AoSsdXXZVXgQeBsbb3YQW6JueARs5UcvDgwZIbePVV/vjjD1q0aMG8efNIS0sjLS2tUJmTJ0+yc+dOLBYLQUFBhIWFuXwhviYpKcnXJriMstk7eNJmrwuHEGIMsFtKeUwIUVwRpx/37du3L3F+xQcffMDzzz/PzJkzqVmz5lXn9Xo9R44cISoqis6dOxMREUHHjh2dbdovSEpKUjZ7gepos8FgKPXB7IuuSj9gkBBiDzAReBrItAVLARoDZ8pTcVJSkmPzoxo1ajB37twSRUMlFlYoyo/XhUNKOVxK2UlK2QV4H+uoynZgqK3IUGCbq/Vu3ryZfv36MWHCBIzGqydJ2VGioVBUHH8Jy88GxgohvgPqAktd+fDy5csZO3YseXl5xMTEONacFMeFCxeUaCgUFcSnE8CklHMKvO1dnjreeecdnnjiCQAeffRRZs2aVaoYdO3alejoaJo2bXpVOXfkylAoqgOVeubookWLeO655wB4/vnnuf/++4stl5KSQr169ahVqxYA1157baHzJrOZTw9dJGX/znLnylAoqhOV+lexePFidDodb731VomiUXDX+JycnGLLLNh1iF2pV8g0GAvlyliw65AnzVcoKi2VWjh69OjBkiVLuOeee4o9XzAQKoQgJCTkqjIqz4VC4TqVuqvyxhtvFCsG4PzoiTN5LlTmcoWiMJXa4ygpCOrKkKs9z0VxqDwXCkXxVGrhKA5Xl8arPBcKhetU6q5KcURGRtKtWzfy8vKcnqcxrUcbTqWeJsUQUO5cGQpFdaLKCIfJZEKns3oH3bp1w2KxOD25S6fVMqpNJO07XK/mcSgUTlAluip6vZ4PPviA9PR0x7HyzAh1R64MhaI6UOmFwx4IvXTpEocOqXkXCoU3qNTCcezYMdas/YL0HAM33tSJzp07+9okhaJaUKljHPM/XM7+E39CVCy/HYfknb+raeIKhReo1MKRciGDwMYtiYi9jqw8k9oOUaHwEpVVOHQAjVu2Ib/+tWgKJA375dQ5MrKbEaRzPcBpMFS+3dCUzd6hutls338I22+tKBpLkYlPlYGkpKTuwHe+tkOhqAbc0rFjx++LHqysHsdPwC1Y92BRq9AUCvejw5o0/KfiTlZKj0OhUPgWNfygUChcRgmHQqFwGSUcCoXCZZRwKBQKl1HCoVAoXKayDsdWGNvOcQexbgi1A1iGdQjqLHBvgb1sfY4QIg5YBSTbDv0GvIQf22xHCDEKmAkYgWeAA/ix3UKICcC9BQ7dBHQD3gYswAEpZfGZsX2EECIc+BioAwQD/wX+xIM2V2eP4yngku31s8BCKeUtwFFgvM+sKpmdUso427+HqAQ2CyEisW621R3oDwzCz+2WUn5gv89YbV8KvA5Ml1J2A2oJIfr60MTiGAdIKeWtwF3AG3jY5mopHEKI1kBbYLPtUBywwfZ6IxDvA7NcJQ7/tzke2C6lzJBSnpVSTqZy2G3nGeBFoJmU0j4Ryh9tvgBE2l7XwfpA9KjN1bWr8irwIDDW9j6sgLt8DuuMOX+jrRBiA9YtMv9L5bD5WiDUZncdYA6Vw26EEJ2AU1i7WJcLnPI7m6WUK4QQ44QQR7He5wHAwgJF3G5ztfM4hBBjgN1SymMlFPHHzWSPYBWLQVjF7gMKi74/2gxWuyKBIVjd6Y8obKu/2g0wEVhSzHG/s1kIMRo4KaVsAfQCPilSxO02VzvhAPoBg4QQe7B+OZ4GMm3BUoDGwBlfGVccUsrTUsqVUkqLlDIFa+Crjj/bbCMN+FFKabTZnQFkVAK7wdql+hE4z9/dAPBPm7sBXwJIKX8FagD1Cpx3u83VTjiklMOllJ2klF2A97GOqmwHhtqKDAW2+cq+4hBCjBJCPGp7HQU0xPr09lubbXwF9BJCaG2B0nD8/F4DCCGigUwpZZ6UMh84LITobjs9BP+z+SjQGUAI0RSrQB/ypM3VNcZRlNnAx0KI+4ATWCPp/sQG4DMhxCAgCLgf+AX/thkp5WkhxGpgj+3QQ1hXW/q13VjjAecKvH8YeFcIoQX2Sim3+8SqknkX+FAIsRPrb3oKVq/UYzar1bEKhcJlql1XRaFQVBwlHAqFwmWUcCgUCpdRwqFQKFxGCYdCoXAZJRwKhcJllHAoFAqXUcKh8BhCiAZCiHTbJCT7sa1CiLt8aZei4ijhUHgMKeU5rDMY2wMIIe4GLFLK1T41TFFh1JRzhaf5DugqhDgOzAN6+9YchTtQwqHwNN9hXerdDviwlHQGikqEEg6Fp/kOeA3rsu4bfWyLwk2oGIfC05zAuqL3QdsSdUUVQAmHwtNMB1ZKKXf62hCF+1BdFYVHsCWE/gKrx6GGX6sYKh+HQqFwGdVVUSgULqOEQ6FQuIwSDoVC4TJKOBQKhcso4VAoFC6jhEOhULiMEg6FQuEy/w8lZL4zuTD9gQAAAABJRU5ErkJggg==\n", - "text/plain": [ - "
" - ] - }, - "metadata": { - "needs_background": "light" - }, - "output_type": "display_data" - } - ], - "source": [ - "model = LinearRegression()\n", - "visualizer = prediction_error(model, X_train, y_train, X_test, y_test)" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3 (ipykernel)", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.9.13" - } - }, - "nbformat": 4, - "nbformat_minor": 5 -} diff --git a/12_notes.qmd b/12_notes.qmd new file mode 100644 index 0000000..b65567e --- /dev/null +++ b/12_notes.qmd @@ -0,0 +1 @@ +# Notes {-} diff --git a/13_exercises.qmd b/13_exercises.qmd new file mode 100644 index 0000000..465ff4d --- /dev/null +++ b/13_exercises.qmd @@ -0,0 +1 @@ +# Exercises {-} diff --git a/13_main.qmd b/13_main.qmd index f392870..c49b092 100644 --- a/13_main.qmd +++ b/13_main.qmd @@ -1,4 +1,7 @@ -# 13. Data Analysis Examples +# 13. Multiple Testing ## Learning Objectives +- item 1 +- item 2 +- item 3 diff --git a/13_notes.qmd b/13_notes.qmd new file mode 100644 index 0000000..b65567e --- /dev/null +++ b/13_notes.qmd @@ -0,0 +1 @@ +# Notes {-} diff --git a/ISLP_data/Advertising.csv b/ISLP_data/Advertising.csv new file mode 100644 index 0000000..9547f43 --- /dev/null +++ b/ISLP_data/Advertising.csv @@ -0,0 +1,201 @@ +,TV,radio,newspaper,sales +1,230.1,37.8,69.2,22.1 +2,44.5,39.3,45.1,10.4 +3,17.2,45.9,69.3,9.3 +4,151.5,41.3,58.5,18.5 +5,180.8,10.8,58.4,12.9 +6,8.7,48.9,75,7.2 +7,57.5,32.8,23.5,11.8 +8,120.2,19.6,11.6,13.2 +9,8.6,2.1,1,4.8 +10,199.8,2.6,21.2,10.6 +11,66.1,5.8,24.2,8.6 +12,214.7,24,4,17.4 +13,23.8,35.1,65.9,9.2 +14,97.5,7.6,7.2,9.7 +15,204.1,32.9,46,19 +16,195.4,47.7,52.9,22.4 +17,67.8,36.6,114,12.5 +18,281.4,39.6,55.8,24.4 +19,69.2,20.5,18.3,11.3 +20,147.3,23.9,19.1,14.6 +21,218.4,27.7,53.4,18 +22,237.4,5.1,23.5,12.5 +23,13.2,15.9,49.6,5.6 +24,228.3,16.9,26.2,15.5 +25,62.3,12.6,18.3,9.7 +26,262.9,3.5,19.5,12 +27,142.9,29.3,12.6,15 +28,240.1,16.7,22.9,15.9 +29,248.8,27.1,22.9,18.9 +30,70.6,16,40.8,10.5 +31,292.9,28.3,43.2,21.4 +32,112.9,17.4,38.6,11.9 +33,97.2,1.5,30,9.6 +34,265.6,20,0.3,17.4 +35,95.7,1.4,7.4,9.5 +36,290.7,4.1,8.5,12.8 +37,266.9,43.8,5,25.4 +38,74.7,49.4,45.7,14.7 +39,43.1,26.7,35.1,10.1 +40,228,37.7,32,21.5 +41,202.5,22.3,31.6,16.6 +42,177,33.4,38.7,17.1 +43,293.6,27.7,1.8,20.7 +44,206.9,8.4,26.4,12.9 +45,25.1,25.7,43.3,8.5 +46,175.1,22.5,31.5,14.9 +47,89.7,9.9,35.7,10.6 +48,239.9,41.5,18.5,23.2 +49,227.2,15.8,49.9,14.8 +50,66.9,11.7,36.8,9.7 +51,199.8,3.1,34.6,11.4 +52,100.4,9.6,3.6,10.7 +53,216.4,41.7,39.6,22.6 +54,182.6,46.2,58.7,21.2 +55,262.7,28.8,15.9,20.2 +56,198.9,49.4,60,23.7 +57,7.3,28.1,41.4,5.5 +58,136.2,19.2,16.6,13.2 +59,210.8,49.6,37.7,23.8 +60,210.7,29.5,9.3,18.4 +61,53.5,2,21.4,8.1 +62,261.3,42.7,54.7,24.2 +63,239.3,15.5,27.3,15.7 +64,102.7,29.6,8.4,14 +65,131.1,42.8,28.9,18 +66,69,9.3,0.9,9.3 +67,31.5,24.6,2.2,9.5 +68,139.3,14.5,10.2,13.4 +69,237.4,27.5,11,18.9 +70,216.8,43.9,27.2,22.3 +71,199.1,30.6,38.7,18.3 +72,109.8,14.3,31.7,12.4 +73,26.8,33,19.3,8.8 +74,129.4,5.7,31.3,11 +75,213.4,24.6,13.1,17 +76,16.9,43.7,89.4,8.7 +77,27.5,1.6,20.7,6.9 +78,120.5,28.5,14.2,14.2 +79,5.4,29.9,9.4,5.3 +80,116,7.7,23.1,11 +81,76.4,26.7,22.3,11.8 +82,239.8,4.1,36.9,12.3 +83,75.3,20.3,32.5,11.3 +84,68.4,44.5,35.6,13.6 +85,213.5,43,33.8,21.7 +86,193.2,18.4,65.7,15.2 +87,76.3,27.5,16,12 +88,110.7,40.6,63.2,16 +89,88.3,25.5,73.4,12.9 +90,109.8,47.8,51.4,16.7 +91,134.3,4.9,9.3,11.2 +92,28.6,1.5,33,7.3 +93,217.7,33.5,59,19.4 +94,250.9,36.5,72.3,22.2 +95,107.4,14,10.9,11.5 +96,163.3,31.6,52.9,16.9 +97,197.6,3.5,5.9,11.7 +98,184.9,21,22,15.5 +99,289.7,42.3,51.2,25.4 +100,135.2,41.7,45.9,17.2 +101,222.4,4.3,49.8,11.7 +102,296.4,36.3,100.9,23.8 +103,280.2,10.1,21.4,14.8 +104,187.9,17.2,17.9,14.7 +105,238.2,34.3,5.3,20.7 +106,137.9,46.4,59,19.2 +107,25,11,29.7,7.2 +108,90.4,0.3,23.2,8.7 +109,13.1,0.4,25.6,5.3 +110,255.4,26.9,5.5,19.8 +111,225.8,8.2,56.5,13.4 +112,241.7,38,23.2,21.8 +113,175.7,15.4,2.4,14.1 +114,209.6,20.6,10.7,15.9 +115,78.2,46.8,34.5,14.6 +116,75.1,35,52.7,12.6 +117,139.2,14.3,25.6,12.2 +118,76.4,0.8,14.8,9.4 +119,125.7,36.9,79.2,15.9 +120,19.4,16,22.3,6.6 +121,141.3,26.8,46.2,15.5 +122,18.8,21.7,50.4,7 +123,224,2.4,15.6,11.6 +124,123.1,34.6,12.4,15.2 +125,229.5,32.3,74.2,19.7 +126,87.2,11.8,25.9,10.6 +127,7.8,38.9,50.6,6.6 +128,80.2,0,9.2,8.8 +129,220.3,49,3.2,24.7 +130,59.6,12,43.1,9.7 +131,0.7,39.6,8.7,1.6 +132,265.2,2.9,43,12.7 +133,8.4,27.2,2.1,5.7 +134,219.8,33.5,45.1,19.6 +135,36.9,38.6,65.6,10.8 +136,48.3,47,8.5,11.6 +137,25.6,39,9.3,9.5 +138,273.7,28.9,59.7,20.8 +139,43,25.9,20.5,9.6 +140,184.9,43.9,1.7,20.7 +141,73.4,17,12.9,10.9 +142,193.7,35.4,75.6,19.2 +143,220.5,33.2,37.9,20.1 +144,104.6,5.7,34.4,10.4 +145,96.2,14.8,38.9,11.4 +146,140.3,1.9,9,10.3 +147,240.1,7.3,8.7,13.2 +148,243.2,49,44.3,25.4 +149,38,40.3,11.9,10.9 +150,44.7,25.8,20.6,10.1 +151,280.7,13.9,37,16.1 +152,121,8.4,48.7,11.6 +153,197.6,23.3,14.2,16.6 +154,171.3,39.7,37.7,19 +155,187.8,21.1,9.5,15.6 +156,4.1,11.6,5.7,3.2 +157,93.9,43.5,50.5,15.3 +158,149.8,1.3,24.3,10.1 +159,11.7,36.9,45.2,7.3 +160,131.7,18.4,34.6,12.9 +161,172.5,18.1,30.7,14.4 +162,85.7,35.8,49.3,13.3 +163,188.4,18.1,25.6,14.9 +164,163.5,36.8,7.4,18 +165,117.2,14.7,5.4,11.9 +166,234.5,3.4,84.8,11.9 +167,17.9,37.6,21.6,8 +168,206.8,5.2,19.4,12.2 +169,215.4,23.6,57.6,17.1 +170,284.3,10.6,6.4,15 +171,50,11.6,18.4,8.4 +172,164.5,20.9,47.4,14.5 +173,19.6,20.1,17,7.6 +174,168.4,7.1,12.8,11.7 +175,222.4,3.4,13.1,11.5 +176,276.9,48.9,41.8,27 +177,248.4,30.2,20.3,20.2 +178,170.2,7.8,35.2,11.7 +179,276.7,2.3,23.7,11.8 +180,165.6,10,17.6,12.6 +181,156.6,2.6,8.3,10.5 +182,218.5,5.4,27.4,12.2 +183,56.2,5.7,29.7,8.7 +184,287.6,43,71.8,26.2 +185,253.8,21.3,30,17.6 +186,205,45.1,19.6,22.6 +187,139.5,2.1,26.6,10.3 +188,191.1,28.7,18.2,17.3 +189,286,13.9,3.7,15.9 +190,18.7,12.1,23.4,6.7 +191,39.5,41.1,5.8,10.8 +192,75.5,10.8,6,9.9 +193,17.2,4.1,31.6,5.9 +194,166.8,42,3.6,19.6 +195,149.7,35.6,6,17.3 +196,38.2,3.7,13.8,7.6 +197,94.2,4.9,8.1,9.7 +198,177,9.3,6.4,12.8 +199,283.6,42,66.2,25.5 +200,232.1,8.6,8.7,13.4 diff --git a/ISLP_data/Auto.csv b/ISLP_data/Auto.csv new file mode 100644 index 0000000..cbac5ff --- /dev/null +++ b/ISLP_data/Auto.csv @@ -0,0 +1,398 @@ +mpg,cylinders,displacement,horsepower,weight,acceleration,year,origin,name +18,8,307,130,3504,12,70,1,chevrolet chevelle malibu +15,8,350,165,3693,11.5,70,1,buick skylark 320 +18,8,318,150,3436,11,70,1,plymouth satellite +16,8,304,150,3433,12,70,1,amc rebel sst +17,8,302,140,3449,10.5,70,1,ford torino +15,8,429,198,4341,10,70,1,ford galaxie 500 +14,8,454,220,4354,9,70,1,chevrolet impala +14,8,440,215,4312,8.5,70,1,plymouth fury iii +14,8,455,225,4425,10,70,1,pontiac catalina +15,8,390,190,3850,8.5,70,1,amc ambassador dpl +15,8,383,170,3563,10,70,1,dodge challenger se +14,8,340,160,3609,8,70,1,plymouth 'cuda 340 +15,8,400,150,3761,9.5,70,1,chevrolet monte carlo +14,8,455,225,3086,10,70,1,buick estate wagon (sw) +24,4,113,95,2372,15,70,3,toyota corona mark ii +22,6,198,95,2833,15.5,70,1,plymouth duster +18,6,199,97,2774,15.5,70,1,amc hornet +21,6,200,85,2587,16,70,1,ford maverick +27,4,97,88,2130,14.5,70,3,datsun pl510 +26,4,97,46,1835,20.5,70,2,volkswagen 1131 deluxe sedan +25,4,110,87,2672,17.5,70,2,peugeot 504 +24,4,107,90,2430,14.5,70,2,audi 100 ls +25,4,104,95,2375,17.5,70,2,saab 99e +26,4,121,113,2234,12.5,70,2,bmw 2002 +21,6,199,90,2648,15,70,1,amc gremlin +10,8,360,215,4615,14,70,1,ford f250 +10,8,307,200,4376,15,70,1,chevy c20 +11,8,318,210,4382,13.5,70,1,dodge d200 +9,8,304,193,4732,18.5,70,1,hi 1200d +27,4,97,88,2130,14.5,71,3,datsun pl510 +28,4,140,90,2264,15.5,71,1,chevrolet vega 2300 +25,4,113,95,2228,14,71,3,toyota corona +25,4,98,?,2046,19,71,1,ford pinto +19,6,232,100,2634,13,71,1,amc gremlin +16,6,225,105,3439,15.5,71,1,plymouth satellite custom +17,6,250,100,3329,15.5,71,1,chevrolet chevelle malibu +19,6,250,88,3302,15.5,71,1,ford torino 500 +18,6,232,100,3288,15.5,71,1,amc matador +14,8,350,165,4209,12,71,1,chevrolet impala +14,8,400,175,4464,11.5,71,1,pontiac catalina brougham +14,8,351,153,4154,13.5,71,1,ford galaxie 500 +14,8,318,150,4096,13,71,1,plymouth fury iii +12,8,383,180,4955,11.5,71,1,dodge monaco (sw) +13,8,400,170,4746,12,71,1,ford country squire (sw) +13,8,400,175,5140,12,71,1,pontiac safari (sw) +18,6,258,110,2962,13.5,71,1,amc hornet sportabout (sw) +22,4,140,72,2408,19,71,1,chevrolet vega (sw) +19,6,250,100,3282,15,71,1,pontiac firebird +18,6,250,88,3139,14.5,71,1,ford mustang +23,4,122,86,2220,14,71,1,mercury capri 2000 +28,4,116,90,2123,14,71,2,opel 1900 +30,4,79,70,2074,19.5,71,2,peugeot 304 +30,4,88,76,2065,14.5,71,2,fiat 124b +31,4,71,65,1773,19,71,3,toyota corolla 1200 +35,4,72,69,1613,18,71,3,datsun 1200 +27,4,97,60,1834,19,71,2,volkswagen model 111 +26,4,91,70,1955,20.5,71,1,plymouth cricket +24,4,113,95,2278,15.5,72,3,toyota corona hardtop +25,4,97.5,80,2126,17,72,1,dodge colt hardtop +23,4,97,54,2254,23.5,72,2,volkswagen type 3 +20,4,140,90,2408,19.5,72,1,chevrolet vega +21,4,122,86,2226,16.5,72,1,ford pinto runabout +13,8,350,165,4274,12,72,1,chevrolet impala +14,8,400,175,4385,12,72,1,pontiac catalina +15,8,318,150,4135,13.5,72,1,plymouth fury iii +14,8,351,153,4129,13,72,1,ford galaxie 500 +17,8,304,150,3672,11.5,72,1,amc ambassador sst +11,8,429,208,4633,11,72,1,mercury marquis +13,8,350,155,4502,13.5,72,1,buick lesabre custom +12,8,350,160,4456,13.5,72,1,oldsmobile delta 88 royale +13,8,400,190,4422,12.5,72,1,chrysler newport royal +19,3,70,97,2330,13.5,72,3,mazda rx2 coupe +15,8,304,150,3892,12.5,72,1,amc matador (sw) +13,8,307,130,4098,14,72,1,chevrolet chevelle concours (sw) +13,8,302,140,4294,16,72,1,ford gran torino (sw) +14,8,318,150,4077,14,72,1,plymouth satellite custom (sw) +18,4,121,112,2933,14.5,72,2,volvo 145e (sw) +22,4,121,76,2511,18,72,2,volkswagen 411 (sw) +21,4,120,87,2979,19.5,72,2,peugeot 504 (sw) +26,4,96,69,2189,18,72,2,renault 12 (sw) +22,4,122,86,2395,16,72,1,ford pinto (sw) +28,4,97,92,2288,17,72,3,datsun 510 (sw) +23,4,120,97,2506,14.5,72,3,toyouta corona mark ii (sw) +28,4,98,80,2164,15,72,1,dodge colt (sw) +27,4,97,88,2100,16.5,72,3,toyota corolla 1600 (sw) +13,8,350,175,4100,13,73,1,buick century 350 +14,8,304,150,3672,11.5,73,1,amc matador +13,8,350,145,3988,13,73,1,chevrolet malibu +14,8,302,137,4042,14.5,73,1,ford gran torino +15,8,318,150,3777,12.5,73,1,dodge coronet custom +12,8,429,198,4952,11.5,73,1,mercury marquis brougham +13,8,400,150,4464,12,73,1,chevrolet caprice classic +13,8,351,158,4363,13,73,1,ford ltd +14,8,318,150,4237,14.5,73,1,plymouth fury gran sedan +13,8,440,215,4735,11,73,1,chrysler new yorker brougham +12,8,455,225,4951,11,73,1,buick electra 225 custom +13,8,360,175,3821,11,73,1,amc ambassador brougham +18,6,225,105,3121,16.5,73,1,plymouth valiant +16,6,250,100,3278,18,73,1,chevrolet nova custom +18,6,232,100,2945,16,73,1,amc hornet +18,6,250,88,3021,16.5,73,1,ford maverick +23,6,198,95,2904,16,73,1,plymouth duster +26,4,97,46,1950,21,73,2,volkswagen super beetle +11,8,400,150,4997,14,73,1,chevrolet impala +12,8,400,167,4906,12.5,73,1,ford country +13,8,360,170,4654,13,73,1,plymouth custom suburb +12,8,350,180,4499,12.5,73,1,oldsmobile vista cruiser +18,6,232,100,2789,15,73,1,amc gremlin +20,4,97,88,2279,19,73,3,toyota carina +21,4,140,72,2401,19.5,73,1,chevrolet vega +22,4,108,94,2379,16.5,73,3,datsun 610 +18,3,70,90,2124,13.5,73,3,maxda rx3 +19,4,122,85,2310,18.5,73,1,ford pinto +21,6,155,107,2472,14,73,1,mercury capri v6 +26,4,98,90,2265,15.5,73,2,fiat 124 sport coupe +15,8,350,145,4082,13,73,1,chevrolet monte carlo s +16,8,400,230,4278,9.5,73,1,pontiac grand prix +29,4,68,49,1867,19.5,73,2,fiat 128 +24,4,116,75,2158,15.5,73,2,opel manta +20,4,114,91,2582,14,73,2,audi 100ls +19,4,121,112,2868,15.5,73,2,volvo 144ea +15,8,318,150,3399,11,73,1,dodge dart custom +24,4,121,110,2660,14,73,2,saab 99le +20,6,156,122,2807,13.5,73,3,toyota mark ii +11,8,350,180,3664,11,73,1,oldsmobile omega +20,6,198,95,3102,16.5,74,1,plymouth duster +21,6,200,?,2875,17,74,1,ford maverick +19,6,232,100,2901,16,74,1,amc hornet +15,6,250,100,3336,17,74,1,chevrolet nova +31,4,79,67,1950,19,74,3,datsun b210 +26,4,122,80,2451,16.5,74,1,ford pinto +32,4,71,65,1836,21,74,3,toyota corolla 1200 +25,4,140,75,2542,17,74,1,chevrolet vega +16,6,250,100,3781,17,74,1,chevrolet chevelle malibu classic +16,6,258,110,3632,18,74,1,amc matador +18,6,225,105,3613,16.5,74,1,plymouth satellite sebring +16,8,302,140,4141,14,74,1,ford gran torino +13,8,350,150,4699,14.5,74,1,buick century luxus (sw) +14,8,318,150,4457,13.5,74,1,dodge coronet custom (sw) +14,8,302,140,4638,16,74,1,ford gran torino (sw) +14,8,304,150,4257,15.5,74,1,amc matador (sw) +29,4,98,83,2219,16.5,74,2,audi fox +26,4,79,67,1963,15.5,74,2,volkswagen dasher +26,4,97,78,2300,14.5,74,2,opel manta +31,4,76,52,1649,16.5,74,3,toyota corona +32,4,83,61,2003,19,74,3,datsun 710 +28,4,90,75,2125,14.5,74,1,dodge colt +24,4,90,75,2108,15.5,74,2,fiat 128 +26,4,116,75,2246,14,74,2,fiat 124 tc +24,4,120,97,2489,15,74,3,honda civic +26,4,108,93,2391,15.5,74,3,subaru +31,4,79,67,2000,16,74,2,fiat x1.9 +19,6,225,95,3264,16,75,1,plymouth valiant custom +18,6,250,105,3459,16,75,1,chevrolet nova +15,6,250,72,3432,21,75,1,mercury monarch +15,6,250,72,3158,19.5,75,1,ford maverick +16,8,400,170,4668,11.5,75,1,pontiac catalina +15,8,350,145,4440,14,75,1,chevrolet bel air +16,8,318,150,4498,14.5,75,1,plymouth grand fury +14,8,351,148,4657,13.5,75,1,ford ltd +17,6,231,110,3907,21,75,1,buick century +16,6,250,105,3897,18.5,75,1,chevroelt chevelle malibu +15,6,258,110,3730,19,75,1,amc matador +18,6,225,95,3785,19,75,1,plymouth fury +21,6,231,110,3039,15,75,1,buick skyhawk +20,8,262,110,3221,13.5,75,1,chevrolet monza 2+2 +13,8,302,129,3169,12,75,1,ford mustang ii +29,4,97,75,2171,16,75,3,toyota corolla +23,4,140,83,2639,17,75,1,ford pinto +20,6,232,100,2914,16,75,1,amc gremlin +23,4,140,78,2592,18.5,75,1,pontiac astro +24,4,134,96,2702,13.5,75,3,toyota corona +25,4,90,71,2223,16.5,75,2,volkswagen dasher +24,4,119,97,2545,17,75,3,datsun 710 +18,6,171,97,2984,14.5,75,1,ford pinto +29,4,90,70,1937,14,75,2,volkswagen rabbit +19,6,232,90,3211,17,75,1,amc pacer +23,4,115,95,2694,15,75,2,audi 100ls +23,4,120,88,2957,17,75,2,peugeot 504 +22,4,121,98,2945,14.5,75,2,volvo 244dl +25,4,121,115,2671,13.5,75,2,saab 99le +33,4,91,53,1795,17.5,75,3,honda civic cvcc +28,4,107,86,2464,15.5,76,2,fiat 131 +25,4,116,81,2220,16.9,76,2,opel 1900 +25,4,140,92,2572,14.9,76,1,capri ii +26,4,98,79,2255,17.7,76,1,dodge colt +27,4,101,83,2202,15.3,76,2,renault 12tl +17.5,8,305,140,4215,13,76,1,chevrolet chevelle malibu classic +16,8,318,150,4190,13,76,1,dodge coronet brougham +15.5,8,304,120,3962,13.9,76,1,amc matador +14.5,8,351,152,4215,12.8,76,1,ford gran torino +22,6,225,100,3233,15.4,76,1,plymouth valiant +22,6,250,105,3353,14.5,76,1,chevrolet nova +24,6,200,81,3012,17.6,76,1,ford maverick +22.5,6,232,90,3085,17.6,76,1,amc hornet +29,4,85,52,2035,22.2,76,1,chevrolet chevette +24.5,4,98,60,2164,22.1,76,1,chevrolet woody +29,4,90,70,1937,14.2,76,2,vw rabbit +33,4,91,53,1795,17.4,76,3,honda civic +20,6,225,100,3651,17.7,76,1,dodge aspen se +18,6,250,78,3574,21,76,1,ford granada ghia +18.5,6,250,110,3645,16.2,76,1,pontiac ventura sj +17.5,6,258,95,3193,17.8,76,1,amc pacer d/l +29.5,4,97,71,1825,12.2,76,2,volkswagen rabbit +32,4,85,70,1990,17,76,3,datsun b-210 +28,4,97,75,2155,16.4,76,3,toyota corolla +26.5,4,140,72,2565,13.6,76,1,ford pinto +20,4,130,102,3150,15.7,76,2,volvo 245 +13,8,318,150,3940,13.2,76,1,plymouth volare premier v8 +19,4,120,88,3270,21.9,76,2,peugeot 504 +19,6,156,108,2930,15.5,76,3,toyota mark ii +16.5,6,168,120,3820,16.7,76,2,mercedes-benz 280s +16.5,8,350,180,4380,12.1,76,1,cadillac seville +13,8,350,145,4055,12,76,1,chevy c10 +13,8,302,130,3870,15,76,1,ford f108 +13,8,318,150,3755,14,76,1,dodge d100 +31.5,4,98,68,2045,18.5,77,3,honda accord cvcc +30,4,111,80,2155,14.8,77,1,buick opel isuzu deluxe +36,4,79,58,1825,18.6,77,2,renault 5 gtl +25.5,4,122,96,2300,15.5,77,1,plymouth arrow gs +33.5,4,85,70,1945,16.8,77,3,datsun f-10 hatchback +17.5,8,305,145,3880,12.5,77,1,chevrolet caprice classic +17,8,260,110,4060,19,77,1,oldsmobile cutlass supreme +15.5,8,318,145,4140,13.7,77,1,dodge monaco brougham +15,8,302,130,4295,14.9,77,1,mercury cougar brougham +17.5,6,250,110,3520,16.4,77,1,chevrolet concours +20.5,6,231,105,3425,16.9,77,1,buick skylark +19,6,225,100,3630,17.7,77,1,plymouth volare custom +18.5,6,250,98,3525,19,77,1,ford granada +16,8,400,180,4220,11.1,77,1,pontiac grand prix lj +15.5,8,350,170,4165,11.4,77,1,chevrolet monte carlo landau +15.5,8,400,190,4325,12.2,77,1,chrysler cordoba +16,8,351,149,4335,14.5,77,1,ford thunderbird +29,4,97,78,1940,14.5,77,2,volkswagen rabbit custom +24.5,4,151,88,2740,16,77,1,pontiac sunbird coupe +26,4,97,75,2265,18.2,77,3,toyota corolla liftback +25.5,4,140,89,2755,15.8,77,1,ford mustang ii 2+2 +30.5,4,98,63,2051,17,77,1,chevrolet chevette +33.5,4,98,83,2075,15.9,77,1,dodge colt m/m +30,4,97,67,1985,16.4,77,3,subaru dl +30.5,4,97,78,2190,14.1,77,2,volkswagen dasher +22,6,146,97,2815,14.5,77,3,datsun 810 +21.5,4,121,110,2600,12.8,77,2,bmw 320i +21.5,3,80,110,2720,13.5,77,3,mazda rx-4 +43.1,4,90,48,1985,21.5,78,2,volkswagen rabbit custom diesel +36.1,4,98,66,1800,14.4,78,1,ford fiesta +32.8,4,78,52,1985,19.4,78,3,mazda glc deluxe +39.4,4,85,70,2070,18.6,78,3,datsun b210 gx +36.1,4,91,60,1800,16.4,78,3,honda civic cvcc +19.9,8,260,110,3365,15.5,78,1,oldsmobile cutlass salon brougham +19.4,8,318,140,3735,13.2,78,1,dodge diplomat +20.2,8,302,139,3570,12.8,78,1,mercury monarch ghia +19.2,6,231,105,3535,19.2,78,1,pontiac phoenix lj +20.5,6,200,95,3155,18.2,78,1,chevrolet malibu +20.2,6,200,85,2965,15.8,78,1,ford fairmont (auto) +25.1,4,140,88,2720,15.4,78,1,ford fairmont (man) +20.5,6,225,100,3430,17.2,78,1,plymouth volare +19.4,6,232,90,3210,17.2,78,1,amc concord +20.6,6,231,105,3380,15.8,78,1,buick century special +20.8,6,200,85,3070,16.7,78,1,mercury zephyr +18.6,6,225,110,3620,18.7,78,1,dodge aspen +18.1,6,258,120,3410,15.1,78,1,amc concord d/l +19.2,8,305,145,3425,13.2,78,1,chevrolet monte carlo landau +17.7,6,231,165,3445,13.4,78,1,buick regal sport coupe (turbo) +18.1,8,302,139,3205,11.2,78,1,ford futura +17.5,8,318,140,4080,13.7,78,1,dodge magnum xe +30,4,98,68,2155,16.5,78,1,chevrolet chevette +27.5,4,134,95,2560,14.2,78,3,toyota corona +27.2,4,119,97,2300,14.7,78,3,datsun 510 +30.9,4,105,75,2230,14.5,78,1,dodge omni +21.1,4,134,95,2515,14.8,78,3,toyota celica gt liftback +23.2,4,156,105,2745,16.7,78,1,plymouth sapporo +23.8,4,151,85,2855,17.6,78,1,oldsmobile starfire sx +23.9,4,119,97,2405,14.9,78,3,datsun 200-sx +20.3,5,131,103,2830,15.9,78,2,audi 5000 +17,6,163,125,3140,13.6,78,2,volvo 264gl +21.6,4,121,115,2795,15.7,78,2,saab 99gle +16.2,6,163,133,3410,15.8,78,2,peugeot 604sl +31.5,4,89,71,1990,14.9,78,2,volkswagen scirocco +29.5,4,98,68,2135,16.6,78,3,honda accord lx +21.5,6,231,115,3245,15.4,79,1,pontiac lemans v6 +19.8,6,200,85,2990,18.2,79,1,mercury zephyr 6 +22.3,4,140,88,2890,17.3,79,1,ford fairmont 4 +20.2,6,232,90,3265,18.2,79,1,amc concord dl 6 +20.6,6,225,110,3360,16.6,79,1,dodge aspen 6 +17,8,305,130,3840,15.4,79,1,chevrolet caprice classic +17.6,8,302,129,3725,13.4,79,1,ford ltd landau +16.5,8,351,138,3955,13.2,79,1,mercury grand marquis +18.2,8,318,135,3830,15.2,79,1,dodge st. regis +16.9,8,350,155,4360,14.9,79,1,buick estate wagon (sw) +15.5,8,351,142,4054,14.3,79,1,ford country squire (sw) +19.2,8,267,125,3605,15,79,1,chevrolet malibu classic (sw) +18.5,8,360,150,3940,13,79,1,chrysler lebaron town @ country (sw) +31.9,4,89,71,1925,14,79,2,vw rabbit custom +34.1,4,86,65,1975,15.2,79,3,maxda glc deluxe +35.7,4,98,80,1915,14.4,79,1,dodge colt hatchback custom +27.4,4,121,80,2670,15,79,1,amc spirit dl +25.4,5,183,77,3530,20.1,79,2,mercedes benz 300d +23,8,350,125,3900,17.4,79,1,cadillac eldorado +27.2,4,141,71,3190,24.8,79,2,peugeot 504 +23.9,8,260,90,3420,22.2,79,1,oldsmobile cutlass salon brougham +34.2,4,105,70,2200,13.2,79,1,plymouth horizon +34.5,4,105,70,2150,14.9,79,1,plymouth horizon tc3 +31.8,4,85,65,2020,19.2,79,3,datsun 210 +37.3,4,91,69,2130,14.7,79,2,fiat strada custom +28.4,4,151,90,2670,16,79,1,buick skylark limited +28.8,6,173,115,2595,11.3,79,1,chevrolet citation +26.8,6,173,115,2700,12.9,79,1,oldsmobile omega brougham +33.5,4,151,90,2556,13.2,79,1,pontiac phoenix +41.5,4,98,76,2144,14.7,80,2,vw rabbit +38.1,4,89,60,1968,18.8,80,3,toyota corolla tercel +32.1,4,98,70,2120,15.5,80,1,chevrolet chevette +37.2,4,86,65,2019,16.4,80,3,datsun 310 +28,4,151,90,2678,16.5,80,1,chevrolet citation +26.4,4,140,88,2870,18.1,80,1,ford fairmont +24.3,4,151,90,3003,20.1,80,1,amc concord +19.1,6,225,90,3381,18.7,80,1,dodge aspen +34.3,4,97,78,2188,15.8,80,2,audi 4000 +29.8,4,134,90,2711,15.5,80,3,toyota corona liftback +31.3,4,120,75,2542,17.5,80,3,mazda 626 +37,4,119,92,2434,15,80,3,datsun 510 hatchback +32.2,4,108,75,2265,15.2,80,3,toyota corolla +46.6,4,86,65,2110,17.9,80,3,mazda glc +27.9,4,156,105,2800,14.4,80,1,dodge colt +40.8,4,85,65,2110,19.2,80,3,datsun 210 +44.3,4,90,48,2085,21.7,80,2,vw rabbit c (diesel) +43.4,4,90,48,2335,23.7,80,2,vw dasher (diesel) +36.4,5,121,67,2950,19.9,80,2,audi 5000s (diesel) +30,4,146,67,3250,21.8,80,2,mercedes-benz 240d +44.6,4,91,67,1850,13.8,80,3,honda civic 1500 gl +40.9,4,85,?,1835,17.3,80,2,renault lecar deluxe +33.8,4,97,67,2145,18,80,3,subaru dl +29.8,4,89,62,1845,15.3,80,2,vokswagen rabbit +32.7,6,168,132,2910,11.4,80,3,datsun 280-zx +23.7,3,70,100,2420,12.5,80,3,mazda rx-7 gs +35,4,122,88,2500,15.1,80,2,triumph tr7 coupe +23.6,4,140,?,2905,14.3,80,1,ford mustang cobra +32.4,4,107,72,2290,17,80,3,honda accord +27.2,4,135,84,2490,15.7,81,1,plymouth reliant +26.6,4,151,84,2635,16.4,81,1,buick skylark +25.8,4,156,92,2620,14.4,81,1,dodge aries wagon (sw) +23.5,6,173,110,2725,12.6,81,1,chevrolet citation +30,4,135,84,2385,12.9,81,1,plymouth reliant +39.1,4,79,58,1755,16.9,81,3,toyota starlet +39,4,86,64,1875,16.4,81,1,plymouth champ +35.1,4,81,60,1760,16.1,81,3,honda civic 1300 +32.3,4,97,67,2065,17.8,81,3,subaru +37,4,85,65,1975,19.4,81,3,datsun 210 mpg +37.7,4,89,62,2050,17.3,81,3,toyota tercel +34.1,4,91,68,1985,16,81,3,mazda glc 4 +34.7,4,105,63,2215,14.9,81,1,plymouth horizon 4 +34.4,4,98,65,2045,16.2,81,1,ford escort 4w +29.9,4,98,65,2380,20.7,81,1,ford escort 2h +33,4,105,74,2190,14.2,81,2,volkswagen jetta +34.5,4,100,?,2320,15.8,81,2,renault 18i +33.7,4,107,75,2210,14.4,81,3,honda prelude +32.4,4,108,75,2350,16.8,81,3,toyota corolla +32.9,4,119,100,2615,14.8,81,3,datsun 200sx +31.6,4,120,74,2635,18.3,81,3,mazda 626 +28.1,4,141,80,3230,20.4,81,2,peugeot 505s turbo diesel +30.7,6,145,76,3160,19.6,81,2,volvo diesel +25.4,6,168,116,2900,12.6,81,3,toyota cressida +24.2,6,146,120,2930,13.8,81,3,datsun 810 maxima +22.4,6,231,110,3415,15.8,81,1,buick century +26.6,8,350,105,3725,19,81,1,oldsmobile cutlass ls +20.2,6,200,88,3060,17.1,81,1,ford granada gl +17.6,6,225,85,3465,16.6,81,1,chrysler lebaron salon +28,4,112,88,2605,19.6,82,1,chevrolet cavalier +27,4,112,88,2640,18.6,82,1,chevrolet cavalier wagon +34,4,112,88,2395,18,82,1,chevrolet cavalier 2-door +31,4,112,85,2575,16.2,82,1,pontiac j2000 se hatchback +29,4,135,84,2525,16,82,1,dodge aries se +27,4,151,90,2735,18,82,1,pontiac phoenix +24,4,140,92,2865,16.4,82,1,ford fairmont futura +36,4,105,74,1980,15.3,82,2,volkswagen rabbit l +37,4,91,68,2025,18.2,82,3,mazda glc custom l +31,4,91,68,1970,17.6,82,3,mazda glc custom +38,4,105,63,2125,14.7,82,1,plymouth horizon miser +36,4,98,70,2125,17.3,82,1,mercury lynx l +36,4,120,88,2160,14.5,82,3,nissan stanza xe +36,4,107,75,2205,14.5,82,3,honda accord +34,4,108,70,2245,16.9,82,3,toyota corolla +38,4,91,67,1965,15,82,3,honda civic +32,4,91,67,1965,15.7,82,3,honda civic (auto) +38,4,91,67,1995,16.2,82,3,datsun 310 gx +25,6,181,110,2945,16.4,82,1,buick century limited +38,6,262,85,3015,17,82,1,oldsmobile cutlass ciera (diesel) +26,4,156,92,2585,14.5,82,1,chrysler lebaron medallion +22,6,232,112,2835,14.7,82,1,ford granada l +32,4,144,96,2665,13.9,82,3,toyota celica gt +36,4,135,84,2370,13,82,1,dodge charger 2.2 +27,4,151,90,2950,17.3,82,1,chevrolet camaro +27,4,140,86,2790,15.6,82,1,ford mustang gl +44,4,97,52,2130,24.6,82,2,vw pickup +32,4,135,84,2295,11.6,82,1,dodge rampage +28,4,120,79,2625,18.6,82,1,ford ranger +31,4,119,82,2720,19.4,82,1,chevy s-10 diff --git a/ISLP_data/Auto.data b/ISLP_data/Auto.data new file mode 100644 index 0000000..236ae22 --- /dev/null +++ b/ISLP_data/Auto.data @@ -0,0 +1,398 @@ +mpg cylinders displacement horsepower weight acceleration year origin name +18.0 8 307.0 130.0 3504. 12.0 70 1 "chevrolet chevelle malibu" +15.0 8 350.0 165.0 3693. 11.5 70 1 "buick skylark 320" +18.0 8 318.0 150.0 3436. 11.0 70 1 "plymouth satellite" +16.0 8 304.0 150.0 3433. 12.0 70 1 "amc rebel sst" +17.0 8 302.0 140.0 3449. 10.5 70 1 "ford torino" +15.0 8 429.0 198.0 4341. 10.0 70 1 "ford galaxie 500" +14.0 8 454.0 220.0 4354. 9.0 70 1 "chevrolet impala" +14.0 8 440.0 215.0 4312. 8.5 70 1 "plymouth fury iii" +14.0 8 455.0 225.0 4425. 10.0 70 1 "pontiac catalina" +15.0 8 390.0 190.0 3850. 8.5 70 1 "amc ambassador dpl" +15.0 8 383.0 170.0 3563. 10.0 70 1 "dodge challenger se" +14.0 8 340.0 160.0 3609. 8.0 70 1 "plymouth 'cuda 340" +15.0 8 400.0 150.0 3761. 9.5 70 1 "chevrolet monte carlo" +14.0 8 455.0 225.0 3086. 10.0 70 1 "buick estate wagon (sw)" +24.0 4 113.0 95.00 2372. 15.0 70 3 "toyota corona mark ii" +22.0 6 198.0 95.00 2833. 15.5 70 1 "plymouth duster" +18.0 6 199.0 97.00 2774. 15.5 70 1 "amc hornet" +21.0 6 200.0 85.00 2587. 16.0 70 1 "ford maverick" +27.0 4 97.00 88.00 2130. 14.5 70 3 "datsun pl510" +26.0 4 97.00 46.00 1835. 20.5 70 2 "volkswagen 1131 deluxe sedan" +25.0 4 110.0 87.00 2672. 17.5 70 2 "peugeot 504" +24.0 4 107.0 90.00 2430. 14.5 70 2 "audi 100 ls" +25.0 4 104.0 95.00 2375. 17.5 70 2 "saab 99e" +26.0 4 121.0 113.0 2234. 12.5 70 2 "bmw 2002" +21.0 6 199.0 90.00 2648. 15.0 70 1 "amc gremlin" +10.0 8 360.0 215.0 4615. 14.0 70 1 "ford f250" +10.0 8 307.0 200.0 4376. 15.0 70 1 "chevy c20" +11.0 8 318.0 210.0 4382. 13.5 70 1 "dodge d200" +9.0 8 304.0 193.0 4732. 18.5 70 1 "hi 1200d" +27.0 4 97.00 88.00 2130. 14.5 71 3 "datsun pl510" +28.0 4 140.0 90.00 2264. 15.5 71 1 "chevrolet vega 2300" +25.0 4 113.0 95.00 2228. 14.0 71 3 "toyota corona" +25.0 4 98.00 ? 2046. 19.0 71 1 "ford pinto" +19.0 6 232.0 100.0 2634. 13.0 71 1 "amc gremlin" +16.0 6 225.0 105.0 3439. 15.5 71 1 "plymouth satellite custom" +17.0 6 250.0 100.0 3329. 15.5 71 1 "chevrolet chevelle malibu" +19.0 6 250.0 88.00 3302. 15.5 71 1 "ford torino 500" +18.0 6 232.0 100.0 3288. 15.5 71 1 "amc matador" +14.0 8 350.0 165.0 4209. 12.0 71 1 "chevrolet impala" +14.0 8 400.0 175.0 4464. 11.5 71 1 "pontiac catalina brougham" +14.0 8 351.0 153.0 4154. 13.5 71 1 "ford galaxie 500" +14.0 8 318.0 150.0 4096. 13.0 71 1 "plymouth fury iii" +12.0 8 383.0 180.0 4955. 11.5 71 1 "dodge monaco (sw)" +13.0 8 400.0 170.0 4746. 12.0 71 1 "ford country squire (sw)" +13.0 8 400.0 175.0 5140. 12.0 71 1 "pontiac safari (sw)" +18.0 6 258.0 110.0 2962. 13.5 71 1 "amc hornet sportabout (sw)" +22.0 4 140.0 72.00 2408. 19.0 71 1 "chevrolet vega (sw)" +19.0 6 250.0 100.0 3282. 15.0 71 1 "pontiac firebird" +18.0 6 250.0 88.00 3139. 14.5 71 1 "ford mustang" +23.0 4 122.0 86.00 2220. 14.0 71 1 "mercury capri 2000" +28.0 4 116.0 90.00 2123. 14.0 71 2 "opel 1900" +30.0 4 79.00 70.00 2074. 19.5 71 2 "peugeot 304" +30.0 4 88.00 76.00 2065. 14.5 71 2 "fiat 124b" +31.0 4 71.00 65.00 1773. 19.0 71 3 "toyota corolla 1200" +35.0 4 72.00 69.00 1613. 18.0 71 3 "datsun 1200" +27.0 4 97.00 60.00 1834. 19.0 71 2 "volkswagen model 111" +26.0 4 91.00 70.00 1955. 20.5 71 1 "plymouth cricket" +24.0 4 113.0 95.00 2278. 15.5 72 3 "toyota corona hardtop" +25.0 4 97.50 80.00 2126. 17.0 72 1 "dodge colt hardtop" +23.0 4 97.00 54.00 2254. 23.5 72 2 "volkswagen type 3" +20.0 4 140.0 90.00 2408. 19.5 72 1 "chevrolet vega" +21.0 4 122.0 86.00 2226. 16.5 72 1 "ford pinto runabout" +13.0 8 350.0 165.0 4274. 12.0 72 1 "chevrolet impala" +14.0 8 400.0 175.0 4385. 12.0 72 1 "pontiac catalina" +15.0 8 318.0 150.0 4135. 13.5 72 1 "plymouth fury iii" +14.0 8 351.0 153.0 4129. 13.0 72 1 "ford galaxie 500" +17.0 8 304.0 150.0 3672. 11.5 72 1 "amc ambassador sst" +11.0 8 429.0 208.0 4633. 11.0 72 1 "mercury marquis" +13.0 8 350.0 155.0 4502. 13.5 72 1 "buick lesabre custom" +12.0 8 350.0 160.0 4456. 13.5 72 1 "oldsmobile delta 88 royale" +13.0 8 400.0 190.0 4422. 12.5 72 1 "chrysler newport royal" +19.0 3 70.00 97.00 2330. 13.5 72 3 "mazda rx2 coupe" +15.0 8 304.0 150.0 3892. 12.5 72 1 "amc matador (sw)" +13.0 8 307.0 130.0 4098. 14.0 72 1 "chevrolet chevelle concours (sw)" +13.0 8 302.0 140.0 4294. 16.0 72 1 "ford gran torino (sw)" +14.0 8 318.0 150.0 4077. 14.0 72 1 "plymouth satellite custom (sw)" +18.0 4 121.0 112.0 2933. 14.5 72 2 "volvo 145e (sw)" +22.0 4 121.0 76.00 2511. 18.0 72 2 "volkswagen 411 (sw)" +21.0 4 120.0 87.00 2979. 19.5 72 2 "peugeot 504 (sw)" +26.0 4 96.00 69.00 2189. 18.0 72 2 "renault 12 (sw)" +22.0 4 122.0 86.00 2395. 16.0 72 1 "ford pinto (sw)" +28.0 4 97.00 92.00 2288. 17.0 72 3 "datsun 510 (sw)" +23.0 4 120.0 97.00 2506. 14.5 72 3 "toyouta corona mark ii (sw)" +28.0 4 98.00 80.00 2164. 15.0 72 1 "dodge colt (sw)" +27.0 4 97.00 88.00 2100. 16.5 72 3 "toyota corolla 1600 (sw)" +13.0 8 350.0 175.0 4100. 13.0 73 1 "buick century 350" +14.0 8 304.0 150.0 3672. 11.5 73 1 "amc matador" +13.0 8 350.0 145.0 3988. 13.0 73 1 "chevrolet malibu" +14.0 8 302.0 137.0 4042. 14.5 73 1 "ford gran torino" +15.0 8 318.0 150.0 3777. 12.5 73 1 "dodge coronet custom" +12.0 8 429.0 198.0 4952. 11.5 73 1 "mercury marquis brougham" +13.0 8 400.0 150.0 4464. 12.0 73 1 "chevrolet caprice classic" +13.0 8 351.0 158.0 4363. 13.0 73 1 "ford ltd" +14.0 8 318.0 150.0 4237. 14.5 73 1 "plymouth fury gran sedan" +13.0 8 440.0 215.0 4735. 11.0 73 1 "chrysler new yorker brougham" +12.0 8 455.0 225.0 4951. 11.0 73 1 "buick electra 225 custom" +13.0 8 360.0 175.0 3821. 11.0 73 1 "amc ambassador brougham" +18.0 6 225.0 105.0 3121. 16.5 73 1 "plymouth valiant" +16.0 6 250.0 100.0 3278. 18.0 73 1 "chevrolet nova custom" +18.0 6 232.0 100.0 2945. 16.0 73 1 "amc hornet" +18.0 6 250.0 88.00 3021. 16.5 73 1 "ford maverick" +23.0 6 198.0 95.00 2904. 16.0 73 1 "plymouth duster" +26.0 4 97.00 46.00 1950. 21.0 73 2 "volkswagen super beetle" +11.0 8 400.0 150.0 4997. 14.0 73 1 "chevrolet impala" +12.0 8 400.0 167.0 4906. 12.5 73 1 "ford country" +13.0 8 360.0 170.0 4654. 13.0 73 1 "plymouth custom suburb" +12.0 8 350.0 180.0 4499. 12.5 73 1 "oldsmobile vista cruiser" +18.0 6 232.0 100.0 2789. 15.0 73 1 "amc gremlin" +20.0 4 97.00 88.00 2279. 19.0 73 3 "toyota carina" +21.0 4 140.0 72.00 2401. 19.5 73 1 "chevrolet vega" +22.0 4 108.0 94.00 2379. 16.5 73 3 "datsun 610" +18.0 3 70.00 90.00 2124. 13.5 73 3 "maxda rx3" +19.0 4 122.0 85.00 2310. 18.5 73 1 "ford pinto" +21.0 6 155.0 107.0 2472. 14.0 73 1 "mercury capri v6" +26.0 4 98.00 90.00 2265. 15.5 73 2 "fiat 124 sport coupe" +15.0 8 350.0 145.0 4082. 13.0 73 1 "chevrolet monte carlo s" +16.0 8 400.0 230.0 4278. 9.50 73 1 "pontiac grand prix" +29.0 4 68.00 49.00 1867. 19.5 73 2 "fiat 128" +24.0 4 116.0 75.00 2158. 15.5 73 2 "opel manta" +20.0 4 114.0 91.00 2582. 14.0 73 2 "audi 100ls" +19.0 4 121.0 112.0 2868. 15.5 73 2 "volvo 144ea" +15.0 8 318.0 150.0 3399. 11.0 73 1 "dodge dart custom" +24.0 4 121.0 110.0 2660. 14.0 73 2 "saab 99le" +20.0 6 156.0 122.0 2807. 13.5 73 3 "toyota mark ii" +11.0 8 350.0 180.0 3664. 11.0 73 1 "oldsmobile omega" +20.0 6 198.0 95.00 3102. 16.5 74 1 "plymouth duster" +21.0 6 200.0 ? 2875. 17.0 74 1 "ford maverick" +19.0 6 232.0 100.0 2901. 16.0 74 1 "amc hornet" +15.0 6 250.0 100.0 3336. 17.0 74 1 "chevrolet nova" +31.0 4 79.00 67.00 1950. 19.0 74 3 "datsun b210" +26.0 4 122.0 80.00 2451. 16.5 74 1 "ford pinto" +32.0 4 71.00 65.00 1836. 21.0 74 3 "toyota corolla 1200" +25.0 4 140.0 75.00 2542. 17.0 74 1 "chevrolet vega" +16.0 6 250.0 100.0 3781. 17.0 74 1 "chevrolet chevelle malibu classic" +16.0 6 258.0 110.0 3632. 18.0 74 1 "amc matador" +18.0 6 225.0 105.0 3613. 16.5 74 1 "plymouth satellite sebring" +16.0 8 302.0 140.0 4141. 14.0 74 1 "ford gran torino" +13.0 8 350.0 150.0 4699. 14.5 74 1 "buick century luxus (sw)" +14.0 8 318.0 150.0 4457. 13.5 74 1 "dodge coronet custom (sw)" +14.0 8 302.0 140.0 4638. 16.0 74 1 "ford gran torino (sw)" +14.0 8 304.0 150.0 4257. 15.5 74 1 "amc matador (sw)" +29.0 4 98.00 83.00 2219. 16.5 74 2 "audi fox" +26.0 4 79.00 67.00 1963. 15.5 74 2 "volkswagen dasher" +26.0 4 97.00 78.00 2300. 14.5 74 2 "opel manta" +31.0 4 76.00 52.00 1649. 16.5 74 3 "toyota corona" +32.0 4 83.00 61.00 2003. 19.0 74 3 "datsun 710" +28.0 4 90.00 75.00 2125. 14.5 74 1 "dodge colt" +24.0 4 90.00 75.00 2108. 15.5 74 2 "fiat 128" +26.0 4 116.0 75.00 2246. 14.0 74 2 "fiat 124 tc" +24.0 4 120.0 97.00 2489. 15.0 74 3 "honda civic" +26.0 4 108.0 93.00 2391. 15.5 74 3 "subaru" +31.0 4 79.00 67.00 2000. 16.0 74 2 "fiat x1.9" +19.0 6 225.0 95.00 3264. 16.0 75 1 "plymouth valiant custom" +18.0 6 250.0 105.0 3459. 16.0 75 1 "chevrolet nova" +15.0 6 250.0 72.00 3432. 21.0 75 1 "mercury monarch" +15.0 6 250.0 72.00 3158. 19.5 75 1 "ford maverick" +16.0 8 400.0 170.0 4668. 11.5 75 1 "pontiac catalina" +15.0 8 350.0 145.0 4440. 14.0 75 1 "chevrolet bel air" +16.0 8 318.0 150.0 4498. 14.5 75 1 "plymouth grand fury" +14.0 8 351.0 148.0 4657. 13.5 75 1 "ford ltd" +17.0 6 231.0 110.0 3907. 21.0 75 1 "buick century" +16.0 6 250.0 105.0 3897. 18.5 75 1 "chevroelt chevelle malibu" +15.0 6 258.0 110.0 3730. 19.0 75 1 "amc matador" +18.0 6 225.0 95.00 3785. 19.0 75 1 "plymouth fury" +21.0 6 231.0 110.0 3039. 15.0 75 1 "buick skyhawk" +20.0 8 262.0 110.0 3221. 13.5 75 1 "chevrolet monza 2+2" +13.0 8 302.0 129.0 3169. 12.0 75 1 "ford mustang ii" +29.0 4 97.00 75.00 2171. 16.0 75 3 "toyota corolla" +23.0 4 140.0 83.00 2639. 17.0 75 1 "ford pinto" +20.0 6 232.0 100.0 2914. 16.0 75 1 "amc gremlin" +23.0 4 140.0 78.00 2592. 18.5 75 1 "pontiac astro" +24.0 4 134.0 96.00 2702. 13.5 75 3 "toyota corona" +25.0 4 90.00 71.00 2223. 16.5 75 2 "volkswagen dasher" +24.0 4 119.0 97.00 2545. 17.0 75 3 "datsun 710" +18.0 6 171.0 97.00 2984. 14.5 75 1 "ford pinto" +29.0 4 90.00 70.00 1937. 14.0 75 2 "volkswagen rabbit" +19.0 6 232.0 90.00 3211. 17.0 75 1 "amc pacer" +23.0 4 115.0 95.00 2694. 15.0 75 2 "audi 100ls" +23.0 4 120.0 88.00 2957. 17.0 75 2 "peugeot 504" +22.0 4 121.0 98.00 2945. 14.5 75 2 "volvo 244dl" +25.0 4 121.0 115.0 2671. 13.5 75 2 "saab 99le" +33.0 4 91.00 53.00 1795. 17.5 75 3 "honda civic cvcc" +28.0 4 107.0 86.00 2464. 15.5 76 2 "fiat 131" +25.0 4 116.0 81.00 2220. 16.9 76 2 "opel 1900" +25.0 4 140.0 92.00 2572. 14.9 76 1 "capri ii" +26.0 4 98.00 79.00 2255. 17.7 76 1 "dodge colt" +27.0 4 101.0 83.00 2202. 15.3 76 2 "renault 12tl" +17.5 8 305.0 140.0 4215. 13.0 76 1 "chevrolet chevelle malibu classic" +16.0 8 318.0 150.0 4190. 13.0 76 1 "dodge coronet brougham" +15.5 8 304.0 120.0 3962. 13.9 76 1 "amc matador" +14.5 8 351.0 152.0 4215. 12.8 76 1 "ford gran torino" +22.0 6 225.0 100.0 3233. 15.4 76 1 "plymouth valiant" +22.0 6 250.0 105.0 3353. 14.5 76 1 "chevrolet nova" +24.0 6 200.0 81.00 3012. 17.6 76 1 "ford maverick" +22.5 6 232.0 90.00 3085. 17.6 76 1 "amc hornet" +29.0 4 85.00 52.00 2035. 22.2 76 1 "chevrolet chevette" +24.5 4 98.00 60.00 2164. 22.1 76 1 "chevrolet woody" +29.0 4 90.00 70.00 1937. 14.2 76 2 "vw rabbit" +33.0 4 91.00 53.00 1795. 17.4 76 3 "honda civic" +20.0 6 225.0 100.0 3651. 17.7 76 1 "dodge aspen se" +18.0 6 250.0 78.00 3574. 21.0 76 1 "ford granada ghia" +18.5 6 250.0 110.0 3645. 16.2 76 1 "pontiac ventura sj" +17.5 6 258.0 95.00 3193. 17.8 76 1 "amc pacer d/l" +29.5 4 97.00 71.00 1825. 12.2 76 2 "volkswagen rabbit" +32.0 4 85.00 70.00 1990. 17.0 76 3 "datsun b-210" +28.0 4 97.00 75.00 2155. 16.4 76 3 "toyota corolla" +26.5 4 140.0 72.00 2565. 13.6 76 1 "ford pinto" +20.0 4 130.0 102.0 3150. 15.7 76 2 "volvo 245" +13.0 8 318.0 150.0 3940. 13.2 76 1 "plymouth volare premier v8" +19.0 4 120.0 88.00 3270. 21.9 76 2 "peugeot 504" +19.0 6 156.0 108.0 2930. 15.5 76 3 "toyota mark ii" +16.5 6 168.0 120.0 3820. 16.7 76 2 "mercedes-benz 280s" +16.5 8 350.0 180.0 4380. 12.1 76 1 "cadillac seville" +13.0 8 350.0 145.0 4055. 12.0 76 1 "chevy c10" +13.0 8 302.0 130.0 3870. 15.0 76 1 "ford f108" +13.0 8 318.0 150.0 3755. 14.0 76 1 "dodge d100" +31.5 4 98.00 68.00 2045. 18.5 77 3 "honda accord cvcc" +30.0 4 111.0 80.00 2155. 14.8 77 1 "buick opel isuzu deluxe" +36.0 4 79.00 58.00 1825. 18.6 77 2 "renault 5 gtl" +25.5 4 122.0 96.00 2300. 15.5 77 1 "plymouth arrow gs" +33.5 4 85.00 70.00 1945. 16.8 77 3 "datsun f-10 hatchback" +17.5 8 305.0 145.0 3880. 12.5 77 1 "chevrolet caprice classic" +17.0 8 260.0 110.0 4060. 19.0 77 1 "oldsmobile cutlass supreme" +15.5 8 318.0 145.0 4140. 13.7 77 1 "dodge monaco brougham" +15.0 8 302.0 130.0 4295. 14.9 77 1 "mercury cougar brougham" +17.5 6 250.0 110.0 3520. 16.4 77 1 "chevrolet concours" +20.5 6 231.0 105.0 3425. 16.9 77 1 "buick skylark" +19.0 6 225.0 100.0 3630. 17.7 77 1 "plymouth volare custom" +18.5 6 250.0 98.00 3525. 19.0 77 1 "ford granada" +16.0 8 400.0 180.0 4220. 11.1 77 1 "pontiac grand prix lj" +15.5 8 350.0 170.0 4165. 11.4 77 1 "chevrolet monte carlo landau" +15.5 8 400.0 190.0 4325. 12.2 77 1 "chrysler cordoba" +16.0 8 351.0 149.0 4335. 14.5 77 1 "ford thunderbird" +29.0 4 97.00 78.00 1940. 14.5 77 2 "volkswagen rabbit custom" +24.5 4 151.0 88.00 2740. 16.0 77 1 "pontiac sunbird coupe" +26.0 4 97.00 75.00 2265. 18.2 77 3 "toyota corolla liftback" +25.5 4 140.0 89.00 2755. 15.8 77 1 "ford mustang ii 2+2" +30.5 4 98.00 63.00 2051. 17.0 77 1 "chevrolet chevette" +33.5 4 98.00 83.00 2075. 15.9 77 1 "dodge colt m/m" +30.0 4 97.00 67.00 1985. 16.4 77 3 "subaru dl" +30.5 4 97.00 78.00 2190. 14.1 77 2 "volkswagen dasher" +22.0 6 146.0 97.00 2815. 14.5 77 3 "datsun 810" +21.5 4 121.0 110.0 2600. 12.8 77 2 "bmw 320i" +21.5 3 80.00 110.0 2720. 13.5 77 3 "mazda rx-4" +43.1 4 90.00 48.00 1985. 21.5 78 2 "volkswagen rabbit custom diesel" +36.1 4 98.00 66.00 1800. 14.4 78 1 "ford fiesta" +32.8 4 78.00 52.00 1985. 19.4 78 3 "mazda glc deluxe" +39.4 4 85.00 70.00 2070. 18.6 78 3 "datsun b210 gx" +36.1 4 91.00 60.00 1800. 16.4 78 3 "honda civic cvcc" +19.9 8 260.0 110.0 3365. 15.5 78 1 "oldsmobile cutlass salon brougham" +19.4 8 318.0 140.0 3735. 13.2 78 1 "dodge diplomat" +20.2 8 302.0 139.0 3570. 12.8 78 1 "mercury monarch ghia" +19.2 6 231.0 105.0 3535. 19.2 78 1 "pontiac phoenix lj" +20.5 6 200.0 95.00 3155. 18.2 78 1 "chevrolet malibu" +20.2 6 200.0 85.00 2965. 15.8 78 1 "ford fairmont (auto)" +25.1 4 140.0 88.00 2720. 15.4 78 1 "ford fairmont (man)" +20.5 6 225.0 100.0 3430. 17.2 78 1 "plymouth volare" +19.4 6 232.0 90.00 3210. 17.2 78 1 "amc concord" +20.6 6 231.0 105.0 3380. 15.8 78 1 "buick century special" +20.8 6 200.0 85.00 3070. 16.7 78 1 "mercury zephyr" +18.6 6 225.0 110.0 3620. 18.7 78 1 "dodge aspen" +18.1 6 258.0 120.0 3410. 15.1 78 1 "amc concord d/l" +19.2 8 305.0 145.0 3425. 13.2 78 1 "chevrolet monte carlo landau" +17.7 6 231.0 165.0 3445. 13.4 78 1 "buick regal sport coupe (turbo)" +18.1 8 302.0 139.0 3205. 11.2 78 1 "ford futura" +17.5 8 318.0 140.0 4080. 13.7 78 1 "dodge magnum xe" +30.0 4 98.00 68.00 2155. 16.5 78 1 "chevrolet chevette" +27.5 4 134.0 95.00 2560. 14.2 78 3 "toyota corona" +27.2 4 119.0 97.00 2300. 14.7 78 3 "datsun 510" +30.9 4 105.0 75.00 2230. 14.5 78 1 "dodge omni" +21.1 4 134.0 95.00 2515. 14.8 78 3 "toyota celica gt liftback" +23.2 4 156.0 105.0 2745. 16.7 78 1 "plymouth sapporo" +23.8 4 151.0 85.00 2855. 17.6 78 1 "oldsmobile starfire sx" +23.9 4 119.0 97.00 2405. 14.9 78 3 "datsun 200-sx" +20.3 5 131.0 103.0 2830. 15.9 78 2 "audi 5000" +17.0 6 163.0 125.0 3140. 13.6 78 2 "volvo 264gl" +21.6 4 121.0 115.0 2795. 15.7 78 2 "saab 99gle" +16.2 6 163.0 133.0 3410. 15.8 78 2 "peugeot 604sl" +31.5 4 89.00 71.00 1990. 14.9 78 2 "volkswagen scirocco" +29.5 4 98.00 68.00 2135. 16.6 78 3 "honda accord lx" +21.5 6 231.0 115.0 3245. 15.4 79 1 "pontiac lemans v6" +19.8 6 200.0 85.00 2990. 18.2 79 1 "mercury zephyr 6" +22.3 4 140.0 88.00 2890. 17.3 79 1 "ford fairmont 4" +20.2 6 232.0 90.00 3265. 18.2 79 1 "amc concord dl 6" +20.6 6 225.0 110.0 3360. 16.6 79 1 "dodge aspen 6" +17.0 8 305.0 130.0 3840. 15.4 79 1 "chevrolet caprice classic" +17.6 8 302.0 129.0 3725. 13.4 79 1 "ford ltd landau" +16.5 8 351.0 138.0 3955. 13.2 79 1 "mercury grand marquis" +18.2 8 318.0 135.0 3830. 15.2 79 1 "dodge st. regis" +16.9 8 350.0 155.0 4360. 14.9 79 1 "buick estate wagon (sw)" +15.5 8 351.0 142.0 4054. 14.3 79 1 "ford country squire (sw)" +19.2 8 267.0 125.0 3605. 15.0 79 1 "chevrolet malibu classic (sw)" +18.5 8 360.0 150.0 3940. 13.0 79 1 "chrysler lebaron town @ country (sw)" +31.9 4 89.00 71.00 1925. 14.0 79 2 "vw rabbit custom" +34.1 4 86.00 65.00 1975. 15.2 79 3 "maxda glc deluxe" +35.7 4 98.00 80.00 1915. 14.4 79 1 "dodge colt hatchback custom" +27.4 4 121.0 80.00 2670. 15.0 79 1 "amc spirit dl" +25.4 5 183.0 77.00 3530. 20.1 79 2 "mercedes benz 300d" +23.0 8 350.0 125.0 3900. 17.4 79 1 "cadillac eldorado" +27.2 4 141.0 71.00 3190. 24.8 79 2 "peugeot 504" +23.9 8 260.0 90.00 3420. 22.2 79 1 "oldsmobile cutlass salon brougham" +34.2 4 105.0 70.00 2200. 13.2 79 1 "plymouth horizon" +34.5 4 105.0 70.00 2150. 14.9 79 1 "plymouth horizon tc3" +31.8 4 85.00 65.00 2020. 19.2 79 3 "datsun 210" +37.3 4 91.00 69.00 2130. 14.7 79 2 "fiat strada custom" +28.4 4 151.0 90.00 2670. 16.0 79 1 "buick skylark limited" +28.8 6 173.0 115.0 2595. 11.3 79 1 "chevrolet citation" +26.8 6 173.0 115.0 2700. 12.9 79 1 "oldsmobile omega brougham" +33.5 4 151.0 90.00 2556. 13.2 79 1 "pontiac phoenix" +41.5 4 98.00 76.00 2144. 14.7 80 2 "vw rabbit" +38.1 4 89.00 60.00 1968. 18.8 80 3 "toyota corolla tercel" +32.1 4 98.00 70.00 2120. 15.5 80 1 "chevrolet chevette" +37.2 4 86.00 65.00 2019. 16.4 80 3 "datsun 310" +28.0 4 151.0 90.00 2678. 16.5 80 1 "chevrolet citation" +26.4 4 140.0 88.00 2870. 18.1 80 1 "ford fairmont" +24.3 4 151.0 90.00 3003. 20.1 80 1 "amc concord" +19.1 6 225.0 90.00 3381. 18.7 80 1 "dodge aspen" +34.3 4 97.00 78.00 2188. 15.8 80 2 "audi 4000" +29.8 4 134.0 90.00 2711. 15.5 80 3 "toyota corona liftback" +31.3 4 120.0 75.00 2542. 17.5 80 3 "mazda 626" +37.0 4 119.0 92.00 2434. 15.0 80 3 "datsun 510 hatchback" +32.2 4 108.0 75.00 2265. 15.2 80 3 "toyota corolla" +46.6 4 86.00 65.00 2110. 17.9 80 3 "mazda glc" +27.9 4 156.0 105.0 2800. 14.4 80 1 "dodge colt" +40.8 4 85.00 65.00 2110. 19.2 80 3 "datsun 210" +44.3 4 90.00 48.00 2085. 21.7 80 2 "vw rabbit c (diesel)" +43.4 4 90.00 48.00 2335. 23.7 80 2 "vw dasher (diesel)" +36.4 5 121.0 67.00 2950. 19.9 80 2 "audi 5000s (diesel)" +30.0 4 146.0 67.00 3250. 21.8 80 2 "mercedes-benz 240d" +44.6 4 91.00 67.00 1850. 13.8 80 3 "honda civic 1500 gl" +40.9 4 85.00 ? 1835. 17.3 80 2 "renault lecar deluxe" +33.8 4 97.00 67.00 2145. 18.0 80 3 "subaru dl" +29.8 4 89.00 62.00 1845. 15.3 80 2 "vokswagen rabbit" +32.7 6 168.0 132.0 2910. 11.4 80 3 "datsun 280-zx" +23.7 3 70.00 100.0 2420. 12.5 80 3 "mazda rx-7 gs" +35.0 4 122.0 88.00 2500. 15.1 80 2 "triumph tr7 coupe" +23.6 4 140.0 ? 2905. 14.3 80 1 "ford mustang cobra" +32.4 4 107.0 72.00 2290. 17.0 80 3 "honda accord" +27.2 4 135.0 84.00 2490. 15.7 81 1 "plymouth reliant" +26.6 4 151.0 84.00 2635. 16.4 81 1 "buick skylark" +25.8 4 156.0 92.00 2620. 14.4 81 1 "dodge aries wagon (sw)" +23.5 6 173.0 110.0 2725. 12.6 81 1 "chevrolet citation" +30.0 4 135.0 84.00 2385. 12.9 81 1 "plymouth reliant" +39.1 4 79.00 58.00 1755. 16.9 81 3 "toyota starlet" +39.0 4 86.00 64.00 1875. 16.4 81 1 "plymouth champ" +35.1 4 81.00 60.00 1760. 16.1 81 3 "honda civic 1300" +32.3 4 97.00 67.00 2065. 17.8 81 3 "subaru" +37.0 4 85.00 65.00 1975. 19.4 81 3 "datsun 210 mpg" +37.7 4 89.00 62.00 2050. 17.3 81 3 "toyota tercel" +34.1 4 91.00 68.00 1985. 16.0 81 3 "mazda glc 4" +34.7 4 105.0 63.00 2215. 14.9 81 1 "plymouth horizon 4" +34.4 4 98.00 65.00 2045. 16.2 81 1 "ford escort 4w" +29.9 4 98.00 65.00 2380. 20.7 81 1 "ford escort 2h" +33.0 4 105.0 74.00 2190. 14.2 81 2 "volkswagen jetta" +34.5 4 100.0 ? 2320. 15.8 81 2 "renault 18i" +33.7 4 107.0 75.00 2210. 14.4 81 3 "honda prelude" +32.4 4 108.0 75.00 2350. 16.8 81 3 "toyota corolla" +32.9 4 119.0 100.0 2615. 14.8 81 3 "datsun 200sx" +31.6 4 120.0 74.00 2635. 18.3 81 3 "mazda 626" +28.1 4 141.0 80.00 3230. 20.4 81 2 "peugeot 505s turbo diesel" +30.7 6 145.0 76.00 3160. 19.6 81 2 "volvo diesel" +25.4 6 168.0 116.0 2900. 12.6 81 3 "toyota cressida" +24.2 6 146.0 120.0 2930. 13.8 81 3 "datsun 810 maxima" +22.4 6 231.0 110.0 3415. 15.8 81 1 "buick century" +26.6 8 350.0 105.0 3725. 19.0 81 1 "oldsmobile cutlass ls" +20.2 6 200.0 88.00 3060. 17.1 81 1 "ford granada gl" +17.6 6 225.0 85.00 3465. 16.6 81 1 "chrysler lebaron salon" +28.0 4 112.0 88.00 2605. 19.6 82 1 "chevrolet cavalier" +27.0 4 112.0 88.00 2640. 18.6 82 1 "chevrolet cavalier wagon" +34.0 4 112.0 88.00 2395. 18.0 82 1 "chevrolet cavalier 2-door" +31.0 4 112.0 85.00 2575. 16.2 82 1 "pontiac j2000 se hatchback" +29.0 4 135.0 84.00 2525. 16.0 82 1 "dodge aries se" +27.0 4 151.0 90.00 2735. 18.0 82 1 "pontiac phoenix" +24.0 4 140.0 92.00 2865. 16.4 82 1 "ford fairmont futura" +36.0 4 105.0 74.00 1980. 15.3 82 2 "volkswagen rabbit l" +37.0 4 91.00 68.00 2025. 18.2 82 3 "mazda glc custom l" +31.0 4 91.00 68.00 1970. 17.6 82 3 "mazda glc custom" +38.0 4 105.0 63.00 2125. 14.7 82 1 "plymouth horizon miser" +36.0 4 98.00 70.00 2125. 17.3 82 1 "mercury lynx l" +36.0 4 120.0 88.00 2160. 14.5 82 3 "nissan stanza xe" +36.0 4 107.0 75.00 2205. 14.5 82 3 "honda accord" +34.0 4 108.0 70.00 2245 16.9 82 3 "toyota corolla" +38.0 4 91.00 67.00 1965. 15.0 82 3 "honda civic" +32.0 4 91.00 67.00 1965. 15.7 82 3 "honda civic (auto)" +38.0 4 91.00 67.00 1995. 16.2 82 3 "datsun 310 gx" +25.0 6 181.0 110.0 2945. 16.4 82 1 "buick century limited" +38.0 6 262.0 85.00 3015. 17.0 82 1 "oldsmobile cutlass ciera (diesel)" +26.0 4 156.0 92.00 2585. 14.5 82 1 "chrysler lebaron medallion" +22.0 6 232.0 112.0 2835 14.7 82 1 "ford granada l" +32.0 4 144.0 96.00 2665. 13.9 82 3 "toyota celica gt" +36.0 4 135.0 84.00 2370. 13.0 82 1 "dodge charger 2.2" +27.0 4 151.0 90.00 2950. 17.3 82 1 "chevrolet camaro" +27.0 4 140.0 86.00 2790. 15.6 82 1 "ford mustang gl" +44.0 4 97.00 52.00 2130. 24.6 82 2 "vw pickup" +32.0 4 135.0 84.00 2295. 11.6 82 1 "dodge rampage" +28.0 4 120.0 79.00 2625. 18.6 82 1 "ford ranger" +31.0 4 119.0 82.00 2720. 19.4 82 1 "chevy s-10" diff --git a/ISLP_data/Ch12Ex13.csv b/ISLP_data/Ch12Ex13.csv new file mode 100644 index 0000000..4da47c6 --- /dev/null +++ b/ISLP_data/Ch12Ex13.csv @@ -0,0 +1,1000 @@ +-0.9619334,0.4418028,-0.9750051,1.417504,0.8188148,0.3162937,-0.02496682,-0.063966,0.03149702,-0.3503106,-0.7227299,-0.2819547,1.337515,0.7019798,1.007616,-0.4653828,0.6385951,0.2867807,-0.2270782,-0.2200452,-1.242573,-0.1085056,-1.864262,-0.5005122,-1.325008,1.063411,-0.2963712,-0.1216457,0.08516605,0.6241764,-0.5095915,-0.2167255,-0.05550597,-0.4844491,-0.5215811,1.949135,1.324335,0.4681471,1.0611,1.65597 +-0.2925257,-1.139267,0.195837,-1.281121,-0.2514393,2.511997,-0.9222062,0.05954277,-1.409645,-0.6567122,-0.1157652,0.8259783,0.3464496,-0.5695486,-0.1315365,0.690229,-0.9090382,1.302642,-1.672695,-0.525504,0.79797,-0.689793,0.8995305,0.4285812,-0.6761141,-0.5340949,-1.732507,-1.603447,-1.08362,0.03342185,1.700708,0.007289556,0.09906234,0.5638533,-0.2572752,-0.5817805,-0.1698871,-0.5423036,0.3129389,-1.284377 +0.2587882,-0.9728448,0.5884858,-0.8002581,-1.820398,-2.058924,-0.06476437,1.592124,-0.173117,-0.1210874,-0.187579,-1.500163,-1.228737,0.855989,1.249855,-0.8980815,0.8702058,-0.2252529,0.4502892,0.5514404,0.1462943,0.12974,1.304229,-1.661908,-1.630376,-0.07742528,1.306182,0.7926002,1.559465,-0.6885116,-0.615472,0.009999363,0.94581,-0.3185212,-0.1178895,0.6213662,-0.07076396,0.4016818,-0.01622713,-0.5265532 +-1.152132,-2.213168,-0.8615249,0.6309253,0.9517719,-1.165724,-0.3915586,1.063619,-0.350009,-1.489058,-0.2432189,-0.433034,-0.03879128,-0.05789677,-1.397762,-0.1561871,-2.735982,0.7756169,0.6141562,2.019194,1.081139,-1.076618,-0.2434181,0.5134822,-0.5128578,2.551676,-2.314301,-1.27647,-1.229271,1.434396,-0.2842774,0.1989456,-0.0918332,0.3496279,-0.2989097,1.513696,0.6711847,0.0108553,-1.043689,1.625275 +0.1957828,0.5933059,0.2829921,0.2471472,1.978668,-0.871018,-0.989715,-1.032253,-1.109654,-0.3851423,1.650957,-1.744909,-0.3788853,-0.6798261,-2.131584,-0.2301718,0.4661243,-1.800449,0.6262904,-0.09772305,-0.2997108,-0.5295591,-2.023567,-0.5108402,0.04600274,1.26803,-0.7439868,0.2231319,0.8584628,0.2747261,-0.6929984,-0.8457072,-0.1774968,-0.1664908,1.483155,-1.687946,-0.1414296,0.2007785,-0.6759421,2.220611 +0.03012394,-0.6910143,-0.4034258,-0.729859,-0.3640986,1.125349,-1.404041,-0.8061304,-1.237924,0.5776018,-0.2720642,2.176562,1.436407,-1.025781,0.2981582,-0.5559659,0.2046529,-1.191648,0.2350916,0.6709647,0.1307988,1.068994,1.230987,1.134469,0.556368,-0.3587664,1.079865,-0.2064905,-0.00616453,0.1642547,1.156737,0.2417745,0.08863952,0.182954,0.9426771,-0.2096004,0.5362621,-1.185226,-0.4227476,0.6243603 +0.08541773,-1.113054,-0.6779688,-0.562929,0.9381944,0.1188091,-2.192225,0.6850726,0.2623043,-1.229459,-0.4883662,-0.7410539,0.2535037,-0.7490539,0.8542319,0.3547439,2.651606,-0.3035108,-1.686913,-0.1424553,-1.155041,-1.663616,0.4012225,-0.4246822,1.375743,-0.7549781,-0.0913072,0.07828002,0.9699861,1.052131,-1.706679,-0.2728831,-1.767506,0.4122611,0.7079067,1.046001,-0.2757717,-0.1802863,0.3356578,-0.4892649 +1.11661,1.3417,0.1032784,0.3909632,-1.927491,0.4516918,-1.34507,0.6253362,0.8163053,-0.3580809,-0.3046185,0.985715,-0.7731103,2.017286,-0.6052647,0.5203902,0.2669838,0.3189837,-0.3471575,0.07221104,-0.02392177,0.6408948,0.1347361,-1.447761,-0.6551162,-1.368402,1.281157,0.3999539,1.766725,0.1088888,0.7425302,0.9062667,-0.6284529,-0.5508992,-0.7105038,0.7347727,0.7289549,-0.02231221,0.9659412,-1.2982 +-1.218857,-1.277279,-0.5589246,-1.344493,1.159115,-1.501044,-0.5541203,0.6914985,-0.8816731,0.4541205,0.195527,1.378114,-0.3705605,0.8048829,0.7401933,0.07093143,0.1225537,0.01202907,-0.6259848,-0.411758,-0.4795317,2.071019,2.147213,-0.2714811,0.5421695,0.5835736,-0.7127743,-1.05329,-0.792966,-1.219865,-1.20116,-0.190726,-0.4345159,-1.365349,0.1386375,0.9568554,-1.052269,-0.4186067,-1.134572,-0.3337191 +1.267369,-0.918349,-1.2535,-1.067114,-0.2406378,1.163889,0.5650894,0.9493293,2.372266,1.706587,1.576088,0.9160682,0.2642956,0.1407703,-1.672722,1.568677,-0.3904991,0.2262634,1.324377,-0.5026163,-0.1757905,-1.069898,-0.4649283,0.2792395,-0.3631731,-0.7092818,-1.398798,0.1238113,0.3403061,-0.1447735,-0.7926979,-0.2496071,-0.6709566,-1.050847,0.06428231,1.413005,0.2904095,0.519561,-0.9492208,0.7225716 +-0.7447816,0.7960816,0.5785438,-1.128211,-1.070943,-0.4067883,1.423884,0.5825686,0.4386859,-1.481767,-0.2775346,1.367359,-0.2825588,1.096762,0.6452979,0.610196,-0.07635609,0.6070032,-0.3567831,-0.965128,3.810641,3.524993,2.198272,2.346375,3.135629,1.574138,1.473596,1.680235,3.015593,2.280847,0.6812332,0.1469941,2.572552,3.430821,2.278654,1.649749,1.685545,1.696202,1.898409,2.156966 +-1.131219,0.5335755,1.486437,0.4258659,0.088809,0.7972244,0.6782569,-0.4321131,1.300891,0.3502131,-1.936635,0.4153989,0.6072336,-1.091904,-0.2287216,0.4223089,1.493098,-0.2895715,-1.216269,-3.191302,2.130251,2.974344,3.326521,1.482598,2.59436,5.0694,3.505138,2.65494,1.138081,1.499542,1.362987,1.141709,0.9923764,3.076109,0.351894,1.581641,1.741696,2.735668,2.424388,3.630031 +-0.7163585,1.22884,0.7424597,0.6872972,-1.126001,-1.055018,1.081147,-1.023514,0.4376163,1.068868,-0.3311311,0.3931545,1.328074,1.166919,0.5311782,-0.5915161,-0.1043765,1.169652,0.06482427,-1.453793,1.497115,3.484277,0.110146,3.193833,0.893435,2.374892,2.208073,1.625861,2.434435,3.404111,3.853985,2.16526,2.773377,3.374466,1.25642,2.37575,0.900051,1.594337,1.829814,2.607872 +0.2526524,-0.806926,-0.02958559,1.116365,-0.9054184,0.2043188,-0.3768665,1.645244,0.1928211,2.107748,1.95071,-0.8103248,0.2613593,0.07657565,1.540524,-1.561735,0.1143961,0.5960547,-2.523109,1.566453,0.6759015,2.075784,2.200879,1.493281,3.137509,3.689008,1.147235,2.835638,3.226493,2.007422,2.40828,2.373131,1.927478,1.862904,2.322372,2.213668,0.7071982,1.399347,-0.1397285,3.401425 +0.1520457,1.353509,-1.406995,0.03240639,-1.628687,-1.131968,-1.341698,-0.1112326,0.4916261,-0.6998577,0.1952951,-0.2096145,0.3148721,0.1035945,-0.644774,-1.359984,0.5773159,-0.0964901,-1.413189,0.7441464,0.5435144,3.227171,0.3801661,2.170058,4.007507,1.926569,-0.5611607,3.203452,1.847779,0.9568465,2.549661,1.901848,3.115676,1.470712,1.558983,2.505967,1.706376,1.597566,1.328015,1.096132 +-0.3076564,-1.764702,1.09164,0.1803118,-0.4416847,-0.1568716,-0.7669616,-1.857714,-0.6414762,0.1448449,1.062511,-0.3403476,-0.03580902,0.04526966,-0.7168134,1.257269,0.8488496,0.0486066,0.751352,-0.2413405,2.66613,3.585697,1.019379,0.3776509,2.960444,0.4697732,0.7920091,3.127578,2.78033,3.346864,1.413924,1.114353,2.911165,1.48487,1.428044,-0.02193937,2.513132,2.013316,1.875078,1.89575 +-0.9530173,-0.005029033,-1.038571,-1.83799,1.56925,-0.00916165,0.2791491,1.174014,-1.465443,0.938107,0.8014606,0.7945867,0.394829,0.2360855,1.401715,-0.1414771,-0.4298357,-0.7095744,-1.913986,-0.7629428,3.163928,3.012489,0.8187804,2.694877,1.071782,-0.05599793,0.5218649,2.194555,3.051663,1.580901,0.3346845,1.268219,2.015522,0.9922643,-0.4301881,0.816671,2.544959,3.307193,3.935368,1.809976 +-0.6482428,0.4271488,-1.342906,0.9419534,-0.1211663,0.962023,1.095579,-0.3873946,-0.3738028,-1.3261,2.570899,-1.205545,0.1051308,1.73155,0.8655443,0.07296553,-0.5456131,-0.09742298,0.3092214,-1.639621,0.8885266,1.049442,2.502193,-0.02595528,1.606582,1.697671,2.492714,3.175114,1.80984,2.08991,2.672099,2.781323,2.159233,2.452704,2.552366,0.5727647,3.044855,1.964097,1.376095,1.708566 +1.224314,0.2104399,-0.7959764,1.12481,1.155308,-0.846015,0.889522,-0.09384116,-0.7114847,-0.7908875,1.145258,-1.345425,1.091153,-0.2316139,-0.6736463,0.8613082,1.779914,1.891259,-0.893873,-0.991037,2.126653,2.354143,-0.3201383,1.957518,2.010197,1.645403,3.147666,0.252028,2.316782,1.548019,1.834547,1.710323,4.145461,2.707454,0.7011277,1.426131,3.529138,0.6580514,1.339901,1.84674 +0.1998116,-1.666185,2.124443,-0.300517,0.5888714,-0.1146617,-0.08151489,0.4807244,-0.636062,0.4313116,1.962565,0.4817488,-1.244769,-0.9433685,0.1064052,-0.4132172,-1.597358,-1.281836,2.013157,0.6571174,2.204357,-0.335341,2.1306,2.609903,1.814043,3.10422,1.524013,2.742184,1.671687,2.947503,0.1756477,2.543901,1.39063,1.172432,0.3871181,1.957585,3.449103,3.714091,2.694902,1.81435 +-0.5784837,1.184845,-0.5499333,-0.003648889,1.001318,2.105999,1.122142,-1.411689,-2.037054,1.615409,1.492406,-0.5869178,-0.4324824,0.1284159,-1.006053,0.7989348,1.176487,-0.80286,-0.6546137,-0.2530673,0.893736,0.8492042,0.4332806,-0.2992884,0.4499105,0.01985191,0.02262036,0.8408059,-0.6909805,-0.7355618,0.2124125,0.2874404,-0.2789698,0.7343264,0.4303332,-0.7632957,-1.010868,1.518754,-1.186824,-0.01631223 +-0.9423007,0.9942695,1.994798,0.3709738,-0.7142866,0.8372602,0.3510056,-1.019296,-0.5739876,-0.2144674,1.559133,-1.473706,-0.7835962,-0.7459324,0.7387083,-0.4153104,0.2528742,-0.4105197,1.489082,-0.3738095,2.015995,0.8580653,1.951216,-1.130419,-0.7597812,0.1356801,-0.2724263,-0.4438066,1.135211,0.9315272,-0.9014917,-0.6419308,0.9368316,-1.368114,0.08182648,-2.771127,-0.4898521,0.2147905,-1.295492,0.07179837 +-0.2037282,0.926747,-0.3763795,-0.7325388,1.88669,-0.7453392,0.1748927,0.1355237,0.1928391,2.372396,-0.5657255,1.227259,0.3252453,-2.253822,1.426236,-1.523174,0.8160235,-1.34616,-1.07883,-0.1558619,0.37109,-0.8056755,-1.224916,-0.5790335,1.499167,2.039603,-0.7682762,1.031504,0.3115557,0.5440675,0.8483886,1.216646,-0.2635194,0.5044073,0.6304813,0.1336823,0.3619509,-1.203108,-0.9893377,-1.400979 +-1.666475,-0.2646699,1.575234,-1.397106,0.3968301,0.06545948,0.2826129,3.138083,1.545819,0.08137703,0.1181462,-0.7672708,1.599108,0.5790222,0.2626949,2.809967,-0.7310548,0.6071439,0.3009599,-0.00773749,-1.959176,-0.1672496,-0.9086798,-1.500797,-0.3587943,1.105709,0.3167045,0.418486,0.3992921,0.74118,1.776504,1.068158,0.7812127,1.541803,-0.6261572,0.1620932,0.365301,0.1309488,-1.289759,0.187917 +-0.4844551,-0.2302489,-0.02104012,0.06435464,-0.590937,-0.5709284,-0.3183618,0.7874643,0.2599389,-1.852654,0.3517153,1.493059,0.300837,0.2260382,-2.533009,-2.305811,-0.9251271,-1.239023,0.05620241,-0.4249067,-1.393251,-0.5430338,0.7696922,-0.6455177,2.147829,-0.3728735,0.1923777,-0.1964858,-2.243007,1.155213,-0.3280407,0.4313631,-1.398011,-1.631817,-0.05313933,0.8916985,0.2380238,-0.0901644,1.055049,1.480796 +-0.7410727,-0.5044715,-2.155544,0.3962633,-1.654507,-0.2634981,-0.5684507,0.9634401,0.02987674,0.2967988,-1.363987,-1.27491,0.8907722,0.7092061,0.6387705,-0.08641581,0.6022891,0.1311318,1.222827,0.1130396,0.7751232,1.711086,0.4861718,0.3261966,0.74843,1.057766,2.002599,0.2699995,-1.254299,-1.015607,-0.89084,1.209258,-0.4066165,-0.1074371,1.315812,-0.02192485,-0.5481005,-0.6357574,-1.307048,-0.6921273 +1.160616,-1.990839,1.379523,-0.6098107,0.4353578,-0.3231758,-0.9575343,0.2655117,0.6818054,-0.2428316,0.695902,-0.9374275,0.6549937,1.117971,0.2592632,0.5322193,0.727165,-0.1039562,-0.4476738,0.05094866,-0.4790969,-1.260408,0.83056,0.1529408,-1.349746,2.202841,-1.196626,-0.6172453,1.488675,-0.6365147,1.054153,0.7072436,0.2445818,0.09493836,-0.01611507,1.121116,0.150795,0.131323,-1.58438,0.2860781 +1.012067,0.7439787,-0.129313,0.4066957,0.108621,-0.2648087,-1.220276,-0.933222,1.309131,1.362685,-0.9651866,-0.07952381,-0.3792578,0.8657286,-0.3687509,1.130287,-1.560244,-0.3346921,0.1061331,-0.5764204,-2.208407,0.2749471,-1.095645,-0.6726534,0.1515074,1.561277,-1.841001,-0.8810678,0.4238655,0.9031426,1.417369,-0.3746408,-0.3416505,-0.6057527,-0.6863806,-1.161231,-0.728425,2.041208,-0.6444013,0.3943438 +-0.07207847,-0.5568699,2.250049,-0.45575,-0.0678266,-0.9535903,-1.439945,-0.6861152,-0.9531928,1.706177,-0.4669951,1.352063,0.1638103,-0.8133175,-0.1244347,0.4818082,0.5853373,0.4350472,-0.2540168,-0.5543044,-0.4348156,-1.074268,0.09065793,-0.09701916,0.8205806,1.446513,0.05444575,1.11723,0.3267211,1.44531,0.08988833,-1.391166,-0.1296247,1.871138,-0.4897032,1.635559,0.613229,-0.6959329,-1.051389,0.9643427 +-1.136782,-0.4793098,0.897206,-0.6498382,-1.45181,-1.135071,0.3801932,-1.013814,-0.05634763,0.767054,-0.4926454,-0.9644017,1.567966,0.304231,-0.03719808,-0.01395693,-0.8354372,0.960726,0.5021633,-0.2102694,0.5363416,0.4328446,0.3002221,-0.3656979,-0.1873387,-0.6454806,2.278492,-1.562365,-2.001911,-1.216278,1.01176,0.8409221,1.290264,-1.365146,0.7166507,-0.6230085,-0.4816144,1.251516,-0.1823503,-0.5459129 +0.9006247,2.06205,-1.218953,0.01622336,1.274693,1.19447,-0.8899291,-0.07837995,1.481901,0.8473389,0.02952964,-0.3163083,-0.8319134,-1.253904,0.4903611,0.4330656,-0.4945696,0.07111833,0.6686318,-0.5043141,-1.209556,-1.758967,-0.6728243,-0.5248611,0.2000481,-0.01604878,0.3399991,0.6185291,0.3415352,0.6569358,-0.4934845,0.3608237,-1.502304,-1.872112,0.1150174,1.746671,1.51748,0.6505427,-1.731365,-0.5548055 +0.8517704,0.3777836,0.2176338,-0.1694064,1.112912,-0.00252012,0.6464126,-0.6363218,0.0166894,-1.244798,0.7932835,-0.1441388,0.6285353,0.2612898,-0.1454291,1.518492,-1.16427,0.6139458,1.173906,-0.1000772,0.9605064,-1.666743,0.0876299,1.880049,0.8481783,0.6185891,-1.042138,0.2957461,0.3503913,0.571172,0.4714169,-0.9093746,-0.1922366,-0.8735476,-1.845017,0.0275507,-0.1957928,-0.3924465,-0.0884799,-0.6361567 +0.7277152,-0.774312,0.112014,-0.9092526,-0.9868791,0.3201157,-0.3696699,-0.3653393,0.3928957,-0.002152637,1.966436,1.059502,0.3360981,2.471858,0.1337036,-0.5636224,-0.8450418,0.738979,-0.4373714,-1.516527,-0.4944382,-0.9899987,0.169141,1.394034,0.5605813,-0.5301229,-0.2431764,-0.4160405,-1.539535,0.3541083,0.7669146,-0.8338445,-0.05789907,-0.1491494,-0.2743979,-0.3942909,0.1921027,0.4187058,-0.8525582,-1.270061 +0.7365021,1.385666,-0.9150514,1.360003,-0.06501109,-1.190947,-1.117472,1.119314,-0.8701722,0.01950151,1.494079,0.3088535,-0.139972,1.332056,1.339434,-0.6052584,-0.2568965,-0.974706,1.323138,0.2162455,0.8201894,0.1173536,-0.2628307,0.5650495,-0.1001319,0.08561021,-1.376565,2.232219,-0.4329239,1.108252,-0.2848149,-1.087232,1.90824,-0.01054931,-1.150491,0.416177,-0.2073955,-0.1578878,-1.006207,0.9328323 +-0.3521296,-0.3824607,-1.193399,-1.664865,1.076963,1.102529,1.635943,1.207881,0.6380713,1.573046,1.179061,-1.479621,-1.005506,-1.432105,-1.749733,-0.3901695,1.241572,-0.3749123,-1.511067,-1.403761,1.112062,2.11086,1.354151,0.02370517,1.02461,0.9741893,0.1442328,0.9490573,2.126939,0.5248454,-0.9359314,0.824257,-0.7509165,-0.1299838,0.3835345,-0.2529661,-0.8603352,-1.10378,-0.4981529,-0.7554363 +0.7055155,1.09818,0.8365308,0.9107369,1.174588,0.2985715,0.5936244,0.5456308,-0.8505715,1.19447,-0.7859565,-0.8859055,0.2494174,1.261879,0.3285886,-1.010634,-1.551305,0.3308089,0.8381064,2.425664,-1.277778,0.6918862,1.324225,0.4584866,0.5740684,-1.845328,0.9665577,-0.4420438,-0.328384,0.6480078,0.2500468,0.3004222,-0.2631401,0.711665,-0.8517509,-1.667811,-0.4511248,-0.7765371,-0.3665714,-0.7119524 +1.300358,0.8196898,1.490957,1.684782,-1.509113,-1.062827,2.060229,0.1213824,-0.4664545,-1.015623,-2.042014,-0.1893221,-1.571146,-0.8513373,0.05712967,0.4580363,0.2091046,-1.354154,0.4927617,1.043788,-0.8876407,0.4400645,-0.4941654,-0.457061,-0.7204402,1.043424,1.067285,-0.9246517,-0.9286122,0.3472081,-0.1340199,0.8311067,-1.325612,-0.5638431,-1.447337,-0.07916657,1.026008,1.682128,-0.3375278,0.03167592 +0.03825201,0.5190384,-1.727514,-0.08545033,0.1283017,0.05773285,-0.2655301,1.619925,1.480575,-2.492903,-0.726188,0.3237122,0.8623476,1.522811,-0.6556967,1.72475,0.6929036,1.576587,0.8117433,-0.7635001,-0.6026167,-1.308941,0.2969869,0.5255466,-1.24399,-0.618802,-1.215295,-0.06540179,-1.265031,1.170274,-1.726831,-0.5508598,0.9714834,-0.6601466,-0.7140841,0.8583833,0.5581876,-0.9989405,0.03222696,0.1630182 +-0.9792838,-1.314318,0.3668783,0.4034652,1.619863,0.2613481,-0.7089389,-0.5265674,0.2699198,-0.161773,-0.3006321,-0.06091156,2.126365,-1.129122,-0.02757238,0.6336284,0.6965225,0.2886509,-1.204227,1.050755,-0.1537168,-0.7718744,-0.2350103,1.497298,-0.91831,-0.2387945,2.032843,1.662595,0.4669969,-0.2091563,2.276972,0.2420266,0.3388086,0.308363,0.02517714,-0.9322796,1.123349,-0.2839858,-0.07773471,0.8653868 +0.7937612,0.2570269,0.1306008,-1.053028,-0.003775532,-0.2727945,-0.4409795,1.043976,-2.471141,2.332648,-0.07765335,0.5444023,0.0120315,0.8610552,0.344296,-0.1343776,-0.110112,1.388118,-0.3254291,0.9191739,0.5485643,-1.862154,0.2424929,0.8252912,0.4536433,0.3829995,0.5024815,-1.076093,0.368919,-1.362373,-0.4495549,-0.550991,1.431036,-0.4105199,2.002819,1.246912,-0.9908101,0.480317,-0.2427061,-0.4412224 +0.7865069,0.3486049,-1.081115,2.259135,-2.189539,-1.760301,0.1397191,-1.379464,2.496226,0.2111719,-0.732856,-0.0759815,0.4640188,1.127945,0.6126224,-0.9663962,-1.784827,0.13312,-0.8200972,0.4830998,1.213516,0.2082814,-0.1100306,-1.316384,-1.830888,1.212793,0.1442573,-1.004605,0.09624428,-0.6387385,-0.7735206,1.176383,0.8788719,0.8767893,-0.09395272,0.6231024,-0.9196247,0.131377,0.0897155,-0.6667667 +-0.3104631,0.1587819,-1.626491,0.2745213,0.5036481,-0.2892407,0.1421482,-0.04618723,1.06269,-0.0829484,1.155412,0.1094996,-0.4346237,-0.9467153,0.4314989,0.3588942,-0.7556844,-0.06247193,0.4241155,-0.2358314,-0.6928053,-0.8104984,0.1080485,-0.5475006,0.693094,0.2363369,-0.8778116,0.6941243,-1.123286,-0.0855731,-1.862489,0.5104888,-1.045072,0.1192021,-1.148185,0.9096224,-1.578378,-2.364987,-1.129909,1.214311 +1.698885,-0.1849131,0.6093191,-1.408806,-1.6956,-0.2335841,-0.899203,-0.06957702,-0.505706,-0.3424353,0.9227335,0.2913394,0.2068268,-0.01974178,-0.6739792,-2.578768,-0.2421374,-1.435733,-0.8712425,-0.5697265,-0.9832361,0.4058641,-0.4186079,0.2941307,1.19043,-1.117004,-0.3621521,1.013424,-0.2973135,-0.4854955,-0.3221369,0.7165936,-0.7686977,0.6699888,0.6926823,-1.107252,-0.008808944,-0.6573414,-0.6940314,1.754694 +-0.7945937,1.950198,-0.3837559,-1.234806,0.7846392,1.530752,-0.7441157,0.4991056,0.1942185,3.548017,-0.34498,0.04815635,-1.211628,-0.1854318,-0.5531648,0.9778711,0.4257559,-1.544418,-0.3967763,-0.5934603,-1.971339,1.731593,0.1215293,-0.4673042,-1.184141,0.494403,-0.4169794,-1.582738,-1.392108,0.5969168,1.862286,1.539193,1.471404,-1.280391,-1.426542,-1.048205,0.1609362,-0.01068728,0.1882839,0.4139709 +0.3484377,1.731419,0.2647067,-0.1200581,1.302191,-0.7390133,0.8818015,-2.231556,-1.571295,0.1952887,-0.02053485,0.1911802,0.2975138,1.830771,-0.6073049,-0.5673285,-0.4108202,-0.07970051,1.10111,-1.562462,0.924214,-0.3863267,0.8707452,0.1631364,1.189754,1.744427,1.111959,0.2099866,1.333163,-0.3415307,0.5762299,-0.5169975,2.016992,0.7538394,-0.5597526,-0.2100932,2.152405,0.3819966,0.2043224,-0.5133833 +-2.265401,-0.007093533,-1.075917,1.56151,-0.7558855,0.7826384,0.01474156,0.5319782,0.1440657,0.6689188,-0.027379,-0.5023424,0.4018851,0.524951,-1.49328,1.446502,-0.564079,0.1267697,-0.2449609,-0.3375946,-0.5890547,1.180549,-0.2479551,0.7573566,3.008111,3.565679,0.01029978,0.9635586,-0.3031993,-0.7750776,0.1756539,2.05065,-0.0578306,0.9503282,-2.335808,-0.1481007,0.2898428,0.5300804,-2.005062,1.690765 +-0.1622053,0.5052937,-1.70492,-0.445592,-1.286909,-0.5288534,0.7037361,1.199604,-0.2590582,-0.7512628,0.03902721,-1.250502,0.2264897,-0.83371,0.05098268,0.1388001,1.204852,-1.451273,0.655162,1.730388,-1.182838,-2.054295,-0.2349709,-0.4115265,1.093675,-1.644476,0.369504,-0.4194156,-2.285313,-0.3300115,-0.4388723,0.8716717,1.003739,1.095349,0.1152202,1.06218,-1.767279,-1.134888,0.3905588,-0.778794 +1.130865,1.375937,-0.0900827,-0.867734,-1.043716,0.4844302,0.0887216,-0.3493132,0.9577919,-0.04370464,-0.1323543,0.517023,-1.465564,-0.4448143,0.2078436,-0.7249397,-0.8675073,-1.736041,1.271513,0.1262193,-0.4605093,0.370421,-1.726724,-0.5053015,1.221296,0.5570055,3.263411,-0.01882729,0.1828907,0.4064395,0.9138635,-1.268199,0.716548,-0.5812218,1.285761,-1.62459,-0.04659618,2.13579,0.7454413,-1.877619 +-0.455546,2.166524,0.2846788,0.1486715,-0.2973123,-1.262704,1.045366,0.5252803,0.3856205,-0.7868726,-1.251039,-0.444307,-1.119575,-0.08956308,-0.9840282,-0.5846243,0.2536638,0.6532875,-0.8133761,1.18527,-0.5545128,-0.8268409,-0.006568352,2.150546,0.5344639,-0.9544268,3.112121,1.301465,1.50228,0.8233718,0.2671249,-0.7124693,0.7195066,1.26316,0.6858665,1.69569,0.1126784,1.519924,1.164513,-0.4428308 +-0.8991663,0.2045406,-0.4430617,-0.02878769,0.9216518,0.5430125,0.0990666,-0.3175242,-1.572822,0.005867953,-0.3366069,-0.6516611,-0.5888239,0.2749147,-0.1994862,0.7031086,1.535729,-2.899789,2.405605,0.5221175,1.473265,-0.4608785,1.369525,-0.1311666,3.540289,1.234585,1.209291,0.4565819,0.09265819,-1.219204,-0.6519219,-0.6454935,0.8094742,-0.5825668,1.834648,-0.509876,-0.9815417,-0.3310543,0.3810245,1.724701 +0.7268389,-0.3685858,-0.2991162,-2.14046,-1.137196,2.876359,1.765146,-0.3341439,-0.6442878,-0.4045349,0.413366,0.5595178,-1.06707,-1.118156,-1.528953,0.05446838,-1.417132,-0.3742207,-0.2562946,0.3520323,-1.062899,0.4347082,0.4666324,-0.157007,0.3777844,0.498163,-0.5553751,-0.7163031,-0.4541748,0.2637208,-0.8885743,-1.812592,0.04827278,-0.1564584,0.3657967,-0.1323741,1.368492,1.619644,0.6921469,-0.2332798 +-0.8094409,1.714665,-0.7739517,0.1123969,-0.0648509,0.1866578,0.1384743,-2.095744,0.8419374,-2.206464,-1.803962,-0.8540951,-0.1476932,-0.3176832,-0.1162324,-0.2799527,-0.04926362,-0.7933144,0.7849893,-1.816424,0.635375,-0.1994647,0.8312015,0.2507402,-0.5079173,1.148932,1.086094,-0.009816327,-1.503533,-1.314329,0.03994071,0.8131987,-0.2970124,-0.4927281,-1.746251,-0.6921627,1.962506,0.08895188,1.163495,0.7285652 +0.2670851,0.8315306,0.241081,-0.0127944,0.9892672,-0.2051988,-1.405478,-0.5845218,-0.4868743,1.117698,0.2471393,1.485915,0.3254064,-1.151021,-1.113404,-1.221917,0.693118,-0.3275051,0.4905356,0.7380092,-0.3192619,0.8264332,-1.339322,1.525062,-1.211212,0.4153972,0.8766505,-0.3541354,1.304006,0.1382727,-1.719282,-0.9487071,-1.540213,-0.5795103,-0.03736999,0.05668318,-0.08443898,-0.1838988,-1.189313,-0.3271567 +-1.737264,-1.196727,-0.3375843,0.04702676,-1.100233,-0.6407227,0.3125345,-3.627529,0.7013215,-1.710656,1.406826,-0.4409403,0.02730033,-1.563362,0.3101302,0.6176599,-0.3927792,0.4158435,-1.628321,0.9103822,0.6629662,-0.8014066,0.5416531,0.1397443,0.02347822,-1.091829,-0.7237454,0.01421243,0.8927964,0.8535591,0.1199788,2.218379,1.366187,0.8998253,0.4079212,1.841334,-0.930561,0.3650003,0.8732356,-1.830881 +-1.411425,-0.2008006,1.249134,-0.1636117,0.9485218,-1.048626,1.815532,-0.1087722,-1.187695,1.926978,0.906115,1.211018,0.2554187,-0.02785028,0.6082475,0.5627235,0.5647193,1.837537,0.3068569,0.1473984,1.88754,0.1849457,2.701473,-0.0278634,0.1713661,2.326326,-0.4278628,0.3861796,-0.7588646,-0.2437137,-0.8875403,0.1869728,-1.074891,1.515833,0.6044213,1.580978,-0.2150378,0.5230431,-1.680529,0.1478898 +-0.4535512,1.365201,0.7086719,-1.119037,0.9693492,0.1893329,0.6787914,1.105003,0.5110681,0.4993083,-2.595223,0.5553113,1.279235,1.032919,0.679289,0.8516994,-0.165217,1.076507,0.04997314,0.7123337,-0.002816868,0.08949857,-0.1290334,-0.4831876,0.9227308,1.363511,-0.8480393,-0.6511501,-1.116606,0.8841621,0.2487995,0.6341142,-1.012383,2.584299,-0.07058665,-0.663057,0.3884592,-1.070233,-0.2063986,-1.692048 +-1.035491,-0.02816577,1.318507,0.2888444,0.08228936,0.6125448,-2.074467,0.1736034,1.736238,0.02401023,0.01363487,-1.18958,-1.192922,0.8563035,-1.833494,1.20878,1.741304,-0.02961985,1.570607,-1.088059,-0.4616785,-0.5976448,-0.2692638,-0.3696843,-1.204031,-0.5322308,1.715185,1.916895,0.191542,0.3498385,0.171908,-0.2731403,-1.21162,-0.899457,-0.3748367,1.66755,1.106238,0.1684324,-0.6062556,-0.7446295 +1.362143,-1.759968,3.458551,0.9281802,0.8664452,-0.7754966,0.6883382,-0.9033587,0.337786,-0.3919069,-1.59192,-0.3588545,-1.619895,0.05301489,-0.3592627,0.8428688,0.3437278,2.024253,-0.005867516,0.2239052,-0.920713,1.701997,-2.887398,0.243883,-0.3956033,0.3307968,-0.5037724,1.284279,0.53102,0.2547471,0.939288,-0.8534879,0.7460765,0.1298672,0.6041524,-0.6625136,-0.8290446,-1.180994,-0.2933093,-0.366737 +0.9174567,-0.6009205,-0.1949492,-2.423952,1.372275,0.03994449,-0.6382612,-0.835978,-1.271368,-0.775446,-0.188795,0.8055235,0.06373321,0.04073363,0.6855496,0.2141559,1.94609,-0.571372,-1.576129,0.2498237,0.9969349,-1.195082,-0.3348164,0.3723295,-0.4232004,1.405888,0.272855,-1.160249,-1.949912,1.220429,-0.9611371,-0.4801038,-0.6757058,-0.7565159,-0.5479252,0.6937402,0.791902,0.655277,-0.4994561,-0.688523 +-0.7851422,0.6210088,0.4287486,1.557878,2.2344,0.2251232,1.989245,-0.2772042,-0.3589223,-0.1799903,0.1604443,-1.041639,-0.6416557,-0.9511776,-0.9988015,-0.1544098,0.5697919,0.1762414,-0.1854771,0.1931995,0.7361332,0.09149527,-0.7040933,0.225741,-0.1781781,1.514491,2.509891,-0.7871716,-0.05572368,0.2555273,-0.3288873,-0.02048022,0.7688556,1.083385,-0.6026733,-0.2579116,-0.2949727,-0.3293879,-0.3746788,0.7530878 +0.5735182,-1.404984,-1.492522,-0.5461443,-1.116817,0.558027,0.1306628,-0.05633005,1.335672,-1.395117,-2.018645,-2.103797,0.1128778,0.5220917,-0.5287563,-1.359938,-0.6085203,-0.2484622,-1.36922,0.6633945,0.9068413,1.246248,-0.02698808,-0.06762562,0.8115901,-0.3171243,0.9649408,-0.9135214,0.3442981,1.11707,-1.078611,0.5891761,-0.6893603,0.3916892,-0.6355018,-1.841481,0.7216341,-0.2661154,0.7968767,0.7579533 +0.9181962,-1.904084,-1.683701,-1.460203,-0.679744,0.183504,-0.2161343,-0.1007334,-0.1998144,-0.5532596,0.2871584,-0.350412,-0.5671544,-0.7700515,0.2420618,0.9395627,-0.2267437,-0.9689475,0.3780411,0.1837503,1.818536,0.06867725,-0.2078776,0.02992952,-1.235592,-1.479346,-0.1303668,-0.4847117,-1.02197,0.6483461,-0.6269407,0.6870816,-0.9618638,1.099295,1.161061,1.436041,0.384681,-0.1499804,0.3794495,-1.096991 +0.2562873,-0.4907771,-0.5178761,-0.8990806,-1.533507,-0.09222776,0.9355509,0.09625995,-0.3637038,0.3269,1.034507,-1.04903,-0.4192102,0.4568866,-0.380388,-0.5994591,-1.107533,-1.682126,1.08278,1.104593,1.635063,-0.8680402,0.6747137,1.877262,0.7834916,-1.508318,2.339138,-0.527795,0.5316686,-0.7266722,-0.367321,0.9657018,-0.9736222,-0.8829712,0.06766261,1.787692,0.3877624,0.8134422,1.168956,-1.363222 +0.3519666,-0.5351684,-0.1491479,0.06304456,-2.892614,0.4357179,0.2207959,-0.08463651,1.673197,0.946043,-1.109061,-1.219149,-0.3303407,1.119284,-0.1329603,0.5154061,-1.080841,-0.03417905,0.6676726,-1.05585,-1.170226,-0.424564,2.293384,-0.3669762,1.794454,-0.1771233,1.032199,-0.979453,1.084332,-0.95486,0.05029544,0.2718236,0.2459581,0.8159176,-2.008355,-0.5750048,1.707974,-1.286414,0.6341174,-1.398058 +1.174337,-0.3543507,-1.086527,-1.528156,-0.3418698,0.3996504,2.454311,1.631736,-0.3084107,1.350537,0.7495808,0.9318616,0.01645455,-0.9370882,-0.1867362,-0.3968147,0.436204,0.2169375,-1.641035,-0.3920626,-0.4799631,0.1515526,-0.06271854,0.1746916,0.6517372,0.4804253,1.099438,-0.9680308,0.4826966,0.1441153,-0.3235745,-0.4555806,-0.3414674,0.2732974,-0.2930346,-0.6154018,0.6587236,-0.877892,-0.5057673,-0.2932624 +-0.4808464,-1.801861,0.1936917,0.2055542,0.9418432,0.9158531,-1.033274,-0.7474515,1.26331,0.8492998,1.228781,-0.01120451,-0.5011666,0.300504,1.162198,-0.9229699,1.97171,0.8779133,0.1426466,0.8849624,1.453196,1.476979,-0.358642,1.97016,-1.247717,-1.274308,0.959817,-1.519953,0.6609836,1.405494,-1.438619,0.3839904,0.88861,-0.1954744,0.8223276,0.3290668,1.81836,-1.196966,-1.087535,0.8090748 +-0.4188297,-0.1418732,0.3299104,0.5758395,-0.3478696,-1.179089,-0.4339671,1.202591,1.476427,0.2457126,0.3404072,-0.4895515,-0.08798899,-0.03465765,1.038114,0.3764918,-0.3265693,-0.6827583,0.2036939,0.3486705,-2.115191,1.5038,0.9035976,-0.5628519,0.5013996,-0.06552015,-0.3174451,-0.121373,0.4939432,-0.09524562,-0.1841865,-1.071532,-0.9497286,-2.636622,-0.1988273,-0.8970025,0.5776333,-1.082657,-0.5322023,-1.321972 +0.9551128,-1.531457,0.492487,1.178192,-0.577062,-0.3400896,0.4027364,-1.007427,-0.05052466,-1.681127,0.6700464,-1.379353,-0.6891009,-1.47214,-0.2861721,1.871179,1.429931,-2.247631,0.1896265,-0.1144726,2.337245,-0.5559726,1.497056,-2.305296,0.4241863,1.445241,-0.3692953,0.8648087,1.470471,-0.669268,0.5964407,-0.6658678,0.2240379,0.4613764,0.5721705,-0.4213531,0.4604297,-0.5624483,1.108668,0.2684938 +-1.289007,0.747724,2.175987,-0.7784894,0.7215801,0.989907,0.8821119,-0.1943275,0.285721,-0.5596171,-0.3071088,-0.5720146,0.5292884,0.3886827,-3.143676,0.2363436,-0.6814432,-0.2609711,-0.3096465,0.7623438,0.6496054,-1.594618,1.858089,-0.485244,0.1231821,0.6759179,-1.02686,-1.539931,2.612379,0.6668742,1.217263,-0.172614,0.5197499,-0.8048824,-0.9452827,-0.6589836,1.460557,0.06853905,0.002005336,0.5954352 +0.1861974,-0.9268624,-1.25683,-0.95757,0.2978428,-0.04238906,0.7475203,-0.3028764,0.8756253,-0.3185186,-0.4013172,-0.4868987,1.107667,-2.626366,-0.5336328,0.5462619,0.1981702,-1.134637,0.5434634,-1.283966,0.2046603,0.2001381,-0.4862957,0.8968453,2.605118,1.1464,-0.6587497,-2.011227,-1.868094,-0.1722973,0.3335993,0.5805082,-0.942274,-0.1601973,0.3393414,-0.9712678,-0.7838391,1.58384,0.07833034,-0.1226866 +-0.0313255,0.5263548,0.9514733,1.08156,-0.252678,1.058437,-1.44189,-1.139123,-0.2241227,-0.2113667,-0.4755539,1.032658,0.7473872,-1.166551,-0.1856199,1.742952,0.7654219,-0.819288,-2.155339,0.9660806,1.63067,1.160136,0.1855338,-0.4207426,-1.377596,-0.7854812,-0.5049669,-1.740156,-0.8398698,0.770539,-1.119338,0.422415,-0.6133127,1.176961,2.309224,-0.1460819,-1.123249,1.713298,0.2607247,-0.6109905 +0.4670973,-0.403464,0.3939671,-1.618498,0.9781416,-0.511895,0.08327686,0.1541133,1.06856,0.1546696,0.418225,-0.6683883,-1.548675,0.285917,1.698462,-0.7002287,-1.017667,1.141493,-0.4544228,-1.563814,0.2938491,-0.7232649,-0.2058309,-0.07407882,-0.690645,0.9530289,-0.08584463,0.03793999,-0.6974025,-0.3270483,0.2573818,-0.1243294,-1.399447,-1.162502,-1.496985,0.9758395,0.04300753,0.7959672,-1.753781,0.6449593 +1.024198,-0.03102188,-0.3243308,0.1742353,-0.5803846,-0.03142884,-0.82367,-1.15675,0.3248136,-0.8214052,0.378242,0.04929396,-1.790092,-1.76467,-0.7033445,-0.6271972,0.1104038,0.8331777,1.12283,-1.127918,1.202976,-0.5779483,-0.1992787,-0.1541892,1.072055,-1.649976,0.261839,0.9417996,-0.3451566,-2.084318,-0.7871012,0.3869096,-0.05978559,-0.2555773,2.192058,-0.3604817,-0.3155811,-1.324476,-1.411832,1.470588 +0.2673585,0.5240287,-0.9252948,-0.09508596,-0.5817083,1.05488,0.05097295,0.205232,0.04855525,1.10439,-0.4446359,-0.7342757,-0.3828476,1.220679,-0.3972738,-0.563121,1.477175,1.24846,-1.632882,0.349837,-1.074859,-0.2754561,-0.9047152,-1.709094,-0.2098073,-0.4673099,-0.8049511,-0.2660949,0.157216,0.35225,0.5855835,1.01748,-0.5143829,-1.180432,0.03285825,0.1037776,-0.9420953,0.1383986,0.957001,-0.9120315 +0.2318261,0.9513305,0.299561,1.990841,-1.571063,0.4179885,0.1466042,0.1873141,0.3853326,1.846768,0.1092157,-0.06549914,0.2493503,-0.302525,0.3197385,-0.6919255,0.3166584,1.596053,0.05807138,1.563859,-0.316988,-0.6326412,0.1966419,-0.09479553,-0.694022,0.5713797,0.09381224,0.3616531,-0.9773562,1.03052,0.0832235,-0.5875794,-0.7651681,-0.7302539,0.9081598,1.554032,0.1628849,-0.8203908,0.2791645,0.4449899 +0.7475925,1.434041,1.21646,0.3648151,-0.6049416,-1.264002,-1.284285,-0.5228387,0.854001,1.012642,-0.1610891,-0.6028438,-0.2289314,0.5130871,-0.1769741,2.540101,-0.830431,0.6732317,0.6725787,1.744419,-1.201651,-2.592356,0.09182853,-1.754941,0.1756612,0.1673159,0.7274229,-0.2630692,-1.138523,-1.002582,0.3011085,0.1265724,2.825591,0.5901034,0.1274404,0.8303247,-0.4364426,1.025433,-2.496853,-0.3417772 +1.217069,1.211814,1.585629,-1.139858,1.331901,0.1641926,0.4223695,1.504745,0.3536735,1.946112,0.6313095,-0.4928388,0.234649,-1.105708,-0.4654024,0.7822463,0.6743235,-0.1310948,-0.6060997,-0.1422506,-1.344521,-1.36301,-0.8483423,-0.648167,-1.319002,-0.717065,-1.465826,-1.898808,0.3148309,1.06451,-0.4439349,2.497175,-0.2765811,0.2340367,-0.9679357,0.06290501,0.4465909,0.3327135,-0.5960449,-0.2275127 +0.3833583,-0.678845,0.4057452,-1.149927,-1.014591,-0.1634669,0.7828013,0.08551937,-0.1918214,-1.252448,-2.026557,-1.006993,0.5030307,1.595393,0.8457007,0.1577406,1.690844,-0.9626267,-0.500139,-1.944679,0.1705765,-0.8349297,1.859967,-1.343308,-1.484946,-1.738827,-0.6931463,-0.9841683,-0.7454088,-0.8155868,-0.1742606,-2.456277,-1.865276,-1.240233,-1.032555,1.657295,0.4013802,-0.2242892,-1.05531,-1.010964 +-0.9880528,-3.24049,-0.7834583,-1.943323,0.5031153,0.2736059,0.1650369,0.7146004,0.5531767,0.9620128,-0.04668716,1.442798,0.9022123,-0.3568813,-0.009518634,0.05385321,-0.8689235,-0.7357214,-0.3270745,0.1020249,-1.433916,0.6176986,-0.09953703,0.3981514,1.253936,0.5204374,-0.8671026,-0.1613718,-0.4554171,1.22482,0.06175668,0.8039733,-0.7399866,0.5414668,-1.814087,-1.010687,0.3300032,1.027475,0.05203607,0.1732738 +-0.1568529,0.756353,-1.280674,0.1490092,-1.8229,-1.533513,-0.975644,-1.134508,0.01720767,-1.79979,-1.38183,-0.7183156,-0.6320968,-0.5053965,-0.2901609,-0.07100879,1.537283,-0.9226529,-0.495237,0.7987654,-0.633652,0.4736393,1.582834,0.6755595,0.06918938,0.771828,-0.6695709,0.1696559,-1.360845,0.7749629,-0.1240705,0.7235213,-0.2386281,0.6420811,-2.482315,-0.7938814,-1.211966,-0.8383569,0.4155206,-0.4991846 +1.735535,2.136581,-2.316516,1.064753,0.5401993,0.935449,1.13558,0.6599548,-1.032107,-0.4787275,0.1431146,-0.0589945,-0.01884983,0.5417073,-0.5851714,-0.8088863,-0.9094115,-0.2299236,-0.0689191,0.1132837,-0.06211203,1.503284,0.7337852,0.1481613,0.9357622,0.5226355,-0.5967693,1.082223,1.316457,-0.3173847,-1.310234,-0.3573102,-0.9748852,0.0996512,-0.8017929,0.2286509,0.1672774,1.332884,-1.885516,-0.3235928 +-0.3522983,1.032402,0.1149,0.4098704,-0.5840188,-1.801413,0.3892579,-0.5856292,-0.3752973,0.2678109,1.048317,-0.904843,0.9974462,-0.4402325,0.5624912,2.16414,-1.191659,0.6211167,0.5895416,0.2411645,2.60027,0.284066,-1.497692,1.437747,0.2134891,1.048589,-0.01854809,0.7847914,-0.5635842,1.221055,0.611831,-0.646674,1.556181,0.3791728,0.7802606,0.5885215,-0.9630954,0.2134635,-0.9021008,-0.721093 +0.68864,-2.350898,-0.08096164,-0.4000142,0.4456904,-0.2474869,-1.474436,2.540809,-0.1411824,1.380972,0.08233659,0.9258932,0.7580653,0.211236,-1.713466,1.073278,-0.008735042,-0.6770787,0.6011049,1.391643,-0.6889877,-1.188257,1.005961,-0.4198868,-0.4428315,-0.311835,0.690336,0.8970395,0.2715069,-0.634871,0.7189807,-0.81544,-1.013333,0.7663582,-0.09570784,1.465377,-0.4006828,-0.1737463,0.6222464,-1.48897 +1.224406,0.5101664,-0.5867379,-1.149701,1.185817,-0.4182793,-0.2715284,1.861715,-0.3575462,-0.2129559,0.9847606,-0.2565714,0.5455263,1.276495,1.933728,-0.1524934,-0.2514474,-0.09782937,0.04140036,0.2561082,-0.2984681,-1.277523,-0.5934478,0.7637635,-1.542544,-0.7793138,1.278597,0.04614636,-0.785729,-0.1728421,1.192113,-0.5625802,0.6185346,0.07782438,-0.8411456,0.05063839,-0.5787844,0.8920338,0.2372331,0.3124459 +0.7942963,1.386289,1.356455,-0.941,1.241372,0.2047216,3.0283,-1.377182,1.440621,-1.392366,-0.1684344,-0.9814726,-1.694418,-0.9429639,-0.8027007,-0.5112176,0.2945099,0.3616289,-1.192824,1.733982,-0.7375745,1.218865,-0.9009435,0.216411,-0.1063302,-0.2684234,0.704886,-0.3895773,-0.8639096,0.3123071,-0.004668907,0.9060324,0.7561289,0.4028008,0.6146074,-0.3374999,-0.1192738,1.270299,0.3574591,-0.3306031 +-0.006402398,-1.803563,0.4982943,0.1329993,-0.06111639,-1.35827,-0.3079043,-0.566454,-0.5946257,0.722726,1.432742,0.1448068,-1.012714,-0.848369,-0.3746849,0.1119932,1.545818,2.039388,-1.029765,-0.9670014,1.583246,0.9254735,-0.7541634,0.4733858,0.7810652,-2.256637,-0.1258124,0.8629921,-0.3022865,-0.8066692,-0.06838632,0.4753285,0.555574,1.113928,0.9814186,-1.489919,-2.046046,0.5771862,-2.283813,-0.2945456 +0.2191506,-1.526538,0.4036648,-0.4920219,-0.8418074,-1.272403,-1.198727,0.06222364,1.165755,-1.232425,-1.265797,-0.5474795,0.004739934,0.3051055,-0.429679,-0.1577457,0.07684792,0.5980016,0.7138827,-0.01660854,1.168854,-0.4436431,-3.105944,0.8466058,0.3072545,1.27876,1.134993,-0.996519,1.256194,-2.460178,-0.6149452,0.8214165,-0.554375,-0.5239989,1.489625,1.327614,0.1222107,0.01590414,1.317716,1.494402 +-0.8864638,-1.296871,0.5480717,0.1117493,-1.603772,0.892239,-1.311234,-0.6367194,-0.7210188,-0.1852503,0.7273811,0.7397728,-0.3183624,0.3625228,1.95212,0.8952415,-0.2790082,0.6804825,-1.696709,1.622917,-0.291447,-1.279826,-0.715302,-1.494512,-1.393975,-0.07578884,0.4882804,-0.06708207,1.252124,0.9423884,-0.9104378,0.1798319,0.6465232,0.3842036,-0.790542,-0.2300728,-0.8899908,-1.253495,0.06676757,0.7547419 +0.4397603,-1.767471,0.6900035,-0.05991141,1.062743,1.119622,0.8351726,1.831439,0.4986606,0.9786868,0.03983874,-1.059745,1.24851,1.764831,0.8158379,-1.939136,1.056447,-0.1599035,0.07928431,-0.6726712,-0.08343298,-0.008908435,-2.987727,0.248644,0.4663941,0.1998807,0.5208602,1.309796,-0.2595481,0.5789363,0.9139915,-0.486198,1.724975,-1.30626,0.4911544,-0.3796254,0.9301124,1.218849,-0.317431,0.9209807 +-0.8863898,0.8653283,-0.2874406,-0.979592,0.2365342,-1.615542,0.3250318,-0.4440106,-0.3579852,1.176626,0.8343709,-0.08974431,-2.378942,0.3033631,-0.7341737,0.02109155,0.1035593,1.277071,-0.0542985,-0.2339987,2.074983,0.388466,1.139313,0.1680085,0.5838581,-2.020404,0.9437953,0.3705996,-0.2355083,1.588981,0.5979879,0.2138659,0.310034,-0.824531,0.1263448,-0.4531483,-0.4970156,-0.09920595,-1.004589,-0.8243001 +-0.8538185,-0.7724048,-0.5189004,0.4825913,-1.529293,0.4822002,0.04588054,0.587779,0.2972349,2.3926,-0.3557517,0.07863995,0.7357726,-0.6535245,0.4750473,-1.511597,1.82361,-0.8983274,-0.853264,0.3519842,1.073282,-2.414035,0.0981426,0.1604314,0.9741096,-0.2341151,-1.67958,-0.9529903,-0.4394775,-2.313102,0.8514615,0.6868325,-0.5553411,-0.985844,0.2075981,0.365058,-0.5374016,-0.1355659,-0.2429012,-0.2982057 +-0.9899943,-0.1108457,1.250462,-0.3595148,-0.2550374,0.1779343,-0.04781101,-1.187452,0.2411686,-0.4526586,1.11429,-0.5127513,-1.124349,1.563372,-1.100186,1.859847,-1.465428,-1.348646,-0.2716187,0.2343775,1.388918,-1.286528,0.4482617,0.5431436,1.315296,0.3562432,-1.421438,-1.264095,0.4259104,-0.336069,0.8593984,2.365351,-0.8801344,-0.9128973,-0.2907021,2.090371,-0.5612451,-0.288171,-1.001121,0.25757 +-0.6508777,-1.233645,-0.211777,0.995577,0.6582017,1.418623,-0.02361628,-2.020753,-1.468316,-0.8542611,0.8507572,0.38044,0.6523611,-0.2724346,0.02847891,-0.7223853,-0.8802198,1.841958,1.661456,-2.753031,-0.8143866,0.6072514,-0.3510463,0.09439142,-0.4376393,1.00929,0.8100885,1.168702,-1.739061,-0.4330541,0.7826762,-1.473707,0.8561068,-1.669518,0.1074915,0.6945382,-2.082175,1.481668,1.643466,-0.7035056 +1.053947,1.330162,0.1399598,-0.09369478,-0.2851725,-0.735151,-1.176563,-1.189795,-1.506802,-0.743841,-0.6976071,-0.838423,-0.927466,-0.5772549,-0.7741065,-0.06832301,-0.8055702,0.0280363,0.4828595,-1.743545,-1.179219,1.401859,-2.097261,-0.3077367,-0.6486873,-0.7901336,0.4007533,0.7501239,0.7609003,-0.5971615,-0.4817788,1.674782,-0.09188997,-1.117439,0.6877634,0.6321632,-0.314924,-0.675761,-0.1898141,-0.07299789 +-0.390878,-0.8954067,0.9855528,-0.8902725,0.859099,0.276297,0.2231468,1.14863,-0.1939695,0.9839178,-0.03542041,0.1134285,0.6085281,-0.4351779,1.691392,1.602007,0.4393212,0.308688,-1.741444,0.696422,-0.6423056,-1.434639,1.912702,0.5774774,0.3012331,-0.2495548,0.9808891,0.7517611,1.159423,-0.06868111,0.6155491,0.1527405,0.411939,0.01388568,1.149236,0.9156392,1.22011,0.5487831,1.388269,-0.02850287 +-0.0705864,-0.001279981,0.5095033,-0.4353885,1.143988,0.1904806,-0.5776972,0.7936442,1.182272,-0.3876318,0.2983649,-0.3631842,-0.1318069,2.052228,-0.02233159,-0.5350842,-1.408026,-0.8942574,1.026129,0.4354711,-0.4441461,0.784875,0.1179949,-0.2619878,0.2843887,-0.4689603,-1.123767,-1.427364,0.004895345,-0.3842415,-0.6803156,-0.2049619,1.145362,-0.7288145,0.629185,0.7696591,-0.1765092,-2.191843,-0.125045,0.01573781 +-0.4620508,1.075759,-0.5419024,-1.246157,-0.7997656,-0.4432717,0.6900894,2.312411,-0.02172099,-0.2790226,1.036891,-0.402756,0.3309052,-0.9583213,0.9730256,0.9738238,-0.09747137,-0.8792682,-1.321427,-1.886602,-0.3672569,0.5505697,-0.01669701,-0.7138181,0.003720478,-0.1093475,-1.20041,-1.233983,0.3692707,-1.157671,-0.4352751,0.5075427,-1.76149,-0.1993775,-0.730475,-0.7260359,-1.491311,0.2765981,1.195129,-0.4985135 +0.5409083,0.2767932,1.38308,1.089861,0.3928749,-0.922327,0.9188244,0.4467008,1.077806,-0.1893278,-0.2801661,0.3889101,0.814729,-0.3243688,0.8793665,0.477965,-1.406424,-0.02221719,0.5841775,0.3527073,0.8079961,0.2758564,0.799583,-0.624826,-0.1188497,0.5045022,-0.8241864,0.1409389,-1.165617,-0.2610761,-1.501341,-1.182398,0.3367287,0.04903033,0.2437513,-0.6656943,-0.9165832,-1.242852,0.4533513,0.2322153 +0.931635,2.231401,-0.3277551,-0.4874441,-0.03855792,1.394414,0.000137599,-1.617739,0.7411442,-0.4305478,0.8833484,-0.1068893,-1.106182,-0.3313733,1.871321,-0.3387507,-1.623697,-1.123807,2.392963,-0.4312879,-0.9467334,0.09533607,-0.02171917,0.9269803,0.9841207,0.4404383,-2.323872,-0.5117954,1.112921,0.888261,0.6398023,-0.01649902,-0.07152446,-0.2474706,0.7966065,-0.2345342,-0.9014088,0.3377663,-0.7961203,0.01896071 +-0.2092743,-0.316859,-0.4726869,0.6866396,-0.1208278,-1.294984,1.049792,0.739706,-1.914767,-0.4059391,-1.141891,-1.510852,-0.2907334,1.222511,-0.2444358,-0.04222331,0.7569781,0.1431406,-1.681769,0.2746936,-0.04009084,-0.0918332,-1.985667,1.083059,-1.236259,0.2734993,0.3120626,0.5853436,2.747179,1.069049,0.67911,-0.04384835,-0.1911935,0.291198,-1.271504,-0.05266964,-0.03525177,0.5435023,0.4909832,0.2695141 +0.61735,-0.2957037,-0.906588,-0.0684791,0.3232668,0.5998703,0.08547694,0.2639106,0.02183871,-0.2033636,-1.316901,0.7666942,-0.685656,0.4026581,0.4302301,-0.5003967,0.6873859,-2.404678,-0.5510886,-0.653899,-0.7849677,0.7132983,-0.8044154,0.7030224,0.07275352,0.8115652,-0.658768,0.4027569,1.228026,1.073708,-0.1424529,0.918231,-0.9157044,-0.2359459,1.174047,1.089738,-1.145319,-1.97335,-0.872363,-1.243334 +-0.4050775,-0.2813054,0.7888865,-0.469054,-0.224378,0.03136775,0.8033103,-0.8680958,1.270064,0.0436703,-0.2569079,-0.1132449,0.9405332,0.911887,0.7507907,-0.0822654,-1.649017,0.4521386,1.006583,-2.018172,-0.1299473,-1.188473,0.3357483,-1.043365,-0.1877919,-0.5594068,-1.217977,0.1124117,0.03423045,-1.763338,-1.439013,-0.3035083,-2.34953,-0.6257675,-0.2840114,0.893372,-0.02836635,0.08551235,0.354566,-0.627949 +1.053104,0.04116174,0.2281924,0.4492978,-0.503572,1.81875,0.3441484,-1.53459,-1.169927,0.9910093,-0.8719836,-0.1174215,-0.04844153,0.2555837,0.155488,0.7179088,0.5872713,-2.241754,-0.4687208,-0.08564226,-0.785674,1.082532,0.131412,0.7929627,-1.101256,1.175624,0.5750237,1.484894,-0.3318219,-0.4219983,-1.246283,1.356883,0.9207057,-0.6889624,1.022516,-0.5941042,0.816315,0.7459865,1.215934,-0.05233939 +0.6022842,-0.7145895,0.4525711,0.03740589,-1.311985,-0.2927431,-0.1737089,0.3093728,-1.595319,0.07157813,2.078212,1.005685,0.8923242,1.6556,1.57065,0.7382092,0.8106575,1.109236,-0.0894716,0.6475196,-0.04665275,-0.5127541,-0.3824028,0.448915,-1.379543,0.9063908,-0.7240701,-0.557526,-1.090925,0.8645187,-1.795657,0.6470359,-0.5960021,1.10025,-1.298173,-1.177174,-1.107278,-0.4302532,1.19175,0.4172196 +1.017461,2.523586,-1.860143,1.207554,0.5197868,0.9539449,-1.637961,2.926861,-0.7034045,1.043732,0.01656966,-0.1787533,0.179081,-0.876742,1.444119,0.359274,0.8499174,0.4651972,0.07971386,0.1728613,1.022922,1.335571,-1.075608,1.212212,-0.7365463,-0.808253,-0.931998,0.5142047,1.012379,-0.1163176,0.3837124,2.798655,-1.887168,-0.5927467,-0.9538978,1.919614,-0.2951008,0.9806229,0.01079989,1.574426 +0.6081673,0.2333733,0.3917267,0.2961469,0.4956351,0.5256714,2.797897,0.9119851,0.4896263,0.9543799,1.685866,-1.858133,-1.836656,0.5084699,0.3952121,0.1662209,-1.29549,0.5955918,-0.6451741,-0.9311136,0.4641191,-0.06697461,1.2375,0.5298843,-0.3088785,0.5570949,0.3508581,-0.8830775,0.09940694,0.3532178,0.3259633,-0.3674458,-0.5097278,0.4116409,-1.5813,-1.614567,-0.798768,-2.091669,-0.7294646,1.22123 +0.206736,0.05322626,-1.417476,0.4120779,-0.03756048,1.200849,0.348129,-0.06820718,-0.02192403,1.695897,-0.6244306,-0.7807424,1.644626,-1.887835,0.8917454,0.2362482,0.947965,-0.01643077,-0.769196,-0.01469028,-1.374169,-0.03548818,2.001683,0.9060743,1.980245,1.245289,0.4877494,-0.3604898,0.356262,-0.7233251,-0.5443773,0.7790023,-1.209913,0.2495881,-0.4842256,0.3409266,-2.126231,0.6258254,0.5967234,-0.2850515 +-1.897727,0.007317735,-0.1802705,0.5620988,0.8588838,-0.6929581,0.006830077,-1.151931,0.2178457,0.4212137,-1.006466,-1.027534,0.7769687,0.6770733,1.308965,0.5359832,0.9943253,-0.4122798,-0.7531632,0.4568857,0.4504643,-0.1876292,-1.647099,-0.1365731,0.7568025,-2.913905,-0.1294109,1.318387,0.9217794,0.5931101,-0.3466198,0.3363405,-1.760356,0.4046718,0.0755606,-0.3928197,-0.0813674,-2.246856,-0.3434676,0.4193866 +-0.6825828,-1.896124,0.4983746,0.5236366,0.4154162,2.172212,1.471125,0.3343142,0.5861595,0.3355555,-0.7930646,-0.1098845,1.47153,-0.2711941,-2.238027,0.353409,0.7641599,-0.2169733,-0.7898399,0.6772994,1.608442,-2.433896,1.072159,-0.7328611,-0.003868847,-0.720009,0.3336168,-0.4374537,-0.05423015,0.4019821,0.4525996,0.7151073,-0.2941223,-0.2042808,-0.2719034,-0.7942417,-0.5478072,0.9076628,-0.6910896,-1.425624 +0.4813384,0.4810913,0.6026013,-0.08827406,0.7752958,0.02746565,0.4106676,-0.4247143,1.055541,1.034505,1.921676,-0.0340287,-0.3873934,0.2846269,-1.362347,-0.4094895,0.1954252,-0.4490692,-0.4011114,1.158584,0.5916457,-0.9576532,0.3062784,-0.6793648,-0.6718078,1.020393,0.6485023,-0.2594907,-0.172694,-1.198308,-0.3465141,0.7166628,-0.7894432,-0.8747644,0.3033422,0.7329114,-1.012041,-0.3938312,2.104274,0.8929687 +-0.463031,0.3320065,-0.4820308,-1.199814,-0.05204835,0.1824712,-0.05785357,0.2550863,-0.6629185,1.146505,0.03956834,-1.275032,0.8860169,-0.5918463,-0.2974507,0.7510724,-0.5064328,-1.186189,1.937572,1.239583,-2.007453,1.128467,0.1239044,0.3078691,0.4486408,-1.762642,-0.495451,0.9746325,-1.727645,0.8114384,0.9917054,1.230618,-2.017818,-0.3319409,0.9711618,1.446205,1.583721,-0.6463112,0.2502403,-1.177285 +-0.2797417,-0.6454991,-0.126249,0.6185945,0.8967021,0.1520082,-0.6590815,0.1938208,0.1637302,-0.1327639,-1.461071,-0.7812951,-1.07313,0.3049873,0.5648955,-0.8349383,0.7263023,1.054671,0.5793312,0.9158425,-0.8200566,-0.03901709,-1.198676,0.7104932,-0.01220045,1.001464,-0.3534013,0.2839111,-0.322829,1.309931,0.3909525,1.083922,-1.449984,-0.03065797,-1.675509,-0.2741803,-1.974057,0.01511904,-0.1128929,-1.915916 +-0.4136901,-0.3692699,0.874537,-0.9645172,0.07418931,-1.915157,-1.171583,0.07945943,0.2846821,0.2020524,0.5958032,-0.798379,1.189257,0.2388668,-0.9575272,-1.231308,-0.1538235,-1.349389,1.820357,0.272459,-0.4444517,0.8669787,-1.319592,0.7112083,-0.9508707,-1.11563,-0.5702212,1.281438,1.468516,0.4259677,0.4978593,-1.186937,0.9431263,-0.1782244,-0.3825225,-1.040985,-0.3484318,-1.675436,0.2949135,0.5621256 +1.618767,-0.416026,-1.245032,0.8510994,0.05677277,0.5556235,-1.268997,0.36591,-1.228784,-0.1958918,0.7590719,-0.1949478,-0.5428058,-0.7018565,-1.219165,0.08968862,-0.2116603,0.1449419,-0.1912686,-0.2353424,-1.052328,1.042668,0.4623233,1.848975,0.2798797,-1.35697,2.270494,-0.6905741,0.7854085,-0.7423161,-1.003383,0.02367262,-0.6758721,-0.2237105,0.9663614,1.860325,-0.499294,-1.695103,-0.5587881,-0.2302536 +-0.7210557,0.575634,-1.349126,-1.054175,0.1820294,-1.651253,-0.1578196,1.150731,-2.352025,-1.161695,0.4250045,-1.731427,-1.402039,-2.387151,-0.1606524,-0.1257599,1.453987,-0.295446,0.2471267,-0.1189187,-1.460899,0.1424936,-0.07959036,-1.455216,-0.1943264,-1.987395,-2.437751,-0.2578236,-2.040998,-1.423332,-0.1523404,-0.2245315,1.489937,0.5099063,0.9586621,-1.832143,0.3657129,-1.553123,-0.5865197,0.3756301 +-0.4530932,-0.8793241,1.016944,-0.5175424,0.8278325,0.1413645,-0.2968275,0.1501102,0.06644538,-0.1393115,1.603328,0.1513141,2.188735,0.5584125,-0.7755025,1.377894,-1.456579,1.481315,-1.252205,0.4776468,-1.302422,0.2600608,-0.5270398,-0.9033299,0.106432,-0.9016884,-0.1273952,0.2351796,-1.339039,0.8388572,-1.205047,-0.8250595,-0.9026929,-0.09439934,-1.705552,-0.5862319,1.309366,-1.745594,-0.148417,-1.347129 +0.01425716,-0.809559,-1.824999,-0.3121542,-0.1396969,1.431528,0.9037996,1.412254,-0.3099932,-0.2903777,1.148378,1.913236,2.236973,-0.5846191,0.5998706,-1.174977,0.2104355,0.03383547,-1.337394,0.01166865,3.367104,1.324193,0.7159006,0.6271972,-1.29623,-0.06687397,0.199848,-1.976656,0.267691,-1.175235,0.3903723,-0.6030772,-0.5105908,-0.7371762,-0.7896042,-0.5009031,-1.860494,0.4762851,-1.083907,1.254982 +0.2157646,0.4926698,-0.4376353,-1.570045,0.3642789,-0.01345597,0.2693104,-0.01439835,-0.6375396,-1.109039,-0.4616258,1.962162,-3.110967,0.8238142,-0.7814564,-0.7180092,-0.4930081,-1.345179,0.5916409,-0.03875493,2.116772,0.1277405,-1.279853,-0.8275903,0.4354227,0.360908,0.1145651,0.7075091,0.2018127,-1.347053,0.3893716,0.6991967,0.8002973,1.327523,-0.0893016,0.4551012,-0.1905523,-1.062766,-0.4575999,1.504613 +0.1888702,-0.8038873,-1.378599,2.007363,0.9059131,-1.165784,-0.4233758,0.6600925,0.6760583,-0.1620685,-0.6565443,1.31283,1.432817,-0.7204871,0.6673281,1.370486,0.2120296,-1.380063,-1.682222,-1.150728,0.8318387,1.233718,-1.367749,-0.1341183,0.6763883,-2.497917,0.5519518,0.766889,1.235539,-0.09141155,-0.7976609,1.639761,0.02904048,-0.9477937,0.5425994,0.4483875,1.987252,-0.7673432,-0.7486844,-2.000969 +-0.05014849,-0.344689,-1.821933,-0.6570185,-0.9073548,-0.1228093,-0.5929752,-0.07174838,0.909487,-0.5707826,-1.380015,0.7483015,-0.819112,0.001322486,0.8198494,0.3748494,-0.8749434,-0.8828255,-2.790899,-0.1145206,0.2401081,-0.6661079,-2.663868,-1.41808,-1.54916,0.7495282,0.3412198,0.2071126,0.350747,-1.037965,-0.2182624,0.8136472,0.08930977,0.06037804,-0.01954074,-0.7379327,-2.116559,1.4701,-0.7902534,0.4690851 +-1.49542,0.192679,-0.7954872,-1.721539,1.143409,0.5228269,-0.2193762,-0.2972873,-1.465172,-0.5174789,-0.6157836,-2.353325,1.379008,0.09229948,-0.7065764,-0.8170549,1.566227,-0.3163496,0.03780792,0.7170915,-0.8838706,0.8433,0.231261,0.6710496,-1.892499,-0.4360921,-1.283824,-1.08431,-0.6238862,1.356859,-1.722207,0.8015313,0.08375952,1.015151,1.463073,-0.2540895,-0.1722487,2.021592,0.1748226,-0.1295669 +0.3678378,0.4648077,-0.5672741,0.3921659,-1.047482,-0.7645738,-2.042506,0.7498538,-0.633945,-1.215067,0.7404873,-0.6754569,1.279504,0.03671894,0.6821149,-0.8012012,-1.513931,0.3662163,-0.1828,0.1537035,1.403315,-0.2955042,0.02368342,-0.268375,1.219026,-0.3329148,1.246495,-0.2233164,2.594285,-0.3787041,0.2356227,0.0009237,-2.213395,-0.8393039,-2.185349,-1.166206,0.5559434,0.08949886,0.7650956,-2.301839 +0.517144,-0.9306044,1.293843,1.685983,0.003567543,0.1550376,2.713178,-1.163588,0.765923,-0.356809,-0.7906215,0.3367685,0.508267,1.947845,-0.7388198,-0.1046399,0.8924945,0.4045997,-1.554781,0.2537949,0.6256284,0.2313926,-0.208159,-1.728594,-0.0450908,1.974945,0.161339,-1.764474,-0.2050513,-0.002528684,0.06869892,0.6800674,1.661217,1.300573,0.04770453,-0.9536175,-1.421838,0.6702874,0.1682428,0.2807413 +-0.4843355,1.339021,1.159232,0.1726345,-0.730539,0.3167954,0.8325782,1.445385,0.7348688,-0.930979,-1.146301,0.140856,-1.916941,-0.2961794,2.176342,-1.284885,-0.9337869,-1.894549,1.175842,-0.3267209,-0.788895,-0.4718262,-0.7893947,-0.6856501,0.5504009,-0.3894079,-0.9510605,0.6258518,-0.9747448,-1.683845,0.5597305,-1.067827,0.3309359,-0.7589705,0.7384077,1.185595,-2.866989,-0.5317721,-1.700985,-0.4889624 +0.6748556,0.1679515,0.2204652,-1.960053,-0.8770991,0.369403,0.4155725,-1.653387,-0.7239938,1.491266,-0.9145114,0.6350604,-1.253968,-1.349625,0.8277541,-0.1739991,1.704735,0.5126244,-0.0716106,2.235136,0.9847274,-0.4094601,0.5411619,1.754902,2.2884,0.9206545,1.991747,0.05474595,-0.004599396,1.330502,1.290344,-1.327079,0.7213981,-0.7637826,-1.41757,-0.3397645,1.933341,-0.1316073,-0.935193,-0.2890469 +-0.7624486,-1.295384,-0.8536304,0.4156749,-0.94293,-0.6483243,0.3594357,-0.1869003,0.1546421,-0.2151881,0.5573668,1.485612,0.2711338,-0.7793992,0.3910491,1.66471,0.135166,-2.003161,0.6219757,-0.4133654,1.400193,-1.10115,0.08205559,-0.9788809,1.610334,-1.101358,1.153475,-0.2658562,-0.4595905,1.055607,-2.067957,-0.7187299,-0.3037785,-0.5668895,0.1883409,-0.3523315,0.0551401,-1.47095,1.561681,0.3704036 +0.3860738,-0.8979096,1.339069,-0.8714537,0.7661303,0.5713007,-0.8174571,-0.9305272,1.126186,0.3679706,0.3445075,0.3077729,0.5239629,0.1010051,-0.6882109,0.4853184,0.4790607,-0.04934345,1.618848,-1.254425,0.4428527,-0.5263861,-0.758763,-0.3411003,1.283108,-1.324537,-0.1576416,1.321869,-0.7209536,0.9000293,0.2358486,-1.039014,-0.3603809,0.3245835,-1.420972,0.07559591,0.4147952,-2.235345,-1.213401,0.9584462 +-0.6640033,-0.1991474,-1.3226,0.8115584,-2.75763,0.2472737,-1.809666,0.009726441,0.6975773,0.2727096,-0.6234897,-0.05682715,0.8511875,-0.0171832,0.4787102,-0.08367329,0.3621084,-1.702301,0.8336054,-1.703237,-0.8406628,0.2103065,0.7372764,-0.02563988,0.6833503,-0.3624431,1.073296,-2.064022,-0.5392738,-0.5688285,0.1874718,-0.5711777,1.85605,-0.09693309,1.189057,-0.8319965,0.988883,1.136955,0.7862848,1.36425 +-1.724344,0.01347128,-0.3713737,-1.033601,-0.693357,-1.515593,-1.383133,0.5131512,0.01664915,-0.05787889,0.9001532,2.030557,2.200848,-0.6922625,1.712712,0.4893764,0.1365716,0.3715544,1.109052,-0.9645601,1.316457,-1.219859,0.6053841,1.948648,-1.373297,0.104511,0.04497591,0.9041496,0.6305422,-0.9995124,-0.1403121,-0.04657576,0.531668,0.6676503,-1.00507,-0.5076362,0.7972146,-1.013466,1.384138,0.7750627 +1.156319,-0.1953723,-0.3137379,-0.6197514,1.579572,2.226515,2.332157,-0.8349754,1.490071,-0.888293,-1.378225,1.541248,0.6230538,-0.741753,1.679373,0.520229,0.936073,0.5096714,0.118661,-0.8585207,-1.557027,-1.376615,0.3112027,-1.190076,-0.3208572,-0.4495141,-0.8023066,-1.878526,-0.9826098,1.893303,1.252649,0.0915663,0.3623916,1.300618,0.6435842,0.9679126,0.1586441,-0.7307148,-1.184524,-0.9264862 +0.6935066,-2.005082,-1.263741,-1.264833,0.5000376,-0.8660515,-0.2679559,-0.5499116,-0.0963032,-1.065134,-0.5238646,0.0345859,-0.9111363,1.021448,3.554124,1.815166,0.7892227,-1.410274,-0.2720436,0.1011693,-1.159894,-0.06106217,-2.728107,-2.095984,1.085413,-0.05655378,-0.3302514,-1.7212,0.7753054,-1.433657,-1.884101,0.5338009,-0.02981512,-0.3836255,-0.7941145,0.6206113,-0.6308645,0.7851673,0.6174791,-0.5471758 +0.1431564,0.2780552,0.8111377,-0.3879487,0.3151851,0.5985495,2.085716,-1.017725,-0.322985,-0.4501801,-0.8185465,-0.783026,-0.442145,0.1874804,-0.4482631,-2.359662,1.090657,0.5928379,-0.195456,1.515223,-0.8985858,-0.333588,1.361327,-0.665131,0.3735808,-0.6980578,1.658458,0.6764802,1.085063,0.01358374,0.2904739,0.2498043,1.153057,0.5072621,0.3667837,0.1545222,1.524714,-0.433719,-0.5863372,-0.7724107 +1.492814,-0.7009808,-1.566914,-1.122846,0.706528,-0.173705,-0.4215919,-1.331806,2.720418,0.694529,1.090643,-0.4958664,-0.05860545,0.9352828,-0.04049809,-0.833363,-0.3143246,-1.392325,-0.4154882,1.328081,1.203441,1.566451,-1.098444,1.103759,1.34969,-1.242992,-1.64527,-0.3137343,-2.086935,-0.3212597,0.4785073,1.048166,0.0118531,-0.2030342,1.231999,-0.08247383,0.8878141,-0.5054286,1.326875,-1.063408 +-1.632153,0.6449347,-1.450444,0.8806802,1.457861,0.0325864,1.790223,0.2362463,2.646131,0.1677322,-1.376841,0.0900948,-0.2473166,-0.4222328,-1.258888,-0.934917,2.818132,0.16931,0.9470855,0.1165622,-0.2421013,1.987346,-0.4188148,0.805846,-0.7424503,-0.1477932,-1.808992,-0.7924413,-1.12213,-0.1192148,-2.520808,-0.8454557,0.4586969,2.574548,-0.50235,0.6792296,-0.5436548,0.4723664,0.4921267,-0.3866668 +0.127846,0.6981556,0.9797361,1.906537,-0.6694734,-0.1017262,0.9460606,0.305435,-0.5274615,1.870213,0.6741586,1.248119,0.2647399,-0.740942,0.3701118,0.541495,1.595926,1.184929,0.4210655,0.4944262,-0.5991057,0.580931,-0.7386874,0.5795048,0.06157796,-1.462398,1.07603,-1.57419,0.3304016,-0.01141098,-1.056759,-0.5453335,-0.8451952,-0.4095414,-0.649117,-0.9097839,-0.7954893,-1.087026,-0.5214125,1.259723 +-2.403664,0.997271,0.3701666,0.2442495,-1.148496,1.484491,0.8375365,0.782877,0.1175337,1.852928,-0.53235,0.4297116,-0.7797121,0.6314025,0.3544833,-1.112044,-0.2949826,1.274828,1.194103,0.9667314,1.761339,-1.350116,-0.01041573,1.399293,1.115887,0.2182123,0.1636219,0.6997072,-0.5258208,2.485065,0.9072089,1.375151,0.8017431,-1.060736,0.5921045,-1.882508,-1.346334,-1.485317,1.353698,1.314235 +1.443928,-1.643284,0.1733165,-1.135373,-0.6382386,1.44934,-0.6785213,0.9495092,-1.971681,-0.3565076,0.7796902,1.599995,0.4665238,0.3134125,1.185363,-0.06978535,1.427786,-0.3998606,-0.09624288,1.764262,-0.2058368,-0.7247946,-2.713294,-0.07361656,-0.9660578,-0.6692666,-0.824446,-0.4408538,0.6695648,-0.8631034,-0.8284855,0.7940492,0.5614017,-0.08076912,0.6573699,-0.1569758,1.034381,-0.5147751,0.4238962,0.9680003 +-0.878893,-0.8007625,-0.2735915,1.404655,-0.999188,-0.5752857,-0.2126611,-0.1885136,-0.2935878,-1.294053,0.0245867,1.503013,-0.206254,-0.6877295,0.6057627,-1.408143,-1.287606,0.3513233,0.3434354,1.226782,0.972576,-1.675263,-0.5950923,0.3908467,-0.1749575,-1.325065,-0.1078835,0.8587165,-0.2215024,0.7854883,-0.09777876,0.7780883,2.047857,0.2350455,-0.4724167,-0.677984,0.4292764,0.5134219,-0.7875298,-2.119573 +-1.306438,0.242734,0.04525164,-0.2259192,-1.385003,0.5891885,0.3342372,-0.09594732,-1.201158,-1.715001,-0.4572348,-0.8258735,0.72386,0.1486251,0.9136293,-1.081531,0.001623172,-1.629076,0.749625,-2.998225,-0.05356339,-1.358316,0.936144,0.3089053,-2.81527,0.4190146,-0.871922,-0.07125201,-0.07197645,-0.02478967,-1.008549,0.09808151,-0.2191527,0.2370697,-0.8484579,-0.2892249,1.313789,-1.340688,0.09573065,-0.9302306 +-0.877199,-0.105719,1.570806,0.1305281,0.715127,0.4185815,0.6161115,0.900392,-1.277765,-0.4963435,0.1272074,-0.0493185,1.647461,0.9166148,1.074008,-1.893498,1.828603,0.07526374,0.6912167,0.6852019,-0.2038325,1.443858,-0.916023,2.129629,-1.875723,0.3551776,0.6127713,1.488324,0.5115789,-0.5326624,0.2539355,0.7507837,-0.5630785,1.703214,1.436579,-0.1221627,0.3741568,0.4795378,-0.5350149,-1.506282 +-1.16438,-0.1790229,-0.6983562,-1.179827,0.2393708,0.4925463,-0.1707148,0.004056834,0.8484878,-0.1006489,-0.4310825,-0.5457021,-0.3306798,-0.6464718,0.2295259,0.6324204,-0.3071329,0.9458635,0.2258505,-1.424889,0.3382662,0.454015,-1.285283,0.1360886,-0.8814312,-1.130837,-3.426028,-0.3635639,-0.4276277,0.8527237,-0.385516,-0.1846566,0.1228342,-0.724022,-1.376222,-0.2802839,0.08313377,-1.372667,-0.3416446,0.760085 +-1.982348,-1.419072,-0.4272629,-0.8287056,0.1164812,1.014431,-0.1341332,0.05936647,-0.6722437,0.04968599,0.5764007,-0.943769,-0.3528281,0.01151775,-3.12958,-0.4050274,-0.8843702,0.6556913,0.06714934,1.148607,0.3702423,0.4974983,1.475763,1.931261,1.812392,-0.3750889,0.7084582,-0.3340214,-0.3057681,1.341293,-0.5927457,0.5538085,-1.83366,1.07665,-0.5512168,0.1514892,0.5240782,1.090267,-1.047123,0.6050645 +-0.9899442,0.6308903,0.08298622,-1.155484,-1.025905,-0.6242414,-0.487671,-1.164429,-0.9986531,-0.3528707,2.445793,1.089387,-0.2694517,0.8016824,0.9566385,-1.199388,-1.11516,0.7826625,0.06606078,-1.992284,-0.383239,0.01445895,-0.4733161,-0.2702164,-0.852797,0.9543671,-0.2166478,0.3000456,1.635664,0.3763525,1.124272,-1.21548,-0.9114118,-1.187337,-0.2770951,-0.4517749,-0.0417958,-0.25122,0.5379805,1.770776 +-0.1516846,0.4198107,1.454363,1.10391,-2.084404,0.220936,0.2845774,0.58689,-0.1777819,0.1183115,0.6355814,0.7302613,2.537724,1.127578,0.1533079,-0.5222487,0.6012617,1.229808,-0.03866817,-0.02579833,1.442283,-1.056191,0.7412798,-0.02286563,-1.673114,0.1275731,1.889048,-0.940843,1.147455,-1.563294,0.5111549,-0.4852451,-0.2505319,-0.8043822,-0.983497,0.01146862,0.3692725,-1.418088,2.076859,0.955553 +0.9125068,-0.7764052,1.265366,-1.415666,-1.812221,-0.0910377,-1.264464,0.167834,-1.655672,1.134738,-0.4659306,-1.502645,0.160072,-0.4208025,-0.6675652,-1.188431,1.086513,1.125657,0.4118491,-1.288874,0.07329637,-1.574418,0.4526229,1.127255,1.844854,-0.05688944,-1.300565,0.1661494,0.06501717,0.1635806,0.6854992,-0.7256756,-1.227867,-1.560387,0.605273,-0.328864,-0.083747,-0.1464706,1.147505,0.699894 +0.4076698,-0.7766187,-0.3608104,1.3297,-0.3242543,-1.207956,-0.4302414,-0.4745689,-0.282905,-0.791283,1.010326,0.8463619,0.3032152,1.912569,1.636795,-1.68093,0.8674485,-1.175177,1.515217,0.4130539,-0.8797614,0.6295347,-0.1787619,0.9821765,-3.218199,1.053828,1.295535,0.8000571,0.1243728,0.8037721,-1.82922,0.2614482,2.864849,1.059206,-1.501144,-0.7723926,0.456752,0.05667141,1.201587,1.990335 +-1.242184,-1.020168,-0.9766893,-0.3029013,-0.8964194,0.01371212,0.9241838,1.913609,-0.1927227,-0.6352086,1.2901,0.5011166,-1.071453,0.4125865,-1.238873,-0.7784867,1.284378,-2.290645,1.299704,0.5689834,-0.7409213,0.1681686,-1.126167,-0.7426129,-0.3316019,-0.2231559,-0.1521666,1.004133,0.755524,0.3656353,-1.443214,-0.07534419,0.6394425,-1.174,1.648096,-0.002096138,0.3706955,0.5346252,-0.4090461,0.5652727 +-0.6426944,0.1675418,-0.389943,-0.131608,0.7556421,-0.4980104,0.2813029,-1.784346,-0.1011858,0.04219682,-0.1249959,0.3087585,-1.490353,-1.306189,-0.4388994,-0.8322356,0.3816092,-1.186369,0.05359036,0.4610154,0.8691647,-0.5392194,0.2197964,-0.8152101,-1.0865,-1.554078,-0.3655183,-0.6405002,-0.2387331,-0.250239,-0.5436043,0.738873,0.09549265,0.6896223,-1.065431,-0.1342571,-1.938029,0.03403801,0.4886932,0.9724909 +1.930244,-1.247604,0.4179004,-1.703542,1.804803,0.0460844,-0.3145566,-0.1514579,0.07815343,-0.9575136,-0.2721066,0.5471546,-0.08848995,-1.760553,-0.6523371,0.3814843,-0.000706276,-0.4894646,-0.794713,1.727494,-0.6215419,1.340758,-0.347094,-0.1926771,-1.801749,-0.2413596,-1.692998,0.3777801,-0.7913565,-1.126952,-0.4887105,0.6481156,-1.229723,0.8302509,0.04539357,0.4104464,-0.7457222,-0.1468996,-1.319291,-0.8702658 +0.4101994,-1.258033,0.09682857,1.385222,-1.550478,0.3729059,0.4716555,0.9892033,-0.4449705,-1.358603,-0.1248626,0.631965,-0.7598479,0.5300031,-1.395578,-0.05307235,2.266444,-1.09524,0.7561498,2.212044,1.248591,-0.8379197,-2.408987,2.698075,0.0967107,-0.179297,2.827858,1.413385,1.657236,-0.1542623,0.06568358,-0.3895888,-0.7932249,1.072193,0.8205705,0.635947,-1.390647,-1.888654,-0.2399708,-1.034715 +-1.291349,0.2400204,0.2339679,2.339104,-0.1006846,0.9191041,0.528198,0.5111019,0.3902742,1.261739,-0.3368015,-1.326168,-0.4849521,-0.6183358,-0.3610408,-0.1770483,0.5308012,0.6698929,1.14266,-0.4956349,-0.4991483,1.319884,-0.2416706,-1.6521,-0.4646026,1.156859,0.4442398,-1.22287,-0.8134998,0.4604609,-0.1881816,1.141111,2.320001,-1.746631,0.07853343,0.5122565,0.2454536,-0.2958694,0.7964273,-0.4493783 +2.635045,0.3537255,-0.003517701,0.8373573,0.4121953,-0.2367585,0.86406,-1.055282,-0.05752386,-1.339385,-0.7242957,2.75591,-1.607187,1.059739,-1.030828,0.7154662,-1.023325,-0.04318859,1.153478,-0.8456173,0.7827711,-0.2156038,1.808307,-1.902659,0.928527,-0.2173259,-2.45139,0.765105,-0.01509842,0.0591651,0.7848745,-0.02934656,0.3337205,-0.7628738,0.01816081,0.8181264,-0.2019331,-0.4829486,-0.1462439,-0.04883229 +0.4870723,-0.4813691,-2.091426,-1.785015,-0.04477948,-2.339387,-0.8410644,-0.60266,-1.161276,-0.5630767,1.30917,2.045881,-2.15328,-0.6157261,-0.5694721,0.7712914,0.8261777,-1.210571,0.1553875,-1.200547,2.305751,0.1157469,-0.5002413,-1.250409,-0.7187815,0.8210137,-0.7525227,-0.04135289,2.161159,-0.6416537,0.08147294,1.839036,0.461307,0.939566,2.110389,-2.176385,0.4167993,0.6257114,0.3746746,-2.312146 +0.8538923,-1.848133,0.1092008,0.7459048,0.9657262,-2.079157,-1.560338,-0.1237286,0.09381353,1.281028,-0.1994629,-1.246692,-1.486027,0.6959804,0.08001262,-0.1450899,0.3158853,2.015637,-0.5612582,0.8282719,0.6016893,-0.9140131,0.8692069,0.502349,-1.106129,-1.160234,-1.295165,0.3819267,-1.163277,0.1254232,-1.084408,0.252293,-0.4839096,0.7655403,1.861803,-0.6238026,-0.6143545,0.3444116,-1.603399,0.8286573 +1.088443,-0.5639334,-1.114422,-2.067839,1.863605,0.1913018,-0.3327562,-1.294499,-1.647925,2.394945,0.5306486,-0.515803,-0.3702679,-1.538152,-1.508654,1.215225,-0.7011313,1.376264,1.158051,-1.545006,-0.6089442,1.320668,-1.028933,-1.307768,0.5819859,-0.4820592,0.6732108,-0.3884038,-0.006388688,-2.060946,1.495568,0.7863587,0.04237295,0.9723151,-0.2785077,0.2017627,0.4557345,1.545779,0.8033347,-0.5558714 +0.226014,-0.9702911,-0.1114619,0.2407248,-1.83308,0.2160934,-0.7947617,-0.8118307,0.1654719,-1.928625,-0.4635866,-1.981509,-0.3977675,-0.7116593,0.03661129,0.4657963,-2.382379,-1.07244,-1.124419,0.5677278,0.5658172,0.8585999,0.9065388,-0.4656176,-0.3504488,-0.3253498,0.4482964,0.5287532,1.919453,-0.4418257,0.5005289,0.7575802,1.209792,-0.1607456,-1.425609,1.055094,1.946121,0.5247941,-0.1112375,1.257717 +0.06819884,-0.3975606,-0.1442594,0.1509398,0.4060252,0.6337611,1.34539,-1.814753,-1.195772,-1.113294,1.553314,-0.603419,-0.3672889,-0.4572879,0.3734229,0.9077597,0.4522053,-1.577773,-1.39153,0.762226,-0.79787,-2.485987,1.598613,0.6511837,-0.4811609,-0.5604498,-0.4666708,-0.3844609,0.2262838,0.4082631,-1.062128,0.5201262,1.49366,-0.6617422,-0.5819282,-1.377867,-1.020536,-1.471227,1.176682,0.4409889 +-0.9848155,0.4190441,-1.005571,2.049432,-0.7067314,0.4215432,0.5384528,0.4552758,-0.3161609,-1.255374,-1.904983,1.512115,-0.8626427,-1.236819,-1.466885,-1.123885,-0.3088317,0.6092863,1.10583,-0.4157419,0.2190566,-0.5386338,0.02751464,-0.207018,-0.04439759,0.1982157,-0.6100871,0.6874858,1.374526,-0.4762003,0.6106105,-0.02097285,-0.549432,0.5782683,1.065591,-1.700919,0.2631034,-0.55839,0.5280273,0.1050178 +-1.310854,-0.01006552,0.5985845,0.602419,1.927392,-1.218213,-0.29173,0.3037121,2.057555,0.7349145,-0.3357125,0.1025842,0.01770182,0.02555803,0.9679417,-0.5257265,0.3787359,1.615145,-0.2717789,0.1914814,0.03807136,0.1916898,0.6999029,0.4719374,-0.6558695,-1.852444,0.1066784,-1.179322,-0.4871721,-1.090667,-0.163822,-0.8949793,-0.2879593,1.146625,-1.296878,-0.5664922,0.8799135,0.4087121,0.8377621,-1.268535 +2.464055,0.3544643,0.04301727,-0.2288542,1.565171,0.7123957,0.5565567,-1.087641,-1.32876,0.6262865,-0.06405531,-0.6608462,-0.7042631,-0.3895734,-1.999414,2.1886,0.2785203,1.118843,1.424586,1.393447,0.000440786,1.1681,0.9569187,-0.6020843,-0.09777792,-0.836957,1.375971,-2.275172,0.7073082,-0.1894233,1.661967,1.088626,-0.3826742,0.03214816,-0.7829124,0.6545863,-0.2468149,-0.1433669,0.6340833,0.9067661 +-0.6654281,-0.5498684,0.3614288,0.6126973,-0.3946728,-0.7512104,-0.3068004,0.3192403,-0.6196412,-0.848217,-1.035458,-0.899959,-0.7810163,2.072894,-1.181569,-0.6579035,-0.5701468,0.6310074,1.024614,-0.02191256,0.6925641,-2.335683,1.216291,2.983571,-0.04035698,0.2572309,0.3457543,-0.5532642,0.6105338,-0.5950278,-0.5630547,0.9573384,0.8274917,-1.475223,-1.53954,-0.1786546,-0.8172757,0.08247632,1.224706,0.138489 +0.9128626,0.7311192,0.136305,-0.1312517,0.9314308,1.182303,-0.2949112,0.7804417,0.2815179,1.084047,-0.5756945,-0.1690081,-0.4638573,0.059379,0.9035276,-0.8452598,1.304076,1.54281,-0.2273242,-0.4734271,-1.924815,0.2707546,0.6833835,0.2023928,-0.4629197,0.9273513,-0.3333329,0.283984,-0.9228957,-0.07837714,-0.1346229,-0.571082,0.9495626,0.7158472,0.4492502,0.7655127,1.597901,-0.2354673,-1.856652,-0.451977 +0.9646642,0.3131712,-1.761578,0.4665767,-1.040387,-0.8570967,-0.4707757,-0.3507871,0.04833043,0.8156495,-0.2544681,0.1432911,0.9336052,-1.439503,-1.003851,0.5246638,-1.560923,-1.879785,3.03802,-1.309087,0.1430144,0.8600417,-0.1955225,-0.04550714,-1.018488,0.8882222,1.020438,-0.9379776,-1.021864,-2.515649,-0.6993046,-0.8404053,0.02506857,-1.410576,0.1120935,-0.9062432,-0.5556012,2.377724,0.3159528,0.4706662 +1.608003,1.540053,0.6231604,0.5779889,-2.65494,-0.05695113,-0.8814009,0.05470677,0.6155159,-0.4895321,1.31104,0.4125754,0.7887126,-1.323101,-0.614166,1.032662,-1.960672,-0.3952412,-0.3676364,0.262641,-0.1128682,1.270014,-0.1877121,1.021342,-0.5712966,-0.9376007,-1.182474,0.08789391,1.203349,0.3008092,-0.3508078,-0.02391086,-1.755881,0.4711969,-0.4129634,-0.5060355,0.40327,-0.3662533,0.1978068,0.9641451 +1.8354,0.3020303,-0.2490861,-0.3166106,0.3902921,1.129204,-0.9088993,0.3903783,0.2349899,1.605518,-1.033512,1.274219,-1.054967,-0.1124146,-0.008370327,1.215618,0.3655303,-0.3186228,1.186036,1.617887,1.186027,-0.434305,-0.07603606,-0.308749,0.4482352,0.5323165,-0.3169575,-0.7163595,0.7222997,0.3664887,0.3710938,-0.1601732,-1.441013,0.4709959,-0.6111621,0.6622975,0.4149865,0.3623335,0.4678918,0.6434615 +0.7024627,-0.4090037,0.0931224,0.5354361,1.048508,-0.2947133,-0.5952464,-0.256424,-0.3008951,-0.01002792,0.6851604,-2.10224,-0.08813743,-0.00473137,0.6674561,-0.05946853,-0.1251135,-0.125558,0.9925046,0.7541903,-0.2942556,1.569779,-1.255116,0.8650623,0.2478507,-0.961649,0.8608857,1.597585,-0.8215657,1.208851,-0.5441086,0.6140321,0.3636537,0.2490854,-0.04324698,0.6420457,0.2705311,0.1441272,-1.444519,0.9885208 +1.217854,0.4545041,0.0229701,-0.2197846,1.141921,-0.8233888,0.6477268,0.183409,1.51733,-0.9595713,-1.766895,-2.141338,0.5471572,1.414553,-0.06745351,-0.01383649,-0.2267996,-0.1270243,0.5204182,1.166341,1.947542,1.793622,-0.3086271,1.222507,-0.9217885,1.043977,-1.114335,-0.878713,-0.4890416,0.3566784,0.3510035,-0.3759586,1.642901,1.237441,0.5348325,-0.0317216,-0.8683716,-0.5136849,0.2591392,0.03312244 +-1.123654,0.05235751,-1.181833,1.094751,-0.7048012,0.8948622,1.046955,-2.165586,-0.6232483,0.3500613,1.373951,1.747008,0.2505657,-0.9241647,-1.889878,-0.4250167,-1.657212,-1.637282,0.1208898,0.7950804,-0.8476339,-0.02340271,-0.2888365,1.069375,-1.421616,-0.9546454,0.8135053,0.03083782,-0.04131537,1.471589,0.84384,0.8540956,0.953223,-1.333939,0.2154188,0.5427136,-0.3077009,-1.008394,0.1851927,-2.110107 +0.6683301,-1.423834,0.07072333,0.7071896,1.69613,0.707656,-0.7658572,1.298114,2.049302,0.1954158,0.3480181,-0.7729467,-0.5455248,0.1339297,-1.129803,0.4256374,-0.8555766,-0.3152767,1.383104,-0.5039054,0.9182134,-0.5055847,-0.4096059,0.07502142,0.6197019,0.2267762,0.5374429,-0.4753591,0.70862,-0.4228761,-0.95008,-0.5490527,-1.055132,-0.4367013,-1.54339,0.5200778,-0.3158507,0.3173063,0.4744357,-1.006122 +1.216411,1.338651,0.5695853,-0.9456515,-1.362054,-1.420268,-1.656033,-0.4317016,-0.7533625,-0.05363257,0.6413286,-0.979699,-1.598028,0.4268922,3.000633,1.34227,1.775127,-1.694579,-0.791342,-0.7612707,0.2266386,0.01689461,1.25503,-1.212906,1.4313,-0.7465841,0.124188,0.05469968,-2.421892,0.2807495,1.529716,1.045568,1.483493,0.5658135,-1.640693,1.088284,0.2316654,-1.104949,0.6702502,-0.420626 +0.2345754,-0.8700074,1.676093,1.295376,-0.2491367,1.161205,0.8440296,-1.00644,0.9215967,-0.1250391,-0.466238,0.5703874,-0.3122418,0.2085879,-0.4718624,1.01297,0.7957008,-0.4340202,-0.5938136,-0.4632908,-0.5832007,0.1336176,-0.5445903,-1.67021,-1.379975,-0.9791089,0.6485006,-1.209926,0.2995453,-1.002872,1.329837,-1.16956,1.21324,1.228174,0.4634899,1.651767,1.033088,-0.5688865,-0.2688467,-1.951882 +-0.4186966,0.06074699,0.803025,0.2591651,-0.122321,0.06785694,-0.8772493,1.241165,1.737385,-1.176887,0.77533,1.247307,-0.2089773,0.000262633,-0.02544143,1.388705,0.1354294,0.5380928,-0.8628266,0.09588533,1.277258,-1.236791,-1.089643,-1.396922,-2.308722,-1.173322,0.7591366,-0.6559374,-0.240216,-1.246788,0.2607796,-0.2502555,-1.048672,-1.515079,-0.6433393,-0.7053206,-1.305809,1.149587,-0.2894632,-0.1760086 +0.2382201,1.686845,-0.856123,0.5060116,-0.1344096,-1.754255,-1.152441,-0.2110212,-1.566881,0.6130496,-1.139488,-0.6625011,0.3983425,-1.205835,0.2630506,-0.05468113,-0.3746061,-0.2830932,1.674902,0.04374227,1.381347,-0.7935688,-0.4338131,-2.973223,0.07220986,-0.6210847,0.6241278,1.396546,-0.4697152,-1.255814,-0.04117677,-0.9132548,-0.5923349,-0.05061346,-0.1299614,-0.5569036,-0.1417726,0.5357625,-1.544199,-0.1538437 +-0.5505882,-0.463645,0.0453635,-0.4098308,-1.232619,-0.6405494,1.683385,-0.3252629,0.5783226,-0.7232077,-0.8026548,-0.1909582,0.3068097,0.2014514,0.6488076,-0.7814485,0.0784779,-0.6857908,1.301157,-0.8523321,-1.308506,0.6019841,-1.586333,0.9388556,1.797681,-0.2054735,-0.3503643,-1.00046,-0.6199803,0.09356256,-0.3996831,-0.2546693,-1.754701,1.786275,0.04247806,-1.207896,-0.4162312,0.03367258,-0.6886956,1.709951 +-0.5006028,0.8789056,-0.226853,0.4353847,0.4087219,-1.14294,-0.7585139,0.114983,-0.0652218,-0.1264638,-0.5974609,0.9777462,-0.5754411,0.06552377,-0.2745341,-2.360588,0.5882597,0.9160039,-0.5112796,0.03789206,-1.095127,0.1579764,1.732928,0.9164393,1.01265,-0.6266491,0.4932787,-0.2290319,-0.9630172,-1.42711,-0.3008726,-0.2062992,0.7314594,0.7653902,-1.732834,-0.1157084,0.52225,-2.171272,1.308527,1.76363 +1.163897,-1.202652,-1.332063,-0.1634118,-0.5481565,-0.2104632,1.710819,-0.6179278,-0.0903036,1.496922,1.947748,-0.06011508,-0.611559,-0.08425417,0.9369186,1.506208,-0.4746355,-1.740901,1.226616,-1.303575,1.059671,2.592165,-1.12131,-0.0273374,1.141426,-0.6498112,0.5917718,0.4360191,-1.969368,1.388221,2.391839,-0.0667738,0.1230622,-0.4606303,1.683836,0.2883724,0.2349416,-1.07839,0.9955429,-2.246501 +2.155537,1.047238,-0.01731969,0.3645467,0.8125014,0.4673648,-0.3514361,-0.2700385,0.9400272,-0.4041381,-1.008239,-0.6711938,-0.08947251,-0.5346221,-0.1630639,1.2845,0.716148,-1.056283,-1.384017,-0.5306734,-0.413271,-0.3839145,0.1808662,0.6924967,1.513068,1.282016,2.128228,-0.794625,1.334454,0.0628463,-0.5372733,1.897399,0.2072807,-0.5951862,-0.5458555,0.8245152,-0.74484,-0.3997493,-0.1071329,-0.6685892 +-1.709157,0.5177404,0.01618453,1.525125,0.2673568,0.3413148,0.2097596,-0.6842006,-0.0897425,-0.02566968,-0.9808353,-0.7595639,-1.054345,0.3097499,0.3883144,1.370221,-0.3714719,1.569404,-0.4797104,-1.598796,-0.2386062,1.263999,0.3929872,-0.5736283,1.349884,1.55894,0.07098176,0.1064263,0.4762365,-0.09780268,0.5167802,-0.2858449,-0.7875076,-1.289067,0.405454,0.06478327,1.123208,0.9238945,0.934757,0.3361331 +-1.600823,-0.6395313,0.6276257,0.4550406,1.252515,0.2800277,-0.794616,-0.008655512,-2.471396,-0.6423625,-2.110578,0.2789303,2.435073,3.028498,0.6251072,0.9368024,0.4267038,0.5677552,-0.5606914,0.3964004,-2.048692,0.02714243,-0.716148,-0.942567,-1.142837,0.8150733,0.4576756,-0.7290075,-0.8758758,0.9875533,-1.948845,-0.3444031,0.1080229,-0.5584281,0.1977467,0.6159064,0.06372396,0.5201033,-0.5053734,-0.01651563 +-1.038553,-1.12191,0.1584738,-1.568379,-0.1881953,0.1845084,0.4699047,1.72394,-0.02290601,1.753615,-1.128392,-0.8221546,0.1157568,-0.6632653,-0.1734884,0.5030601,-0.905541,-2.059535,0.5476462,-0.514766,-0.3866806,-0.6593369,1.517368,-0.2885863,-1.452416,0.2308839,0.9366343,-0.04171376,-0.07834722,1.300868,0.4849613,-1.114286,0.5636652,0.1000263,-0.4756349,0.6288057,-0.3777466,1.373353,-0.8662566,1.576415 +0.3230942,-2.241049,1.49298,-1.528661,1.414555,1.274788,1.084742,0.9444138,-1.164662,0.5546293,-1.023625,0.2751953,-0.5413938,-1.610705,0.8536547,0.08522645,1.169526,1.396487,1.037081,0.2132255,-0.3089974,-0.333532,-0.03933895,1.247544,0.4150628,-1.132636,-1.846395,-1.430153,-0.62895,0.0763028,1.143605,-0.06689404,-0.6930507,-1.027624,0.3522596,-0.2404705,-1.229657,0.1411961,0.8447603,1.054553 +-0.8888472,1.502026,1.634858,-1.150307,-0.7001405,-0.09585824,0.3186545,1.122479,0.08220506,0.5015589,0.3504832,0.3947851,0.202569,0.9594504,0.9114663,2.269808,-1.429595,-1.210497,-0.4335194,-0.6672135,-0.5572725,-1.619531,-0.3970662,0.3559477,0.1212372,-0.08056368,0.8698363,-0.3036652,1.303293,1.34339,0.2782463,-1.454392,-0.1690848,-0.105265,0.4586354,1.031368,-0.3959911,0.4788168,-0.1611389,-1.35475 +0.393679,-1.105561,-1.25926,-1.201834,0.2640331,-1.225362,0.3352738,-0.1066308,0.07637351,-1.850068,0.4523358,-1.997774,0.2871307,0.8086893,-0.9343955,-0.2568269,0.4706302,-0.8693069,1.594976,-0.8671817,1.473372,0.8519577,-1.453945,-0.5092017,-2.103682,-0.07247907,0.9914143,-0.4390896,0.4883266,2.031804,1.055518,-0.1217906,1.194271,0.7079433,-1.260951,-0.04673708,0.6037955,0.8423278,-0.5612606,-0.3860779 +0.2365415,0.4078813,-0.2492927,0.9932333,-0.07365947,1.308033,-0.3763208,1.509093,-1.386966,0.1189573,0.6904818,0.7978214,0.1275072,0.4859854,-0.4721638,0.6217434,-0.2279089,-0.3522012,0.19196,0.6291965,1.322636,-0.2663065,-0.1574269,1.673723,0.6318529,0.3390617,-0.8965017,0.638269,0.5734232,1.052049,-0.6137083,-0.170642,2.504841,-0.05347834,-0.8319932,-0.4873905,0.79282,-1.105017,-1.183128,1.839476 +-0.4304968,-0.489039,0.9591378,-0.2236106,0.953849,0.822736,0.0454194,-0.865347,1.597564,0.8967648,-0.3562006,-0.3148437,0.2660166,-1.516199,0.1734365,0.4245589,-1.03957,0.1751209,1.025468,-0.2584954,0.99804,-0.3386467,1.318426,-0.4580694,-1.13513,0.7374153,-0.9740518,0.07832446,1.091434,0.214537,0.2010326,0.08315937,0.4895405,0.9707381,0.91991,-1.866354,-0.09281816,-0.6859636,0.3416733,1.489661 +-0.5479331,-1.302789,0.5413121,-0.2864519,-0.6290312,-0.5796001,-0.9921678,-1.029456,-1.271543,-1.788415,-1.28806,-0.5763668,1.250459,-0.5545002,-1.026239,1.388428,0.1825618,0.345803,-1.622005,-1.010356,0.7374495,-1.031199,-0.2198874,0.09143,0.1799582,0.3924017,-1.123531,-0.3366569,0.1648686,0.5651926,-0.4846877,0.3269186,0.4997185,-1.736631,-2.526,-1.223635,-1.120342,1.878687,1.64178,-0.9042495 +-1.322252,-1.02161,1.130788,0.8840154,-0.1265069,0.517941,-0.7569179,-1.738104,-0.584159,-0.7761728,-1.323616,1.238183,-0.2954423,1.759257,-0.8144726,1.313355,1.391404,-0.7147952,0.6333166,-0.6826964,-1.327618,-1.39309,0.354172,1.617131,-0.8497452,-1.636435,1.790203,1.685328,-0.8055767,-0.6269419,-0.1632875,1.368671,0.6574046,0.3330833,0.5044933,0.487489,-0.5535376,0.6985563,-0.447193,-1.18699 +0.6821267,0.3571815,-0.4524806,0.5370022,0.9402107,-1.195285,0.907594,0.2410623,-0.8288507,-0.07875931,0.1543771,-1.496925,-0.4931189,0.3867188,0.608266,-0.4446125,-1.534936,0.1199665,1.108721,-1.09024,-2.801002,-0.3595025,-0.7253366,0.3078136,1.461925,1.570694,1.454563,-1.477703,0.08199994,0.1047782,-0.6327816,-1.440342,0.1437303,-0.5602102,-2.136365,0.9556909,-2.318319,-1.82682,0.06025587,1.740098 +2.162789,-0.6410703,-1.047744,0.7846147,-1.914673,0.5119641,-2.532364,-0.8440128,-0.4620012,0.8376503,-0.2493363,0.5565506,-0.2610403,1.450116,-1.47039,0.4244687,-0.8081227,0.685332,-0.3968414,-1.701355,-1.31025,0.03989609,0.3434239,0.3424714,-0.2408352,0.7433369,-0.648279,-0.6851902,0.8901264,-1.058897,-0.9898788,0.9590566,-0.6579394,-0.1739291,0.3143434,0.7079181,0.4452103,0.6594265,0.8214313,-0.6053317 +-0.4166696,-1.01507,-1.17224,-1.144563,-0.1169741,0.188916,-1.317159,0.7568717,0.5518678,-1.656481,2.119877,-0.7854275,-0.8114075,-0.1155658,1.605627,-1.041631,-0.8156291,-0.1147435,-0.03541355,-0.5012995,0.07571896,-2.889864,2.54023,-1.103211,-0.05864184,-1.024034,0.3092297,0.2811418,-0.5168658,0.211009,0.4780202,1.610703,0.4810055,-1.622075,-1.68009,0.5122122,1.268366,-0.1696115,-0.5533796,-0.2264667 +-1.357318,-0.1112854,-0.8142653,0.6577703,-0.6252379,0.3592326,0.2955018,0.191601,0.6770821,-0.1297353,0.001360476,0.639054,0.8290064,-1.069263,0.7232465,1.186588,0.3853772,0.06756792,0.6202307,-1.081112,-0.8195871,-1.616727,0.826665,-0.2565171,-1.449061,-1.0135,0.444965,0.1281297,-0.4734205,-1.809098,0.6749107,0.4251941,-0.08686487,-0.6559256,0.2790229,-1.220776,1.691225,0.3571568,-0.4983048,-0.777557 +-0.6712265,-0.1920835,-0.3141614,-0.3656765,-0.05588486,0.8471195,2.896773,-0.2972406,1.317485,0.6086802,-1.719877,-1.124805,0.06684865,-1.277645,-1.480444,1.146843,-0.6244114,-0.4436877,-1.931755,-0.9173265,2.03685,-1.252236,-0.7243645,-1.183186,-0.7070854,0.5255143,-0.5545369,-0.01387094,2.326301,0.8675695,0.5109523,0.6296963,-0.4863505,-0.9375777,0.5041549,-1.257988,0.3004964,0.828316,-0.09153012,2.479717 +0.6499182,-0.9098442,0.548228,-0.570528,2.021838,-0.7952944,1.289116,-0.3569958,0.1365639,-0.03261653,0.5263885,0.5014416,0.1624349,-0.8542522,0.8518285,-1.184496,-1.23265,-1.717943,0.2722894,0.05413117,-2.193654,-1.627025,-0.6293427,0.2443552,0.6481842,0.7873742,-0.2143016,2.242434,-1.214289,2.60414,-0.06205845,0.3216666,0.423742,-0.6469326,0.4990856,1.113969,-0.2089231,0.2853165,0.7070079,0.9096645 +0.7712912,0.3896078,-0.5860856,-0.3272804,0.3362576,-2.020471,0.3703762,0.26213,-0.07110787,-0.121866,1.299798,1.609125,0.5257598,1.133473,1.669205,-0.2526571,-1.710063,0.3894692,-1.388745,-0.008202451,1.789867,0.6941563,-1.190533,-1.070804,0.4233057,-0.7174941,0.5287813,-0.5447294,0.1191853,0.9639176,1.050423,-0.9809172,1.059825,-2.295928,1.67503,-0.35847,-0.3377026,-1.810361,0.007533358,0.09677736 +2.676632,-0.2300779,-1.780335,-1.61776,-0.881136,-0.4292995,0.182373,-0.96005,-1.926091,-0.5821605,0.9650612,1.993904,-0.620845,0.4782557,-0.2107781,-1.905897,-1.081025,-0.6102763,-1.860825,2.014224,-1.427114,0.1846791,1.373991,-0.1513326,1.605048,-0.1406562,-0.5637007,-0.5079437,2.139069,-1.549176,-0.5510749,1.872905,2.029103,-1.004406,0.7778893,-0.377752,1.33641,-0.4770393,1.041789,-0.06671675 +-1.370871,0.188027,1.65703,0.8657194,0.3408286,0.115544,0.3495401,2.269554,-1.220933,-0.1160154,-1.001882,-0.3239044,-1.080141,-0.115067,1.408093,0.9518085,-1.711918,-0.5814093,1.078789,0.1587993,-0.5525325,-1.066078,-0.2100268,-0.2731623,-0.0753649,0.1442098,0.561071,0.6738558,-0.6640432,0.5970044,0.5390509,-0.9742432,0.903214,1.164537,-0.3648118,-0.6998358,0.5424948,1.897902,1.370041,0.8477429 +0.05775915,-2.025232,-0.377483,-0.4944737,-0.58955,-0.7795271,-0.1621295,-0.1693846,-0.1322275,1.358404,0.6249772,-2.280277,0.1269683,-0.6099662,-0.1699904,0.552633,1.278717,-0.1396629,0.01751644,0.4353283,0.2633982,0.3631066,-0.1207178,-0.009830273,-1.076998,-0.8660224,1.458827,0.09335539,0.6671373,0.3841218,-1.702225,0.2628476,1.84153,-0.4155807,1.531051,0.08568185,-0.5311305,0.0948398,-0.6599928,1.479493 +-0.1970675,-1.414251,0.814569,-2.280955,0.2839716,1.0952,0.8401388,-0.2668147,-0.2630426,-1.701628,-1.13962,-0.9197944,-0.9268864,-2.507118,-0.7408867,1.527586,-0.9445204,0.5600486,1.108156,-0.6986113,-0.3775366,-0.01823615,0.6111949,0.875432,-0.6951805,-0.3721883,-0.7353872,0.7066504,0.91015,-0.6445734,-0.397934,-0.5487364,-0.4455102,0.8154495,0.9971565,-1.598161,-0.1066138,1.424121,-1.714355,0.5460878 +-1.261518,0.7400657,0.7434744,-0.7105584,-0.02389975,-0.1475591,-1.888975,0.9945134,0.3372647,-0.7402958,0.7673417,0.8937595,-0.29677,-2.144279,-0.4196179,0.5211347,1.87404,0.80308,-1.154877,-0.06346234,1.025292,0.4361927,0.998675,0.2270917,-0.2047631,0.2821653,-0.01793551,1.486283,0.3462268,-0.5524067,-0.7711985,-0.6780175,0.09119485,1.070065,0.02575884,1.126906,0.2956465,-1.549573,-0.3275643,-0.005471123 +-0.6624426,0.5110552,-0.09654323,-2.527582,1.676953,0.9870678,0.6769673,1.703789,-0.3126159,-0.3186527,-1.226946,-1.064068,-2.400651,0.5767522,-0.2519109,0.2069407,-1.02417,-0.005109274,0.1420616,-0.2401213,0.09495067,0.7222099,-0.7223587,-0.5110817,-1.170023,0.3106091,0.2843772,-1.180636,0.2359011,-0.5689845,-1.059404,-1.446301,-0.1101275,0.5793757,0.2905272,-0.7422614,-0.4571913,2.02812,1.375449,1.769771 +-1.332352,-0.6017003,2.502462,-2.378708,-1.065562,-0.2244157,-0.4533951,0.6216426,-0.2815087,-0.5489809,1.720777,-0.7635754,1.149518,0.3031773,-1.606991,0.4362775,-0.9598096,-1.55849,-0.01583105,-0.8830028,0.3897811,-1.734402,0.329803,-0.7109872,1.00602,-2.16504,1.030878,1.285981,1.723111,-0.7443639,0.05966649,-0.2077955,0.1629881,-0.4434764,-0.8185108,1.564257,1.777477,-0.2351204,0.1177003,0.92399 +0.2773236,0.8644934,0.4667759,-0.8342667,0.1913468,-1.256153,0.4576419,0.4679384,-0.734379,-0.4647125,-1.302048,-0.2555726,0.7057485,0.6291137,-1.139997,0.4435498,0.07335082,-0.1382636,0.7625048,-0.6996388,-1.143629,0.8517071,0.6875145,-0.6708392,0.8318568,-1.645923,0.3339195,0.2733706,0.3448861,0.546697,-0.1284288,0.2502533,-0.8590383,-0.2778174,-0.1540891,0.8363927,-2.350017,0.4246373,0.7118887,-1.352527 +1.085534,-0.2686663,-0.5246236,0.186221,0.77203,-0.8643795,0.3681368,0.387325,0.5029148,0.3264041,0.2647095,1.272735,-1.207852,-0.3419336,-1.527437,0.3964565,0.2518329,0.4705692,-1.364605,-0.4655038,-1.252768,0.4250123,-0.00589669,0.4665121,-0.5260249,-2.317181,-0.3758303,0.03536303,0.4793116,-0.3280429,-0.1911246,-0.6388056,-1.061978,-0.6956607,-0.4555934,-0.9225953,-0.2755076,0.3591887,0.2607231,0.3007094 +-1.642718,0.5592865,0.7846686,1.212378,-1.565194,0.4788151,-0.3520098,-0.3107,-0.8372396,-1.265589,0.2053355,0.1317324,0.2824384,0.4627395,0.8676165,0.1045355,1.638755,2.007827,1.25615,0.2385992,1.083002,-0.6321697,-0.08646623,-1.294801,-0.3205325,0.9540108,1.660413,-0.8901645,1.220297,0.1330206,-0.004083898,0.02924825,-0.4263372,-1.221103,0.4262036,0.8276561,-0.4231911,0.5614624,0.3072119,-0.1338644 +-0.4571803,0.2349121,-0.9840732,-0.1309608,0.7295496,-0.1282879,1.96546,0.4003254,1.292672,-0.4849346,-0.003865248,-0.97601,-1.467264,0.4374257,0.5491703,0.3671729,1.636768,-1.436167,0.1659141,-0.06539774,-0.2504165,1.120225,-0.1668545,0.3712386,-0.1986987,-0.1147916,1.521914,-0.7406514,-0.2473886,-1.420477,0.4085934,-2.481032,1.057208,-0.7347282,1.581042,-1.200322,1.084196,1.544584,-0.2488451,0.08586639 +1.461172,-0.02596937,0.4064222,0.9029144,-0.03023288,0.3841631,-0.3159248,0.805107,-1.785033,0.4099091,-1.810063,-2.209766,1.502254,-0.9680674,0.1125541,-0.1882409,-0.4832624,-1.026668,0.5907156,0.3785723,1.102647,-1.257327,0.6284047,0.6087897,1.068584,0.03938384,-0.1749997,-0.2679587,0.0809192,-1.633686,-0.06023912,-0.5685324,-0.94562,1.288585,0.1131264,0.3133426,1.166014,0.181311,0.546554,1.63734 +-1.672533,-0.978812,-0.1732967,1.137603,0.2133791,0.003405224,0.6333641,1.913009,1.499731,-1.082841,-0.6039256,-0.1782623,-0.4632736,-0.6943587,-0.4717736,1.746983,1.618083,0.7369754,0.742727,0.006169594,-0.1620843,-1.250906,-2.867416,0.6239646,-0.9554752,0.5760296,-1.151317,0.4450561,1.211913,0.3335497,0.286904,-1.6643,0.9621215,0.4343706,0.09197276,1.553162,0.3525271,1.171462,-0.6288075,-1.917304 +1.561096,0.04057291,1.002105,-0.4675135,-1.151581,-0.05511658,-1.037074,-0.8608109,0.3773517,0.9107135,0.8385376,-0.2526892,-1.528677,0.2038229,0.144481,2.039644,-1.156958,0.4062043,0.5943765,-0.5682099,-0.2562511,0.3105437,-1.179828,0.0206106,-1.245287,0.7268582,-1.701247,1.443261,-0.06520382,0.1233921,1.35356,-0.09355005,-0.6375952,0.1490554,1.642858,-0.9519018,-0.2206918,0.8792368,0.3750491,1.342705 +-1.493371,-0.1019076,0.375233,0.5960912,-0.3418922,-0.7187573,-0.3496991,0.4269969,-0.5143173,1.551502,0.2703768,-0.1613006,0.5118756,0.803014,-0.1229056,-1.811988,-0.1626382,1.584263,-0.4221772,0.1938611,-0.3402681,-0.04155895,0.7501521,0.9067426,0.2855673,-0.8877952,-1.618618,0.2024887,1.396453,-1.597705,-0.05171015,0.4261496,-1.171568,0.5624,-1.107412,1.204677,1.224039,0.4778736,0.993659,-0.735602 +-1.182449,0.8584576,0.04294817,0.6574748,-1.570944,-1.057753,-0.9331185,0.1234535,0.06141445,-1.554916,-2.234072,0.4115637,0.6725665,0.9807224,1.903998,2.729277,-0.2831004,0.7605734,-0.1292656,1.399734,-0.798092,-1.572518,-1.478488,0.605019,-0.0800124,-0.2325737,0.1582159,0.8458504,0.5002934,0.115566,0.8002574,0.2709114,-0.7092772,-0.3616911,-0.3200878,-0.6397759,-0.1545445,0.2567505,0.1458185,-1.36141 +-0.3565869,-0.9308836,0.04077397,0.4064355,0.8478623,0.6053765,1.397204,-0.6007512,-0.4779193,-0.9192912,0.4722239,-1.862929,-0.3883303,0.02400269,1.0707,-0.03604651,-1.218935,0.05532777,-0.1309609,-0.8427618,-0.86465,0.1981875,-0.2085583,-0.9104938,1.43475,-1.197876,1.1491,-0.3741754,-0.0865037,-0.2945635,-0.8461186,-0.2557675,-0.1447082,-0.2041144,1.287647,0.6451142,0.4316358,-2.167625,-0.1907492,0.5075916 +-0.9156401,0.5304504,-0.23335,1.479127,0.5338417,-1.490652,0.3412646,0.6721712,0.837028,0.5413872,-1.600242,-0.3582762,-2.01474,0.9159592,0.05658016,-0.9618608,-0.4855545,-2.100229,-1.872302,0.04476566,1.109336,-0.54989,-0.510519,-0.6385168,0.911429,1.174557,0.4849788,0.5616573,-0.02281169,2.471484,-0.6786576,-0.7916897,0.9179658,0.757695,-0.789362,-1.321797,0.02810512,0.3703584,-1.373149,0.4963956 +0.8494855,-0.5882823,0.509761,0.8379988,-1.844116,0.3710475,-0.9398997,0.02548002,0.993777,-0.1988926,-0.2136842,-0.6459241,-0.02190887,0.4057547,0.1696894,-1.377928,0.2069262,1.28708,1.42714,-1.838952,0.1409918,-0.3400826,0.3608091,-1.033285,1.182692,0.2584432,0.670058,2.486218,-2.019959,0.7222066,-0.6928297,0.965975,0.05842043,0.494192,-1.840626,-1.150707,-0.486134,1.940175,1.159126,-0.7620133 +-0.4896072,-0.1709368,-0.4582641,0.8305484,0.05790073,-1.823208,-1.006899,0.3008273,-1.481645,1.525436,-0.707318,0.7192434,0.4273097,0.1175266,0.2088181,-0.05047045,-1.424528,-1.148935,-2.96792,-0.9079427,-1.718709,-0.230168,-0.7073263,1.884963,-1.24198,-0.5570884,0.9284848,2.019783,0.5856009,1.155374,1.568025,1.343449,-0.8769763,0.6212191,1.272885,-0.03850185,1.176358,-1.779866,0.1497713,1.398172 +0.7279832,1.369794,0.01695745,0.8337052,-1.493701,0.7324235,0.6937147,0.5641853,-0.625117,-0.2273537,1.815374,-0.3842922,1.320114,-1.367411,0.4704175,-0.5415263,-1.350597,1.197707,1.807614,-0.09668704,0.505566,1.622676,1.29825,1.439162,-1.006394,-0.06463981,-1.648832,-2.022092,-1.393307,-0.4903068,1.102308,-2.248318,2.002736,0.4012293,1.328032,-0.5466067,-0.7345371,1.562356,1.037488,0.3854287 +-0.6193132,0.2180647,-0.8412001,-0.1347093,-1.769217,-1.050161,1.545459,-3.048589,0.5182967,0.757325,0.2551065,-1.657192,-2.310695,-0.2639032,1.222323,4.088422,0.08200118,-0.4891666,-0.6016789,1.016678,-0.888877,0.6393455,0.2302672,-0.4626445,-0.8006714,-0.8222682,0.2542758,0.9325591,-1.132815,-0.1794136,0.5382355,1.141862,0.375603,0.2548488,0.2654526,-0.8065682,2.453323,1.412397,-0.673662,0.7591055 +0.3437355,0.05119915,0.07174996,1.763034,-0.3064024,0.02341662,-0.1460999,-0.2639827,-0.321179,0.6950649,0.1351979,-0.3001476,-0.80828,-0.8304167,0.3696099,1.411544,1.304122,-0.2914363,0.4379853,1.494046,-1.856602,0.09708639,0.8903172,1.227973,1.673284,0.9901012,1.137394,0.1281093,-0.4359717,-1.736349,0.07404531,-0.1823337,0.1337094,-1.984702,0.001545613,-0.6876252,-0.7924469,-1.282487,1.17214,0.06632897 +1.813625,0.5308562,0.3594092,-0.5225572,0.0795156,-0.2246502,1.08431,-0.9343528,-0.199344,-0.747715,0.4125286,-0.702273,0.3463918,0.2254823,1.852338,-0.2852887,0.08516529,-1.597884,1.023464,-0.2476722,-0.8909507,1.459649,-1.143116,0.9519352,1.064485,-0.6477182,-0.9556852,-1.140284,1.699556,1.395651,1.037475,-0.127038,0.2093789,-0.6350594,0.1746095,-0.2839725,-1.092048,1.209492,-0.1068922,0.6784098 +1.453227,1.016604,-1.081668,1.458636,0.02276871,-0.3553003,2.240451,0.5597668,1.113287,-0.0386842,-0.4020421,1.174653,-2.115826,-0.7521844,0.6477757,0.9104798,0.2020498,-1.184705,0.5110418,1.961046,-1.993234,1.171104,-0.8738105,0.3329414,0.3293069,-0.1209587,0.05168257,-0.3494565,0.3653014,-0.7121263,0.01507381,-0.4446244,-2.412185,-0.6083805,0.7798293,-0.1116616,0.4934419,-0.6555968,1.055701,1.580631 +0.4124294,0.5478175,-0.5617298,1.700807,-0.04721615,-0.02599257,-0.45522,-0.7859556,-0.8965157,1.344185,-1.575267,-1.953661,0.4521223,0.1696461,-0.9122915,1.199239,-2.223642,0.6591484,0.3643384,0.4968345,0.83965,-0.2138547,-0.3590662,0.4616008,-0.5843388,-0.7887735,2.106475,0.3637356,-0.4490959,-0.8004657,-0.2128651,-1.822829,1.260596,-1.058614,-0.7383711,-0.821597,-1.140187,0.2858803,-0.5441648,-1.478059 +0.201976,-0.1102824,-0.352352,0.1893671,-1.667289,0.5850439,-2.174865,0.6523342,1.566284,-0.1168597,-0.4970677,-1.384408,-1.057074,0.3942474,-1.227714,-0.9669944,0.0314126,-1.758249,0.8526285,-0.6483921,0.1086696,1.367369,-0.3716497,1.518113,1.384342,-1.19962,-0.8364934,-0.09149003,-0.7854844,-0.8035102,-0.7219454,0.5889114,-0.000181314,-0.5017694,0.8324873,2.226778,-1.806252,0.3471301,-0.3741387,-0.4742202 +1.684347,0.2162949,-0.5164344,0.3463855,1.840331,0.3998404,1.12472,-1.728499,0.3620255,0.7113679,0.5967158,-1.487564,0.874603,0.2153983,0.3353125,-1.115481,1.138413,-0.1925178,-0.1298958,0.5482296,-0.1321925,-0.1317641,-1.202381,-0.768276,0.2149322,-0.1371097,-1.222325,0.8012656,-1.248132,-0.8949108,1.24317,-0.2619455,0.3207047,-0.06289351,-0.8884541,0.7208216,-0.2709428,-1.381558,0.1070797,0.3653667 +-0.9648245,-1.234421,-0.792615,-0.703426,-1.008106,0.5379084,-0.008171818,1.309511,0.5560395,0.923333,-0.09627263,-1.029183,0.2329898,0.1856999,-1.033357,1.61599,-0.6979029,-0.5351104,-1.035488,0.1387345,0.5643367,0.4396114,0.5087021,1.56809,-0.4540452,-0.04031375,-1.894858,2.354156,3.040976,-0.9127025,-0.4215832,-0.4735006,-0.4904804,-1.14617,1.678304,-0.5056491,-0.8143567,-0.5991108,-0.2360089,0.6672319 +0.6621552,0.956061,0.8518623,-0.7443563,0.07277533,-1.938353,0.6040478,-0.8537764,1.988294,0.524994,-1.222625,-1.446568,0.1099746,-1.309919,0.3301498,-0.4653016,0.5538309,-0.3376358,0.9172893,1.283788,-0.7795217,-0.5797969,-0.2898221,-0.6449845,-1.23071,2.043494,1.516821,-0.5058319,-0.8650178,-0.4365106,-0.7556388,-0.9986104,-1.011032,-1.167533,-0.5259997,-0.8169065,0.8513823,-1.680066,-0.01931246,-1.878913 +-0.553344,0.4651511,-1.461894,0.1585469,0.3585296,-0.0207608,0.8732721,1.489545,-0.6956947,-0.50359,0.1937626,-0.4484932,-1.410285,0.1610221,-0.1693321,-1.522448,-0.03658521,0.740867,0.8668845,0.5102592,0.5074517,1.212083,-0.5469345,-1.022821,0.5916016,-0.4917652,0.2935255,-1.125474,0.4503509,-0.962709,-1.080473,0.2268964,0.115672,-0.8464789,-1.302112,-0.0737832,0.7477408,1.434305,0.09163164,1.74522 +2.444205,0.5596903,-0.4236916,-0.1020359,-1.167542,-2.058887,0.5688355,1.511122,0.596654,-0.08861441,-0.6167851,1.232979,0.8760393,-0.7405423,1.183237,-1.153665,-0.5014168,-0.608774,1.699216,-0.465816,0.4742636,-0.01991558,-1.001005,1.390326,-0.7050653,-0.4612449,-0.3898089,-0.3814058,-1.336795,0.1872017,-0.3435652,0.936557,1.290609,-0.7864627,-0.4877578,0.4384443,-1.344781,-1.035216,1.205718,1.384575 +-0.3046298,0.8042604,0.06669258,-1.706784,0.723086,0.8889541,0.2725601,0.2570836,-1.025463,1.090065,-0.1801667,0.8683804,-1.457779,0.1866116,-1.532172,0.966599,-0.4140567,0.4260525,-0.6429042,0.9721297,0.4016346,-0.3898912,-0.1534178,0.09594899,0.1529544,0.03913574,0.1226697,0.7856116,2.48708,0.3500598,2.559121,1.390801,1.345291,-1.551251,-0.8375688,0.7044313,0.8435133,0.6399347,1.976077,-1.190477 +-0.1173699,-1.235342,1.668356,1.513501,-1.227192,-0.6747004,1.069427,-0.5173724,1.344846,0.574271,-0.7556403,-1.421588,1.017284,0.4155068,1.237511,-1.639257,-1.99749,1.047386,-0.38129,-0.2647058,0.07832634,0.386403,1.669768,-0.6952237,0.5815103,0.9801093,-0.9123893,-2.758246,0.8371067,-0.3274877,1.045524,0.5886327,-0.06426795,0.2794544,-1.891407,-0.06160457,-1.525201,-0.7886854,-1.161116,0.3288068 +0.9694288,-0.2887939,0.05499602,-0.5573984,0.4920058,-0.6139797,-0.03845218,0.9882817,0.9879701,-0.8161627,-2.972234,0.358152,0.3815812,-0.8765546,-0.7000535,0.04201117,0.9241749,1.515879,-0.5963715,0.3445046,0.2988346,0.8674745,1.048541,-0.8939909,0.41137,0.08501765,-0.4413277,-0.271972,0.1792084,1.879591,-0.7931654,1.03257,1.046718,-0.357905,-2.538342,0.6029056,-1.396938,-0.5474894,1.510906,1.268602 +0.5909396,1.006287,-0.9057176,0.5517037,1.533794,-0.7493878,0.5305435,-0.479987,-1.024127,-0.7225105,-0.7681556,1.72504,-0.9754889,0.5779305,0.3393525,-2.805573,1.571212,-1.024852,-2.553043,0.5733984,-0.1972711,1.846615,-0.09996975,0.6045219,0.2750299,0.7452929,0.8419327,-0.7546577,-0.1722221,0.2066765,0.4653671,0.3659623,-1.444985,-1.101867,0.04885435,0.5089477,0.9350694,0.01826881,0.3094704,-1.024881 +-1.208958,-0.1287105,0.9834239,1.154484,-1.753884,0.8891834,1.169674,-0.667484,1.37639,2.347958,-1.502827,0.6703722,0.7416565,0.09989107,0.9005081,1.292245,0.1560236,0.6216993,-0.5231037,1.56128,-0.9488045,1.629359,0.1698127,-0.2222233,-0.5751148,-1.049444,-0.199573,2.026803,-1.165421,-1.733763,-0.6665006,1.087832,-0.02144098,0.2271869,-1.264067,0.008493345,-0.5781657,-1.109952,1.300185,-0.004696844 +-0.3019284,-0.05506845,-0.03812302,0.1920389,0.706902,-0.6568368,-0.1931556,0.6028041,-0.082753,0.7241997,-1.701467,-1.2338,-2.545475,0.8960075,-1.029831,0.970037,1.343399,-1.296634,-0.6153314,2.572136,0.4974157,-1.372893,-0.08896554,2.226806,-0.02331349,-0.718384,-1.163254,-0.6168847,1.780792,0.3607832,-0.1207001,-1.215659,0.09310782,1.331147,1.155221,0.1402223,0.6524913,0.766723,-0.662834,0.475786 +0.4012605,-0.276858,0.05022081,-1.514478,0.3696493,-0.2417299,1.314063,-0.5249178,-0.4437986,-1.944115,0.5411628,0.4582295,0.06290392,-0.3476063,1.456194,0.9271396,-0.6858152,-0.1685494,-1.744037,-0.3495562,-0.4746048,0.05030868,-1.145966,1.006311,-1.693295,0.004247172,0.908035,0.5192093,0.470206,-1.728563,-1.582619,0.2877128,-1.476526,1.68444,0.2579844,0.8949868,-0.6599356,-0.9986635,-1.107694,-0.05925115 +-1.688368,0.3425124,-0.925841,-1.328893,0.3851013,1.156555,-0.7915653,-0.6127333,-1.388318,-0.9438861,-0.08147973,1.58585,-0.1861357,0.4157775,-1.118282,0.3945231,1.5636,0.03531132,-0.451937,-0.4207861,-0.424821,1.37979,-0.3945821,-0.7246935,-1.113421,-0.5947673,1.21235,-0.3144421,-0.5986741,-0.4352972,1.0087,-1.021143,0.385746,0.5502334,1.098165,-1.239448,-1.657277,-1.228075,1.195558,-0.2064618 +-0.7129925,1.45528,-0.3517457,0.4161088,0.5630007,-0.3543722,0.6380652,0.2652807,-0.6362167,1.312585,-0.2816928,0.9651384,-0.01533932,0.7282803,0.2322432,0.3452681,-0.2170073,1.230682,0.07955527,1.183074,-2.090535,2.047822,0.1395427,-0.3723354,0.170207,0.7379684,-0.01567312,-0.2572559,0.5650302,1.441585,-0.8882207,-0.404897,-0.6591099,0.8466618,-1.486191,0.1660672,0.5644686,-0.1090594,1.002499,0.4058763 +-0.2543003,-0.7595871,-1.544359,0.6361211,-0.694406,0.8790678,1.119089,2.644702,0.3249134,0.01994565,1.627105,0.9195861,-1.291712,-0.6159451,0.1150979,-0.1643645,-0.7019017,0.5449488,-1.392586,-0.1983545,0.8368516,1.189517,-1.02162,0.7951687,-0.9962017,-1.391625,1.080748,1.234995,-1.317808,-1.076763,0.1910554,0.4294883,-0.1693799,0.967,-0.2836238,-0.9090686,-1.625816,1.297064,-0.5068094,-0.356033 +-1.001567,0.03912816,-1.780316,-0.6774865,-0.2240067,-1.833288,-0.71747,0.1707058,1.25643,-0.6770678,0.6386278,0.2144649,-0.3084167,-0.03220252,-0.9142952,-0.3739977,0.2034523,2.161476,-0.4456536,-0.03827024,-0.7174615,-0.2957434,-0.4333822,-0.1179631,1.849604,-2.109037,-0.3083007,-0.1134842,-0.09135086,0.9733003,1.563926,0.3437444,-1.065502,-0.413299,0.5386972,-0.1925437,2.301226,0.4310477,0.295964,-0.6880631 +-0.9241004,-0.06762163,0.7906098,-0.321766,-1.281146,-1.627746,-0.7409309,0.8340562,0.1963523,-0.9352118,2.050731,0.09303093,-0.4567366,-0.6411647,0.1150111,-1.423087,-0.997213,-0.1579355,-0.7112123,-0.9680517,0.6057206,-0.8051695,-2.422597,-1.673916,1.069472,0.4109982,0.1670959,1.592739,-0.9771846,-1.43188,-0.04666362,0.3281564,-1.03481,-1.236608,1.401291,-0.6394923,-0.756801,0.6127628,-0.9574916,-0.1050007 +0.4402456,-0.1193565,-0.1780753,-0.3934993,0.9088429,0.6711114,2.28119,0.3191429,1.72004,0.7307885,-0.1186298,-0.8495069,0.8422205,-0.268188,-0.7382415,-0.687025,1.163132,1.458218,-1.016709,0.8312014,-0.4892225,-0.4644813,1.23276,1.292468,-0.3438357,-0.9108586,-0.4700004,3.026366,0.1797299,-0.9645916,-0.9830119,0.07405215,0.9715317,0.1057207,0.88219,-0.764037,-1.924343,0.4776478,-0.3923354,0.5150964 +-0.3268166,-0.7336143,0.2364432,-1.690688,-0.4288735,-1.530267,1.405208,2.115003,1.51821,0.1829769,-3.508072,-0.2413407,-0.5900708,0.3305299,-0.802578,-0.1130258,1.477351,-0.43512,1.154456,-1.545065,-1.044981,-1.8805,-0.9787084,0.6787275,0.04308982,0.9823519,-0.2311666,-0.04074303,0.05433827,-0.4233256,0.740872,-0.6724945,0.8137235,1.390813,0.735844,-0.5676844,0.3505499,-0.2805758,1.529817,-1.913473 +-0.2744759,-1.067218,1.807126,-0.03216948,-0.09212174,0.3777319,-0.07841305,1.544946,-0.8430072,0.5238164,0.1285339,-1.652674,0.189802,-0.8076008,0.3007847,-2.097794,-0.9857549,-0.5597949,0.06247284,-0.1818319,0.5615337,-1.044279,-1.149446,-0.5648077,-1.261091,-1.137369,-0.893147,-0.6829698,-1.107194,0.4434204,0.6089108,0.9933603,-0.7209093,-0.1186438,1.429347,0.885313,0.827552,1.882713,-0.541683,1.210174 +1.284142,-1.203486,-2.378428,-1.438278,-1.092024,0.2421823,-0.7846128,0.2770913,-0.4633121,-1.507348,0.6647185,-2.038204,-0.3443705,-0.6908027,0.4282222,1.434031,0.04455854,-0.4017458,1.558852,1.959493,0.9229261,-1.562293,1.704715,-1.469646,0.4787143,0.75615,-0.824033,-0.08347596,-1.616161,1.246479,2.541661,-1.227473,-0.907715,0.6748222,-0.3917298,-1.711723,-0.5315225,-0.593794,0.01736464,-0.2604355 +0.3042118,-0.3444994,-1.201837,-0.5470993,-0.2800943,-0.4333329,-0.3520848,-0.8384945,-0.7089343,0.6100495,0.806982,0.3820714,-1.428843,0.3091525,-1.934831,0.2810347,-1.029307,-1.009281,-1.351614,0.01201565,-0.1667473,0.1816754,0.0123402,-0.6263212,1.355632,-0.3644254,1.912675,0.7417172,-1.563027,0.3724585,2.624802,1.725689,2.052389,-0.8269206,1.167113,-0.733911,-0.1443774,0.2835532,1.304598,-1.018952 +1.273683,-0.0218504,-0.07517866,-0.5972489,0.9831308,0.866953,-0.01350433,-0.2615399,-0.416341,0.8138826,-0.3055612,-0.09012662,0.6069174,-1.560542,-0.969122,1.8969,-0.8862759,-0.5747247,0.4738969,0.5703096,0.5803999,1.060388,1.915796,-0.01024109,1.479541,0.6417671,-0.385162,-0.3758565,0.4907448,-0.1181676,0.3397721,-1.318235,-1.59008,1.265366,0.7325313,0.2721532,0.5728113,1.355029,0.934801,-2.285382 +1.033448,1.72197,-0.1246015,-1.136563,0.445182,-1.172704,-1.112993,0.9321983,-0.1981954,-0.4681287,-0.7743734,0.6120665,-0.960924,1.136119,-0.390419,-0.2476719,0.002577405,-0.02193075,0.443512,0.2631631,0.3317531,-0.8425498,-0.4188211,2.026458,-0.1647789,-1.525103,0.1249373,0.223377,-0.9173387,-1.685349,1.311526,0.1185749,-2.897894,-0.8698143,-0.207447,0.7231883,-2.619444,-0.02006546,0.5031486,0.7081654 +0.6653842,0.6426474,-0.3938512,1.069164,-1.182549,0.7408518,0.4829888,-0.1170199,1.603357,1.211622,-0.1069728,0.04053243,-0.575451,1.468192,0.1691247,-0.7836943,0.3240284,-1.030936,0.6977245,0.3264965,-0.2188335,-0.1234741,0.2388137,0.7283815,1.289393,2.836843,0.4641067,-1.179114,-0.01624682,1.20281,-0.5917985,-0.4820136,0.5844167,-0.4461872,0.9248762,-1.454769,-1.029673,-1.028636,0.4776585,-1.572777 +-0.3133281,-0.6237715,1.845724,-0.8607243,-0.4749345,0.2679777,-1.012714,0.02317589,-1.871258,-0.5000759,0.3305711,-2.463137,-0.5611514,-0.9090275,1.166516,-0.5530091,-1.073432,-0.3444911,0.2023795,0.2768198,-0.3815555,-1.813768,1.852498,-0.5998796,-0.2206857,-1.306845,0.1360792,-0.188718,0.4746854,-0.36666,-0.3924085,1.562985,1.981815,0.7100864,1.714808,-1.610326,-0.8834936,-0.3432179,-0.3774578,-1.939771 +0.4640767,-1.971285,1.059984,-0.3014736,1.118292,0.967424,0.4890423,-0.714034,0.6311282,1.437438,-1.083465,-0.8443955,-1.389927,0.9496713,-0.6955063,-0.70876,0.7318912,1.581026,-0.6118234,-0.8260593,-0.3521532,1.474547,1.47342,-1.618786,-1.860914,2.299914,1.192428,-1.066478,1.79118,0.3408947,0.8879266,0.3150564,0.6826087,0.162102,0.9141612,1.075043,0.9061958,0.6219787,-1.785992,-0.3882763 +0.4786059,0.5027203,1.825524,0.5172248,-0.4271225,-1.411324,-0.8514385,-0.888157,-0.09583713,0.310206,1.215171,0.4128858,1.245087,-0.9178006,0.015244,1.163209,-0.1786745,0.3039969,-0.4617067,-0.6000976,1.184663,0.05717527,-0.4150377,-0.0465746,0.1088291,-0.6696928,0.4431436,-1.240562,-1.106494,-0.3951409,0.9663492,0.04773914,0.9885795,-1.384323,0.2305797,0.2257097,0.5570088,0.8397508,1.267239,-0.4517804 +2.536236,-0.5343934,-0.588053,0.5444529,1.274187,-1.140527,0.3970357,1.332783,-1.099162,1.203164,1.207939,-1.295914,1.039769,0.3053587,1.361213,0.05959836,-0.4349001,0.2951559,-1.028569,1.435093,1.187732,0.3904477,0.3185744,0.9921034,0.3910141,1.419274,-1.28583,1.35618,-0.1997296,0.6162597,0.2157076,-0.3571279,-0.4800922,-0.8283802,0.3728421,2.07604,-0.1615094,1.213506,-1.93347,0.08255205 +-0.5350631,-0.07051842,0.7532902,-1.189068,1.436383,-1.283346,0.3648259,0.03772987,-1.302716,0.4741475,-0.3014794,0.25877,-0.1628261,0.613846,0.2160094,0.4289345,-0.7654526,0.770431,1.102267,0.1855603,0.04098005,1.21613,-2.121258,-0.5763914,-0.04781379,-0.1378652,-0.6047801,0.5994959,-0.3130105,0.01471725,-1.729108,-0.9015161,-0.07634718,-1.065382,-0.7556646,0.3662397,-1.317609,-0.2709807,-1.286538,0.8009199 +1.368106,0.7692553,0.4051955,0.4125468,-1.891011,-1.052797,-0.9255039,-0.616507,-1.167367,-0.3976502,0.0980358,0.07977882,-0.8306757,-0.2180233,0.9017362,-1.587509,-2.298053,-0.4048454,-0.4914361,0.003819652,-0.6370978,1.370735,0.6497915,-0.6590633,-0.6982523,-0.4803382,-0.3564145,-0.2520254,-0.3012559,1.325846,0.9744178,-0.4628568,-0.140278,-0.2994243,-0.1765242,-1.414799,-1.730061,0.2590029,0.3598097,-0.4171818 +0.1418443,1.172374,1.151624,-1.519651,1.101106,1.49019,1.047592,1.174661,-0.2906323,-2.068707,-0.52381,0.5076325,-0.08021974,0.8197553,-1.621338,-0.929026,0.4493766,0.0278102,-1.19761,-2.093742,0.6033106,-0.07538646,-0.2939501,-0.6527983,1.284035,0.1242865,0.08094695,0.9175383,0.9896185,0.02378008,-0.387617,0.2632474,0.9863637,-1.712689,0.8515988,-0.02763764,0.4166292,1.230432,-1.040491,0.05866201 +-0.782815,-0.2179289,0.05853952,-0.7545798,0.4150657,1.0227,0.7075945,2.674868,1.334633,1.008182,0.2232573,1.295921,-0.09872763,-0.6327172,0.4866819,-1.062246,-2.157665,-0.6737986,0.8327928,1.116335,0.0331486,0.2504266,0.002063797,2.399503,-0.009847144,0.6285644,1.551676,0.2257419,-0.6715051,0.1234412,-1.442834,0.3203437,-0.918526,1.212572,0.248381,-0.3993997,-2.381156,0.6746535,-0.5897227,-1.489617 +1.881519,-0.6300646,-0.7481423,-0.08250428,1.096931,2.312836,0.03566219,0.2343901,1.832647,-0.8168086,-0.259836,-0.8475631,-1.849752,-0.7009136,0.07893106,0.01042097,0.2139799,-1.492737,-0.8375277,2.184313,-0.5488738,0.6291235,-1.792289,0.2473957,-2.341161,0.3707633,0.776419,0.2739967,1.144504,0.7137703,-0.4326759,0.04387192,0.002953531,-0.3088363,0.3860279,-0.4116084,1.181257,1.562057,1.453264,-1.092675 +0.5529229,0.8781821,0.133593,-0.09731776,0.9154198,0.8764472,-0.8402033,-1.539473,1.789646,-0.3858175,-0.1698884,-1.739161,-0.3368322,-2.36799,1.040285,0.2512743,-0.8256679,2.015103,0.4328807,0.08795576,-1.103113,-1.27315,1.134054,-1.346231,0.7176192,-0.3165438,1.247613,1.345986,0.4228559,-1.664,0.4754048,-1.346983,-0.08784152,-0.7670694,-0.6769667,-0.1982514,0.7428926,0.6995555,0.09195909,0.7020314 +-1.844857,0.08528255,0.8061046,0.2328671,1.357306,0.2207408,-2.493353,-1.700518,-0.2719865,-1.194554,3.247508,0.9113344,-0.1112005,-1.645054,1.390758,0.3731239,1.174411,1.950632,-1.830645,-0.07834773,-0.8252207,-0.442891,-0.3845537,-0.5878738,1.613221,0.9149072,-0.9332966,-0.5646183,-1.485779,-0.3666808,-1.388244,0.2088956,0.6707086,-1.352129,0.5583875,0.312297,0.5568412,-0.8457715,0.8513218,-0.04592418 +-0.5503079,-1.088158,-0.9066625,-0.8228929,0.0109238,-0.5398211,-1.800114,-0.9600775,2.55316,-0.2317202,-0.8098283,1.735093,-0.7609002,-0.3095442,0.07285364,0.2137133,-0.7395057,0.203003,-0.1636947,-1.916909,0.4516809,1.284139,0.237299,0.2622975,0.9595082,-0.06275299,-0.2321149,0.4813039,-0.5232012,-1.87846,1.441074,0.816553,0.7607809,-0.3358497,2.272893,0.5361217,-0.7535573,0.3129019,0.1105312,-0.2913627 +-0.02601476,0.432028,-2.126596,-0.895687,-0.5380448,1.231883,1.306334,-1.19943,1.284392,2.392465,-0.3647489,-0.3378301,1.161541,1.051879,0.8578863,0.4055652,0.4227385,-0.5776981,0.7525665,1.70323,0.9114546,-0.1371148,-0.9423463,1.304413,-0.6609252,0.8436828,2.086294,0.2375139,-0.5090334,0.948961,0.3461843,1.123873,0.1458509,0.46591,-0.8000462,1.0715,-1.578126,0.1202723,-0.1084285,-2.234682 +-2.15925,2.137799,-1.143154,0.2701725,1.649798,-2.102,0.5931014,-1.00478,-1.612204,0.01693363,-0.05324196,-1.63992,-0.1608674,-1.845814,-0.4047063,-0.3788682,1.792701,-0.2318425,1.841642,-0.4931286,-0.03597306,0.8120175,0.134202,-1.640216,1.03211,1.233488,-0.4006062,-1.544967,-1.928723,-0.391307,-1.163472,0.2505773,1.099705,0.4554359,-1.33172,-1.552797,-0.006812531,-0.80353,0.9736014,-1.225101 +1.7706,0.06206283,1.467081,0.3138653,-0.4941912,-0.408106,-0.1462584,-0.3922135,1.496072,-1.052669,1.254628,0.2523495,-0.2106261,-2.722224,-0.6557136,-0.4993739,0.5500148,1.381603,-0.7616481,0.9564107,-0.0697461,0.258598,-1.120008,-0.7447623,-0.9270119,-0.4835574,-0.4692314,-0.1331632,0.4980252,-0.5311259,1.55116,1.809492,-0.05690194,0.03125301,-0.342169,0.7979033,1.792011,0.2203269,-0.8732468,0.6104705 +0.7604735,-0.7067467,0.5369896,-0.0746449,0.4874801,-1.027033,0.0424363,-1.227257,-0.2146095,0.6620598,0.7974581,-0.1126512,-1.892993,0.7811981,1.150917,0.000174131,-1.340932,-0.7356038,-2.162279,0.8260854,-1.246037,-0.09461276,0.92255,-0.3873893,-0.8235623,-0.8713262,-0.1969672,0.5758814,0.737806,0.60434,-0.3522901,-0.6581748,-0.814465,-0.9158779,-0.5797719,2.112809,1.222233,-0.547589,0.7603136,-0.8749647 +-0.4845428,-1.060045,2.062693,0.3868924,1.3949,-0.5130171,-0.7681249,-0.03845624,-0.4238126,1.407132,0.6607743,-1.016118,-0.5870416,0.9847303,1.363054,1.027616,1.404465,0.4989036,0.4004879,1.656204,-1.212897,2.423509,1.479117,-0.7566794,-0.9031792,-0.4190372,-0.5750553,0.05827649,-0.9581598,-1.93139,0.08557077,-0.7871838,-1.941098,0.2902326,-0.1443445,0.674982,0.119937,-1.200259,0.9593074,-0.7657125 +-0.4727639,0.8942918,0.1590567,1.187189,2.324885,1.608686,0.2557198,0.5324842,0.5480346,1.169419,-0.4809231,-0.7146116,-0.7434406,0.9015595,0.69226,-0.4926849,1.343955,0.893336,0.5681425,0.974387,-0.008807347,0.3390152,0.05711765,-0.6336859,0.3587079,-0.2895133,-1.819523,-0.08673598,2.433561,-0.06146214,-1.164912,-0.9898984,-0.6649336,1.139678,1.340468,1.172661,0.7317948,-0.4772827,2.076849,0.1076902 +0.3316721,-0.05484585,-0.1968035,0.5670514,-0.6257547,0.6264679,-0.9777217,-0.1025781,-1.787199,-0.7003577,0.3391121,0.2550651,0.9091568,1.23812,-1.209412,-1.012392,-0.07456336,-1.717768,0.82727,-1.567808,-0.7108981,0.9001268,0.2285386,2.192372,1.218347,-0.5630442,-0.996494,1.62867,-0.5626643,-0.8276508,0.1521796,2.364444,-1.407516,-0.1074943,-0.4938443,0.1815069,0.3138899,-0.2039881,0.8078433,1.538139 +-0.9723008,-0.2391498,1.064797,-0.2923385,0.39524,-0.05898864,-0.8161376,-0.2244034,-1.086873,1.863873,-1.53784,1.593554,-0.3971044,-2.046581,-1.738379,-2.140378,1.243537,0.5891341,1.546174,-1.569828,0.06910034,0.1961129,0.1448834,-0.04036764,1.975785,0.6532698,0.2312581,0.8946744,0.3890475,0.06438904,0.7169794,-1.441575,-0.009641938,1.178367,-0.3287841,0.6567072,0.1431721,1.3122,-0.5096895,-0.05750559 +-0.5312442,-0.8948782,-0.7260831,1.467022,-1.955594,0.8955435,-0.1036086,-0.3825295,-1.689418,0.392098,0.8155582,-0.9831998,1.88634,1.113068,-2.042616,-0.6071075,1.407907,-0.6659606,-1.789814,-0.1308652,0.1455765,-0.2485045,0.02741477,2.015894,0.7818057,0.3521816,0.07443463,-1.27262,-0.1249121,-0.06622647,0.1867163,-1.983589,-0.3446692,1.415915,2.831577,0.2100089,0.403795,-1.965364,-0.7232948,0.4640919 +0.06203674,-0.3383744,-0.5572548,1.826018,0.3690919,-1.290571,-0.6589436,-1.057907,0.8455462,-0.7872284,0.09079513,0.2938083,-0.30873,-0.8179362,-0.8216945,0.9012131,-1.660939,0.2240375,0.426018,-0.9570653,0.3898353,-0.5285188,-0.2848804,-0.9113718,-1.542638,0.1844024,-0.6584283,0.2650304,0.7123362,0.484547,-0.2050209,0.6852473,-1.954289,-0.803792,-0.6949282,-2.051954,2.475996,0.5309327,-1.022556,1.172503 +0.10353,-0.01032162,0.7932482,-1.224906,0.4012784,0.1033128,-0.7384434,0.4262488,0.2248896,-1.962696,0.9624663,1.129486,0.3942704,-1.158736,-0.174415,0.4712744,-0.379586,-1.442181,1.034702,-1.532957,0.7560594,0.8183425,-0.6001082,0.5662866,1.130177,0.7901014,0.2471553,0.706204,1.183108,1.00746,-0.04360006,1.505218,-0.6422283,-0.03089392,0.01891765,0.3764816,-1.21149,0.7191555,-0.4418811,-0.7940312 +1.169799,-0.03207081,0.000556495,3.222134,-0.806898,2.201916,-0.4383998,-0.3015083,0.9365768,1.394544,-0.743566,0.3719779,0.4967696,0.1741803,-0.5221403,-0.7887745,-1.550941,1.679451,-0.9919756,0.2694591,-0.5151353,0.9547748,1.026756,1.722854,-0.933604,0.4495567,-0.2143293,-1.227873,0.4250198,1.043441,-1.127846,0.7242463,-0.8650295,1.247166,-1.149707,-0.4617626,2.447032,3.165459,-0.2549181,-1.754637 +0.3941892,0.09224979,-0.309888,-1.281987,0.3497155,-2.055326,-1.405884,0.6743686,0.4276694,-1.121046,0.5442797,-0.3557201,-2.064064,0.04709102,0.4052898,-0.5110335,-0.4102122,-1.274488,-0.2659783,0.4744121,1.332427,-1.007673,0.8915143,-0.144087,-1.215807,-0.445507,-1.563627,0.1770387,0.1305507,0.06364367,0.6870038,-0.4108963,-0.112071,-1.293682,-0.6370302,-0.3029067,2.455904,0.9127248,-0.2237188,0.4595154 +-0.1341024,-0.9399536,-0.3694581,-0.7691559,-0.2001564,0.8073041,1.099428,0.1054504,0.5733638,0.9543527,-2.103281,0.4977217,-0.1866428,0.4318542,-0.7310553,-0.1363756,0.5318254,-1.59086,0.7526577,-1.6265,0.2618575,-0.2763518,1.256783,-0.8856265,-0.6561465,-1.328345,-0.4595812,-0.837261,0.6302329,0.4847538,1.409876,-0.4068447,-0.425145,1.46229,-0.6715712,-0.6650469,-0.3224296,1.006654,0.7632255,-1.697106 +-0.3798756,0.3990168,-1.034237,2.002978,-0.8957085,1.408379,1.037786,-0.2892502,0.4787895,-0.7354509,-0.3346978,1.145799,0.2391412,-1.73562,-0.2821351,0.7250493,0.4978618,0.1495695,-0.5926607,1.474426,-1.270171,1.894442,-1.319965,1.067083,0.1976717,-0.06268613,0.1674018,-0.00672128,0.5915064,-1.083959,0.7026258,0.3490307,-0.01173376,-0.7918272,0.674953,0.5000165,-0.8519298,-0.6250141,-0.5479573,-0.4476272 +1.278667,-1.009475,-0.08989707,0.9587878,-0.6247551,0.4761302,-0.6607074,-2.41366,0.1047579,-0.2620635,0.4339085,-1.417949,1.02411,0.05124961,0.03898965,2.08037,-0.2499936,0.410883,-0.123819,1.800201,0.02182318,-0.1222083,-0.4097383,0.4849104,0.5907184,0.6470014,-0.8002489,1.046715,1.66118,-1.251429,1.735112,1.727574,1.6295,0.722004,0.4162096,-0.7856915,0.736238,1.86464,2.053114,0.5992186 +-0.6993323,0.0123165,-0.7189023,0.2171809,-0.3601385,0.9555429,1.944865,-0.8789039,1.011883,1.652378,1.096027,-1.165181,0.1240997,0.5257474,0.6669738,-0.1420757,-0.6478271,-1.397436,-0.4392626,0.2475972,-0.5876387,1.235466,-1.081922,-0.3216942,-0.7350306,1.354329,-1.300496,0.1329642,-0.4135814,0.6749646,1.506743,-0.02872287,1.329903,-1.805582,-2.806265,-2.576582,0.5744226,0.6955864,1.284374,-0.4039655 +0.5642505,-1.468275,0.5218577,-0.9086501,0.01782081,1.427744,0.08847448,-0.1321254,1.930017,1.457457,2.203047,-2.242763,-0.1769075,-1.240876,-0.0147939,0.1079841,-0.1955332,1.494984,-0.658474,-0.5484068,0.5925906,0.5689679,0.5341955,0.6958913,0.2960403,0.3498759,0.9569657,0.6870798,0.1487466,1.240183,0.1059848,-0.2903752,-1.047945,-1.116784,0.8309403,-0.276129,-0.2933576,-1.52607,-0.224091,0.948519 +0.302942,-1.990222,-0.638475,-1.182617,-0.5113048,-0.4761818,-0.437816,-0.1749492,-0.819088,0.3096602,1.108646,-0.538502,-0.0109607,-1.120445,-0.7147544,-0.9191104,-0.4961175,0.05897825,-0.1562634,0.2965386,0.1873453,0.9664666,1.674073,0.8755292,0.1293646,-0.2214052,-2.246721,-0.1479147,0.937449,0.3421198,-0.4816203,0.3437928,-0.1210595,0.5078389,-0.02343331,-0.8524698,-0.02989313,0.1555635,-0.5562902,-0.5139564 +0.2154412,-0.8175837,-0.9893796,0.8713705,-1.701718,1.472875,1.118253,-0.03854766,1.043973,0.2459952,-1.585767,0.6666511,-0.9158945,1.264314,-1.240674,-0.9328547,0.007040214,-0.6958746,0.3754151,0.2221361,-0.3514333,-0.1453362,0.733405,0.4171838,1.783599,-0.1392282,-0.08689291,0.4982204,2.423175,0.6846035,0.4813605,1.024519,-0.6290205,-0.825635,-0.1416538,-1.042091,2.18902,-1.311214,-0.461467,0.4917124 +0.921003,-0.7066146,0.3786798,-0.1948007,1.124516,0.934626,-0.914043,-0.8403811,-0.2807768,-0.1893863,1.49396,1.052959,-0.5342189,1.419938,-0.9845603,-1.09518,-1.65979,-2.129819,-0.2028712,1.922006,-1.00812,-0.6489429,-0.1086712,1.497328,0.3865844,1.808226,0.3713828,-0.9194086,1.073931,-0.5004468,0.5879455,-0.1710399,0.1318283,-0.4767364,0.1543003,1.777603,0.5993513,1.324355,-1.159156,0.7572663 +-0.8875519,-1.926928,0.8298065,0.3391615,-1.243018,0.3399947,1.445078,-0.922693,0.05726674,0.2215368,-1.468603,-0.2611275,-0.2608846,-0.8358297,-0.4223882,-0.4402985,0.1352505,0.8666554,0.7264607,-0.883803,0.592059,1.090052,-0.3450121,-0.4648145,1.54169,0.3665573,-0.5058759,1.096181,-0.1990266,0.9506214,0.3857739,0.3538694,0.3768219,-0.4944602,0.4039124,0.5738147,-1.366408,1.794353,-0.4413391,0.005087347 +-0.002893014,-2.053876,-0.5223349,0.4288391,-1.957618,0.3345805,-0.2602171,1.115272,-0.7594233,0.3693483,1.026061,0.4271135,-0.3968187,1.49341,0.5260044,2.137842,0.3213553,-0.787635,-0.9271448,0.5785548,0.8970156,-0.2535514,0.2664867,-1.056284,-0.1823473,-0.1454419,1.77431,-0.137738,0.3466266,1.017359,-1.428265,-0.5283427,-1.034481,0.8602376,-1.875766,-0.05894734,-1.858859,-1.741024,-1.011388,1.019569 +-0.1951736,1.616174,0.3047028,-0.5388867,-2.375654,0.05547521,-0.693527,-0.1879806,0.8626395,-1.794455,2.911775,-1.215361,0.660228,-0.3989621,0.06883538,0.1176071,-1.850269,0.2049524,1.075279,-0.7940333,0.5403877,-0.571829,1.462314,-1.363385,0.4304908,0.4710246,1.620169,-0.0256161,-1.132247,0.1671792,-0.7626884,-0.4677773,-1.284391,0.2619936,-0.649472,-0.4528773,0.2383288,0.0814264,-1.781681,1.110073 +-2.234679,0.5879565,-0.7157053,-1.483243,-1.254202,0.05020565,0.7200954,-0.815753,0.123146,-1.18342,-0.3096676,0.3327546,-0.9451745,1.399571,3.208317,-0.1549534,-0.1602172,-0.1081016,-0.8606965,-0.04149312,0.2807392,-0.3568338,0.7027291,-0.3405516,-0.8368436,-1.107558,-0.3877862,-1.808168,2.110081,0.3712673,1.093767,0.2670179,1.977854,0.9536817,-0.19087,1.16079,-2.707835,0.1466016,0.9899014,0.6557933 +0.478323,-0.8202958,1.792164,-1.947456,-0.01859973,-0.5827667,-0.8595697,-1.20569,0.8868192,0.4198727,1.519717,0.6211621,-3.002661,0.3741903,-0.6541268,-0.3645961,0.7927031,-1.417278,0.9655002,-0.2065052,1.222267,-0.8085468,0.9163063,1.181088,-0.110289,0.5179326,1.162527,-1.539881,1.445406,-1.496432,1.131966,1.84822,1.671199,-1.183713,0.5166368,0.6775974,-0.5202186,0.2077035,0.2633578,0.152124 +1.622692,1.102372,0.2873148,1.412242,0.6726376,-1.651547,-0.3048475,0.4155192,-0.7708754,-0.7405265,0.4821106,-1.914946,0.2573712,1.643963,-0.6030772,-0.3034785,-2.121462,-0.637152,1.482518,-0.6930966,-2.129533,0.3768408,1.230824,-0.1958078,-0.01019378,-0.544479,0.4361133,-0.7024325,1.283431,0.5490438,-0.4464537,1.504071,-1.221142,-1.048278,-0.08323425,0.6988342,0.1618758,0.04798718,-1.551594,2.653412 +-1.196074,-0.2393882,-0.5667964,-0.2957817,-0.3082732,-0.3272143,0.0508804,1.568966,0.1510852,0.3928547,-0.4318883,0.4027209,-0.1764753,0.5276242,-1.880441,-0.999653,0.05208513,0.4350305,0.4128478,-0.964412,-0.2724328,-0.6226984,1.291148,-2.142878,-0.07428359,0.6418595,1.584438,0.6238343,1.328582,0.732921,0.4218275,1.247722,1.691883,-0.3425652,-0.454842,-0.4649289,-0.05894427,1.236078,-0.6350249,-0.1662978 +1.659509,0.609441,1.660352,2.098924,-0.3575282,-1.23386,-0.4551751,-0.4621138,1.529122,-0.2569064,0.732641,0.2384231,0.58035,-0.2093633,-0.3643803,-0.5872216,-0.9321486,-0.8781363,-0.06285924,-2.422144,0.520095,-0.09069585,0.8810369,1.391157,-0.1801377,-1.22588,0.6647661,0.02160709,0.5752423,1.253915,-0.6883967,-0.9840042,-0.5038421,0.5659165,0.1479767,-0.2773697,-0.8413612,-0.9899905,-1.321333,-0.2303997 +0.2045454,0.4640065,0.1622794,0.1370094,0.7190656,-0.4949269,0.9547199,-0.9268013,-0.8988274,-0.08645473,-0.355363,-0.01109705,-1.564019,-1.030256,0.02629224,1.004101,0.2658556,0.5314992,0.161519,1.505492,-1.12611,0.3600276,-0.4873529,-0.009724262,1.815498,-1.657437,-0.4391008,-0.2646575,-0.8900526,-0.738838,1.125315,-0.2490692,0.4477915,-1.259788,0.390954,-1.515157,0.6208335,0.648672,-0.8954566,0.2041018 +1.385976,0.5605362,-1.104386,-0.679799,-0.406789,-0.3713676,-1.028126,-2.173783,-1.19755,0.08109985,0.4855772,1.416954,-0.6166798,0.5740085,0.4277822,-1.994864,0.8763404,-1.317342,-0.5900156,-0.1198391,0.4003729,-1.018474,-0.3217641,-0.3234227,-0.5782439,0.4979703,1.097236,-2.220527,-0.7841264,0.1478663,-0.1803764,-1.883462,0.8916074,1.138538,0.8022096,1.629895,-0.6745323,-0.4601818,-0.1827462,0.220388 +-0.8578118,0.2520442,-0.03362419,0.9347283,1.191985,-0.4675359,-0.8806667,1.33705,-0.8864536,0.691036,-1.277327,-0.117869,1.046372,0.5250179,-1.141793,0.1419324,-0.4436253,1.872214,-0.08517587,-0.3984453,-0.2466121,-0.217064,0.2293662,0.5674603,-0.01083866,0.6268998,-1.540294,2.094545,0.9101055,0.6594636,1.315756,0.06153959,-1.230249,0.5501673,1.047344,-0.096484,-0.1436224,-1.091414,-0.419935,0.9896483 +0.6858621,-1.19968,0.6077916,-0.3002944,-1.256409,2.280963,-2.355678,-0.7976201,1.770923,0.8890496,-0.2120412,-0.6247553,-0.8415476,1.603072,-0.8736849,0.05922921,0.4935178,0.924529,-2.892987,-0.2203705,-1.090392,0.6172692,-1.4601,0.1614185,3.509353,1.566242,0.005753118,0.8220009,-0.8893133,1.449525,-1.065848,-0.4092313,-1.091547,-0.4372036,0.3358704,-2.170208,0.2353268,-0.9683891,-0.4985635,0.2322231 +-0.5816625,-0.8787443,-0.2712148,0.4720448,-0.0822743,-0.6687052,-0.4684898,1.414539,-0.7577506,-1.38768,-1.034158,1.006924,0.04487139,-1.002611,0.4175151,-0.4020561,-0.7482398,-0.2886428,0.4408119,0.09912142,0.5413924,1.398343,0.5179577,0.6018325,2.285186,-0.1140344,0.8938847,-2.210724,1.499098,-0.3890211,-1.587144,-0.7593156,-1.07553,0.2865577,-0.7069119,-0.9164407,1.014168,-0.1064127,-0.5788326,0.7302273 +-0.6845529,-1.16823,0.4963991,-1.897077,1.329718,-0.1408628,0.5510262,0.9724312,0.03617659,1.445575,1.925765,-0.27342,0.2783219,-0.1591567,0.7799825,0.4179954,0.6506365,-0.86277,0.09090416,1.276028,-2.215363,-1.260827,-1.180493,1.196008,-1.930124,-0.6055882,0.9747462,-0.6223347,0.3850131,0.3327818,-2.058869,-1.003025,-1.54457,-0.005973188,1.158255,-0.473629,1.311599,-0.6450937,-1.435954,0.08241864 +0.3784729,1.539776,-1.016552,0.05360819,-0.7024122,0.166862,-0.8415277,-0.3192869,-1.492051,0.1392559,0.9729174,-0.7303704,0.5507193,-0.4687668,1.733727,-0.3978819,-1.115731,1.425576,-1.108456,0.2242394,-0.5311313,-0.0161045,0.215471,-0.07699956,0.1133233,0.1289783,-0.5858642,1.241563,0.9485677,0.1596243,-0.6756337,-0.6518652,1.035978,0.9147707,-0.7165963,-0.7808115,-0.4008615,0.1972473,-1.162446,-0.1552973 +1.312957,-0.2851237,1.71327,-0.8800088,0.0212486,0.6291781,0.2160119,-0.3569615,0.3002989,0.6357958,0.5258339,1.233272,-0.516867,-1.336664,-0.1490313,-0.8494892,-1.358209,-0.3362528,0.2965828,-2.194134,0.1531738,1.091981,0.4288329,-1.002469,-0.001950186,-0.131451,-0.540703,-1.432985,1.073824,0.4224071,-0.726744,1.680893,0.2097987,0.2948633,-2.062713,-0.7201369,-0.5001913,-0.3456278,-0.08140415,0.7757165 +0.3140215,-0.04079823,0.6052381,-0.6833659,1.760619,-0.2860642,0.911968,-0.648003,0.704288,0.633486,-1.906822,-0.2770788,-0.7721798,1.464471,0.05631875,-0.4749887,-2.092843,0.0137967,0.3566571,0.2882965,0.7454154,1.250484,0.4839062,0.5092269,1.103874,0.003798154,-0.1843647,-1.370217,-0.5615197,0.3236991,-0.8069222,0.3358587,-1.13381,-0.5151932,0.02004626,0.8400418,-1.442232,-1.16591,0.01008701,0.4828064 +-0.5021852,-0.01625638,0.1353302,0.02146303,0.7550657,0.6063388,-0.1878491,-0.02153923,-2.007115,-0.2923603,-0.07367917,-0.92497,-0.5710295,0.8470807,-1.100883,0.5717066,1.351657,0.950468,0.733054,1.391759,-0.5793054,0.800627,1.518688,-0.8662858,0.1532947,-0.8996898,0.5604288,-2.389976,0.2456248,0.3828365,-1.561338,-0.609685,-0.6165892,0.9227868,0.07006581,0.4472156,-0.8945208,2.071231,-1.057148,-0.5211552 +1.252115,-0.04848325,1.209436,1.516122,-0.1483904,-0.6082367,0.5662503,-0.1843218,0.2683239,-2.0097,0.3055716,0.2466408,0.2982286,0.9184144,-0.7126562,1.410296,0.3501286,-1.112254,-2.214715,-1.340438,-1.121978,1.130651,0.2872972,1.627682,0.1856152,-0.3696535,1.302717,-1.517734,-2.993981,0.6377883,1.4501,-1.225715,-0.2558299,0.7284164,-1.215162,-0.7901918,-0.1251047,0.06237985,-0.2502887,-0.05071451 +-0.3031554,-0.6334486,-1.836104,-0.8109516,0.4497824,0.1556035,-0.5416815,0.1063254,-1.686593,1.569473,-0.5883971,0.789418,1.966931,0.734936,0.9337218,0.897058,-1.239174,0.1429722,-0.1433776,-0.3231087,-0.4196535,-0.8696229,-0.8117121,-0.4300478,0.04738776,1.377113,-0.1395752,0.99873,0.09268348,0.7488333,0.03061167,-0.955973,0.1726336,0.1307921,0.5867076,1.046434,-0.1503476,-0.7613648,1.194587,-0.9274759 +0.8112319,0.03624687,-0.230004,-1.539116,0.7219766,0.6583519,-2.055981,1.59575,0.1152697,-0.1356939,0.2008274,1.651974,-0.3951117,1.160372,-0.3795038,2.246612,-0.3635909,1.803601,-1.228651,-0.9071704,0.3715192,-0.05961681,2.433156,0.3634486,0.2277008,-1.02579,0.5499967,0.06404954,-0.4414139,-0.7917577,-0.3283938,0.1709475,0.7185463,-1.628443,0.3537813,0.04972838,1.552479,-0.5473521,0.2057519,-2.020991 +-1.259379,0.5749691,0.5974023,0.7879319,1.09011,-1.464666,0.4560915,0.008033442,-0.1131769,-0.1031511,0.3543162,2.015282,-0.6432876,-0.7620306,0.7275025,-0.8075381,-1.248283,-0.6502997,-0.4672441,0.3306301,0.231999,-1.226393,-0.3308761,0.2562321,1.845871,0.9444024,-0.779132,-1.103756,0.5101511,0.4184679,2.037574,0.5361663,0.5206585,-0.2630614,-1.56806,2.477458,-1.172379,-0.9528819,0.5415029,0.6714856 +0.7629873,0.1498169,-1.785145,1.620083,-1.392572,0.368314,0.7265628,1.278273,0.7484077,-0.7169908,1.004242,2.648824,1.181514,1.177895,-0.7563995,0.9735325,-0.7984389,0.1802798,-1.008,0.3257398,0.6118118,1.123254,0.365475,1.492253,0.757902,-0.3765615,0.3726086,0.8906067,0.825823,0.0179383,-0.96784,0.4185774,-1.508159,-1.037055,0.9320957,-0.3752514,-2.026453,1.087247,-0.8949032,-1.365958 +-0.8034535,1.693107,0.8907324,0.4916645,0.00664823,-1.04663,0.6681104,0.5650308,0.4519339,0.1165603,1.405439,-0.09300016,0.1299385,1.019555,0.1571505,0.4739861,0.6132716,-1.884051,0.8569398,0.6202036,0.002273269,-0.3974348,-0.6025942,-1.571756,0.3428778,-0.1002824,-1.473985,-0.03356483,0.994898,0.8015046,0.4942812,0.5528447,0.6922505,-0.6818975,0.6738582,0.07203926,-0.6182877,-0.697061,0.4607964,0.3184064 +-0.3793839,1.232742,-0.6564379,-1.711536,-0.7485457,0.5405553,-0.6999799,-0.6291882,0.2486816,-1.427534,-0.8552662,-0.05328794,0.551656,0.1802625,-1.525691,0.2316489,-2.065439,1.266036,-0.5151899,1.954801,0.09695475,-0.6062915,-1.092706,0.3398008,1.313515,1.562171,-1.280082,0.9645843,-0.8784016,-0.03895605,2.28229,-0.3912908,0.4983636,0.1528187,0.7689241,-1.816784,-0.1259461,0.2671127,1.16946,0.5798245 +-1.366494,0.1796436,1.359111,-0.9260267,-0.7594184,0.1106738,0.3745467,0.2672386,-0.70627,-1.417828,0.7601579,2.115652,0.7780578,0.3999959,0.7245292,2.078229,-0.04388882,-0.6908478,0.402223,-0.2586214,-0.3174484,-0.7211972,-0.1637879,0.4309381,0.08181562,0.4610964,1.449769,-0.9063016,-1.954263,0.2651399,-0.132346,-1.304952,0.3434005,-0.6420293,0.715023,-0.383825,-0.6311738,-0.9111613,0.6662822,-1.196772 +3.519299,-0.5731206,1.88999,-1.368135,-1.634291,1.588405,1.185166,-0.3798948,-0.6821478,-0.7323656,-0.1712068,2.833482,-0.4834958,-0.02663705,1.353663,-1.316133,-0.3896645,-0.5047312,-1.217328,0.32364,1.508122,-0.9206946,-0.9185166,-0.6867428,0.8917605,2.541065,0.3081096,0.7099166,-1.00463,-0.8955887,0.6584213,-0.3616463,0.1012291,-0.4368737,0.3416262,-0.2619797,0.6033851,1.695972,-0.1398366,-2.261094 +-1.213148,-0.228057,-0.278992,0.2347019,0.9370522,1.81889,-2.74516,1.207513,0.6649652,1.316169,-0.03716759,0.1956389,0.7929325,-1.109221,-0.8641874,0.6671542,-0.6218841,0.1132161,-0.4486045,0.9476148,-0.01801253,-0.9949322,-1.905354,-0.1448124,-1.11205,0.1394248,-0.7273538,0.7538552,1.936057,-1.022512,1.575278,0.1448104,0.1958617,0.3391977,0.4683865,-0.1308172,0.910317,-0.2805666,-1.187001,-0.09079856 +0.1943573,-1.132042,-1.579535,1.135201,-1.499916,1.814108,0.9595021,-0.654794,1.722426,-0.7347386,0.6726799,-0.0602205,2.148206,-0.2855474,-1.504625,-0.961454,-1.842671,0.6093893,-1.012292,-0.0248904,0.6149494,1.415837,-2.667029,-1.014592,0.8297721,-0.511073,0.4701356,0.4068405,0.02365576,-0.3641687,0.8404726,1.081728,-0.6926027,0.4912869,0.002790474,0.8494945,-2.227187,-0.2862666,-0.2931318,-0.3834288 +0.09436564,1.08431,1.03163,0.8473582,-1.167828,-0.2125812,-1.267551,0.7834951,-1.650034,-0.0257196,-2.223756,-0.1054098,-0.231757,0.1386652,0.3442035,0.8282644,-0.02105098,-2.18368,-1.081661,0.1800899,0.1377425,0.0345823,-0.5786172,-1.735062,-0.8219146,-0.623302,-0.7021733,-1.794733,0.7612706,1.946063,-0.08714944,0.9489436,0.9563849,0.3216927,-0.2261539,0.1381873,1.793779,1.297181,-0.8880806,0.7482745 +-0.8784914,0.3861999,-0.2453823,0.9163824,-1.534753,-0.2532264,1.185449,0.7762129,0.5932513,0.1816082,2.373827,0.5065031,0.06194697,1.526572,-0.08629273,0.5557227,0.5312101,0.9345213,0.5709804,-2.029378,-0.4679418,1.943743,-1.046482,-0.2339472,-1.575449,-0.803575,1.095839,0.5710033,0.1789273,0.002159373,0.3509683,1.276981,-1.60867,-0.3592317,-1.424103,-0.1446397,-0.3433334,0.4814303,-0.7953896,-0.1925589 +0.2604359,-0.6843459,0.7445604,2.736165,-0.9041653,0.2495322,-0.9004124,-0.7387096,-0.5015964,0.6613593,0.1565096,-0.5104333,-1.76548,0.2651427,-0.5962804,-0.1103182,-0.4248247,0.5603007,1.305233,0.822253,-0.1699157,-0.02725324,1.511209,0.2309441,0.4283693,-0.2371845,-0.6858219,2.260665,-1.263758,-2.136677,1.17056,-1.753309,-0.6743923,0.8227765,0.5920619,-0.6886185,0.2610932,0.2678105,0.977581,-0.7861605 +0.657707,0.4163541,0.7556631,-1.269078,-0.268133,1.273423,0.1373804,-0.7842392,-0.6122067,-0.1367456,-0.8888925,-1.593362,-0.6846188,0.06831862,0.955904,0.6880812,0.9192169,2.26597,-0.2169558,-0.2953153,1.123027,0.9119727,-0.2439889,1.289382,0.9056218,0.2506569,-0.06050707,-0.1821496,-0.3745896,0.4172434,0.1386833,1.192386,-2.081395,-3.154542,-0.8797872,1.053579,1.259933,-0.3582579,0.2186347,-0.892404 +0.7657965,-0.249551,1.337235,1.279546,-1.478802,-0.8004672,0.5702055,1.171442,-0.7163045,-2.025907,0.8040604,1.153069,0.3133278,2.05982,-0.3314871,1.266292,0.1241986,-0.9960295,-0.1250138,1.89323,-0.259023,-0.05071838,1.00547,-0.5088482,0.2170091,0.2897567,1.051362,0.6354819,-2.005534,-0.2479432,-0.936089,-0.0642257,0.8834639,-0.1170116,0.8360004,-0.6855661,0.2896295,0.8597848,-0.5673942,0.01334062 +-2.235361,0.2741413,-1.302513,0.08352601,-0.4151293,-0.4218455,-0.6333078,1.001498,1.553662,0.1010458,-0.2584019,-1.413725,-0.4385859,1.014748,-0.525094,-1.875814,1.323234,-0.2202773,-0.592323,0.8749745,0.5401133,-0.7871073,-0.7776832,0.6475327,-0.03393218,-0.4392203,-1.097234,0.2504217,0.4517848,0.4176339,0.6958104,0.4413073,0.8189005,1.476182,-1.221105,0.629641,0.5376863,1.543452,-0.3918212,-0.0972118 +-0.9028261,-0.6857168,-0.9235105,-0.109217,-0.3754292,-0.1152687,-0.4961005,0.3142004,-0.8883354,1.850288,0.821572,-0.3613545,0.5947843,-0.511297,0.7979916,-0.1928076,0.05790717,0.1678983,0.096165,0.6490616,0.1028558,-0.1865042,-0.3323132,-0.9897656,-0.4603213,0.8175113,-1.251201,-0.2583638,0.355495,1.717082,1.080582,-0.7038505,-0.2522378,0.5057026,-0.9616234,-0.2947341,-1.024287,0.3818077,-0.4629345,0.8865253 +-0.1663838,0.5706372,-1.334782,0.3106137,-0.6649979,-2.667955,0.9627323,-2.904892,-0.3198839,0.9274799,0.4385249,-0.2895104,0.05968881,0.4924127,0.3854466,0.2109138,1.730334,0.514342,-1.059854,-1.002836,-0.5432531,0.2220183,-0.2325734,0.2361371,1.253132,-0.796987,-0.01315185,-1.075604,0.2386081,-0.930625,0.4941331,-1.11725,0.5827745,0.06271986,-0.6447721,-1.320185,-0.05044826,0.6121632,0.7637757,0.8577702 +-0.06549942,1.127135,-0.06956319,-1.610463,0.1070903,-0.3820781,0.3925242,1.773311,-0.5953579,0.841652,1.521453,1.565639,1.107085,-0.567662,0.07622152,-0.5260061,-0.7922442,-0.04223568,0.6859867,0.09630433,2.076728,-1.289222,0.6415284,-0.3095888,0.0894881,-0.3234843,-0.4857622,0.8033816,0.2877501,-1.864704,-0.1966976,-0.303502,0.4175054,-1.55106,-0.5289907,-0.6853358,0.7349237,0.1758503,0.2135955,-0.7663956 +-0.505638,0.1495988,2.345595,0.8000875,-1.857188,-0.4997538,-1.376236,-0.150502,0.1829931,1.364478,-0.5845594,0.8350786,-1.222227,-0.2255331,0.4415353,0.5132123,1.692349,-1.176564,-0.3021988,-0.9726097,-0.3944138,0.0346203,1.754643,0.7368506,-0.1148953,0.1109811,-0.9881065,1.333426,-0.6344393,-0.172449,0.07557413,-0.8294482,-0.4225302,-0.5683905,-1.21916,1.288018,2.129587,0.283311,-1.122006,-1.022885 +0.5202867,-1.337685,0.1346264,0.01188452,0.3684152,0.1528294,-2.071259,-0.3913666,1.824382,1.515205,-1.338143,-0.1828713,0.2511529,0.2803653,2.391313,1.827132,-0.3565259,-1.988796,1.337248,-0.09875218,-1.745819,0.07164506,0.7855435,-1.277339,-0.2467326,0.1370101,0.0603502,1.818322,1.018238,1.893878,0.3371783,-0.08141554,0.1059655,-1.346654,-0.4908243,0.6138821,0.5207148,0.02400605,-0.6649567,-0.2647884 +0.03570148,-1.673203,1.594795,1.697394,-1.140193,-0.05444751,1.93958,-1.552927,1.964754,-0.331717,0.7124105,0.2546536,0.03405762,0.4059393,0.097107,-1.559085,0.3220327,-0.7397535,0.4689234,-0.5029544,-0.02608546,-0.8264824,-0.3378701,-1.815649,1.155148,-0.1151388,2.513224,0.7862514,0.7966618,-1.256832,4.404412,1.425316,-1.342674,0.6514609,-0.8807578,0.2570673,-2.122619,0.8553416,0.2370059,1.337988 +-0.9106896,1.258562,1.80228,-2.138355,0.3238824,1.832384,-0.05321982,-0.1507988,-0.005256383,-1.57589,0.6803025,-0.1086447,1.041859,0.8981045,0.8834105,0.1418577,0.775294,0.4252886,-1.080635,1.754132,0.01861973,-0.6246201,-0.4272099,-0.4010619,-1.168884,1.481959,1.661587,2.084995,-0.5854332,0.506069,-0.006103619,1.002256,-0.5262331,-0.1256015,0.1143062,0.654878,0.4247799,1.505307,-0.7348379,-0.296841 +0.9143431,-0.1120024,1.644121,0.250817,0.4526929,0.6469526,-0.3063942,-0.4573237,-1.340211,0.2253282,0.0596922,0.2158871,-1.026765,-0.2919171,-0.838242,-0.3538725,0.4327282,1.61046,-0.6823251,-0.8526198,-0.08961778,0.7256624,0.8490668,1.177747,-2.06294,-0.447591,1.007088,0.7170131,-1.451847,0.7635161,3.170813,-0.447111,0.2822077,0.5762241,0.1303336,-1.476619,-0.7727474,0.6849408,1.592667,-0.3953623 +-0.3367091,-0.5533295,0.6297007,-0.4464984,-0.5289564,-1.046703,-0.6316538,-1.6113,0.8445129,2.286125,1.242475,0.3137267,0.3948872,-0.1051868,0.7800159,1.306738,-1.225735,1.16624,-0.4111789,0.7017405,1.272528,0.4540392,-0.6146678,-1.091626,-0.4612097,-0.3978039,-0.406529,1.162076,-0.3633989,0.3310425,-0.6964232,1.226314,1.221196,0.732752,1.499822,-1.216148,1.5745,0.2781913,-0.2208516,0.295036 +0.09476286,1.298708,-1.528579,-0.5119798,0.2711642,0.6773905,0.2258332,-0.710084,0.1450614,-0.652235,0.7642455,-1.03683,-1.213558,-0.003632021,-1.072894,0.7202984,-1.707354,-0.4075919,1.198806,-1.75921,-0.1699173,-0.02601962,0.002948999,0.4633677,-0.06252983,0.762464,-0.2694523,1.124297,0.3600386,0.969883,0.2937114,-0.2026855,0.1368049,0.7647383,-0.232622,1.212044,-1.787415,0.4202031,-1.333779,-1.327951 +2.127221,-0.1915555,-1.025734,-2.101401,-0.3960868,0.5831273,3.413571,1.261965,-0.8504883,-0.7446896,-0.9063812,-0.7384565,-0.5009259,0.8511569,1.277552,0.7329818,-1.226303,-0.09539872,-0.9104438,-1.142792,-1.331075,1.111712,0.5874506,-0.8496612,-1.885077,-0.7367944,1.389551,0.1049261,-0.5153282,0.8765592,-1.225679,3.086056,2.223857,0.954726,0.3487692,-1.262585,-0.5738265,1.067456,0.1579341,-1.559085 +0.1965845,0.4387526,-0.9280888,0.6618323,-0.8867958,-2.596941,-1.058573,-1.143187,-0.05772876,-1.335229,0.7182787,0.504241,-0.624721,-0.5012245,0.2988758,-1.232861,-0.9514306,0.2870225,-0.3303708,1.33982,0.6829166,1.642822,-0.4839097,1.031674,-1.130059,-1.469723,-0.6881109,0.3163703,0.5084398,0.3321873,0.1041375,0.2799184,-0.4745973,1.022346,-0.3524696,-0.1091951,1.368557,0.7655048,0.0975618,1.154372 +-0.9676047,0.6602578,1.029157,0.1231871,-0.08031208,-1.202976,-0.8484993,-0.5832104,1.438332,-0.043076,-3.39283,-2.146342,-0.3180401,1.412302,0.7879335,-0.2994538,0.6394667,-1.014437,-0.6463753,-0.4864432,0.768418,0.1306024,-0.005414423,0.7394644,0.8219544,0.2667474,0.07312904,0.2545961,0.1897611,1.044577,0.1962581,-0.1601137,-0.5707867,0.7635657,-1.921212,0.1961255,-1.0317,-0.5627955,1.750948,2.667262 +-0.2848972,0.5497897,0.6170152,1.036607,-0.3621152,-0.7030547,0.1191899,1.403845,1.055552,1.316009,-0.7678205,-0.7350992,1.059524,-0.5603015,0.2046806,0.4661027,-0.5454304,-0.05057025,0.3916374,-0.9891812,0.002241233,-0.858313,-1.266599,0.8427162,-0.08358244,0.3706885,1.704846,0.4358626,-0.3489192,-0.02347024,-1.202361,0.3147003,0.6169209,2.918766,-0.5973052,1.110585,1.188486,-0.1539813,-0.9268309,0.05818226 +0.164446,-0.08515008,-0.6459645,1.45092,-0.3576623,-1.533273,-1.546306,-0.991051,-0.003790644,0.453292,0.457115,-0.3264639,1.26084,1.321296,-1.230223,0.67291,0.8964358,-0.3822014,0.08347168,1.637866,-0.6949946,-0.8335648,1.262257,0.5667078,-1.407006,-0.285469,1.484071,1.364343,-0.977049,-0.2813278,-1.661388,2.461644,-0.8765441,2.667613,0.01867103,-0.3234895,-1.576234,0.08680707,-1.126438,-0.2050526 +-1.111871,-0.4151939,-0.7465667,0.07824663,0.1788307,-0.9045706,0.6358774,-0.2585207,-0.9916213,0.5921937,0.2642149,-0.7513814,0.1135459,-0.757243,1.145597,1.162602,0.6173853,1.656558,0.08368376,-1.003601,0.210858,1.343966,-1.157118,1.444217,0.04869976,0.4576591,-0.8070514,-1.752969,-0.09865672,0.54914,-1.980394,-0.7104156,-0.5944151,1.078533,0.6420447,-0.483657,1.748153,0.387969,-1.537332,-0.5722756 +-1.929365,-2.034997,-1.063749,-0.8773658,1.804782,0.5061262,-1.082942,-1.578487,-1.45679,0.07037206,0.5218069,0.4935055,-1.345061,-1.750631,-1.209542,-0.2651102,-0.3773594,0.2000337,0.7082113,-0.2571946,-0.03234141,-0.2298091,0.0929321,0.8544076,-0.1401185,-0.8292982,2.252453,-1.590919,1.430471,1.206836,-0.0435473,1.749311,0.1166868,0.6048708,0.358365,-0.6595437,-1.052863,0.01911475,0.5478238,1.215355 +-0.4702609,-0.4524589,0.3324192,0.2361255,-0.7090793,-0.2979953,1.434545,-0.2364499,1.689577,1.008251,-2.814441,0.06912953,-0.9213355,-1.303726,-0.9696974,1.173527,-1.03226,1.575132,-0.2229162,-0.1425568,-1.243005,-0.4360546,-1.914158,-1.285093,0.07604435,-0.426009,0.8645253,0.6870774,-1.129194,-1.055576,-0.5063043,0.2411976,1.020537,-0.9207,-0.4797003,-0.1842031,-0.9924531,0.2080195,-0.4688535,0.835927 +0.5343878,0.4871902,0.2587047,-1.599555,-1.970822,-1.19911,1.063098,0.5921748,0.05399496,-1.996206,0.9712692,1.349823,0.2019449,1.644224,0.3193446,1.058537,-0.8217605,-0.0455701,-2.346048,-1.236266,0.3125121,-0.6877448,-0.9484181,0.1357541,0.01362159,-0.6604537,2.283684,-0.2418641,-0.8341916,0.8022839,-0.004647704,-0.9726857,1.091257,1.955119,-0.8496786,-0.730208,0.6080518,-0.3131502,-0.9341855,1.033127 +-0.92105,-0.9327687,-0.7940643,0.9855778,-1.269448,0.01513415,-0.006025275,-1.360021,-0.5613399,-0.1397877,1.731788,-1.297628,0.263423,-0.3294002,0.08114625,0.2821621,-1.443264,-0.4237741,0.4974208,0.7656484,0.0186458,0.6695808,-1.687957,0.5173102,0.3080153,1.598026,-0.7181366,1.072345,-2.104787,1.497343,0.7128619,0.7224349,0.5272011,0.432926,-0.4385912,-0.74624,-1.367791,2.533617,-0.7582156,-0.1226784 +0.6843744,1.39653,-1.354471,0.2685025,-1.412584,0.5298037,0.8573126,-0.167389,0.1947747,0.1203484,0.1610358,1.858079,-0.9324527,-0.6714663,1.011642,0.6917023,0.8115175,-0.3531254,-1.126878,-0.0260049,-0.6697992,-0.1891267,1.499412,0.1015681,-1.009867,-0.6576151,0.1190477,0.6016392,-0.3114586,0.7396594,0.5647572,-0.06295178,0.4081875,1.317817,1.969584,-0.4450652,-1.06206,-0.2764536,0.3240544,-0.209429 +1.522079,0.3169721,-0.3719902,-2.119597,0.1915366,-0.6923434,-0.09302384,1.057427,0.349228,0.831286,-0.7892404,-1.640206,1.024844,0.4884056,-1.528337,0.2369044,-0.8801777,1.364385,0.777317,0.1515963,0.5815766,-0.2751771,0.5936823,0.2794662,0.4657253,1.015808,-0.1163118,-0.6207206,-1.206701,1.381355,0.9133282,-0.3239052,1.799257,-0.001148354,-0.6638625,-0.5161885,-1.361657,0.8275418,0.4369385,-1.037039 +0.1946133,0.9011202,-0.6815565,-1.707633,1.837516,0.2321325,0.7334967,2.307571,-0.6644983,0.3981021,-0.6419447,-1.171479,0.2183125,0.979456,0.102521,0.9411907,2.325921,-0.04289438,1.587536,1.007277,0.276212,0.6240323,-0.3987934,-0.04059565,-2.021625,0.2374427,0.08954698,-1.839943,-1.422585,0.07154495,-0.4087743,1.020642,0.9935291,-0.4883465,-1.176354,0.2372724,0.4764385,-0.7928125,-0.3858057,-0.1129724 +1.697225,-0.4891004,-0.179579,-1.24261,-0.8733917,-0.2594792,0.3521274,-0.112297,0.4366039,0.5321447,-2.342515,-0.4896046,0.5692584,-0.9399255,-0.6484646,0.3327301,0.8445379,-1.12997,0.3042413,1.04344,-1.061519,0.4813861,1.338273,-0.2403198,-0.7809707,-1.587747,-0.0769286,0.6821282,-0.04602197,1.022198,-1.549483,0.5601691,-0.3786048,-0.4828209,0.1815732,-0.86198,-0.2283242,0.8269835,-0.08679983,0.3818332 +0.2780682,-0.7275357,0.4197927,-1.710684,0.320205,0.4603884,1.761764,-0.661428,-0.2522666,-0.07357997,-0.3676647,0.4275389,0.9263192,1.297888,-0.9284788,0.5780064,1.972314,1.24258,0.004961261,1.563989,1.448001,0.8247088,-1.601334,0.1875771,-0.6195331,0.3573939,-0.2578047,-1.673597,-0.05531072,0.03943945,1.023123,0.4440082,0.7445307,-0.3708888,-0.5133652,-0.4964296,1.626292,1.123339,-1.20041,0.9689532 +-0.5710105,-0.5379159,-1.240596,-0.4813206,0.3563706,-2.054055,-1.505029,1.109881,1.370787,-0.5759728,0.8904571,-0.1749998,-0.4486959,-0.5498441,-0.3264241,2.185159,-0.8012998,0.1529227,1.498804,1.311429,-0.1344429,-1.403784,-0.9889925,-1.629726,0.1055478,0.8411561,-0.03536274,-0.4167112,-2.131335,2.806681,-0.6965585,0.585395,-0.4289641,0.415428,-0.8760862,-1.538598,0.2096028,0.375485,2.079938,0.3454173 +-0.1578794,0.2818978,0.1625803,-0.9986958,1.012335,-0.3638829,-0.4780823,-0.2829211,0.3109315,2.371322,-0.1572371,0.6731434,1.883691,1.090137,0.5522517,-0.1629384,0.1768768,1.303508,-1.36054,1.566994,-0.8179089,0.1687455,2.110481,2.763817,-0.7698148,-0.2169145,-1.235381,-0.2319147,-0.3704095,0.6227025,-0.8101282,0.3016914,0.5555159,-0.09544736,-0.1155435,-0.6835973,0.7835587,-0.719344,-2.083837,-1.108346 +2.488339,-0.6989417,0.5815245,0.3969532,0.771848,0.8751343,0.2221778,-1.025214,-0.4796165,0.5930301,-0.3378656,0.1257912,-0.1419429,-0.237532,-0.1863289,-0.8819745,0.2965979,0.5233976,0.7134816,-1.836434,2.197506,0.6125681,-0.1400447,-0.02075021,0.04027007,0.003403144,1.00338,0.4283598,1.700782,1.701843,0.05222922,1.17244,1.367659,-0.792578,0.8593416,-0.3762035,0.7077271,-1.675336,0.6255934,0.7346069 +1.071369,-2.261427,-0.2139025,-0.4379595,-1.096679,-0.7879652,-0.1715971,0.7907898,-0.3239197,0.5902527,0.8716831,-1.141111,1.067143,2.433849,0.543326,1.261384,-0.6910865,1.255574,0.9245579,0.9196568,0.4524381,-0.3312904,0.3664743,1.128121,-0.1725675,1.330677,1.664792,-2.297095,0.9841393,0.6156006,0.1171882,1.282711,0.4370028,-1.43986,0.2595964,-1.556891,0.7855402,0.8247962,-0.4176627,0.08986405 +0.3422191,0.2981194,-1.963551,0.9423931,0.02735992,0.5051175,-0.2363629,1.044681,0.2972368,-0.8489891,-0.2279241,-0.7847044,0.05696299,-1.532388,0.655748,0.1961241,0.5406307,0.6483183,1.497652,-0.08649826,2.169461,0.1329502,-0.02538799,0.4843534,0.7283597,1.616769,-1.86396,1.077223,1.024146,0.3394418,0.7874076,0.2902175,0.2088415,0.3844874,1.205218,-0.2676372,-0.205572,-0.5628024,1.17073,-1.997077 +0.2213274,-0.7581436,0.5113012,1.207525,0.1244414,-0.7209427,-0.4928156,1.024815,-1.154876,-0.1583071,0.4778644,1.395114,-0.7140091,1.319248,-0.002387027,-1.928328,-0.4148485,0.294408,1.002317,-0.8574203,-2.00326,1.237358,-0.6418064,-0.0119318,0.5318421,1.466747,0.2169072,0.8959737,-2.221316,-1.62595,-0.5048366,1.226772,0.3606749,-0.2491475,0.9586268,0.4641709,0.4122657,1.611326,0.1469947,-0.0194921 +0.06836274,0.1862902,-1.387461,-0.3318193,-0.2125718,-1.363589,-0.257292,0.9155209,-0.6350128,0.1576753,-1.208432,-0.9215548,-2.350923,-0.04998837,-0.1076157,0.9383692,-1.50922,-0.6150214,-0.9486393,-0.518257,0.6196763,-0.06048115,1.063081,-0.8715819,0.5817008,-1.082168,-0.5290728,-0.06680595,-0.5030411,1.002326,-0.5582461,-0.8478879,0.6828845,1.558568,-0.1569322,-0.8453595,-0.5220836,0.8187836,0.2366129,0.3784912 +0.3074216,-0.08389878,2.500077,-0.175009,-0.795337,-0.9481324,0.3578761,0.1770579,-0.8895127,-0.1077043,-0.8531281,-1.065658,0.5001797,-0.7693567,-0.9971138,1.235122,-0.4696145,1.254902,-1.176976,-0.2973405,1.876229,2.176359,0.1303087,-0.7408827,-0.002986435,-0.000876736,-0.01345676,1.0691,0.2091,1.708784,0.9620958,-0.6277269,-1.51473,0.6066352,0.02482805,-0.2825532,-0.6137553,-1.350456,0.1059939,-0.6224104 +1.508634,1.205509,-0.1287118,-0.5341338,0.8148195,0.5002368,-1.630209,1.347257,-0.6614727,0.4540064,-1.276918,-0.3152666,0.2785314,0.5331826,0.3190162,-1.572253,-1.178081,-0.2173838,-2.334884,0.8812273,1.320698,-0.1733148,1.26179,-1.858163,-1.036026,1.118223,-0.9185167,0.1896014,-0.08978488,0.190408,-1.369377,-0.2898095,-0.1307369,0.6872541,-0.02235188,-1.795404,0.600942,0.4153117,-0.8009609,1.050473 +0.6158422,0.9545868,0.09942238,-1.368414,0.564833,-1.456542,0.2769671,1.074393,0.7635442,0.1143136,1.116137,1.000676,0.8694724,0.997172,-0.7155011,0.9902672,0.04954611,-0.2335912,-0.07650779,-0.1849125,0.6183834,1.863844,-1.304154,0.01838639,0.004915778,-0.6022414,-0.628335,-0.2712347,-0.5393414,1.906325,2.320773,-1.368784,-0.06844115,0.4808021,-0.4825959,-0.08472026,0.3915727,-0.8876296,0.2155414,0.6814619 +0.08981202,1.549131,-1.00729,0.3321509,-0.7128688,0.7821923,-0.4920284,-0.4202093,-0.2563473,-0.9975558,1.249794,-0.4907467,-0.4222575,-1.55539,-0.0321046,1.023824,-0.001330845,2.186743,0.8838327,-0.2344707,0.7630368,-0.04929233,1.634094,-0.4710835,0.3021629,-0.231224,0.06062632,-0.5965324,0.7732867,0.3301048,-1.848188,-0.7416457,-1.324944,-0.004083195,0.8819658,-0.437901,-1.003785,1.789683,0.7047624,-0.627165 +-2.398453,-0.08340362,-0.9399183,-0.9540934,-0.6230582,0.07226298,-1.68316,-0.1833423,-0.052022,-1.724823,0.5121287,-0.6836164,-0.7154014,0.2776161,0.9964875,-0.7372571,3.183715,-0.2593353,-0.296936,-1.13138,-1.500526,-1.947499,-0.06397338,0.06824199,1.000478,0.2833951,-0.09306731,-0.679146,-0.5389633,-0.2258466,-0.828784,0.8785285,-1.767642,-0.1659076,1.458234,-0.6843528,-1.028858,0.0450679,1.269904,1.081431 +0.04780729,0.6135643,1.570234,-0.8662624,-1.021476,0.4391621,-0.596506,0.7693932,0.5675976,2.406177,0.5059226,0.2650107,-0.308052,2.335514,-1.526684,-0.6083404,-1.308625,-0.03802368,-0.01623921,-0.003630279,-0.5940064,0.03746487,0.06025131,-1.128885,-2.405957,1.037451,0.4401396,0.02867379,-0.2647354,0.3952786,0.3776993,1.893481,-0.6633863,1.01081,-1.71142,1.265993,0.9611313,0.1013678,1.467529,0.8384073 +0.3361434,-1.001269,0.2829105,0.0902038,0.4743566,2.137482,0.4946364,-0.5016338,-1.559304,-0.5014423,-1.159194,-0.3448432,-0.1096075,-0.08160504,2.048805,-0.5030277,0.1850026,0.08127658,3.517389,0.2516612,0.5137658,-1.429149,-0.03789079,0.08031171,1.415606,-0.3050884,-0.4563909,0.1519383,1.055834,-0.2397,-1.607402,0.01266735,0.3916437,0.3275919,-0.2353166,0.9046446,-0.3244434,-1.148394,-1.658234,-0.2753521 +0.4348457,0.1354957,0.03467589,-0.4162151,-0.1748774,-0.1498555,0.6806985,0.2088915,-0.2733188,-1.611743,0.350848,-0.3517785,0.3688343,2.25552,0.4953795,-0.5215284,-0.1463387,0.902668,-0.6242229,-0.6165771,-1.267968,0.2523093,-0.6696492,1.605144,0.835379,0.1455842,-0.00091389,-1.99768,-1.592489,-1.310568,-1.407705,1.546,-0.5065436,0.9329465,0.7816431,-0.3870081,-0.8394139,-0.7222802,-1.277325,-0.2728024 +0.2110236,-0.4151263,0.6232974,-1.468812,0.3839471,-0.03654078,0.006416012,1.217194,-0.1436714,-0.7013149,0.4347962,0.0779736,-1.981049,1.021064,-0.1664005,-0.9535884,-0.8356609,1.173922,-0.246393,0.4108639,-0.02976398,0.4742308,-0.02630175,-0.9275205,-0.9471848,1.364264,1.261784,-1.903234,-0.3384057,0.7532645,3.260031,-1.057164,-1.075374,-0.01200519,0.6974696,-2.323833,0.1362372,-2.099764,0.5534513,3.015771 +0.1107935,-0.4236961,0.3161812,1.485576,0.1062618,-1.02351,-0.1233446,1.109417,-0.7249566,0.9563372,0.9191275,0.3189299,0.6662344,-1.254585,-1.582102,-0.73504,0.3420949,0.5602655,0.9157863,0.3914851,0.01384214,0.4548573,0.349588,-1.972331,1.105829,1.105817,-0.897964,0.2019283,-2.134727,-1.63738,1.009512,-1.643868,-1.070788,1.158214,1.571908,-0.3589147,1.076454,-0.6862918,-0.229779,0.2674824 +-1.993323,0.8708227,-0.2910256,0.9754167,2.052481,1.641069,1.200398,-1.466992,-0.5878386,0.741366,-0.1888507,1.419327,0.2762022,-0.2492497,-0.1148035,-0.5531214,-0.119974,0.1300526,0.2874957,0.9084652,0.5189523,1.94349,-0.4065751,0.2447146,0.7236352,0.1896943,1.425684,0.08201949,0.997765,-1.584223,-1.307452,1.17637,-2.022506,0.4635008,0.837006,-0.463329,0.1388749,-1.441241,-1.281766,0.418865 +1.505016,-1.660536,-0.8664185,1.961496,-1.586043,-1.158599,1.055696,-0.5809493,0.1846141,0.200192,-0.2940809,-0.3762797,-0.5586443,-0.1932863,0.4807496,-0.4443799,-0.08416399,-0.7962907,-0.8551287,-1.031603,0.4459913,-1.02595,0.8403401,-0.3035212,1.067352,-1.43735,0.9806438,1.65008,-0.05054394,-1.324924,0.6201567,0.1815198,0.3193101,0.0369395,-0.3589113,1.312752,0.238944,-0.1554912,-1.300052,-0.2207459 +0.3541571,0.2267495,-0.3697527,0.4159644,-0.949497,-0.2127382,-0.1644667,0.3512688,1.169865,-0.3919689,0.05594266,0.8460658,0.6640939,-1.326337,-0.5405901,-0.3830023,0.5821154,0.5943222,0.05285607,0.2127642,-0.4849125,-1.033837,0.07578653,-0.1306011,0.09805884,0.366928,-0.2889749,-1.144355,0.9252575,-0.2110105,-1.246389,-0.939891,-0.3537261,-0.8912468,-0.2524054,-0.000461748,0.3335584,0.5703676,1.329351,-0.180587 +0.4348378,2.051407,1.757183,0.5821256,-0.9524536,-0.2349001,0.5735916,-0.3870421,-0.3566888,1.537367,-1.623043,-1.087494,1.098582,0.2718457,0.4040288,0.1077077,-0.3175657,-2.020884,-0.909974,1.768922,-0.4152037,-0.2764643,-1.644586,1.26689,-0.05958207,-0.3127229,0.1487758,-0.4139029,-1.305906,0.2536325,1.480293,0.6008861,0.1993254,-0.3811591,-1.023994,1.661005,0.08115127,-1.414372,-1.445052,0.1842837 +0.7682439,0.3924741,-0.1013137,1.1705,-0.1023902,-0.2674624,-0.2379224,-0.671285,-1.113849,-0.947356,-0.9490205,1.22024,-2.108852,0.2399548,-2.185355,-0.5839538,0.5760653,-0.818469,-0.8801849,0.3555136,-0.2062709,0.03528636,-0.2453425,0.3131343,2.147267,-0.04572016,-0.1655393,-0.4614382,-0.7123543,0.1777519,-0.9771729,1.297335,0.1896225,-0.6187523,0.7735202,-1.635835,-0.8987667,-2.564139,-0.5308439,-2.254124 +0.4096884,-0.2747252,0.04289738,-0.1645523,-1.025174,1.357448,-0.9657554,-1.710582,-1.220939,0.132732,-0.05368254,-0.4351361,0.2157876,0.5740554,1.037795,-0.4254721,0.9371222,0.1608901,0.5938538,-2.135448,0.2019899,-0.07181442,-0.3239173,-0.1934795,0.09817414,-0.8617577,-0.8575625,-1.245732,0.3659342,1.508389,0.811688,0.1739274,1.25513,-0.7398141,-0.7250718,-1.554965,0.2067243,-0.3871956,1.438868,0.5348849 +-1.976652,-1.221064,-0.5319394,0.9343454,1.050283,0.5664258,-0.3953624,0.4542357,-0.482802,-0.5774617,0.1570102,0.4499966,0.8626285,1.800382,-0.2036305,-0.09303577,0.3235567,0.3745586,-2.28545,0.895727,1.19843,-0.5612921,1.87961,0.07169759,-1.409661,0.8915263,-1.272589,0.8303557,-1.029337,-0.8508037,1.077536,1.693529,0.3379104,0.5750246,-0.3868828,0.2862836,-0.5207535,0.1394011,-0.4373359,0.2666586 +-1.087579,-0.9488587,-0.5683756,-0.1761716,0.4511632,1.209756,-0.8298203,1.613314,-0.726664,-0.4443865,-0.3901756,1.304091,2.791327,0.3469859,-0.1628876,0.2092537,-0.5940947,1.662473,-1.582124,-0.2457377,-0.9607833,2.202569,-0.8887227,-0.5542306,1.570158,-0.321561,3.067737,-1.993078,-0.4920517,0.8046354,-0.863949,-0.9896326,-0.5937849,0.7540751,0.5446581,-0.02670536,0.1583001,-0.1440703,-1.480627,1.655775 +1.064864,-0.5339416,0.1139007,-0.1811301,1.314229,0.1591247,-1.701657,0.01471177,1.042192,-1.631928,0.4920729,-0.6658818,1.862479,-0.4727574,-1.130948,-0.4205535,0.1689942,-0.8266732,0.1849912,0.2226698,0.1451634,-0.5434641,0.290773,-1.261359,1.146992,0.4207845,0.9974446,0.2517369,0.2893806,-0.09308713,0.7240982,0.01906896,1.269893,0.5021971,1.78558,-2.477316,-2.019284,-0.6477853,-2.536153,2.665217 +2.120203,0.6773452,0.4040357,-0.1941829,-0.8818808,0.004252387,1.815648,-0.4039667,-1.243034,-0.09228454,0.4955148,-1.029611,-1.397969,0.3535914,0.3896642,-0.4695073,0.2329351,-1.06987,0.9304838,-0.3310101,-0.4712882,0.6667028,-0.217124,-0.5718294,-0.850161,0.451657,1.574691,-0.379602,1.535636,-0.1733972,0.1188863,-0.1840923,0.1509788,-1.889183,0.3499432,-1.145624,-1.192154,0.974745,0.7369012,-0.9702304 +-0.5058375,1.313039,1.376936,-0.05362125,-1.712502,-0.8634769,-1.917966,-0.3511374,1.8296,-0.1620622,0.6988013,-0.1215069,0.9506844,0.3342842,1.95254,-0.4846821,-0.4149695,1.267142,-0.1881288,0.7720574,-0.3002309,0.647412,-0.3444363,0.4502688,0.7917861,1.677113,1.514558,-0.3431812,0.03733805,-1.603831,0.1129754,-0.1213547,-1.221378,0.6620776,0.4670052,-0.831375,1.0381,0.05514622,1.110985,1.157568 +0.4069871,0.2693072,0.7965078,-1.808961,1.129049,1.293452,-0.4360441,-0.7626875,0.7361686,-0.070533,0.4936215,-0.2388371,-0.4686577,1.763285,1.217143,-0.7698351,0.4784234,-0.5809645,-1.965847,1.427689,-0.6158416,-1.399895,-0.6486055,-2.134295,-1.584522,-0.3117856,0.08639643,1.907558,1.702765,0.05040522,0.4639914,0.2157741,-0.5655174,-0.409529,-0.1471765,-0.3441008,0.907135,-1.873182,0.7755165,-0.831652 +2.040026,1.138252,-0.6829798,-0.3113817,-0.2429087,1.66494,-0.2394376,0.5503623,-0.9476183,-0.1155221,-0.9409553,-0.3514901,0.2785509,-0.1751202,-0.6269795,-1.536542,0.03578847,-0.3792618,0.03190816,1.074644,-0.5804801,-1.084403,-0.3886358,-0.4211456,0.3087496,0.7400282,0.1240484,-0.3768877,-0.8240278,0.149977,0.2576662,0.4401517,-0.3869745,-0.5605884,-2.375421,1.347655,-0.792598,-1.741158,1.638366,0.3052786 +-0.1939354,1.228837,-1.159106,-0.1932489,0.3672703,0.4904138,-1.479577,-0.4289105,-0.65537,-0.7604117,-0.044244,-1.945046,0.8242406,-0.7312262,-0.08452153,-0.09811314,-0.7327796,-0.99898,1.133271,0.444435,0.3945699,-1.385584,-0.4698211,0.9016065,-1.810385,-0.2047001,-0.7773928,0.7627114,2.222239,-0.8562073,0.108874,1.676657,0.781844,-1.510162,1.369367,0.0897136,0.68259,-0.4385586,-1.457719,-1.395621 +1.370223,0.7140187,-2.702398,-1.676765,-0.4101741,-2.341325,1.67568,-0.09376718,0.04443634,0.7410935,0.6595039,-1.461047,-0.3949612,-1.275203,2.042457,-1.167097,0.8230122,-0.1131344,1.343901,-0.4299534,1.015681,-2.331495,0.5426333,-0.9097603,1.925656,-1.420492,0.2324756,-0.7302284,-0.8678232,-0.57049,-0.151212,-1.655854,0.623394,-0.6416815,-1.547659,-0.8356488,0.4546564,-0.5165349,0.03326926,0.3849062 +-0.5153573,1.085459,-0.5383421,0.04397581,-1.439092,0.4711028,0.8754111,0.6251624,-1.050832,3.02374,-0.2085767,-1.860014,-1.453618,0.845618,-0.422955,0.1729473,0.1702188,0.01749665,1.35185,-0.6891397,0.8308723,0.3595775,0.6422139,-0.2445728,0.4513255,-1.136867,0.4072145,1.758424,-0.9851947,-0.8715924,0.1119321,0.2091193,-0.7646839,-0.6937157,0.648842,-0.997421,-0.5046331,-1.412958,0.7395,-0.4534625 +-0.7182328,1.534283,-0.1189226,0.5063283,-0.5912901,0.5853374,0.9807624,-0.4179639,-1.281196,0.3238057,1.148122,0.8641973,-0.9692873,-0.3677344,-0.5714933,-1.096721,0.3788316,0.5597868,0.7010584,-0.4648398,-2.314133,-1.364663,-1.545479,1.311289,0.2487733,-0.6120858,-0.7182653,-0.09341796,0.1714535,-0.849478,-0.1735877,-1.78439,-0.9322339,0.2527729,-1.301772,-0.79365,0.6274052,0.1186924,-0.6233292,-0.9679404 +-0.1772423,-1.839801,1.614656,1.046151,-2.490486,-0.2694558,1.204872,-0.99897,-1.197109,0.1998611,-0.4016193,-0.1165159,-0.1016521,1.878012,0.854863,-0.2042931,0.07055064,0.4115245,0.3662655,-0.662767,-0.9017235,1.264986,-0.6421991,-0.5124144,0.3567982,0.4927312,-1.735479,-1.362732,0.606627,0.4752698,-1.918996,-0.6355674,-0.9893205,0.4514094,0.7599696,0.2322365,-1.071692,1.921831,-0.5345935,0.3424539 +-0.4015765,0.04050863,0.311566,0.3077907,-0.8297747,-0.05145059,-0.03666272,-1.53976,0.7156289,-0.2388918,-2.360066,0.4146988,0.6339132,0.6515122,0.7449979,0.7394711,-0.8828132,2.564664,0.0295938,0.5890999,-0.5776012,-1.777972,-1.591969,0.3264507,-0.7095787,-0.3788179,-0.5698458,-0.1554877,-0.470147,-1.196836,-0.4796378,0.4495059,-0.563136,-1.020608,1.857387,-0.2780894,-0.8436616,-1.366618,1.394316,-0.4424175 +0.8298836,-2.150074,0.2927493,-0.3397066,-0.4554903,0.001035457,-0.751791,-1.426459,0.2452138,1.795649,-1.16361,0.2088904,1.71357,0.4827643,0.666941,-1.594908,-0.2059279,0.7823381,0.8251214,-0.4020874,-0.5486148,-0.02113,-1.190797,1.919213,0.818699,0.09295744,0.7536564,-0.4686544,0.742219,-1.460893,-2.359741,1.099341,-0.7716495,0.5415327,-0.4025531,0.1942125,-1.174603,-2.112701,0.8615868,0.5111853 +-0.7858066,0.2168707,-1.403742,-1.017951,1.750167,-0.1165226,0.3886935,-0.545762,-0.178216,-0.765796,-0.02592828,0.7537007,-2.189254,1.594501,1.485191,-0.5195804,0.3466412,-0.9452727,-0.2766708,0.1905319,-0.3027085,-0.9873269,0.06988376,0.2976584,-0.5296394,-0.8118774,-0.3217052,-0.263434,-0.4336403,0.5990392,-2.412977,-0.5842237,-0.6534225,-1.263922,0.1568016,0.4783302,-0.8200956,-1.059226,-0.6989662,1.384464 +-0.9385012,-0.5978683,-0.6257315,0.8265614,-0.2344639,-0.4049239,0.6262611,-0.844156,1.036055,-1.427553,-1.954523,-0.6094386,1.660331,-1.041414,-0.6824143,0.3745886,-0.6352977,-0.7843429,0.1967295,-0.9664295,0.07307892,-0.4321093,0.5393529,0.2364827,-0.5808159,0.3056284,2.40931,-0.1320342,0.00936998,-0.0333482,0.8673556,0.5113558,0.4409492,-0.6883362,0.9933951,-1.074146,-1.983405,-0.1744128,-1.340334,-0.7125185 +0.4178068,-0.07602373,-0.02420947,2.084314,-1.299383,-0.4270475,-0.994879,-0.1829703,-0.1112808,0.1972629,0.3768992,0.7987283,0.2447267,1.052097,-0.01418731,0.4572685,1.395673,-0.5642132,-0.5679928,-1.108996,-1.533478,-1.850919,-0.555688,-0.692029,-0.860761,-0.122349,-0.03529024,0.9953555,1.155785,-1.010433,0.5511051,-0.7003298,0.3294363,-0.689629,0.3567351,-0.4955493,1.049272,0.4126494,1.077688,-1.526071 +2.098098,-1.059674,0.006136925,1.477442,-0.9698451,1.121721,0.7536338,1.28866,-0.3065058,0.6302357,-1.11968,-0.5809749,0.5506194,1.083936,-0.624602,-1.274843,0.6697044,-1.039967,1.067858,-0.2575783,-0.507245,-0.8693427,-1.778505,1.203781,0.3298878,0.1633836,0.5129508,0.2222023,0.3436957,1.423643,-0.2671519,1.288669,-0.3893058,0.6202851,-1.305788,-0.6459852,-0.6714492,0.5558598,-0.492904,0.4573786 +-0.4488623,-0.5475545,0.2961484,0.4635137,1.743532,-0.6584189,-0.1819729,-1.230875,-1.285065,0.09706811,-0.1143808,0.6236018,-0.05409099,-0.6924221,-1.29282,-0.6596877,0.7219569,0.2151821,-1.313848,0.906478,-0.2585824,-0.821741,1.395415,-0.2481665,1.035055,-0.4441195,0.5840731,0.8189293,-2.331039,-0.5689894,1.335425,-1.089605,0.223264,-0.2564635,-0.09469934,0.2094217,1.211903,0.2197366,-2.534788,0.1760449 +-1.095288,0.09534096,1.018094,2.462525,0.2806284,0.5793983,-0.153899,-1.579377,0.8247796,-0.1996461,-0.3122689,-0.7698656,-1.569829,0.7636012,-2.16116,-0.1462541,-2.377359,0.3453533,0.9044867,1.789588,-0.8035564,-0.2297193,0.2973638,-1.695003,-0.5805547,-0.0289293,-1.745393,-0.06222086,0.6563163,0.5900228,0.6157717,0.1490069,0.5828182,-0.5016257,-0.05141401,-0.5060997,0.3566996,0.03675512,-0.3045948,0.5223221 +-0.5289728,1.368866,2.471637,0.8572509,0.5914403,-1.848338,1.569437,-0.4172702,0.940001,-0.02749034,-0.815835,1.348143,0.6317047,-1.366692,-2.895259,-0.3599444,-1.155862,-0.3637451,0.5815732,0.4443446,-0.9573785,0.7398373,-0.6550463,-1.334823,-0.6661352,0.6746258,-0.3753236,-0.1014871,-0.5520552,-1.146725,-0.08959061,1.043,0.5375678,-0.3514907,0.5438636,-2.684712,-0.3365051,0.6731982,-0.978768,0.9542127 +-0.4847939,0.4420535,-0.656795,1.156168,-0.3236406,2.346841,1.277845,-0.9993984,-0.7416322,-0.8124573,-0.2011682,-0.8897332,-0.286638,0.4454689,-0.3498855,0.01727666,1.00575,0.5038434,-0.7093687,1.265471,-0.4015582,0.7971286,0.4970518,-0.09040133,0.3372049,-0.01713813,0.7179481,-1.413466,-0.9473663,0.3137003,0.04143388,-1.320771,0.1875392,-1.216695,1.014925,-1.524654,-0.6628252,0.102579,-1.120128,0.4682726 +0.127757,-2.075206,0.07741261,1.577984,0.6534517,-0.792182,-0.7637652,0.04651592,0.4343112,0.8442895,-0.02200158,2.771272,-0.3840892,-0.3233727,-0.9076824,1.641504,-1.175083,0.6872937,0.5125021,-0.4308002,-0.1170533,-0.2443473,-1.366535,-0.0336135,-1.577042,0.512758,0.6549617,-0.8234358,0.04518019,0.4465972,0.9729493,0.6266952,0.3264854,-1.017629,-0.07137766,1.055798,0.1259185,1.694331,-1.365464,-1.83559 +-0.0746731,-0.3078097,-0.8428604,0.0184765,0.3078103,1.037716,0.1190489,-1.034839,1.83384,-1.526763,0.5861117,-0.908472,-0.792873,-0.962877,-1.384463,1.111248,1.163036,0.6586009,0.4141739,0.3331632,-0.5073853,0.5178965,-0.2544482,-0.9197155,0.2897267,1.271612,-0.1789726,-1.792211,-1.12519,-0.7287066,0.4671865,0.5856003,1.620895,0.4576349,0.6639818,0.8157481,-0.00118731,-0.6335277,-0.005152862,-0.2245014 +-0.3591621,-1.45413,0.2469591,-0.2943356,-0.7890811,0.7380478,-0.0771615,0.5387591,-0.4515531,1.811279,0.7124127,-0.8660151,0.1122984,0.9542626,0.3215295,-0.06319462,-0.4474133,-1.010216,-0.8332395,-1.69313,-1.263391,1.552225,-2.219039,-0.2682109,-0.3029294,-1.101283,0.4259012,-1.595875,0.6905423,-1.455964,-0.5495263,-0.646288,0.3821613,-2.772802,0.502557,-0.3993529,-0.4462389,-0.8643605,-0.09528278,1.30125 +-2.048481,1.309616,-0.3787776,1.488226,-0.3546811,1.652728,-1.434351,-0.2181465,0.6304216,0.5832535,0.2825382,-1.605496,0.24706,-0.139258,-2.179153,0.001207574,-0.1076137,2.712752,0.6041879,0.3296778,0.609813,-1.29881,0.8654422,-0.1861204,-0.3393326,0.3327688,-0.3034853,-1.421474,1.941336,-1.143433,-0.7711378,-0.6592028,0.9381236,-1.093657,0.04856186,-0.4554943,-0.6336901,-0.9256758,1.100038,0.07503668 +-0.75118,1.470087,1.866272,0.3315873,0.864689,0.4198822,-0.1752155,0.196478,-0.0515087,-0.2615173,1.504237,0.584046,-0.149514,-0.0369563,-0.04293746,1.442791,1.443634,0.626322,0.4275363,0.5931275,-0.4073273,-0.6344886,1.334337,-1.653667,-0.8147423,-0.5602565,-0.6236122,-0.887386,0.3572339,2.429428,0.7810276,-1.153867,-0.3798772,0.09705174,0.4525624,0.6860388,-0.7050472,-0.5517868,0.6491958,-0.6683385 +-1.581383,1.510429,1.002726,0.8497741,0.891798,0.1919998,-1.093004,0.872945,-1.0814,0.6968154,-1.92971,-0.7983398,-1.238607,-0.2631545,0.2214123,-0.1629631,-0.1801412,0.7899622,1.246359,1.897851,-0.3400075,1.600949,-1.302253,-0.9927275,-0.8810772,-0.0137109,1.767034,0.3542767,-0.6658862,0.7783715,-1.013425,-0.05264313,-0.4011113,-0.2024018,0.4041868,0.4625368,0.4905671,-1.293943,-1.247984,1.330134 +-0.2897761,-0.0576553,1.177859,0.6596057,-0.2480658,0.09016752,-1.287443,-0.07064647,1.439293,2.362008,-0.09752214,0.2003367,1.201468,-1.118143,0.6024723,-0.8668299,0.8429724,2.073774,-0.5081612,0.7973352,0.2820277,0.2088531,-1.092051,0.8981494,-0.6532369,-1.551898,-1.35094,0.06540692,0.6452637,-1.104402,-0.1101977,0.4672233,-0.9799883,0.123878,-0.07694058,-0.8789515,0.05156867,0.8619316,-0.05658448,0.1131836 +-0.6919545,0.1380716,-0.5891046,-0.8728608,-0.7054614,-1.5944,-0.1042234,1.042013,0.3498317,1.577361,0.260759,1.001263,-0.3320769,-0.4328898,0.09102899,-0.4739885,-0.9829128,1.063089,-0.0817554,0.3327249,1.366092,-2.859577,1.229952,-0.2660716,-1.529734,-1.46747,-0.940648,-0.570264,1.407535,-0.3830463,-1.10239,0.1492046,-0.5267169,-0.1625303,0.3447252,-0.3508788,0.8766137,1.298789,0.4051806,0.1077197 +-0.1719879,1.009344,-0.5855291,0.4288513,0.8143653,-0.7423833,1.300691,0.4550452,2.327501,1.01335,0.0160762,0.4580151,0.3420284,2.195063,0.1652199,-0.9160583,0.819843,0.9880255,-0.5925327,1.089939,0.632537,-1.21387,-1.041934,-0.137297,0.5398053,0.5200459,-0.2980635,0.8228995,-0.8994916,-0.8979951,0.0777989,0.7119009,-0.01941987,-0.01303168,-0.7016898,1.080825,-0.6709117,-0.4531859,0.2996466,-0.932134 +-0.222564,-0.09618196,0.2781355,-0.8767307,0.1670905,0.2857291,1.016319,0.2809117,1.906161,0.5205394,-1.069573,0.1562365,-0.4379074,1.362578,1.905594,-0.3578751,-0.2550999,0.7743931,-0.2088635,-0.5894174,0.95736,0.2635021,0.6292257,-0.7066938,-1.015445,0.8029541,-0.000361113,-0.7279602,0.5686671,-1.348293,0.5688315,-2.126811,-0.5609892,-0.232582,0.5801354,0.004348886,-0.0442193,-1.007703,-0.02910733,0.7544239 +0.9192413,2.078037,-1.231661,-1.710208,0.3734085,-1.713572,-0.1638356,0.4337718,-0.8596743,2.570139,0.9762867,-0.358614,-0.850575,-1.427698,-0.2526963,-0.3122714,2.585663,-0.74244,-1.223048,-0.6137101,0.3435459,-0.2445721,-0.01079594,1.474492,-1.485113,0.2477273,0.3708857,0.8560124,1.775505,-0.6425558,-0.3053235,1.387922,0.6500262,0.886925,-1.510656,1.323377,0.2996441,0.5808143,-1.592743,1.69048 +-1.110065,0.04750139,-0.09925392,0.1777723,0.01880045,-0.2921124,-1.143426,-0.0721739,0.978036,-1.11137,1.304926,-0.7320965,-0.003437543,0.5738595,0.1962933,-1.761672,0.05417304,-0.9639456,1.050668,-0.9802547,-0.6473756,-0.7755843,1.349228,-0.6016806,0.3281614,-2.758022,1.535291,-0.99635,-1.976946,-1.548661,-0.2961464,0.178969,-0.875827,-0.07115995,-1.85664,-0.105789,-0.3319516,2.455376,-2.100429,0.2426189 +-0.660889,1.165511,1.496909,0.9401438,1.045894,1.734275,0.04400298,-0.08494047,-0.3780723,-0.9203296,0.8730825,1.418219,-0.6400618,-0.3992124,-1.319328,1.171872,-1.262666,0.8632278,0.5276494,0.3318594,0.5401479,0.1725886,0.1306527,0.5053446,0.1914662,-0.9166972,-1.341284,0.1117951,-0.586357,1.577034,-0.1884574,0.4908324,-0.3730769,1.399297,-0.07179216,-0.1141654,2.245909,-0.3807953,-0.4027332,-0.5924847 +-0.8737749,0.5927729,1.493886,0.3799607,0.6559319,-1.29366,0.4722078,-1.056361,-1.189893,-0.5093818,1.07628,0.6947358,0.211077,-0.003992903,-0.599351,-0.851438,0.4808406,1.63522,-1.366705,-0.1720818,-0.1903031,1.733688,-1.96196,1.51592,0.9592737,-0.585771,0.641703,-0.0506966,1.381385,-0.207607,0.1195091,1.443635,1.093378,-0.3000621,-0.3840668,-0.01518685,-0.07608788,-0.2211737,-0.2496722,0.5131082 +-0.1328097,-0.3230584,1.791011,-0.404598,0.9263784,0.1148715,-0.5518284,0.4759379,-0.717221,-0.2840246,-2.210356,-0.1523493,0.8268863,0.2442241,0.5697003,-0.9583584,0.4017332,0.05969752,-1.090689,-0.0444386,0.1822954,-0.7413461,1.54654,-0.9581277,-1.242009,0.4388507,-1.818876,-0.03210837,0.7972801,-0.1861148,-1.071322,0.003106399,-0.7307215,-0.4802591,-1.538546,2.630245,0.3667009,-0.8010664,1.747359,-0.6186274 +0.3054891,-0.07651839,-0.800101,-1.977789,1.514081,1.216414,-2.00569,-0.8151114,2.025992,0.06409434,2.318499,0.881826,-1.700631,0.8331052,-0.3662069,0.4534546,2.328703,0.3315364,-1.522662,-0.3179981,0.5887085,0.730288,-0.7718173,1.106264,-0.2472299,-0.3606266,0.3659957,0.1541722,0.8081382,0.6065228,1.283063,1.260977,-1.807505,-0.6932494,2.217701,0.4494208,-0.2438193,-0.07265222,-0.9017145,1.591327 +2.163842,-1.649038,1.675512,-1.099723,0.05771982,-1.717987,-0.1898932,-0.1476062,-0.5829734,-0.8188695,-0.7193217,0.8837705,-1.483124,1.278858,-0.6645896,0.1955412,-0.5077006,-1.831107,-0.02804558,1.342481,-0.009701257,-1.986049,0.06573029,0.5077371,-1.510215,-1.992987,0.1143794,-0.4699418,-0.2412685,-3.034959,-1.799324,1.834506,0.6706195,-1.651512,-0.3525574,-0.8513457,-2.355437,-0.3488044,1.525798,0.8323766 +0.8803785,0.06120169,-1.231889,0.5465488,-1.135967,1.294808,2.023536,0.6384289,0.1177433,1.686652,-0.5899976,-1.570088,0.9261708,-1.308968,0.3204637,0.4314875,0.8934258,1.049105,0.8156035,-1.048202,-0.3182507,-0.6344909,-0.9387679,-0.7325027,0.3632654,-0.2316277,-2.526357,1.02349,-0.8138459,0.6872137,-0.1876022,-0.5135684,-0.05694956,-0.7065065,1.156722,0.8034203,-0.142158,-0.09729285,-0.7922974,0.4321555 +0.5051163,0.182151,0.2108499,-0.7902651,1.056117,-0.6169977,-0.7336647,0.4934163,-1.902449,0.001148574,-0.8686466,0.3574535,-1.127873,0.3757868,-0.1497014,-1.370371,0.9991144,-1.637224,0.49675,-0.2056437,0.6314508,-1.176841,0.8967773,0.9943536,-1.23869,0.009158919,1.285769,-0.936059,-1.915475,-0.1176728,0.6575151,-0.8192504,0.8499762,0.1812497,0.1643306,-0.1672759,0.7501259,0.3618628,0.7006649,1.185211 +0.1105902,-0.6582669,0.2532629,0.2857857,1.19686,0.8212279,0.470107,-0.7240713,0.1450898,-1.029038,-0.501673,-1.606059,0.5102454,0.09714411,-0.3601545,0.1634874,1.772031,-0.2695469,-1.140715,2.259175,0.6776797,0.8339273,0.2228924,-1.110729,1.491564,-1.497004,-0.6625869,0.3343717,-0.7783055,-0.3487591,-1.523484,0.2133569,-1.217435,0.9850235,-0.1662247,0.8711168,1.150166,0.3247081,0.7928933,-0.5434534 +-0.2928586,-0.8660844,-1.004358,1.796435,-1.187471,2.085638,0.6789906,1.240415,-0.7877213,-0.01645467,0.7633771,-0.4890914,-0.4776701,-0.4063459,-0.5918644,0.9883053,0.5582452,-0.1070981,0.2553093,-0.5002756,1.060886,0.6597849,-1.58724,1.315105,0.7187902,0.0967638,-0.6859233,-1.049488,1.617452,0.7676985,1.310831,-1.365955,-0.4934144,0.9061806,-1.82924,-1.568468,-0.590546,0.3256533,-0.03220045,-1.642533 +0.77232,-0.0633766,0.05610082,1.347658,0.7979841,0.5290523,0.3163751,0.8545153,-0.9526072,-0.2622099,-0.6118645,0.7354856,0.03760726,-1.641215,0.4186706,0.1068783,-1.283193,1.886679,-2.780292,0.2283446,-0.3760931,0.9377962,-0.9246233,0.5662146,-0.2172057,0.4739495,-0.1447337,0.2949921,-0.4663859,0.2101623,0.4998853,-1.675574,-1.099123,0.05621487,0.3798723,-0.5095599,-0.5550091,1.350515,-0.01304732,2.265172 +2.128794,0.0327901,1.048822,-0.1673823,-1.206299,-1.090675,-1.887,0.4356306,0.617957,-0.937813,-0.397776,-0.6924738,0.1306633,-0.1862207,1.607885,1.040349,0.4807429,-0.9045599,-1.239525,1.390174,1.139692,-2.245382,0.4236813,-0.8161695,-1.601442,0.02304978,0.08456156,1.646254,-0.5993563,-0.1554402,0.2807718,0.7537693,0.2347238,-0.04420234,-0.6254807,1.74866,-1.573628,1.101055,0.2154964,0.3769138 +0.547968,1.532263,0.166088,1.166263,0.3733508,0.4721306,0.2960565,-0.9095477,0.7884724,-0.5825195,-0.1478063,1.693902,0.706523,0.2046544,0.5868935,0.3603613,0.1016651,-1.186084,-0.1693457,0.3086295,0.6394043,-0.1658965,0.2810555,0.3788498,-0.3132606,-0.2593462,-0.2009764,-0.004768928,0.8393606,-0.8855025,-0.9035194,1.050536,0.4743661,1.688974,0.2325668,0.3730485,1.157168,-0.05700406,-1.32116,1.837574 +1.532129,0.4553214,1.270559,0.2910317,-0.3061602,-0.3010973,1.19252,0.7980203,-1.291672,0.3141322,-0.6121131,-1.501809,-1.458844,0.4106367,-0.177361,0.4374791,0.6024053,-0.1378595,-1.380215,-0.7460308,0.6002848,-1.733107,0.3227594,0.456473,1.897142,-0.2379548,-0.6861516,2.87835,-0.09120726,-1.053786,0.8050847,0.5954887,-0.7989315,-1.148702,-1.125188,-1.198091,-0.211668,0.6481398,0.4204593,-0.1130662 +1.666431,-0.5640012,1.434283,0.6658383,1.415615,0.7646785,-0.5678377,1.313968,0.2479297,0.477483,1.052092,-0.5896461,-0.2566682,-1.249435,0.4120567,-0.2131678,-1.839697,0.5838681,-0.07006594,0.8426912,-0.670377,0.2870641,-2.340223,1.248788,-1.362842,0.2161571,0.7746065,-0.6093793,1.234509,-0.4512903,-0.1692797,-0.7381936,0.6488215,-2.422778,0.5261458,-0.5494979,-1.302949,0.5361625,0.712336,-0.8023909 +-0.478,-0.7649419,-0.07773389,-0.1814424,0.3294547,-0.5046282,-0.09430999,1.592583,-1.194787,-1.519144,-1.420327,1.074368,-1.291091,0.2040547,-0.09837569,1.506904,0.969519,0.9442605,-1.29198,-0.6455764,-0.5639535,-0.1456516,0.8425304,1.424801,0.3913242,-0.665111,-0.790715,0.3885541,1.1623,-0.2452658,0.2058875,-0.7643156,0.3933217,-2.359118,1.307168,0.3693774,-0.6599512,0.06717022,1.497082,0.6758814 +-0.3686816,0.5090528,0.3264012,-1.61287,0.7742416,-0.3152287,-1.044518,-1.965446,0.0294056,1.103286,0.8195077,-0.03304973,0.9938017,-0.1491718,-0.4607008,-1.006248,-0.04950375,-1.376796,2.338115,-0.2721036,0.6564532,-0.3399974,-0.09090936,-2.458433,1.127516,-1.252148,-0.4206962,-0.0332868,-1.147121,1.11441,-0.2116566,2.661412,1.232269,-0.7365212,0.8374177,1.483597,-0.7523016,0.3758089,2.373473,-0.08414806 +-0.5161661,1.568214,0.24397,0.1503741,1.012154,0.8392895,0.7920099,0.4641305,-1.386291,-0.2163785,-1.588327,-1.004094,0.1139168,0.5212814,1.338036,0.420552,-0.6568813,1.668885,-1.588964,-1.113702,0.983661,-1.591765,-0.9818282,0.654179,-2.572731,2.220769,-1.638731,0.1645909,0.9048898,0.09201734,-0.711665,0.1615274,0.7686335,0.7030712,3.027454,0.8730714,0.3218081,-0.03072788,-0.6280251,0.6551339 +0.3969377,2.103946,-0.1019252,-0.137263,-2.153746,0.5394922,1.034292,0.8558928,0.09414409,1.060591,-0.7077073,0.1652724,-0.9492312,-1.478729,-0.3964868,0.5100722,0.005242819,-0.568208,-2.724882,0.4537178,1.083906,-1.130948,0.6957391,-0.3274337,0.9728097,1.126511,0.4138261,-0.2502237,-1.822264,0.3419568,-1.106332,-1.664991,0.3724695,0.5442793,-0.1317598,-1.475433,0.243922,0.1766022,-1.227544,-0.4588898 +-0.3148718,0.8333607,-1.821512,-2.170212,0.7127172,0.3221748,1.343251,-0.4908316,0.3270532,-1.081227,-0.9296778,-1.422488,-0.1363121,0.3369732,1.09192,-0.2814366,2.311319,0.703432,-0.5145121,-1.525215,0.2974055,0.6306959,-0.02237525,-0.5533262,-0.2465964,0.8172313,-0.11629,-1.161077,-1.583207,-1.719228,-0.249068,-0.4811849,-1.042144,-1.551778,0.2380742,-1.256418,-0.2624315,-0.1993776,1.576004,0.862335 +-0.6846898,-1.532249,-0.3644836,-0.8129608,-0.736804,0.4206429,-0.854735,-0.5879092,-0.05557971,1.2437,-0.5111085,0.37659,-2.403793,0.7742602,0.1463235,-2.19229,-1.072111,-0.4378605,0.2184434,-1.561687,-0.2112888,-1.844584,0.1584737,-0.3229168,2.302649,0.164367,-1.198344,-1.648508,0.2452679,0.02848302,-1.100481,-0.5091468,-0.03816467,-0.965637,2.495091,0.08047699,-1.619515,-1.330714,-1.128244,1.992094 +1.632248,1.082336,-0.6345094,0.06329308,-1.170193,-1.356446,0.5973047,0.0824217,0.6190849,1.381618,0.3827245,-1.05328,0.4777686,1.302233,-0.6740435,-0.9380487,1.010993,1.411783,0.01881723,0.8114758,-0.7798463,-1.167514,0.2969716,-2.622154,0.2130729,0.3331633,-0.1742573,2.387168,1.310774,-0.9612275,-0.9128009,1.210513,-0.3635785,-1.134811,0.9036577,-0.1795267,0.8272287,-1.129364,0.2633656,-0.1091082 +-0.6845338,-0.03639905,1.63324,1.18874,0.3872153,-1.501251,1.196035,0.8375061,-0.5305062,-0.8930148,-0.262175,1.224756,0.1163836,0.9979848,0.6046733,0.03955983,0.6359591,-0.1979028,0.3752303,-0.4602749,-0.7862953,0.8143704,1.638914,0.3695647,-1.345765,-0.1833604,-0.2246196,-0.3991013,-0.1404091,-0.5083674,-0.02660162,-0.9423494,2.061443,-0.001947129,-0.06271757,-1.305605,1.022905,1.152379,-0.005378667,0.2203379 +-0.313077,-0.5524403,0.5280031,-0.4134103,0.1654828,0.1784296,0.05784625,0.6114815,-1.228204,0.4570938,-0.03645029,0.5716228,0.3364961,0.9458234,-0.2035511,-0.8079144,1.047555,0.3982908,0.4395153,-0.8507286,1.061488,0.1354723,1.973785,-0.3898779,0.4296691,-0.0642685,1.077176,-0.4689739,-0.6901988,-0.713355,-1.40786,-0.2088937,0.7624016,0.1164607,0.04764137,-0.6340951,1.630678,0.5455299,-0.08864796,-0.6066424 +-0.6400327,-1.419163,2.493001,-1.968345,0.08188284,-0.4118144,-1.389224,-0.2226323,-2.627673,-0.9069305,-1.599437,0.4540017,-0.7498475,0.6712076,0.7421328,1.256218,0.575623,-0.2808871,0.8486757,0.2592699,1.026443,0.2863208,1.031687,-1.321654,0.6441375,1.094276,-1.076651,-0.1740644,-0.4625478,-0.4725319,0.5979318,-2.650167,2.150029,0.1198522,1.810142,-1.207571,-0.7814293,-0.9402756,0.9369079,-0.1017863 +0.6353044,0.0779916,0.7032355,0.201099,0.698222,-0.6187303,-0.1816395,-0.4074145,-0.1863941,1.667054,-0.8902815,-1.144441,-1.091223,1.134626,-0.5987958,0.3025136,-0.3174811,-0.2800901,0.5761248,-0.6987377,2.455739,0.3794494,1.640809,1.934639,-0.224562,0.6318406,-2.121505,0.009482551,-1.283769,-0.1921867,0.2672856,-0.2810362,0.4259426,0.5662014,1.039354,-1.249659,-0.3182009,-0.3053951,0.01396323,-0.6389776 +0.02484837,0.8403234,0.2005169,0.3371388,0.4064781,-1.632751,-0.1511384,-1.048579,-0.4790101,-0.388077,1.42324,-0.2958605,-0.004053645,0.4613658,-0.2049377,0.2709542,-0.1650705,-0.3605082,-0.1788825,-0.7541077,-0.6672266,-0.3105903,0.0970149,-0.0362969,-0.5247602,0.5872901,0.1472439,-0.5012706,-2.758892,-1.261602,0.3713637,1.570863,-1.816049,-1.239249,0.2544076,0.1260698,0.6220834,1.043288,0.2863706,0.5768415 +0.2276222,-1.709856,-0.2754812,0.4166243,-0.8686266,0.9341833,1.754935,-0.9343015,2.267454,-0.9934763,-1.006941,-0.1847419,0.4329863,0.7579187,-1.527341,0.1016022,0.8347074,0.9029003,-0.6411596,0.607618,-1.085885,0.3967245,0.4972844,-0.4637002,2.126173,-0.7807151,-0.1767457,-0.727837,-1.327211,-0.721621,0.09795704,-0.7654334,-0.3766239,-0.9652315,0.3283745,0.2580744,1.967096,-0.9841765,0.7066238,0.3026243 +-0.2460682,0.27776,-0.8379037,-1.846862,-0.00735442,1.862427,-0.7033395,0.5800126,0.09463166,0.9850494,-1.651332,0.09160808,-2.313708,-0.4854289,-1.697409,-0.7008734,-0.847593,-0.6919412,-0.6428731,0.5954051,1.529222,0.2515613,-1.417824,0.09702565,-1.223368,0.5810809,-0.09731145,0.242028,1.602961,0.1080559,1.231076,0.4366634,1.005752,-0.9215642,1.057673,-0.7267031,-1.033979,0.8109625,1.072642,-0.6556757 +0.02422383,0.7401122,0.4244971,-0.4282414,0.6500799,-1.977495,-0.784386,-0.09799113,2.769713,-0.5420691,0.4263863,-1.120135,1.118786,0.1577891,-0.6547517,-1.641155,-1.863595,2.294954,-1.620962,-1.322521,-0.7458273,-0.2630346,1.81563,-0.5844504,0.2963747,-0.2871735,-0.2220472,-0.6031358,0.9373067,-0.07275164,1.021526,0.2936286,1.141205,0.2967243,0.1392598,1.323777,0.0015446,-1.553833,-0.4130994,-0.1728405 +-1.244582,-1.27317,0.5148536,-0.2494561,-1.446142,-2.10596,0.9314006,0.09319833,1.815645,-0.6125447,1.087765,0.004937136,0.2669107,-0.6371018,-0.7606234,1.578,-1.12345,-2.388006,0.05834154,1.053569,-1.918908,-0.1029449,0.04728237,-0.3138958,-1.3797,-1.556009,0.3111051,1.098424,0.19599,0.5532679,-2.565298,-0.2670854,-0.974462,-0.6423971,0.05191582,-0.2327884,-0.3236152,2.139708,1.234525,-1.483021 +-1.205975,1.131976,-2.01557,0.6990657,0.003029073,0.1999343,2.395855,-0.2717643,2.020997,0.1062528,1.961454,0.03161296,2.522629,1.287385,1.037632,0.7542588,-0.6170588,0.2432055,0.2725201,0.0987468,-0.6085819,-1.01422,-1.03611,-0.9909296,0.6447767,1.453884,-0.4179502,-0.201181,-0.4029028,-0.7786554,-1.501934,0.919355,-0.08887326,0.2019628,1.270179,-0.174512,-0.3723283,-0.6258926,0.9641214,1.314858 +0.273094,0.235314,-2.037087,-0.3257139,-0.9236295,-1.267357,-0.51774,-0.5975394,1.551691,1.438918,2.434655,1.994164,-0.6147264,0.780946,-0.16315,-0.9754503,0.01956671,2.855393,0.1654067,0.5227739,0.05140821,0.2042344,-0.6499701,0.4426667,0.1728924,0.2823104,1.967057,-0.3583596,0.7697389,0.9121265,0.2444261,0.3871388,0.05982797,-1.401233,1.101299,-0.1680189,1.727561,-1.814355,-0.4741522,-1.614487 +1.729414,-0.1315957,1.259285,0.3945505,0.7246028,-2.182002,0.5184561,-1.246677,-0.6509937,0.6430582,-0.6156218,-0.000697787,0.1778672,0.7291073,-1.18247,-1.910347,-0.51772,0.1274572,0.3063509,0.02928851,0.8899001,0.1158931,-0.4354744,1.079711,0.04777569,0.05203748,0.6555626,0.2513935,0.7822793,0.6586576,-1.025443,0.6051472,-1.818774,0.09029654,1.659247,-0.3737971,-0.6456852,0.5099921,1.501637,1.903859 +-1.064951,-0.5609298,-0.1231664,-0.2801631,0.6640256,-0.669868,-1.259182,0.1334131,1.441346,1.08814,-0.5469693,-0.2938502,0.2583401,-0.4964507,1.765922,-0.8150449,0.6144707,-0.2511927,0.922635,0.3656475,-1.36367,-0.4496783,0.1784369,-2.048878,-0.5967425,-1.077496,0.6137898,-0.5188228,1.112365,-0.7206656,-0.5289114,1.498302,0.6441587,0.4542647,-0.1087632,0.000388236,-1.568033,0.2085552,1.43993,-1.742802 +-0.1137367,1.593347,0.4727724,-0.7259151,-0.6433904,0.5870458,-1.538734,0.4422444,-0.5390388,-1.266575,-0.3901613,0.6936958,-1.265638,-0.527017,-0.709335,0.9198102,-0.07639257,-0.2125085,-1.379822,-0.9393818,0.2380136,0.7603066,2.499907,0.1578259,-0.05818407,-1.720022,-0.903934,-0.07945245,-0.4979091,0.1377598,0.1870136,0.1707886,0.7594103,-0.9360592,-0.6425769,0.01914781,0.9053601,0.06754509,0.5247999,0.5792801 +-0.6437515,-0.5486336,0.4004884,0.3699031,0.3201251,1.081165,-0.08899086,-1.583981,-0.4215866,0.6759692,-0.005593502,-0.03692218,0.9399803,-0.386477,0.8923362,-2.199412,-0.1348021,-0.5484222,-0.4758906,1.035535,0.8271794,0.0922482,-0.5334845,-0.1602914,-1.724335,-0.1430894,-1.600994,1.122577,-0.4300799,0.7049565,-1.970329,-1.680875,0.8716739,-2.208303,0.3765051,1.607694,0.7052145,1.205988,0.4882486,0.8181983 +0.9406484,-0.073261,-1.912398,-0.5407352,0.7114017,-1.062341,-0.8129916,0.1619786,-0.789998,0.3165887,-1.377332,-0.9991758,0.0373191,-0.3289766,-0.8273612,1.489753,0.9405423,-0.363792,0.1545968,-0.05964402,1.020507,-2.015171,1.072196,-0.2294877,0.566688,-1.210576,0.3979385,0.561418,0.6482549,0.4186572,-1.524694,0.1113418,0.1982177,-2.072732,0.4662966,-0.1244362,-1.920745,-0.6911847,-1.197397,-0.1029481 +1.91881,1.702125,0.5758911,-0.1884987,0.7172262,1.025522,0.9588276,-1.276005,-0.3397718,0.1368773,-0.1901829,0.03955725,-0.1162818,-0.7472763,-2.320268,0.0801424,0.0486258,0.4377486,-0.2666708,-2.015406,-0.2430195,0.9581222,0.9563167,0.7665736,0.1789386,1.95024,-0.1299087,-0.09338197,0.561043,-1.355241,-0.3212729,-0.7899088,0.7299612,0.07979919,-0.5776606,1.420681,0.8632713,-0.287702,0.9876137,-1.027251 +-1.006046,0.9338393,-1.271906,0.8697955,0.6598879,-0.4184728,1.050006,-0.2497503,-0.5874278,0.2321587,0.1462804,0.7776656,0.1135019,1.546663,1.331038,-0.5577546,-0.5676113,-2.200091,0.3797604,-1.81153,-0.1363782,0.1853336,1.784886,-0.9650059,-0.761656,0.3556525,1.382204,0.9161287,1.237764,0.1441711,-1.633371,-0.1561234,-0.9749786,-1.737201,2.239668,1.286265,-0.235324,0.2723983,-0.6125814,-0.3421937 +0.5470826,-0.3862376,-1.133927,0.2244141,-0.4134207,-0.9386603,1.033735,-0.6153467,-0.7741632,0.3289132,-0.02945548,0.9532286,0.8382382,1.083815,-0.7523153,1.0766,-0.9951425,-0.3481149,-0.8276762,-0.1066839,2.151105,-0.2043345,-1.191662,-0.5445343,-0.9980228,1.03789,-0.4583705,0.2012402,-1.054928,-0.3550913,0.478877,0.3331566,0.7564884,-1.016795,2.109486,0.1041938,0.1692909,1.134444,-1.004758,-0.001228738 +0.03384706,1.002925,0.1703804,0.5694614,0.3948011,1.030234,-0.6647211,-0.6429474,-1.664205,-0.3567026,-1.30782,-2.051482,-0.1339818,-0.897125,-0.725236,0.2035721,1.347876,-1.039743,1.567315,-1.137513,-0.5787588,0.2982472,0.7280332,0.0982271,2.859893,0.0509393,-0.2210994,-0.02220618,0.7660716,-1.002387,1.744704,-0.000518966,-0.4766876,0.3688758,-1.205654,0.3624262,-1.241285,0.2731272,1.585078,0.7513277 +-0.3158289,-2.088146,0.4535735,0.2266684,-0.5724395,-0.7877408,-0.4253389,-0.2680861,-2.16939,-0.3168363,1.092502,-0.3608938,0.949719,-0.6894144,2.010765,0.8591392,0.1554191,-1.430345,1.373418,0.4199663,-0.9709456,-1.312961,-0.1402478,-0.4883628,-1.171231,-0.0296686,-0.002642084,1.061976,-1.360115,-0.5717291,-0.007733864,-0.4854498,-1.61285,-0.1626454,-0.7501676,1.325862,0.6389812,0.07022735,0.2343838,-0.16186 +0.2885446,-0.8136357,-0.9379423,1.578887,-0.875787,1.597522,-0.7504779,-0.5015085,-1.23296,-0.2307264,-0.8142914,2.048546,0.4288996,-0.6004802,-0.5330178,0.01137791,0.2560332,-1.423066,-1.088663,-0.1971891,0.4784929,-0.7668543,-0.503915,-0.6479878,1.648024,-0.1836649,-0.5009679,1.119246,1.362075,-0.2691619,-0.7245571,-0.842657,-0.2615018,-1.500666,-1.199562,0.1831216,-0.133516,0.2706128,0.4496773,-1.466774 +-0.3443922,-0.7267547,-0.745655,1.683777,-0.8423195,-1.019685,1.226381,-0.1969128,-0.4771853,-0.1046019,0.2361461,0.1130426,0.269738,0.4994753,0.1729076,-0.02677618,0.102742,0.3244221,-0.1092922,-1.851927,0.5377468,-1.038938,1.043067,0.3387147,2.157634,-2.091537,-1.884623,0.00076861,-0.262259,-0.182041,-0.8992548,0.3811819,0.3121976,-0.9627239,0.8846175,0.3813061,0.4074681,-1.22349,0.139873,1.904093 +0.1166314,0.03470017,-0.7239693,0.6232055,-1.429166,1.404935,-0.293127,0.3049411,-0.5168421,0.1482338,0.5791181,-0.2660671,0.9279704,-1.971964,0.9402009,0.1195601,-2.206706,-1.199221,0.37386,-0.2481836,0.4883891,-0.04135916,-0.05398326,1.546974,1.064088,1.706391,-0.274398,-0.91462,0.5063593,0.9524438,1.627665,-0.5629228,-1.303612,0.3511296,1.167422,-0.0326952,-1.525668,1.007306,-0.04319396,1.18887 +0.542966,-0.5570555,-0.01206468,-0.7372352,0.2863638,2.571398,-0.7236017,2.021079,0.4128773,0.6045542,-0.4954537,2.122543,-1.413974,0.000803635,0.4985174,0.8811595,-0.2647832,-0.2742049,-0.7058422,-0.3753504,2.074605,1.078228,0.6943727,0.7360196,0.972579,-0.3751274,0.2698221,0.9781223,-0.9692583,0.1377651,1.34993,-0.2095671,-0.2494484,1.156542,-0.4573689,0.9545712,0.2892509,-0.5671529,-1.326306,0.582797 +1.143948,-0.2362688,-0.1305829,1.380677,-1.711838,-0.5794973,-0.6063638,1.023388,-0.2847327,-0.5381695,-0.2411929,-1.845254,-0.03122859,2.300315,-1.83547,0.1623696,0.4874745,-0.3397505,0.5036999,0.9196565,-1.17221,-0.3149341,0.2622982,-0.4895276,1.280609,0.1204659,-0.5071968,-0.0809225,-0.436116,0.6600096,-0.5789454,0.3029919,0.8290381,1.963821,-0.8459243,0.3791349,0.3017656,1.647717,0.5496287,0.03984431 +-1.284549,0.7363927,0.001629246,1.536579,-0.8062396,-0.1797274,0.3596941,1.457184,0.2625749,1.118884,0.5509234,0.04473182,0.8840385,-0.2277697,-0.5383067,-0.1889842,0.2821468,-1.114442,1.163413,1.440464,0.2832376,-1.306386,-0.05371729,0.07002532,-0.5769153,0.3885615,1.196137,-1.181819,-0.2408785,0.0305698,-1.570064,1.392907,-0.2699721,0.04091144,-0.1089257,0.6042481,-0.2477602,-1.848408,-0.9108354,0.03036773 +0.9198118,-0.1058024,0.6879437,-0.4918793,-0.003757133,-0.5343497,-0.5133498,-1.151304,-0.745178,-0.4806131,1.170257,2.384795,-0.2387445,-2.193194,0.6512582,-0.4070204,0.842031,-1.136032,-0.3800601,-0.2782175,0.0922479,0.419772,-2.507282,-0.2301669,0.02112081,0.2040859,0.6255213,-0.006681845,-1.850284,-1.454995,1.398264,0.9215115,2.558275,0.07376616,0.3377986,1.91809,2.138951,0.245875,-1.095425,-0.1976172 +-0.8460902,-0.7536148,-0.02073712,2.344978,-1.399764,0.7672751,-0.6155502,0.1555791,0.7067762,0.3943324,-0.2951115,-1.784837,-0.4544365,-1.520744,2.41858,0.4988346,1.057624,0.7175296,1.17373,-1.247352,0.7622792,-0.2835651,0.6742543,0.4205671,1.510804,-0.4815507,0.4552788,0.5565844,-0.6028695,0.4372478,1.842392,-1.027367,1.494977,1.261728,-1.061531,-0.3185424,0.7381625,-0.5254235,-0.6856892,-0.5276567 +-1.419304,-1.441883,-0.8397088,-0.4089882,-0.2536973,1.552335,0.6923504,-0.1502326,0.6600241,-0.01364012,-0.556908,0.3212924,0.8263524,0.6074669,-0.4488631,-0.8436955,-0.07436546,0.1155812,-0.6843324,-0.7978171,-0.2120017,-1.174456,0.2787622,-0.427963,-1.010052,-0.973686,-0.1613464,0.06755671,-0.3307555,-0.197509,-0.8459581,-0.3128271,-0.2511036,0.5253947,1.448985,0.2834881,-1.754458,0.7025423,-0.2769558,-1.375579 +-2.665698,-0.2070746,-0.3144128,-0.7096886,0.2748361,0.02285501,-2.055745,-0.3790481,-1.323625,-0.4388562,0.5064243,-0.8307031,0.5918914,0.07914953,-0.4267601,-0.6471097,1.095401,0.6335576,-0.2715009,-0.5651283,2.11953,1.185346,0.3268922,-0.5795358,-1.334807,0.8236427,0.176961,1.909148,-0.05621743,0.337077,0.3046433,1.989954,-1.092401,0.4698598,0.2761031,0.2680207,1.116539,0.3473092,2.451202,0.4354407 +0.8351061,-0.2265637,-0.5500358,0.5254089,-1.686714,0.5463822,-0.1295666,0.1087769,-1.545146,-1.9331,-1.287387,-0.1382136,0.003042125,0.446403,-0.99249,0.8105896,0.3812107,1.629602,-0.1913319,0.8263365,1.152545,0.5220526,-0.5217807,0.8866819,-0.275044,0.3418095,-1.029914,-0.7658485,0.007930274,-1.773525,-0.3745617,-0.85331,-0.5158202,-1.670021,-0.2481574,0.7591083,1.702925,0.2181262,-0.3545402,-0.7169514 +2.377447,0.3626464,2.592954,0.3444975,-0.3593319,-2.442908,-0.1389221,-0.4525193,-1.152062,-0.2696799,0.1139984,1.170407,-0.9074546,-0.4148933,-0.09745641,0.02489256,-0.3979773,-1.096177,0.5891391,-0.8350427,0.3367551,-0.8029031,-0.1319899,0.3403116,1.308193,0.985816,-1.568687,-2.941338,0.04044583,1.000628,-0.4870422,0.717164,0.4718989,0.00045877,0.8302846,-1.396352,0.2151646,-2.089334,0.4494719,1.324787 +0.01396971,0.1757659,-0.00993352,-0.625384,1.387302,-1.440094,0.1121809,1.486597,0.2938633,-1.136537,-0.2456887,-0.565212,-0.3337133,-0.2767179,1.830959,2.073272,0.4053975,-0.573106,-0.703652,0.09491742,-1.246552,-1.580672,0.7241476,1.512144,-0.04540821,0.3133872,0.6758654,0.4961334,1.283969,-1.879209,-0.5027299,-1.683723,2.298624,0.730744,-1.003711,-0.2858777,-0.5683827,-0.873149,-1.134104,-1.291878 +-1.633424,1.492315,1.758061,1.073164,-1.004381,0.9598972,-0.3835015,-1.433603,0.442631,-0.2151471,0.2660395,1.129763,-1.365121,-0.9706193,-1.192205,0.6785105,-0.8081778,-0.8618434,0.04984943,-0.2231913,0.7738207,-0.7829433,-0.146785,-0.5667236,-1.203858,-0.9783168,-0.660562,0.9404959,0.3390625,-0.4253111,1.212397,-1.261628,-0.4608053,0.1218737,-0.3192427,-1.589456,-0.3488899,0.9332776,-0.5784896,-1.4244 +0.4889325,0.1638138,1.673669,1.489762,0.4332542,0.3015147,0.49557,3.353906,0.1325057,-1.376426,0.5107028,1.046101,-1.508768,3.507131,0.1865222,-0.1504179,0.5894496,-1.259826,-0.7820208,1.064317,1.007785,-0.03590166,-0.1628096,-1.978219,1.865742,0.4052665,0.6619679,-1.028174,-2.016587,-0.354827,0.1411775,0.5267949,-0.4155684,1.206286,-1.881218,-0.2188343,-0.8445414,1.338147,-0.9466887,-0.03581918 +-1.019179,1.236595,-0.3237728,-2.092098,-0.258084,-0.1894557,0.3690983,0.9736414,0.2735069,-0.1030513,-0.6595627,-1.03545,1.549446,0.4624489,-0.5091066,0.7087224,-0.5457091,-0.7281792,-0.5513825,0.2289686,0.1701914,0.2422997,-1.94255,1.385263,-0.8911475,0.9570594,1.98809,1.299994,0.1100983,1.193456,1.243456,-0.03600735,0.04274715,-0.6106631,-0.8268251,0.9023621,0.08868926,0.1028901,-0.631718,-0.576633 +1.090553,-0.7807749,-0.2260029,0.8389208,-0.5177061,-0.1378549,1.059567,2.405648,0.4847764,0.2654198,-0.4893646,-0.4480106,0.5218067,0.02705138,-2.557498,-1.247514,-0.2255459,-0.3813146,-1.853267,1.36235,-0.08638112,-0.1500911,-2.01383,0.4179966,-0.5421564,0.06968532,-1.589659,-1.166069,-0.9499439,1.009511,0.009083953,-1.135297,-0.4214931,-0.9726972,1.249322,0.3426779,-0.4068152,-0.204037,-2.0668,-1.007721 +-1.13947,-0.5744266,0.1162977,0.840087,-0.622618,0.76743,-0.1188407,0.9029993,-0.783133,-0.6185604,1.509189,-0.08665656,0.6158918,-0.6258453,-0.7681961,0.3709443,-0.5974023,1.12335,-0.4706762,-0.5344769,-1.935869,-1.577943,-0.3424783,-0.989368,0.3610475,0.005681876,-0.7955761,1.658485,-0.04386233,-1.087305,0.2625998,2.627533,-1.515511,-0.4019492,0.2068717,-1.566505,1.627603,0.7519937,-0.5326398,-0.8724195 +-0.01572564,0.160616,-1.178648,-1.222731,-0.2646925,0.873646,-1.857603,-0.3049969,0.2016204,-1.283033,-0.837296,0.133201,-0.08172376,1.985892,0.6094479,-0.3247889,-0.5458161,-0.4069585,-1.563234,-1.063407,1.556035,1.307891,1.59827,0.9363397,0.3328112,0.83063,0.6822457,0.2545094,1.091192,1.114916,0.09768618,1.321908,-0.9002415,-1.484888,-0.5499883,-0.8587614,0.4774343,-0.3171444,0.05521413,-0.2431408 +0.2974913,-0.1670844,-0.1399882,-0.2747392,-0.6102036,0.3316684,1.286012,-0.1079806,0.7081867,-0.1453275,0.06319956,1.14308,1.84336,-0.1279197,-1.978037,-0.7969212,0.3171394,0.8953895,-0.1486875,-1.055872,0.04331447,1.012442,-0.1040394,-0.6594973,-0.3786398,-0.08081936,0.3074162,-0.665584,0.5979791,-1.119613,0.5543375,1.620872,0.9073598,-1.818856,0.5571677,0.01248577,1.417111,-0.3298529,-0.4070961,2.048499 +3.20059,0.6793468,0.7102212,0.6836523,-0.3375595,1.572585,1.242157,-0.4065802,2.12146,0.2818588,-0.2661093,0.8596924,-0.2819262,-0.1694824,-0.6765305,0.8107951,1.615257,-0.7042722,0.5114472,0.9136212,-1.221293,-1.473662,0.1275306,0.7659445,-0.003822297,0.06287504,-0.7481946,0.3635103,-0.860709,-1.629035,1.083925,0.692552,2.212118,-0.5352276,-0.2066275,-0.6096603,-0.9866929,0.8677035,0.1121841,1.388767 +0.08924424,0.7169586,-0.976966,-0.5162119,0.009141821,-0.826931,-0.9646825,-1.215352,0.8742564,-1.358594,-1.234873,-1.381967,1.322527,0.009064067,0.9633526,-0.6582631,0.822025,1.1084,0.88839,0.09029281,-0.2104645,0.04807945,0.7631664,-0.02742383,2.136585,-0.04675393,1.452031,0.4245546,1.948397,-0.228105,-1.2816,-0.1752062,1.077166,2.372697,0.4667767,-0.883572,1.26494,-0.06319897,0.8240885,-0.2129203 +0.5709723,0.475414,0.7517876,-1.217441,-0.03640826,0.7261452,-0.1130985,-0.5855914,0.4769404,-0.3849093,-1.615373,0.2441963,-0.6644058,0.320371,0.7905018,0.2973014,-0.3149061,1.443206,-0.7856879,1.768375,0.3388851,0.683875,0.3703599,-0.8990326,0.1693851,1.711426,0.7507724,0.5952908,0.7130892,0.8162954,-0.3057537,-0.01043116,0.3947862,1.191177,0.4737704,1.812801,-0.6942872,0.7927894,-2.561776,-0.7911712 +0.5286856,1.734158,0.6219122,-0.7519543,-0.9473156,0.1275074,0.2254942,-0.4167952,0.9931781,-1.082167,1.273393,-0.1496657,-0.3287435,-0.4337254,0.7581183,-0.5058894,-0.3581207,-0.4945549,-0.3886407,0.4683664,-1.932343,-1.028585,-0.03115075,-0.6198045,-1.496466,-1.558598,-0.2039107,0.5774073,-0.6713544,-0.6603642,-1.90425,1.286605,-0.1952277,-3.348986,-2.262752,-0.6235567,-0.9293642,-0.4165837,0.3100115,0.4209295 +-0.4409048,2.069068,0.4029645,0.5675399,0.6318221,0.6185451,1.330959,-0.7982568,0.60536,-1.454406,-0.0440235,0.3003444,0.3310293,1.682318,-0.980773,0.304932,1.675868,-0.4181653,-0.9217308,-0.3280117,0.2722329,-0.1062965,-0.09895317,-0.697019,0.6407943,0.3898551,0.6174554,1.384168,-0.0513442,1.454355,0.0900496,-0.6899454,-0.6716846,-0.899943,-0.5726105,0.02389675,0.7143146,-0.6203548,1.083994,0.8774027 +-0.6727934,1.483307,-0.719034,-1.455376,-0.9603362,1.765453,0.04123446,0.1793009,-1.001382,-0.6058832,-0.6560346,0.8897218,1.670034,0.900592,-0.4460474,-1.075228,-0.3173917,0.2716305,-0.06959288,0.4988331,0.04612988,-0.2225332,-0.2242204,-1.585179,-0.1639277,-0.03327442,-0.7572943,-1.267146,-2.272148,-1.186076,0.1936348,-0.4473876,1.309205,-0.01660418,1.141863,-0.3078514,-0.6659358,1.466181,0.7063316,-0.3707551 +2.154313,1.103177,0.3096354,0.1214205,-1.037103,-0.01826594,-2.149821,-1.190831,-0.2519617,0.09611443,-1.163664,-1.180742,1.791487,-1.683105,-0.3367271,-1.630939,-1.44922,-0.2274292,-0.0785446,0.05641374,0.8331554,-1.301456,0.3574554,1.565503,-0.9237685,0.9442255,-1.281171,0.2594502,-1.273892,-0.476229,-1.361854,1.333524,-0.5610624,0.188396,0.395278,-0.665391,-0.5111264,-0.163171,-0.5609705,-0.08762041 +0.5938527,-1.6248,1.494203,-1.060143,-1.476388,1.462766,0.9221133,0.5447954,0.3540614,1.576166,-1.23036,1.402498,1.236247,1.231103,0.7372871,0.4739064,0.1615265,1.509335,-2.005078,-1.281299,0.1161135,-0.4654092,-0.2720666,0.5709494,-0.04309762,0.1344124,1.530255,-0.2747178,0.543617,0.568744,2.114796,-0.4752841,0.8565955,-1.342745,0.6259594,0.769132,-0.1902926,0.60813,1.014293,0.5120712 +-0.3840638,-0.4959813,-0.261945,-0.1515672,-0.6292733,-0.677627,0.7088755,0.1843469,-2.258642,0.2095688,-0.04536958,1.272671,0.111347,1.379771,-1.760319,0.3725488,0.6062442,0.953417,-1.169372,-0.0650869,0.9663836,1.888659,-1.391886,0.5114416,-0.928763,-0.2723984,-1.411553,-0.456643,-1.159332,-0.573791,1.07349,-0.5355581,0.5163906,0.62236,0.2401064,0.433163,2.090255,-0.03145202,0.3488725,-0.4772467 +0.7197833,-0.3333905,-2.596132,0.1884617,1.083856,-0.529184,1.507734,0.04251325,-1.015037,1.185871,-0.03180232,-1.70435,-1.819205,-2.396196,-0.27861,-0.2025516,0.142382,1.331496,-1.059739,-1.538685,-1.519702,1.912031,-0.04422917,-0.6062652,0.5807744,1.23143,1.186314,-0.9901608,0.6097646,-1.69642,0.004599375,0.2314766,0.5812688,-1.138393,-0.6381468,-0.8491574,-0.5474557,-0.4714887,-0.1449912,0.1625922 +1.708173,-1.583797,-0.1126323,2.880451,0.2359088,-0.2161507,0.5818775,0.1552902,-0.2683037,2.19247,1.177718,1.422841,0.5071196,0.6631315,1.535305,0.7564498,1.523512,1.528906,0.4408511,-0.8561714,0.4716776,-2.13776,1.11178,1.51448,-0.03872081,-1.541106,-0.3478248,0.556881,-0.9799612,1.404526,0.1021298,-0.1362312,-0.1298727,0.2268033,-2.218551,0.07793551,1.637924,-0.6615175,-1.322337,0.5854125 +1.075216,0.3082376,-0.07422831,-0.6316663,0.3893247,-0.4824802,0.3941421,0.08440998,0.4304629,-1.041138,-0.561504,-0.7693495,0.97471,0.6245527,1.034047,-1.294069,0.7722218,0.1582053,-0.771859,1.909903,-1.305041,-1.760122,1.42502,-0.7579173,-0.2173297,-0.8831919,0.1947134,-0.3536037,0.3756026,-0.6674893,1.280123,0.379452,-0.2125742,-0.43897,-0.5018831,1.36615,0.5796587,-0.1470426,-0.8432213,-0.04183809 +0.7774204,0.7114921,-2.251428,-0.7328429,1.73156,-0.6870706,-2.425464,2.461884,0.5966041,-1.019446,-0.8713706,-1.221006,-0.1562348,-0.970801,-0.03257982,-0.7822696,-0.00557322,-1.571271,0.947809,0.7279478,0.8896154,0.9055952,0.909149,-1.426825,2.081725,0.5961062,-0.8257144,-0.3836101,0.654627,-0.4308107,-0.1393918,0.4460172,0.2116463,-0.960151,-0.6647011,-0.5491635,1.882792,-2.258277,0.5228931,-0.01924052 +0.05220005,1.697592,0.4055705,-1.232286,-0.4617642,-0.2341112,-1.736911,-0.5488994,-0.2357055,-0.6194897,1.154167,0.1956703,-1.035937,-0.3570547,0.548882,-1.101682,2.023802,-1.175511,0.7974498,0.4952485,-1.717434,-0.6465857,-0.7359788,-0.8575712,0.4434167,0.7798948,0.2291986,1.505577,-0.07045623,-1.481385,0.5476728,-2.202817,0.8719006,2.41436,-0.4473656,-0.3798042,0.8215314,1.192788,-0.1299489,-0.7817407 +-1.083698,-0.6260046,-0.4363993,0.7951947,-0.01895182,-1.262174,0.8281504,-0.5417266,0.1592747,0.3975892,1.752631,1.079758,-1.130171,0.925466,0.6812096,0.9684006,-1.016477,0.6591918,-0.4743128,1.618905,1.294283,0.1722504,0.04387525,-1.48251,1.026626,-1.268035,1.616387,-0.1296643,-1.693525,0.04962565,0.1254488,-0.6650204,1.509509,0.3386255,-0.9758623,0.04969675,-0.4847277,0.8811644,0.7083248,1.017065 +-0.008856207,0.1486064,0.9384111,0.09803598,1.048168,1.557804,0.1997466,1.196413,-0.8409255,0.1530435,0.3459997,-0.295046,0.04571444,0.8384938,-0.2099178,-1.414648,1.713204,0.6073323,-1.411026,-1.336791,0.2381821,-0.05583553,0.05935106,-0.7108816,-0.5372827,-0.4897972,-1.41456,0.2191487,-0.6535925,-0.4978242,-0.5870484,0.8425212,-0.6046013,0.471673,0.3678759,0.6365216,-0.1790318,0.2087412,-0.4810614,0.1449822 +2.030643,-1.02684,-1.366376,-0.520277,-0.5497962,-0.8276935,2.218221,0.4555053,-0.1939965,0.0436248,0.5755248,0.3284068,-0.4977148,1.16828,-0.7661223,1.190737,0.8965667,1.071202,0.9105756,0.1361992,-0.1394184,1.358684,-1.312406,0.2199376,0.1900325,0.04089414,-1.509229,-1.043308,-0.1348238,0.7277455,1.206275,0.1172479,0.9677926,0.1653975,-0.6337534,-1.808532,1.804835,0.1166284,0.2991486,1.649375 +-1.113329,0.1013403,0.5044141,-0.2167409,0.9319698,-0.5685562,-0.3692752,-0.7034645,-0.05704762,-0.8858334,-0.822814,1.248689,-1.041855,0.5020032,1.276263,0.6741589,-0.297648,0.6403877,-0.3776231,0.03051035,-1.172299,2.489968,-2.525804,0.7510987,0.772013,-0.4504981,0.9167632,-0.9609024,-0.1814045,1.792321,1.121948,-0.2209681,-0.07816146,-1.636862,1.151201,0.7973123,1.293466,-0.792348,1.2808,-1.054378 +1.138272,-0.7889823,-0.1802634,-0.5235305,-1.706031,-0.213904,-0.2393662,-0.7599182,0.6408323,1.651392,1.31091,0.3920971,0.1020896,-1.423439,0.8876293,-2.246856,0.4016012,-1.593254,-1.219457,-1.02154,0.1170433,0.4082672,-0.0302343,0.469634,-0.538701,-1.414888,-2.037591,-0.08603915,-0.3560534,-0.8175053,0.793565,-1.029294,2.571706,1.472252,1.996459,-0.7033945,-0.05796919,-1.117187,-0.4113175,0.7233322 +-0.6148926,0.6882488,-0.77954,-0.08022832,0.5544268,0.6049456,1.346469,-1.816706,-0.6301488,0.3550138,-0.5113278,-0.7059849,-0.6460335,0.213195,0.9499252,1.358626,0.8270499,-0.1513796,-0.008200573,1.704089,-1.734358,0.128554,-0.2019294,-0.3062358,0.9270681,-0.02571055,-0.7069494,1.19493,1.542127,-0.9139907,-1.175989,0.2814059,-0.8520936,-1.06812,0.4318235,1.250042,0.5494302,-1.354866,1.171811,-0.3152844 +-2.193537,1.450136,0.4667678,0.1969645,-0.2672956,-0.2887226,0.2356258,0.7992002,-0.8301422,1.096115,-0.842947,-2.068285,0.814882,0.1241892,-1.107988,3.249307,0.5569337,0.1162674,-0.3249029,0.6473209,-0.7496869,0.2528537,-0.8892859,-0.2812886,0.6136071,-0.4962651,-0.09889544,0.2644632,-0.4488425,0.5022411,0.4937783,-0.2072025,0.4619624,0.4296975,-0.2140865,-0.112768,0.2518215,-1.201592,0.7625858,0.7376386 +-1.215686,-0.4752253,0.3681907,1.0103,-0.2980582,-0.791607,0.5037324,-0.9923234,-0.5489633,-1.196865,0.1850888,1.212256,-0.938888,1.141723,0.7145942,0.6761959,2.137037,1.307629,1.533558,0.1466481,0.5866458,-0.7481571,-0.01863444,-1.26576,-0.245524,-1.727157,-1.228149,0.132211,-0.7013453,-0.6518244,0.9400841,0.6960325,-0.2559799,-0.1963679,-1.365195,0.2945436,0.1934418,1.98496,-1.313381,0.7063828 +-0.04532636,1.11471,0.4623412,-1.420476,0.7077816,0.2102459,0.6840644,-1.643736,0.1882675,1.683752,0.04603592,-0.03558893,1.099913,1.093512,1.248487,-1.033026,0.01655521,-0.1095162,-1.954168,0.7505864,-1.227793,-1.380246,0.1151494,-0.6249599,-1.294378,0.5497277,0.4130139,1.523569,0.5600512,-0.2077147,0.08588303,1.177491,-1.121572,0.9578719,-1.392637,-0.07019269,0.680051,-0.6387021,0.06667982,-0.9174246 +1.598813,1.083638,1.56714,-1.047379,0.6293519,-0.936979,-1.600606,0.004942842,0.9142402,-1.354391,0.4712546,-0.756482,0.3523275,-0.147342,0.7971613,-0.2814485,-1.513011,-0.8111958,-1.034833,-0.6950954,-2.514349,-0.1821449,0.1326827,-0.3797799,0.8600179,1.554542,1.250857,-1.262328,1.395549,0.06049044,2.36354,0.4804966,0.8301977,0.463476,0.721185,-0.6183956,0.6986851,1.625418,0.1141329,2.152355 +0.9275422,3.084,0.9200424,1.246853,-2.997415,-0.7569724,-0.5322039,0.8212553,0.2479735,-1.232242,0.1209616,-0.3999164,-0.1335397,-0.3350657,-2.415863,0.2938322,-0.4174636,-0.3441517,-0.7195315,-0.6870677,0.876433,-0.2282649,-0.1796236,0.7108913,0.6218814,-0.5236893,-0.5091297,-0.86153,-1.140359,-0.4703398,0.7914622,0.8807328,0.3017717,-1.589207,-1.518132,0.6376943,2.054573,-1.593216,1.866246,0.9843938 +0.8292846,0.02417865,-0.5561348,1.643408,1.655555,-0.5584008,1.19388,-0.00060945,-0.3281669,0.1917156,-0.05361858,-1.338275,-0.6234629,1.0374,-1.241059,0.1885221,0.3436642,-1.524381,0.8770111,0.09214365,-0.8375797,-0.8339408,-0.7874164,1.077903,0.2236816,0.3476991,-0.1975018,1.571331,-0.524913,0.7618028,1.381037,0.2190792,0.7594336,-2.339898,0.4630339,0.9846851,0.7264586,-1.045816,-0.4757142,0.8040897 +1.024604,-0.7905065,0.2484038,-1.133315,1.065338,-1.396156,-0.72884,0.7753728,0.3577989,-0.8969233,-0.2915005,0.02728683,-0.4517187,0.2693968,-0.8359042,-0.7805564,2.521503,-0.1044084,-1.43642,1.148302,-1.29692,-0.4991755,0.358243,-0.8966247,-2.567532,1.9712,1.964196,1.240958,0.1112309,0.3649485,-1.380898,-1.02177,-2.16128,-1.432968,0.676987,0.7212203,-1.354061,1.15543,2.358893,-0.6068104 +-0.4762381,-0.1518179,-0.129537,-1.695424,-1.207301,-0.1417092,0.5699539,-1.481759,-0.2594018,-1.59421,-1.740949,-1.525411,-0.1789505,1.516343,0.3861517,0.04843711,-0.9732781,0.2228792,0.5729813,-0.2517825,0.4733182,0.08975896,-1.113632,0.5199474,1.588641,0.04366284,-0.7845889,3.159439,-0.7375804,-1.908886,-1.687338,-0.6286134,1.520157,1.230101,0.452343,0.4962588,1.027222,-1.576265,-1.48878,2.614386 +1.638868,0.7077418,-2.750566,-0.1897425,0.818315,0.3391864,-0.1360898,-0.9503241,1.306167,-0.1617556,0.4027808,-0.5579586,-0.6169088,-0.5011603,1.737772,2.197974,0.3294383,-0.3055665,-0.6236814,-0.007726323,0.3843781,-1.000351,-1.994215,-0.04802569,-2.216645,-1.12497,0.3548289,1.189964,1.379329,1.115358,0.1787767,-0.08557953,0.242324,1.317492,-0.582023,-0.1922539,1.869455,0.2683522,1.17845,-1.546005 +-0.632051,-1.667636,1.168864,0.1158406,-1.337758,-1.710555,1.203708,0.3478948,-0.7307554,-0.2848497,0.1772733,-0.9759791,1.091505,-0.209921,-0.05610395,1.343542,1.051164,0.1822158,1.1989,2.681128,0.7694263,0.6325941,1.01923,0.2339119,-0.1947663,0.6194182,0.08219658,-0.4764851,-0.2722881,-0.3878875,-0.1333688,1.937072,-0.9585367,0.07484122,-0.892029,2.646365,1.32989,-1.658281,1.101679,-0.1328101 +-1.379619,0.6135681,-0.4714252,-0.4470733,0.4349437,0.1406125,-0.1598207,-0.1404696,0.1768062,1.268399,0.2240606,0.662304,0.1772856,-1.993857,-1.059355,0.5297261,-1.330595,-0.6463969,-0.4851302,-1.154458,1.825445,0.6171722,-1.410171,0.5143688,0.3520058,2.426754,-0.1583867,-1.117123,0.2661979,0.3528433,-0.4254055,-0.6473692,-1.074856,2.584302,-0.6788652,-1.494095,-0.2889736,-0.6038709,1.030517,-0.4491461 +-0.2574557,0.3580398,0.03266384,-1.769761,1.497553,-0.1086161,-0.3873711,-0.8804412,0.275327,-1.740846,-2.373916,-0.5224025,0.5199921,-1.302609,-1.738142,-1.098887,-0.7733072,0.2772619,-0.964327,-0.950453,-0.7222506,-2.25748,0.5030664,-1.315557,0.4813067,-0.7852995,1.323397,0.3880235,1.333394,-1.135074,-0.6927048,1.022435,0.6093413,0.6655754,0.4305242,-0.7499683,-0.3930801,-0.370462,-0.6521296,0.9371254 +1.679973,-2.848275,-0.7353572,-1.294952,1.515944,-0.4985722,-0.863566,-0.8460297,0.2288357,0.4910251,0.07065842,1.02666,0.2225495,0.8351757,-0.4649466,-0.2170817,1.755638,0.5695349,-0.1492859,-1.627201,-0.07644616,0.329797,-0.133748,-0.3056395,-0.08244013,0.5998591,0.5446657,-1.588331,-1.44283,0.8971218,0.335588,0.1822846,0.5826228,0.06220147,0.03105838,-0.5834062,0.4682162,0.469692,1.611466,0.6158025 +-2.545858,1.490201,1.81816,1.471587,1.487424,-0.3166144,-1.597831,0.006521591,-0.4647864,0.3701362,-0.8070424,0.007462982,-0.4049104,-0.3150951,0.8349099,-1.126827,-1.748914,0.4762554,1.641912,-0.6815577,-0.410118,0.619961,-1.079672,-0.08451017,-0.8778748,0.3000275,-0.6542461,0.398588,0.01529164,0.5812658,-0.1067292,0.3993265,0.2322096,-1.408204,0.2426341,-0.6037112,-0.4568281,-1.108837,0.02147559,-0.7249602 +0.01207878,-1.422575,-0.05547517,-1.909515,-1.486915,1.450145,0.09092243,-0.1725991,-0.5277962,-1.683594,-0.082544,0.05375647,0.9017723,0.456799,-0.3920435,-0.6319427,0.8080604,1.281847,0.895028,-0.7798491,-0.0946737,0.7910495,1.186083,0.285742,0.4809279,1.254091,-0.6635546,1.877264,-0.605221,-0.5012234,1.496887,0.1020142,0.1363871,1.200799,-0.9663123,0.670201,1.619192,-0.4949284,0.3756728,-0.6470791 +1.960925,-1.407433,-0.532159,0.4089532,-0.1724197,-1.464037,0.0788988,-1.007147,1.259082,-1.796542,-1.116228,-1.095411,0.7495476,-0.8678774,0.4281047,0.1270876,-0.3320251,0.2623988,-0.6682101,-1.465664,-1.185562,-0.7269057,-0.5282889,-2.037886,-1.519095,1.35781,-0.9082787,-0.5401811,0.1398448,-0.8363186,0.4343615,-0.963093,0.4609142,0.9235265,0.1373876,-1.745209,-0.8694043,-1.720276,-0.3232983,-2.517087 +-0.3859055,-1.092356,-0.9597622,1.119431,-0.4347803,1.330535,1.538373,-0.4586078,-0.844139,1.068734,-0.4411874,0.01197872,0.2582421,-1.315192,-0.1555322,-0.09260828,1.148981,1.32902,1.45155,0.8890642,-1.234851,-1.595736,2.390476,0.2468695,-0.7726478,0.6092476,-0.4514032,0.2890785,-0.2960846,-0.8335619,3.186875,0.752458,-1.805423,0.0284271,0.213932,2.860008,-0.9135495,-0.1455615,0.08343702,-1.306032 +0.9107575,-0.3584432,0.01657976,-0.3067354,-0.4616977,0.5175995,0.7500768,0.5783524,-0.5777141,0.2951724,-1.022449,0.7377213,0.8306494,0.4124851,1.181358,-0.6289626,2.119985,1.40899,-0.5041359,-0.5330119,1.538574,-0.5609271,0.1221303,-0.02854241,-1.793404,-1.378776,1.968047,0.1455188,2.181625,0.7464738,-0.5329207,-0.2073537,1.27551,1.206171,1.221238,-0.19873,0.1596772,-1.467545,0.2087613,1.082997 +-1.448131,-1.83291,1.590299,0.9573508,0.763477,0.1142917,-1.651933,0.2962993,0.8169964,0.4775124,0.1210363,-0.5632241,-0.2670373,2.797567,-0.7822733,-1.832174,-1.029192,1.501359,-0.4805614,-0.8579301,-0.1284277,-0.3820954,-0.4381719,-1.221277,0.4193927,-0.5686176,-0.4860321,-0.863184,-3.116789,1.143762,-0.5683584,0.7260815,-0.7044255,-1.289605,-0.2096525,0.08704683,-0.976814,1.790758,0.4984336,0.6081624 +-1.121615,0.1753791,-0.5994464,0.106553,2.082622,0.1105824,2.248974,-0.1318879,0.917376,-1.72767,1.87287,-0.4969614,-0.1135858,0.4964562,1.992949,-1.411359,0.8468202,-0.007503872,0.1963356,-0.4328771,1.039886,-0.5914931,0.06270295,-0.07300003,-1.554374,-0.2591861,-0.4703887,1.433722,-0.4721522,0.5349996,-0.05211709,0.2204822,0.1745201,3.159824,-1.072504,-0.6392873,0.6674857,-1.75831,0.6633704,-0.3081237 +-0.973618,0.330385,-0.0587262,-1.977911,0.883063,-0.5063493,0.2047947,0.1487235,1.897532,2.021632,-1.032581,1.717314,0.9257585,-1.603313,0.8648048,0.8360169,0.6684597,-0.389076,-0.06848465,0.03555654,1.003036,-0.3940495,1.259968,0.2862533,1.811881,0.7137268,-0.4119395,1.196489,-0.09166313,-0.8678913,0.07045632,0.1771093,-1.745761,-1.200594,1.231571,-0.7473664,0.7807363,1.070973,1.511551,0.5929432 +-0.06718692,0.008946448,-0.3658096,-0.4197632,-0.9326594,0.7372242,2.177129,-0.0954439,-0.09205146,0.6740441,-1.730479,0.5845573,0.3143232,0.4075952,-0.3555628,-0.4535652,-1.049107,-1.096508,0.9962469,-0.4423334,1.34093,-1.177178,-1.65453,1.886733,-0.9438044,-0.6338311,-1.286336,0.4632625,-1.061882,2.257132,-0.8303303,1.360593,-0.1826143,-0.3124258,-0.5992499,2.216336,1.478354,1.431071,0.4290266,0.1526511 +-2.139842,0.3613688,0.1466179,-0.09934484,0.6244785,1.221328,0.4244844,0.6320706,0.7434746,1.303628,0.8213002,1.191627,-1.180913,-0.6269124,0.3854547,1.289892,-1.185717,0.6990438,-0.7419817,-0.1676078,1.582476,3.05256,0.9239033,1.02556,0.3385783,0.8177778,2.153419,3.254341,2.384182,1.985466,2.960564,1.529012,1.414249,2.204693,2.781948,3.134195,1.896144,1.965387,0.02157477,2.376542 +-1.263479,0.1597477,-1.866633,-0.896461,-0.1393987,0.2076094,-1.179833,-1.147256,0.3448624,-0.3805576,0.2724801,-0.3411814,0.5624045,-0.5417518,-0.01129139,-1.049086,-0.04685785,0.6820616,1.488945,0.3132632,2.690794,2.669348,3.166364,3.465614,2.038856,2.139952,1.308506,0.4065348,1.859516,2.171397,2.369561,2.89046,3.348573,2.211368,2.333409,1.297922,2.953942,2.333206,2.685066,1.716416 +0.08330797,-1.128163,0.4229188,0.9340527,0.4586032,0.04198025,0.2953861,-2.392196,-1.093952,0.2618844,0.7689797,0.8031885,-0.8226138,-0.2209423,-0.951304,-0.1447597,2.131602,1.381588,0.2587534,-0.5527147,1.433998,2.418327,1.655801,1.896954,1.761396,3.823389,1.682061,0.8922115,2.517238,2.975648,3.293963,0.6840537,4.040102,0.911378,1.132958,2.951812,2.918069,2.707228,2.930173,2.189827 +0.1883251,0.6406271,-1.596252,0.1669182,1.327099,-0.6514519,0.06037357,1.035245,-0.242096,-0.3255917,-2.200406,0.5752732,-2.357486,-0.6649802,-0.3196344,0.5473257,1.47642,-1.257346,-0.1293013,-0.2080587,1.158398,0.02604866,2.224096,0.612498,1.775835,2.754936,2.11065,3.396165,1.773068,1.814805,1.097595,2.086757,0.4816122,2.114414,1.83836,1.609112,1.756245,1.087693,1.294107,2.795341 +0.1898142,0.9467517,-0.8218646,-1.741847,0.8953817,-2.265442,1.359441,1.402056,-0.8361771,0.5593497,-1.287592,0.26524,0.1589435,0.5901539,-0.001927547,1.10617,1.337805,0.2872887,-0.5791411,-0.8011912,4.411398,2.52582,1.351107,3.110173,-0.05422574,1.560296,1.374369,2.285358,2.077482,2.159506,2.151479,2.565935,2.936353,2.338435,2.35728,1.94152,2.121137,1.75356,2.179734,2.325701 +0.2176154,0.6822415,-0.1201119,-1.693446,-0.4794835,-1.240383,1.534688,0.05285562,-0.1424187,0.7494332,-1.478569,-1.575184,1.377765,0.09189534,1.382757,0.7081164,1.996624,0.8178677,-1.216107,2.231179,3.36178,2.333117,3.036253,2.393902,3.036597,2.486867,2.081171,2.51953,2.023585,1.823754,0.1446005,3.468273,1.717457,1.647755,0.9057134,2.108538,1.829865,2.34186,-0.1323644,2.05152 +-0.6488141,-1.611244,0.6594105,0.1563209,-0.873723,-0.6098107,0.3702575,1.367601,0.7033276,-0.8722342,-0.1308931,0.8558414,-0.003034028,0.2335981,-1.867475,-1.121649,0.248566,0.07846434,-1.286128,0.9505939,1.252744,2.241256,3.777436,1.588659,0.824247,-0.7750018,1.321708,1.136107,1.768581,1.097227,1.005666,1.513408,1.669517,3.588223,1.98998,1.751861,2.134638,2.659982,1.268454,3.278548 +0.2547744,-0.4101517,-0.2135455,0.7933506,-1.537878,1.356069,1.422003,1.419489,-1.078889,2.221118,0.6229004,0.5314244,0.305076,-0.6771276,0.4279225,-0.5119112,-0.6384716,0.6514205,1.405476,0.4397616,2.113064,1.936145,2.009238,1.540685,1.535903,2.51379,3.40483,1.726955,3.938497,3.303821,2.714134,1.282007,2.39779,2.097585,3.012436,3.069556,1.627175,3.364177,1.871442,2.942319 +-0.46883,-0.007731573,-0.8000419,0.8336,0.3879086,-0.6060677,-1.00255,0.7273687,-0.3818075,-0.1389304,1.931439,1.340193,0.01211127,0.4213375,0.2088163,1.17674,1.397413,0.4366855,-1.47796,-3.643331,3.300102,4.028814,1.771935,1.522938,3.264786,0.4574014,2.864329,3.29512,2.280962,2.4492,2.050392,1.97846,1.91819,3.077863,2.134552,0.6093722,2.941215,2.793809,1.377232,3.293558 +-1.19251,2.017529,-1.910436,2.57878,0.2453103,1.238545,-1.030269,-1.596531,0.4960388,0.4257765,-0.2761111,0.1075651,-0.2486447,-0.993905,0.6728258,0.2801921,0.3537805,-0.188507,1.481766,-0.05674054,3.078991,1.949954,3.070165,0.5587721,1.012683,1.890684,-0.4179688,0.7321553,1.703687,0.727828,3.425063,1.741303,1.751143,0.9337653,3.686368,1.901641,1.691292,1.145843,0.1391917,2.206531 +0.1212282,0.1276652,-1.880467,-0.1809626,-0.2131559,0.7364041,-1.499243,-1.137831,1.590263,2.582048,0.04400671,-0.9014472,0.5158622,0.1769532,0.8553198,0.635686,0.4777758,-0.07363712,-1.131727,0.2057309,3.098187,1.64686,1.038938,2.535529,1.756823,1.208677,2.963084,3.496751,1.740832,1.776001,0.605926,3.285701,2.918212,1.720014,3.549204,3.527434,2.168426,3.08894,1.97694,3.21311 +-1.035933,-0.4510595,-1.084931,1.871987,0.4178902,0.320482,0.2237376,0.8458793,-0.7446416,-1.659312,0.4054904,0.3812368,2.406445,0.6890081,-0.5743154,-0.4698639,0.2506738,-0.4508467,0.4114795,0.6277279,0.6735797,1.558398,2.282278,1.422344,1.107977,-0.172535,3.497189,1.95208,2.057885,3.607841,2.163282,3.768946,2.834614,2.771351,0.8121314,1.983776,2.377056,3.102802,0.8903243,0.8855907 +-0.1918883,0.1004386,2.03166,1.077128,0.1785653,-0.3435443,-1.866151,0.3722014,-0.4084591,-0.2447575,-1.013261,-2.30124,-0.6279162,-0.3720307,-1.840079,-1.351841,-0.9443724,-1.792605,0.981607,-0.5992622,0.5415523,0.2781324,2.221385,2.265156,2.721733,3.231265,0.9606644,1.9801,2.801037,1.732834,1.451261,0.0684647,2.020882,1.283086,2.236792,3.262271,2.329608,4.062313,2.718627,0.9838965 +-0.9705463,-0.3485444,0.1345864,0.6235133,0.6777407,0.02292801,-0.2882505,-0.900893,-1.974633,0.2481742,1.267142,-0.9937755,0.8771364,0.04089161,-0.3454543,-0.3027824,0.1023671,0.9739712,0.3129768,0.1954106,2.664187,2.478845,3.419889,2.530544,2.485936,4.077314,3.342158,2.468206,2.583639,0.4469294,0.4060006,0.4602092,1.148032,0.755604,1.849382,2.354348,1.604924,1.582222,1.862562,2.321786 +0.05777194,0.7362104,0.2840189,0.5671234,2.098149,1.509432,0.2827536,1.027824,0.984704,0.1333078,-0.6256155,-0.2332643,-0.000755946,1.227248,-0.0166963,-0.8472585,-0.08306887,-1.721948,0.5206905,0.2491973,2.248901,3.210456,1.79974,2.666883,3.1669,2.050325,2.605272,4.249029,0.7828417,1.088435,1.541892,0.5035812,1.647493,1.836554,1.328211,2.189882,1.374312,2.14994,2.011736,2.33373 +-2.110693,0.550408,-0.444882,-0.7033314,-0.6661024,0.7492082,0.9648132,-0.02732232,-0.3004958,-0.6993813,-0.8618082,0.7715718,0.9763283,0.6076633,-1.107565,0.3987146,1.761676,-0.8319785,-0.8117266,-2.013357,1.834744,2.675518,3.035392,1.470848,2.743412,3.460779,0.6959264,2.995986,1.344776,1.987683,3.480497,1.125497,2.194445,2.486775,2.42783,3.28904,0.5681873,1.117186,2.740696,-0.3383849 +0.9781763,1.207437,2.046352,-0.6460037,1.215634,-1.737281,-1.488167,0.4900471,-0.4416319,0.008624532,0.9638779,1.141225,-0.9856074,-2.468918,0.8835738,0.1329329,-0.94496,1.133195,0.2970289,-0.3413898,3.557124,2.20284,1.861756,4.117937,1.848854,2.197446,0.8621868,2.146979,2.961578,3.353926,1.216739,2.203699,0.5288123,0.6290496,1.098398,1.812189,1.97533,2.269837,2.312758,2.26254 +0.3710217,-1.390934,0.3389809,-0.6635445,-1.109225,-1.018168,-0.4995357,0.3423604,0.6617672,-1.88137,0.8135764,1.238402,0.5325243,-0.344468,1.103026,2.711496,-1.525326,-0.6033354,-0.6789055,-1.486774,1.428188,0.7737495,1.548338,2.069477,-0.2689522,1.036168,2.366152,-0.3588597,-0.1766788,1.98178,0.8902974,1.832562,0.459765,3.498405,1.44567,2.626689,-0.4806228,2.428727,1.583528,2.245753 +0.720911,0.2783987,-1.143501,-1.06287,-0.7558977,1.081315,0.9409878,1.65495,0.06323546,-0.483234,0.5646001,0.8287472,-0.004359478,0.5082807,0.1578058,-0.4331533,0.3673551,0.4540349,0.3166395,2.19394,0.2703387,2.545333,1.625955,2.698174,1.220521,1.407163,0.9771682,2.70201,2.461295,1.260475,1.57199,1.856119,0.9723702,2.231523,0.2825344,2.566058,2.261409,2.940563,2.557691,0.7965403 +1.879595,-0.322322,0.1294422,-0.4286695,-0.1299135,-0.5202699,0.2989457,-0.1453285,0.7001635,1.562483,0.4979709,0.9765155,-1.192641,1.170101,0.08985371,0.6986546,-1.174852,-0.5141191,-0.9366861,0.433218,1.777906,1.587634,3.344486,0.5414872,2.697382,1.210674,1.458277,2.710735,2.937927,2.029181,2.247852,1.338248,1.970588,3.597506,1.709614,2.865959,1.390142,1.629278,1.753995,2.583098 +-0.3963161,-0.5617144,-2.469053,-0.5192278,-1.00975,-0.0861674,1.61525,0.6933162,0.9835117,1.842262,0.3520016,0.5436215,-0.6062196,-0.05572721,-0.2552381,0.6197536,0.3326572,-0.6994474,1.060536,0.8698301,3.243658,3.086962,2.655809,1.57099,1.262371,0.2947793,3.527101,1.662105,0.387994,1.274448,0.6248743,3.663082,0.6697525,2.652031,2.239422,2.256048,3.414924,3.281752,3.667218,1.386211 +0.1103229,-1.014318,-1.683552,0.7431669,-0.04314962,-0.2610775,0.03778102,0.4185689,1.534349,0.6455959,1.069303,1.401163,-0.5791474,0.3293853,-0.2848781,0.4891721,-1.09074,-0.4372058,0.2489449,-2.368572,1.834587,1.479328,1.350192,1.288579,0.6961565,1.898787,1.508438,2.624583,1.438987,2.770341,3.43756,1.999732,1.779721,1.530558,3.683939,2.837748,3.976887,1.414459,1.874815,2.770869 +-0.5931409,1.135995,0.005904135,-1.591537,1.142942,-1.223266,0.1551227,1.527944,-0.6152507,-0.6963583,0.4750082,-0.5064363,-1.475025,0.5687305,-1.07353,0.7750057,-1.993901,0.2524884,-0.6634629,-0.4523496,1.415238,1.826071,1.705811,-0.915508,2.7443,1.852778,1.146818,2.361831,1.650755,0.3255743,3.532096,2.425293,1.517652,2.880735,3.762223,3.806129,2.65034,2.73849,1.2597,2.118903 +0.4196218,-1.443312,0.9700857,1.962928,0.03658388,-0.8383338,0.8241627,-0.6026188,-0.9420961,0.165488,-1.470518,0.5182279,-0.6434628,-1.371637,-1.09498,1.28022,0.03581834,-0.485053,-2.228371,1.201474,3.747222,2.897958,3.835926,3.715396,2.181939,2.942394,1.41738,2.079352,0.7686416,1.409596,0.8324619,2.946191,0.8444805,1.37823,0.02459368,3.367799,0.004663511,0.9449892,2.095939,1.125256 +-0.5457387,-0.6948291,0.6946354,0.2297953,0.06258478,-0.4194021,0.6941847,2.324951,-0.2733717,-1.29191,-0.9365653,2.484063,0.4788331,-0.4915684,-0.9273186,-0.5559078,0.3165113,-1.128782,0.3458777,0.5064693,2.099459,1.259989,0.7393165,0.4258811,2.314625,3.986821,0.3573839,1.723723,1.802712,0.982027,2.281059,1.43367,1.893474,0.8875574,2.642127,-0.1993493,1.713588,1.507902,2.336168,1.418931 +1.160921,1.003194,0.09923566,0.5067253,-1.521933,-0.3521837,-1.259348,0.2579467,-0.3929313,0.9310136,0.5276886,0.8769453,-1.880758,1.092329,0.4333083,0.03204918,1.976,0.7104392,-0.3725373,-0.5300472,3.513388,3.898482,2.99082,1.565235,1.745324,3.037213,3.381368,2.086638,2.773048,0.2393657,2.492456,0.9690643,1.105616,0.806586,2.42149,1.59349,1.666696,1.824316,2.158778,2.498785 +0.6398178,1.716175,-0.4823343,0.8182592,-0.5118575,-0.5165821,-0.4748175,-2.477118,1.256099,-1.60864,-0.1160599,-0.3426425,-0.9651163,-1.622166,0.5280168,0.4991458,1.130561,1.115826,1.388426,0.9533415,1.003954,2.442078,0.9573878,1.525685,3.074739,1.842586,2.830982,2.642766,2.309523,1.52693,1.344418,2.245914,1.303444,1.235164,1.381456,3.1617,1.568898,2.484491,1.984194,3.191182 +-0.1220204,0.7763874,0.07014508,0.1987903,-0.413642,-0.2086309,-1.936815,-0.8673046,-1.057438,0.2357865,-0.4954199,-1.758119,0.883319,-0.77702,0.5343914,-0.5256921,-0.4218854,-0.4938672,-1.142337,-0.7743619,0.1608252,0.7930567,1.667225,2.242855,1.96512,1.304135,1.339927,1.562703,1.775361,1.567031,0.05893401,0.9915314,1.024176,0.9300101,1.784605,3.590311,1.713258,1.952965,1.142519,3.987652 +0.184645,0.7887203,-0.03637904,0.4058171,-1.92177,-0.9817393,-0.7229619,-1.473149,-0.2126765,0.4160718,-0.3547443,-0.2765801,-2.297253,-0.06098967,1.336472,-1.168965,1.148181,0.3359468,-0.6093774,-1.51208,4.751642,2.359866,3.336587,-0.5742004,1.754602,0.1180351,2.164303,1.613675,4.12232,-0.4055539,2.761054,1.572346,1.04944,2.258433,1.359757,2.058347,-0.3893014,2.564398,4.814904,3.303889 +-0.517806,-1.628107,0.1920146,-0.4010099,-1.974147,0.5495045,0.474582,2.110922,0.5588706,-0.1960427,0.772052,0.1959591,-1.40293,0.1311064,0.5669655,-0.9412566,1.193351,0.7610076,0.6893383,0.09672922,1.781886,1.610069,2.52114,1.93609,2.297606,1.398722,3.098122,2.156963,2.43496,1.233625,2.072243,2.648958,1.231511,2.561583,0.8619656,0.5978283,1.501238,1.223403,3.191394,3.048492 +0.06798835,1.449787,-0.9084637,1.421029,-0.9555655,0.4650964,-0.4907218,-0.4347947,-1.416767,-2.051266,0.91047,-0.7637342,0.4748067,1.255808,0.4389781,0.9724614,0.6251055,-0.7457427,0.3612787,0.4507409,0.6223383,1.889274,0.1251618,2.271599,2.712883,2.037995,2.600178,2.804388,1.670321,3.848101,2.960269,0.6340049,1.575831,1.009608,1.69142,0.5067729,2.284017,0.739127,2.795096,2.885005 +-0.1847972,-0.9709023,-0.5415259,0.4544338,-0.02482901,1.158172,0.01993906,-0.7005002,-0.6908366,0.1415746,-1.52199,-0.8112193,1.592355,-0.001068081,0.256681,-0.6513587,0.4734067,-1.108605,0.818191,-0.0796177,2.668358,-0.05220602,-0.7943459,2.612446,0.7278348,0.8625408,-0.1260492,1.238227,1.51924,1.487535,1.249311,2.552978,1.494654,0.7669678,3.800497,1.178501,2.00823,2.348302,2.287824,2.867307 +-1.403692,0.6306979,-0.2678852,-1.51115,-2.056257,0.5104555,0.2096659,-0.8283409,1.436716,0.7566129,0.3089213,-2.118522,0.455672,0.8445454,1.018016,-1.215967,-0.2528193,0.2691632,1.33217,0.8126896,2.061485,2.385455,1.828275,0.8751451,1.75937,4.335407,2.811655,2.725931,3.157637,0.633216,2.580883,2.126314,2.34433,1.881057,2.251789,0.7916351,1.253522,2.140114,0.3473648,1.217924 +0.2297407,0.8793904,0.5159698,2.40431,-0.8691745,2.218755,-0.6988746,0.3137163,1.332887,-0.1498034,0.6158443,0.3378152,-1.148276,-1.794399,1.079559,-0.727705,0.2651851,0.4930161,0.2494195,1.373011,3.243351,3.118586,2.003972,3.251578,1.068167,0.6979667,1.411575,2.151537,1.912313,1.704407,3.908166,2.37721,3.028231,1.198397,1.814134,3.34593,3.30631,1.208698,2.944309,2.228016 +-0.8890813,-1.128844,-1.502453,-0.5130415,-0.248919,0.2762023,-2.268664,1.3648,1.366603,0.2393837,1.183386,-1.08971,0.1343103,-0.195955,0.1747895,1.165991,-1.78482,0.85835,0.05919016,-1.588452,1.452169,1.557074,2.317647,1.466672,2.253662,1.266992,2.210831,2.165975,2.30173,2.176163,3.19074,3.886232,2.049314,2.453585,1.863355,1.88365,2.430876,0.4665116,1.007214,2.017082 +-0.1604012,0.2642909,-1.201299,0.9378763,-0.436607,-1.971285,-1.229984,-0.2052043,1.15551,0.6623901,0.8104155,-1.873317,0.388316,0.2437999,-1.87687,0.6240298,0.5799771,-0.9593404,-0.9735986,-0.7468868,2.141125,2.883032,2.477258,2.180753,0.9122307,2.28616,3.559262,0.7232677,-0.4524254,1.124133,2.36685,0.8824471,2.398309,2.623958,1.599416,1.751479,1.512483,2.390674,3.417912,2.594422 +-0.2421368,0.09119491,1.050052,0.8407458,1.523365,-1.257113,0.9433797,0.5240342,-0.123788,1.003064,-0.2411208,-0.6028533,0.1059607,1.612957,0.416665,2.944775,0.1924248,1.038828,0.4722285,0.284313,3.147243,2.965289,0.9382697,2.67267,2.078797,2.403563,1.663517,2.567941,1.089225,2.547832,2.42048,0.2082983,1.597179,2.706799,1.355687,2.289102,2.877693,1.456508,2.081147,1.149828 +-0.0288372,0.7407056,-1.817908,0.5018589,-0.7533826,1.60981,-0.2562383,0.2293209,0.06508995,1.192648,-2.389513,-0.1443363,-2.332564,-0.04261206,-0.02309841,-0.04053289,0.135406,-0.2953209,-0.5122958,0.2764756,2.207385,4.01485,2.755225,3.31954,2.44181,1.350272,2.446789,0.8002924,2.377346,2.392748,0.4913629,2.178155,3.068755,1.466348,2.289502,1.8187,3.46682,2.759524,0.7584879,1.550335 +-0.3162155,0.62397,1.203821,0.9199908,-1.525285,0.6752692,-2.214433,0.3031415,0.2203023,-0.2204446,-0.8950013,-0.0434386,0.7663958,1.450536,1.077992,1.191791,-0.1729841,-0.07616233,-1.521906,0.5324163,1.061712,2.597486,2.041121,1.944744,0.8914553,2.158627,2.101844,2.237316,1.309757,1.682445,2.096163,1.245343,3.670486,2.558328,1.887495,1.388847,2.879877,2.928662,0.2209486,2.296611 +-0.4161601,-0.6135165,-2.367123,0.6249441,1.419144,0.9205426,-0.4867121,-0.3059278,-0.9543336,0.5592756,0.9865527,0.9497801,1.048084,1.213381,0.270908,-2.927187,-1.835009,0.03917199,-1.963496,-0.3016564,2.944257,2.403654,1.515106,2.552254,3.301693,3.847515,1.423011,1.164026,1.361072,1.352917,1.741286,1.577611,2.63028,2.229714,3.478902,2.235475,1.414221,3.15186,3.299806,3.139474 +-1.023896,-1.079649,-0.3516862,-0.5132751,-0.000947965,-1.481664,1.331321,0.7857334,-0.1095094,0.9200329,-0.9438998,-0.9976026,0.7182284,-0.08426458,0.04332581,1.552665,1.506152,-0.637058,0.8210855,1.132612,1.273027,1.499308,2.214586,2.312688,1.343465,1.162356,1.601369,2.274769,1.796766,1.522246,0.5211746,3.775157,1.92262,2.281234,1.266019,3.068926,4.790423,3.06927,3.00254,1.368981 +1.099495,1.515617,1.631428,1.108062,-0.1101461,-0.9054373,-0.9060181,-1.557294,1.149236,-0.4396693,-1.180413,0.2145043,0.3257009,0.4997005,0.4510319,-0.8466386,2.331128,-1.00016,-1.886649,-0.3400505,1.458364,0.8811971,0.8082534,3.435508,1.595371,1.958178,2.914143,2.56276,2.846376,2.576176,1.005104,2.062648,1.664767,2.607548,1.429077,2.200174,3.727234,3.00391,1.05984,0.1513297 +0.8177125,-1.44252,0.1412495,0.5724393,-0.3567389,-2.066809,1.003259,-1.578991,1.6397,1.487007,1.805452,-1.766395,0.977528,-1.448982,2.129071,-0.3043757,-0.0550831,1.727544,-0.6109896,0.2331857,3.191162,1.781965,3.082447,1.144082,1.724855,1.047303,1.233757,3.052971,2.167496,2.314631,0.9588302,3.043211,3.686057,1.232577,2.280895,1.636766,1.702476,3.019984,0.1466477,0.2698252 +0.1688755,0.663321,0.312847,-0.7758223,-0.6587543,0.766775,0.5276134,-0.5364173,-1.147301,0.0407349,2.085863,-0.9244785,0.5726244,0.5523346,0.8518317,0.197218,0.04028872,-0.5889907,2.018371,-1.515053,2.20621,1.89224,2.929091,1.502872,0.415326,1.311054,3.515152,0.6256587,1.22787,2.728583,2.254016,1.956882,2.589552,3.816206,2.217779,2.443751,2.215111,0.2410524,2.055333,1.102458 +0.03866544,0.6155517,1.33609,0.7359524,-0.639134,-0.500762,1.022204,1.050359,-0.9993757,0.248853,-0.07243376,0.1328874,-1.487885,-1.669873,1.042728,0.8718751,-0.1228022,0.3827637,0.1344427,-0.5915464,2.795735,0.4484292,2.779041,4.276558,1.28564,1.683771,2.313319,1.009572,-0.8633732,2.431457,2.819884,2.472546,2.737513,1.816628,1.252128,0.9427445,3.922251,4.041575,2.7572,2.425294 +1.078175,-0.6288748,-2.017177,-0.5885695,1.097622,-1.616408,0.991171,-0.1790261,1.187223,-2.016676,-0.2143127,0.8318459,0.5319324,-0.4125709,0.3808741,-0.951649,0.4181194,0.4508854,-2.659597,0.3877182,-0.1013574,1.159037,1.796993,3.494164,1.701424,2.204569,2.30655,2.616163,1.65763,2.10248,2.338152,-0.2827132,3.864663,1.696188,0.9377627,3.029107,2.114498,3.921557,3.917937,0.8589645 +0.3794273,0.3060544,0.6096412,-1.505148,-0.2423401,2.433088,-0.1935573,-0.6207791,-0.7467825,0.08768745,-0.4359323,0.8301642,-1.693168,-1.144964,-0.1931837,1.594986,-1.377841,-1.330953,-1.262073,0.1939286,1.2692,1.954605,1.614871,2.160412,2.368371,1.784728,3.236812,2.113506,0.945217,-1.046452,1.871435,2.227857,2.640331,1.73138,1.399914,2.323319,2.395685,1.56692,0.9731506,2.029884 +-1.078175,0.8342304,-0.5321044,0.8566616,1.378879,0.7478653,0.7811863,-0.006385306,-1.715948,0.4577288,0.7933519,-0.3024461,1.404071,-0.04736232,-0.2911462,-0.8474727,-0.263837,-1.389083,2.084755,-0.3394242,3.291394,2.343375,2.375407,1.414918,1.213175,1.742285,1.475012,2.248258,2.468112,2.819937,0.839478,1.065773,3.662294,2.5491,1.053976,2.07772,3.956028,2.916132,3.354088,4.80124 +0.1887635,-0.7912308,1.364102,0.2378845,-1.25754,-1.036867,0.3607033,1.264877,-0.06610232,-0.1666385,-0.06066448,-1.267743,-1.947407,0.2356777,-0.04146296,-0.06626658,-0.9461914,0.2303695,-2.665707,-0.1073996,1.39706,3.839087,1.722731,2.196289,2.763982,0.2736282,2.462623,5.12418,-0.8678204,1.069338,2.38751,3.150065,3.782572,3.08088,3.406166,2.359749,0.9427208,1.752872,2.164652,1.468007 +1.583134,-1.092377,-0.3597862,0.09178289,-0.0387551,2.662986,0.128947,0.871157,-0.4383727,0.3548864,0.4455088,-1.50951,1.104689,-0.3261227,0.1032097,-0.2526013,1.024862,-0.305108,0.776853,0.2151867,3.011704,4.933632,1.555531,2.463522,0.8418158,2.265827,3.715871,2.126342,2.466208,3.297757,1.998525,0.800856,1.214712,1.370291,2.715559,1.275338,1.872836,2.147327,2.18018,0.8600427 +-0.2501123,-1.553212,0.8392144,-0.1659412,1.331579,-0.2684837,-0.07793187,-1.557054,2.260733,0.1170508,1.693249,0.2713431,0.5002355,0.4133312,0.001439938,-0.1607019,0.566021,0.6342642,0.3823653,-1.340865,2.205769,1.241147,3.542545,1.88687,2.538947,2.265187,2.590185,3.395062,3.793863,1.86649,2.512442,2.084097,3.414882,1.71977,1.446271,4.18091,1.175673,2.442113,3.549796,2.641403 +0.3429942,-1.283846,-0.6729147,0.598849,-0.2033978,-0.9746849,1.098677,-0.2703904,0.2690652,1.540398,1.043577,-1.013409,1.443224,0.4233433,1.481626,-1.520226,0.6250151,-0.6276731,-0.8026412,-0.4688862,0.4468237,2.083941,-0.03434707,2.390935,1.117962,3.253566,2.926633,1.788825,2.081586,0.993501,1.892152,0.9107472,2.205331,0.3168343,2.288452,2.974796,3.695438,0.7937821,0.8447169,0.5831101 +-0.1247019,0.3161424,-0.9820842,1.454075,0.2341045,-0.2479719,0.7128462,-0.989387,0.01144227,2.292899,-0.7327163,0.7429464,-1.136919,1.215833,-0.06503261,-1.510636,0.9708823,-0.4712647,-0.7017964,1.525826,1.143611,0.4406105,1.815105,3.15951,2.759096,0.8832972,1.467059,2.00448,0.5997126,0.9239428,1.660159,0.4847215,2.124104,2.97556,1.204681,2.675727,1.729117,2.119625,2.904031,2.678731 +-0.9933148,2.368408,1.164134,-0.7171956,-1.368857,-0.1721546,0.2203288,-0.5941474,-1.602937,-0.7271914,-0.4596678,-0.1514461,-1.602773,0.4529838,-0.4032418,-0.9150754,-0.03538844,0.8558228,0.09758028,-0.5558241,1.792233,-0.07411542,2.221431,2.141812,0.4884539,1.771947,2.853355,1.564483,2.007276,2.7853,1.957361,2.246006,2.855649,2.722047,1.246763,2.241156,4.104291,4.618813,1.320925,2.729211 +-1.021439,2.17061,0.6665313,-0.702948,0.2686157,0.4834036,0.08999933,0.7576771,-0.3943243,0.9382869,-1.296631,-0.2419468,0.1559269,1.555603,0.0121504,-1.883782,0.7951907,-0.5595002,-0.3768342,-0.4421618,2.19111,2.194882,1.114666,0.9225235,1.02413,3.315221,2.733917,3.072222,1.43845,1.531805,1.224233,3.34291,2.39866,1.905109,2.024133,3.038977,1.734199,1.90727,2.875555,1.855712 +0.840643,-0.6423378,0.6280234,-0.4716626,0.6705049,-1.135827,0.8319676,-0.4882169,-0.4491773,1.551595,2.862815,-0.4297654,-1.738536,1.780857,-1.050272,-0.6077176,-0.04826404,1.426744,0.4818635,-0.7701997,1.844417,1.494464,3.453693,1.216772,2.905899,1.004177,2.275787,1.196148,2.211466,1.645716,1.404697,2.206414,1.863579,0.8850921,1.365225,0.641993,1.976961,3.368157,2.026886,5.275066 +0.8495731,0.3786972,-0.6563325,0.8546558,2.422031,-0.4609647,-1.273223,-0.9089285,-1.290234,-0.1523142,-0.5663891,-0.2007395,0.9389915,-1.356456,1.016822,0.879526,-0.2797895,0.4047885,0.8715645,0.03210456,2.562719,0.7712516,0.7201191,1.511592,3.071218,2.5715,1.829438,1.200674,2.363619,0.9300305,2.04534,2.5043,2.922452,1.840524,0.5584006,1.654539,0.7540273,0.8114583,0.7708452,2.468737 +0.4696189,1.024413,-0.5925556,-0.1158771,2.182578,-0.6467009,-0.02294161,1.2734,-0.5776598,0.2691178,-0.2078398,0.9428663,0.2959068,0.5836485,1.281946,1.21163,0.2253575,-0.2304428,0.0112601,0.1287936,3.472825,2.405895,1.584973,2.276487,0.7935946,3.850171,1.840671,2.477878,2.03973,2.321072,1.547572,3.548696,2.239015,0.882619,3.421928,1.727742,1.971964,1.285055,2.851796,2.78312 +-1.269655,0.167185,-0.5566538,-0.06085357,1.250316,0.1389423,0.3087385,-1.561799,0.4746929,-0.3063343,0.8552637,-1.746393,1.477929,-0.04113251,0.6685954,1.545402,0.8587013,0.7384173,0.3967885,0.1264154,1.988724,1.410033,1.292145,3.853453,1.243278,2.044288,2.979985,2.727616,2.3979,1.452963,1.23441,1.622514,1.427651,1.877667,2.500301,2.604297,1.390075,2.719148,3.210079,1.756192 +-1.10512,0.03918725,-0.1043376,-0.1940597,0.5724092,0.9207269,-0.3108776,0.6169934,0.3778948,-0.6444385,-0.4979457,0.8166575,2.639861,1.647184,1.068043,0.03836861,-0.3464217,-1.829364,1.645259,-1.089293,3.448552,0.07801587,1.582726,3.380438,2.272774,1.638907,2.300429,0.04526516,2.816475,3.313927,3.739028,2.974742,-0.1083932,3.387596,1.700488,2.293329,1.040083,0.7066566,3.085878,2.316966 +-1.897946,-0.07645971,-1.544128,-0.9787534,1.018305,-0.2422355,0.2644238,-0.08022164,0.808448,0.8733472,-0.2119839,0.3103964,-0.4651573,0.06511009,1.773283,1.270871,-0.5138615,-0.543284,-1.175431,-0.4546445,0.5977548,2.6454,4.265575,2.237755,3.326312,2.298929,2.946161,0.08551795,2.9068,0.9923684,2.226887,3.153309,1.663048,0.7800917,2.730311,4.103309,2.139819,3.244924,2.615684,1.24906 +0.4917873,0.5727514,-1.649675,-0.05463533,-0.5915628,-1.607523,-0.357216,-0.4725274,0.01351045,-1.983164,-2.528484,-1.914262,0.6489735,-0.9807384,-0.4400856,1.869377,0.1547632,1.020205,-0.7452599,-0.2109759,1.456064,1.531397,2.879073,2.458979,1.439473,4.306655,2.42049,0.6465693,2.320282,2.601944,2.328819,1.028909,3.288787,2.274072,3.41996,0.348013,1.301453,1.272227,2.967286,0.2557874 +-0.7047223,1.101172,-0.7704407,0.7582618,0.5054007,-0.5980343,0.4125911,2.129567,0.3142649,-1.543226,-0.3417732,0.3317504,0.1084266,-1.15208,0.8659803,-0.1945059,-0.05116447,-0.2037038,-0.0623508,-1.414114,0.9234041,2.295296,1.648389,1.097403,2.232938,0.1647551,1.765535,1.151272,2.419841,3.027892,1.45974,2.166659,1.416922,1.306299,1.063001,1.884085,3.359318,2.042622,3.035178,3.441004 +1.777576,0.5017193,0.3253949,0.182555,-0.975175,-1.327042,0.2391051,-0.3389019,0.331423,-1.746188,-2.029253,-0.1869575,-0.174908,-0.365159,-0.1137225,0.5693554,-0.1960153,-1.104328,0.7375014,1.139474,1.558818,0.719008,2.402764,1.56098,3.530784,2.427233,1.169153,2.92845,2.574125,0.9271762,2.484792,2.697755,3.452997,2.747781,1.886274,1.460567,0.939781,1.461841,1.651006,1.739213 +-0.245389,-0.3935715,-0.6949702,0.7439966,0.7338776,1.074109,-1.391216,-0.9502585,-0.06351122,1.234623,-0.9345734,-0.5851897,-0.6906404,0.3475655,1.20047,-0.8918483,1.174545,-0.2499412,-1.437537,-0.6423839,2.298097,1.708776,0.6086189,3.404193,2.198972,2.643701,2.104835,3.612259,2.135058,2.777079,0.5324809,2.750039,3.735645,2.813791,1.21107,3.210341,1.239239,2.705197,1.974879,3.090295 +-2.111252,-1.520859,1.027928,2.755946,0.4104823,-1.330091,0.05204586,1.685843,0.7729166,-0.3050506,0.6851448,1.318228,-0.2259661,-0.2719501,0.5087369,-0.6854013,1.007021,-0.4084477,0.339567,0.7898562,2.947291,2.419758,1.689249,2.64454,2.698086,0.9072838,3.465225,2.065935,2.357961,1.942634,2.561885,1.263234,1.999388,3.271897,2.121051,2.401066,2.421252,2.631082,3.162511,1.950459 +-0.5853145,0.1807918,1.431322,1.084368,2.094292,-2.05404,-2.190551,1.573808,-1.448336,-0.4799247,-0.2714928,0.5836874,1.401299,-0.9486025,0.2677135,-1.130438,-1.635789,-0.9253585,1.958713,0.8666175,0.9893428,2.934396,2.025841,1.561514,3.295096,1.855895,2.165017,2.414056,1.233108,2.263357,0.4682856,3.180495,1.733405,1.385049,1.354225,1.995684,2.544968,2.936582,2.607656,-0.1254594 +-0.5179037,-0.9096948,-0.7324259,-1.747787,2.087129,-0.1958721,-0.1463506,-2.457965,-1.334223,0.8788416,-0.1280678,0.6325834,1.833988,-0.894,-0.6006634,-0.955756,2.299049,-0.6561487,-1.22558,-0.4840453,1.302304,3.341873,1.998282,3.214869,3.121639,2.186933,4.585692,1.445586,3.356798,1.047088,1.441144,1.07174,2.426074,2.325551,3.954021,1.755942,2.737067,0.768754,2.655198,0.3969181 +0.5139191,-0.7587175,-0.6766064,-0.0964595,-0.8237508,-0.07416629,0.8727391,-0.8590477,-1.25367,0.3471133,-0.9593103,1.089934,0.5124298,-0.3671478,-0.003471019,-0.8176278,-0.4513833,1.407656,-0.2279972,1.733392,0.458921,1.762979,2.499123,1.259429,2.515056,2.332007,3.132056,0.931627,1.035865,2.143657,2.751273,2.104195,1.08273,2.927909,2.061608,1.825783,1.383764,1.288595,3.12959,2.004852 +-0.6264269,0.3039065,0.8860777,0.6439996,0.1000389,0.2518938,-0.7402005,-0.1775065,0.6962739,-1.143717,0.08045251,-0.2949164,1.392422,-0.9438863,-1.06078,0.5954499,-0.7921648,-0.8401793,-0.6238743,-1.05128,2.903135,3.13385,3.626336,-0.2151597,1.512099,2.79574,1.546706,2.576884,1.590466,3.231353,1.990839,1.926004,3.952973,1.638449,2.664525,-0.1272888,0.5586302,3.06218,2.015736,1.685048 +0.1921426,1.262771,-0.75062,-2.086338,1.15285,-0.009857253,-1.265772,-0.2287362,0.3157933,-1.054502,0.7243332,0.812142,-0.5888714,2.340312,-0.3928533,-1.686028,1.028982,0.2780049,0.5113241,-2.096814,1.963765,3.395375,3.792358,1.142518,0.9800709,0.7275791,2.043392,2.48243,3.653125,2.877572,2.66978,1.913702,1.444885,1.704286,1.656413,2.283289,2.216443,2.337683,1.655797,0.3038461 +-1.427278,0.2006099,0.8846798,0.2907691,-1.824344,-0.1149836,0.7301541,1.116179,1.435515,0.1949569,0.7133571,-1.477482,1.327056,-0.4254978,1.325257,1.600774,1.096361,-0.2671501,-0.6974833,-0.948363,2.000598,3.105524,3.911948,2.151776,0.7121947,1.74798,0.5792261,1.883646,1.469318,2.323999,2.824134,4.422646,0.423888,0.8503188,4.219149,1.566647,2.993461,2.361718,0.748067,3.462656 +0.400991,-0.795184,1.934598,-1.522914,0.3229758,3.069557,1.059972,-1.318898,0.9618884,0.3447372,-2.185933,0.278312,-0.658166,0.6833816,-1.732549,0.6737599,-1.305933,-0.309377,0.2821478,1.120677,1.627527,1.575785,0.3113563,0.5220076,1.118499,4.081474,2.360192,0.3820736,1.452149,0.2946111,2.338594,1.666283,2.532876,0.1500701,3.702497,2.16488,1.686138,2.55157,2.69268,0.1743609 +1.161565,0.04252877,0.6120726,-0.07151252,-0.3641045,1.838756,-0.5987221,0.6010695,-2.187788,-1.276875,1.351941,-1.693001,-0.3082483,-0.7615683,-0.6270579,-2.046001,-1.111052,-0.3538669,-0.08553969,-1.496656,1.706831,2.514485,1.904111,0.1178981,1.042981,1.66792,3.249912,3.377272,2.735216,1.425437,2.297905,2.573399,0.4524123,-0.0865756,1.473626,2.441843,2.039759,1.927163,1.845892,1.701361 +0.3263468,1.496987,0.4713294,0.547747,-0.9549067,0.646117,0.6479289,0.06119964,0.3668115,-1.310433,-1.324172,-0.2432242,-0.4834431,0.9924776,0.2571384,0.6157379,-1.721467,-1.565408,0.9520417,-0.8966816,2.146803,0.501945,1.71308,1.846379,2.295787,1.76113,2.254702,1.595821,3.168214,1.746037,0.4929985,1.385006,3.246462,1.765843,3.810691,2.128359,0.2221268,3.063891,1.967289,2.818766 +-1.014982,0.8667814,-0.2610514,1.111485,0.8884445,-0.9366838,2.585669,-1.219906,0.348721,0.9313092,0.7130674,-1.091974,1.075256,-2.043894,-0.2392084,-0.4866559,-2.01059,1.571999,0.7316757,0.09807415,3.049036,3.177006,3.167584,2.020825,0.7427545,2.732631,1.604501,1.886377,2.602288,1.931188,2.987005,0.8516575,2.199236,-0.1189349,3.804588,1.015751,1.94595,3.132504,1.634453,0.7115706 +0.1012584,0.7975445,1.129907,-1.028483,-1.658099,0.4781078,0.09347145,0.7124396,-1.217009,-0.4371681,-0.9753493,1.13019,1.103117,-0.2084172,-1.115088,0.182786,0.9935735,-1.509804,-1.921404,0.3463875,-0.6370313,2.058322,2.611081,2.88978,0.9781255,2.158008,-0.2414543,2.100239,2.065266,2.245226,0.3665311,0.3533804,1.650713,3.831943,1.537067,0.7932649,3.378433,1.952971,1.456552,1.559093 +0.9918933,-1.454253,0.2041338,0.1993044,1.147212,1.145224,1.292262,-1.260604,-0.8073737,1.272884,1.279041,0.3673341,1.357724,0.4464102,3.049003,1.675446,-0.5699857,1.129246,1.992558,0.6218449,0.6604502,3.307158,2.285183,2.514466,3.012479,1.858806,2.904185,2.77853,3.20996,2.333953,1.461885,0.4093974,1.109566,4.033002,2.337693,2.987604,0.8293784,2.224407,2.993856,1.700083 +1.102651,-0.4827237,0.3505303,1.055683,-0.4577943,-0.4273727,0.5081998,-0.4119829,0.1216394,0.370102,-1.096023,-0.7919795,0.7805788,-1.245863,1.26196,0.964127,-1.690401,-0.0021132,-0.813017,0.2464363,3.531287,2.58967,0.8595649,1.201803,-0.2534586,0.783536,2.399205,2.465413,0.1813123,3.999518,2.496546,2.740548,2.888131,2.35557,1.060119,2.553527,1.538987,2.354909,2.06164,1.051967 +0.280375,1.49162,0.911886,-0.5701386,-0.1966638,0.3741048,0.9274052,0.804248,0.3021954,0.7823597,-1.646726,-0.01877319,-1.164747,-0.1651846,-1.035352,1.532559,2.633237,-0.5180617,-0.6555515,0.4743812,1.482142,3.266586,2.356178,1.952009,1.975198,0.8692547,2.033076,2.281119,2.076058,1.776856,1.741009,1.755212,1.649151,1.880624,0.5001212,2.215654,1.344299,0.02060279,2.373673,2.050623 +-0.4061774,0.6633941,0.7992924,1.511111,-0.0707168,0.4099875,0.2844151,0.151745,1.562779,-1.223927,0.6600627,1.932403,0.1617386,0.5537714,-1.526932,-1.021271,1.803988,-0.9813324,-0.33136,1.792262,0.7615436,2.091152,0.4441665,2.687944,2.192061,2.436252,1.900204,1.160332,1.309895,3.156139,0.8219467,2.979079,1.364993,2.90778,2.425468,2.149662,-0.4100579,1.932012,2.147142,2.547355 +1.286413,-1.785779,1.158026,0.6690709,-0.4040265,-0.6284446,-1.950464,-0.339617,-0.6763837,0.1721839,-1.435917,-1.635386,1.5319,2.429839,-1.428878,-0.3988638,-0.1455991,-0.2155205,-0.1903586,-0.9555059,4.345798,3.390474,1.250666,1.700431,2.947421,0.7016758,2.946317,1.135399,4.002234,2.252994,3.689839,2.912743,0.1409739,4.50258,0.8698636,1.546372,1.835389,1.299899,1.259642,2.247666 +-0.2540306,-0.9227106,2.033265,0.5137191,-1.030515,0.583418,1.865013,0.4528935,1.719505,-0.4726547,-2.611484,-1.415936,-0.9661593,-0.850569,-0.8835456,-0.1597419,1.281951,-0.3410973,-1.917876,0.02697445,1.471076,2.760212,0.4684631,-0.02554489,2.162657,3.095253,2.00665,2.562341,2.625224,2.590739,1.540911,2.613277,1.200343,3.687724,2.085682,1.419257,0.5989389,1.341997,2.323466,1.781347 +-1.144449,-0.2277614,0.1270509,-0.003134947,-0.9608334,0.3851037,-1.679557,-1.00218,0.6306907,0.5082748,1.177522,1.104953,-0.498925,-1.857542,-0.7305756,-0.6092156,-1.60331,-1.073922,0.8200458,0.3761115,0.450312,3.335438,4.572639,3.083114,1.091053,0.9693825,2.21344,2.894836,1.295741,4.067326,2.374088,1.217951,2.324586,3.077169,1.120474,3.868059,2.472254,2.200495,1.688375,1.461323 +0.2664109,-1.072323,0.8462137,-0.4411708,0.07728886,-0.6269871,-0.6445374,1.820789,0.387822,1.11887,0.000320654,2.261911,-0.2782548,-1.368648,0.2804298,0.2960904,2.105005,1.764098,1.384396,-2.221182,1.699288,2.727998,1.276297,1.222281,-0.5922939,1.560714,1.514117,1.390569,1.056035,1.647798,2.874052,1.732447,2.505899,1.102156,2.226806,0.696909,2.75938,3.189147,3.302259,3.197971 +0.6368557,0.7461025,0.2363491,0.05214117,0.5059342,-0.09066295,0.8547476,-1.821199,1.400639,0.7760577,0.1269623,-0.3464926,1.474027,1.018209,2.040834,-0.3172046,-1.073194,-1.098881,-0.930847,0.4133455,3.194812,2.203196,2.436255,1.988718,1.100194,2.180616,2.707424,1.486078,0.3863658,1.351954,3.851319,0.6435485,2.236106,3.500494,2.597611,4.485297,0.8330929,2.162205,3.449779,2.710001 +-1.402115,-0.7298324,1.016091,0.05787203,0.02962757,-0.4669802,2.228659,-0.6101385,0.6289806,-0.5437755,0.2591848,0.003964425,1.382187,-0.2692511,-0.0872496,1.593008,-0.2765205,-0.1009101,-0.8287637,0.3411099,1.719227,2.068687,1.66769,2.200192,0.7077113,1.428802,1.013206,1.635563,2.677574,2.782483,1.262664,3.050032,5.37579,3.255484,0.8062082,1.204859,3.538651,1.309254,4.596783,0.956071 +-1.511825,0.9430092,0.3190095,-0.7253567,0.6329926,-0.2990406,-1.53531,0.1550997,0.3572642,0.3891329,0.198638,-0.9695706,0.2684954,-0.3644738,-0.0145976,-0.5206042,-0.02498633,0.01970546,-0.6792309,0.923364,0.6384099,1.455446,0.6336524,3.702504,1.625123,0.4742081,3.360861,2.407285,2.310806,1.564416,2.121067,1.786487,0.7218956,0.5344933,2.088645,2.21667,2.271664,1.317793,3.154661,3.319825 +-0.3011929,-0.03294286,-0.8276493,0.9614763,-0.3857303,-0.2058523,0.05793322,0.5299834,0.6328587,0.1053906,1.060447,0.1931849,-0.7288287,-0.16134,1.281566,-0.8088351,-0.2798999,0.7046848,0.9184243,-0.1367657,3.818283,1.500558,2.077574,2.613158,2.435043,2.8592,1.138802,3.318834,1.677195,2.035937,4.785525,1.966008,2.073076,1.880782,2.395269,2.377549,3.150095,2.39986,2.647635,2.296358 +0.4913611,-0.6144495,-0.9232574,-0.4421924,1.347119,0.1923548,0.5579552,-0.01080595,-0.2710263,0.05552708,-0.7703397,-0.1583531,-0.3150407,-0.517372,-0.7617446,-0.006049716,-0.6171421,-0.5744794,1.553488,-0.3950344,2.878842,1.172038,2.584231,2.35706,3.281847,0.8884445,0.3571515,1.559826,1.752554,3.35358,2.565308,1.459779,3.167568,3.151773,1.321727,1.236792,1.633293,2.21349,2.477734,2.513961 +-0.8981565,-0.6656298,1.013362,0.6110421,-1.5587,1.876987,-0.2265178,-0.4891166,1.190747,1.456023,-0.5289929,0.6359755,-1.815459,-0.3679397,-0.3715852,0.6628634,-0.8955603,-1.799037,1.607533,0.945429,0.492566,1.464235,3.568378,2.338404,1.970233,2.480554,2.44735,2.716433,4.233365,0.7961464,2.370262,0.5844597,1.667574,2.532111,2.072479,1.332141,2.59284,2.257964,-0.0583257,2.430016 +-1.575055,-0.3686736,-1.382425,-0.1240718,-0.4990927,-0.5309078,0.5660204,-0.05084861,-1.420315,0.632865,1.059085,0.01996626,-1.854827,0.7444932,0.5748553,-0.4091655,-1.087105,0.3743803,-0.9045598,0.05103075,1.25914,1.946999,3.740934,1.85052,0.8046947,1.932647,1.658645,1.869997,1.003814,0.851435,1.631323,1.658847,2.977123,2.555853,-0.7169437,1.625477,3.252791,2.731612,2.995923,1.870298 +-0.5032567,-0.08977567,0.5538425,-1.179609,-0.147045,0.1007605,1.386734,-0.8503157,1.890092,0.1407919,-1.413873,-1.159904,0.1749145,-0.8992431,-0.9759906,0.1960045,-0.1262882,-1.161818,2.20423,-0.2189549,2.638862,1.62829,2.820625,1.852623,3.362877,1.662741,0.5594962,2.684865,4.477929,2.1424,1.99353,2.393414,0.9450096,2.16963,2.088521,2.217262,2.659076,3.495284,1.469655,2.701124 +-0.6229084,0.6996442,-0.06247139,1.020952,0.7612279,1.463773,-2.333119,1.94096,1.428199,1.804111,1.290543,0.4219818,1.24739,-1.485015,1.508426,0.06270589,1.239503,-1.409449,1.12388,-0.9474487,2.386772,1.77804,1.876755,2.518844,1.363698,3.721346,2.411131,1.635085,1.076605,2.055205,2.995918,2.56403,1.538401,2.301146,0.1499274,-0.0887344,4.360649,1.146531,1.162174,2.164439 +-0.68483,-1.861215,0.8383506,0.914537,2.170798,-0.9878187,-0.01521431,1.079646,0.05094982,-0.04008477,0.3426043,0.8707264,0.1556597,0.2649387,-0.05706237,-0.7398167,0.9836095,0.04280208,1.201723,-1.555332,-0.1788519,2.676069,3.182216,1.23009,1.946068,1.784701,1.033958,2.880015,1.96481,2.581121,1.900695,1.444067,2.879706,1.869199,-0.2547986,1.875627,2.301178,4.293737,1.586248,2.909507 +-0.2215168,1.103696,-0.199764,1.313955,-1.911688,-0.7750924,0.970035,1.558836,1.039681,-0.3353022,0.3169732,-1.111047,0.6862143,-0.0837346,-0.4336254,-0.7874346,-2.431628,0.6287966,-0.2386351,1.512512,2.721554,1.247614,1.896353,2.534445,4.962931,2.055052,0.4501882,1.90148,1.346475,2.215935,2.325076,2.586258,1.873457,3.455362,-0.1845388,1.3054,1.986981,1.731633,2.29729,2.001805 +0.3880508,0.6601621,-0.3584096,1.302549,0.2021227,-0.8182505,0.04201057,1.01753,-0.328101,-0.4813194,0.477668,0.6262838,2.626299,-1.634706,-0.6657773,-0.4177655,2.692643,0.3563763,-0.2556925,0.9915608,2.155685,2.252334,0.8384458,1.916881,1.482573,2.384172,2.192609,2.405294,0.9283483,2.242947,2.588655,5.617131,1.93929,1.539181,2.778593,1.554952,2.056902,1.354214,0.949937,2.550616 +-0.8063165,0.4125648,-0.3486333,-0.2932863,-1.779229,0.3118295,-2.123124,-0.3520907,0.3209601,-0.2986217,0.1523132,0.6027054,-0.2119547,0.6501126,-0.174637,0.8272127,0.05254187,-1.060118,1.779958,1.292397,1.850381,3.884707,-0.004473905,2.246082,2.351781,2.00165,1.333814,1.419059,0.08153558,1.38818,1.244516,3.35689,0.3115151,0.6830889,2.576289,2.305,0.6647517,2.822204,1.973898,2.806994 +-0.547213,-1.606636,-2.035119,0.7778321,-0.9655684,-0.5209121,-1.309101,0.409077,1.191882,-0.9085966,-0.2374247,-0.3549888,0.4649294,1.674925,-0.8283996,-0.6918746,-1.667638,-0.7875888,1.023593,-1.587326,1.423121,3.066346,1.128439,2.303214,1.275093,1.896032,2.54725,2.314301,1.438774,3.834409,1.304572,0.6733855,1.985475,1.519032,0.5581792,1.373766,4.586752,1.211848,2.804748,2.525427 +1.213494,-0.8050215,-0.05612577,-1.432311,-0.5125825,-1.932027,-0.5994599,-1.246471,-1.968277,-1.341093,-0.1715679,-0.4404226,0.2137598,-0.991833,-1.307275,-1.34798,-1.12978,-0.478484,0.9703815,0.4124308,2.221248,0.7937923,3.499111,0.650338,3.663166,2.484937,2.496723,2.147065,2.248445,0.2970599,1.371973,2.086396,1.421421,3.543105,3.145092,2.222338,1.862002,2.105411,1.692527,2.048748 +-0.4170132,-1.079218,1.713459,-0.07087143,1.086675,-0.891311,-0.2139723,-0.5178408,0.8083698,-0.3192018,-0.4173513,0.4632249,1.577519,0.1523564,0.06406027,-0.6447237,-0.943285,0.2160819,-1.786138,0.7957601,0.7206225,-2.352199,-0.5073448,0.5507977,-1.556624,0.1208001,-1.763198,-0.4093984,-0.7280413,0.02384492,0.5060353,1.090008,-1.223869,0.3660106,1.140196,-0.1643204,1.357266,0.5983962,0.8222349,-0.09970532 +0.1645932,0.6849344,0.1596853,1.443261,-0.3357023,0.13908,-0.1202558,1.495243,-0.3711833,-0.3173282,0.2929089,-0.7291563,0.1138087,-1.940031,-0.3266115,-0.5016439,-0.0763362,-0.8594003,-0.2267627,-0.168687,-0.3635281,2.448584,-1.65582,0.7184358,0.8703803,0.6732839,-0.5915699,1.297635,-0.1274205,1.739469,0.2227038,0.1793675,-2.233752,-1.653599,-0.1504412,-0.0150496,-0.7241625,-1.120567,0.3163882,-0.03856546 +-1.569627,1.884505,-1.079505,0.984481,-0.128862,0.02060816,1.231929,-0.6924398,0.7266354,1.311574,-0.234075,0.8932644,-0.5697793,-1.129907,1.706297,-0.271407,0.1753801,-0.2455411,0.1670765,0.06348284,-0.4219093,-0.03992665,-1.413521,-0.7508157,1.327383,1.388792,-2.072613,1.106977,-0.7107949,-0.2976144,1.229453,-0.3217201,0.8940103,-0.1808785,-0.3395746,-1.28999,0.3372422,-0.3799073,0.1244764,1.731951 +1.466676,-1.47082,2.059897,0.2078608,1.884779,0.1419429,0.4884963,-1.854147,-0.2040029,0.3035328,0.4135925,0.3595087,-0.1896832,0.9932016,-1.712212,0.131481,-0.3981555,1.023701,-0.7997508,0.290222,2.071332,0.4333299,-0.6257163,-1.539619,0.8678018,1.461556,2.168385,0.6592264,1.812487,-0.2848052,-1.437326,-1.268452,0.1378681,0.9776375,0.1317297,0.2032544,-0.6773185,-2.026938,2.088405,-0.2372315 +-1.496085,-0.222074,-0.3429949,0.1955679,0.3571957,-0.044916,-1.03095,0.2546577,-0.0647982,-0.7845913,-0.3829152,-1.77318,1.173613,0.7860956,0.1741796,0.6817977,0.7855508,1.915251,0.833546,0.936064,-0.8661531,0.8589206,-1.422948,0.3788016,0.01626482,0.4315333,1.029635,0.4583417,-0.01890091,-0.02980904,1.254228,0.04528005,-0.7768914,-0.457008,-0.8028394,0.4061971,0.5900947,0.7957831,-0.3210268,-0.710345 +-1.150289,0.941847,1.385262,-1.14046,-0.5770318,-0.5140478,2.478531,0.8112598,0.3997515,-0.3020022,-0.1878354,-0.5070957,0.3907983,1.477618,1.114477,1.406807,0.241896,-0.5335158,-0.4089345,-0.631356,-0.4135166,-0.1574479,-0.6955178,0.04550328,1.380482,-1.398209,-0.01856429,-0.1411606,-0.168408,0.0617347,0.45169,-2.559123,-1.582258,0.24556,1.655986,0.1695275,-1.075691,1.529513,0.6541538,0.642927 +-2.023009,0.5649013,0.1064419,0.6337866,0.1959914,0.6973594,0.525436,-0.8883428,1.743329,1.685818,-0.09488508,-1.684239,0.2303137,2.569961,0.6340296,0.6812586,-0.08320635,0.562361,2.201292,1.333914,0.4230916,0.920807,-1.034579,1.183496,-0.1995633,0.5233363,0.6310248,-0.357303,-0.1153963,-1.518902,0.6906484,1.028422,-1.383378,0.02690676,0.6502657,-1.453926,0.2454485,-1.402397,1.810391,-1.867899 +-0.6816524,-1.064581,-0.5910395,0.95852,-0.3929905,-0.742097,2.046334,1.237246,-0.06362492,1.062859,1.144483,0.2813052,-0.08008156,0.04210444,0.4868276,0.1555931,0.1735106,0.5898784,0.2303197,0.9936534,0.8820545,0.3044088,0.5709485,0.1814419,-0.6877123,-0.2476855,0.6514061,0.4054941,-1.831026,1.448303,-1.642096,1.932875,1.193909,-0.1395729,-0.09537449,0.1779708,1.349312,0.03346836,-0.9655899,0.2030053 +0.5070104,1.452025,-0.02762624,1.181051,0.2710259,-1.430259,-0.6996381,0.3629549,0.6067194,-1.021904,-0.1527369,0.2732866,1.507298,2.618298,-0.5169115,-1.029454,1.630024,0.5399636,0.1803418,-0.05702972,2.26004,1.053902,-0.3746913,0.6069361,-0.3425669,0.2564581,-0.2642137,0.8969268,-0.7918813,-1.318642,0.5643729,-0.3985223,-0.03818626,0.2322296,-0.974401,-1.082846,1.3843,0.07510827,0.08645139,0.04814042 +0.7471119,-1.44423,0.2181389,-2.391459,-0.1596646,2.513301,0.3534534,0.6495218,-0.03763092,-0.4776914,-0.3766832,-0.003527065,0.05112815,0.3010573,0.07922501,-0.623559,0.4977789,-1.043995,0.6134911,1.11285,-0.07455034,-1.013773,-0.7380151,-0.3464509,1.151858,0.3467955,0.1332536,0.565802,-0.7686675,0.1030314,0.003626133,0.344057,-2.626121,0.2834087,-1.378008,0.5268972,0.1532458,2.236657,-0.5348408,-0.4498897 +0.175989,-0.754,-0.1213729,-0.1163748,0.3693712,1.211073,1.278695,-1.236628,2.22359,-0.2794358,0.3955773,0.02059659,-0.1648558,1.587677,-1.001665,-1.848202,-1.233143,1.635952,0.7928441,1.84065,-2.144392,-0.4121016,-1.439353,-2.391793,-0.1668348,-0.2755148,-0.2842663,-0.580472,-0.7380074,-0.5314417,0.8259792,1.254972,0.409883,-0.5641765,-0.4532337,-1.142985,0.6411795,1.621182,0.03148541,-2.312215 +-0.4375236,1.050013,-1.111881,0.4419058,-0.7695378,0.2826131,-0.5469397,0.8107795,0.8629855,0.7644776,-0.2752179,-0.469726,-0.8961355,0.1945719,0.9795572,0.5566871,-1.86098,-0.4922827,0.03793754,-0.03151941,-0.7696002,0.3165545,0.8072586,-0.08101831,0.2821289,2.128929,-1.480858,1.823261,1.035874,-0.2650693,-1.722271,-0.3831182,0.40749,0.5764681,-0.610322,-1.301642,0.08849423,-0.2349673,0.181311,1.273803 +0.9427262,0.7126508,0.200732,1.039949,-1.14806,-0.790108,0.2887889,-0.8390888,-1.18493,-2.026865,-0.06984947,-1.446068,-2.399945,-0.7316927,1.12569,1.387214,-0.7692273,-0.1547168,0.5637111,0.4204041,-1.660245,1.394472,0.3015067,0.07871085,-2.467085,-1.290346,0.4657082,-0.06558143,-1.44912,-0.1579481,-1.330761,0.6895007,-1.769964,0.1576071,-0.06970179,-0.578826,-0.3400759,-0.0940791,-0.5825251,-0.2618001 +0.891045,-0.8998695,0.03208482,-0.204378,-0.492041,0.6573472,-0.2219314,0.08223516,-2.506592,1.462476,0.7614263,0.04575608,-1.343493,2.070158,-0.1513813,0.307134,-0.6091148,0.06479464,1.120854,1.148853,-0.7523135,-1.136671,0.2403773,-0.4190202,1.923913,1.866726,-0.3721986,-0.2257912,-0.1968845,-2.213103,-0.4827759,-0.2447513,1.358617,-1.109793,-0.6586643,1.562438,0.8092256,0.321239,0.7604492,1.815287 +-1.691786,-0.02041056,-1.357537,-1.211951,1.143095,-1.846487,0.2865102,0.4932896,0.2350373,-1.203517,-0.675341,0.1106622,-1.027163,-2.062881,-0.006124022,1.344597,1.008094,-1.056087,-0.7262346,0.4007747,1.635949,-1.520927,1.161547,2.324752,0.1071913,-0.9549868,1.415785,1.355322,0.2212675,1.437324,1.866027,0.6176523,1.647053,1.649577,-0.4438473,0.2768208,0.000146298,1.855363,0.147125,-1.003812 +-0.7121474,1.269676,-1.19683,-1.328104,-0.04162245,-1.823968,0.48876,2.150408,-0.8286444,-0.4390239,-1.533142,-0.1031612,0.144323,-0.7032872,-0.8392118,1.183728,-0.9918992,0.8842457,1.267995,0.3749588,-0.1107177,-1.236403,-1.144472,1.475012,0.6591935,0.5438642,0.1664315,-0.592444,0.7957009,0.04727633,-1.975851,0.04215909,0.1687457,0.3905763,-1.111451,0.6004921,1.429262,-0.03274076,-1.023167,-0.3407152 +-0.3322689,0.1163017,0.3595671,-0.3294724,-0.854048,0.4622229,-1.503902,0.4870342,-0.5023805,1.312897,1.605391,-0.7821972,1.311145,0.2071995,0.3403825,-0.6263176,-1.577271,0.752442,-0.4127259,-1.503558,1.026607,0.2266731,-1.327672,-0.950244,-1.270036,0.3430676,-0.9934186,2.441806,0.875534,1.287707,0.947278,-0.7706512,0.2685172,0.3250151,0.3880542,-0.5768708,1.344605,0.1061098,-1.386544,0.9875884 +0.9830335,-0.2811018,1.578937,0.789109,-0.1371882,0.686999,-0.6194874,-0.5151551,1.537758,-0.353973,-0.0916334,0.3724116,0.4792578,-0.5429907,0.2846459,-1.721966,1.158277,1.315319,0.0293897,1.148463,-0.838808,-1.293983,0.5668627,0.5603373,-0.6310459,0.1514704,1.856612,0.06791841,0.7528763,-1.115764,-0.6391079,1.477934,0.08959696,0.006568956,-0.3547394,-0.8525883,0.2034539,-0.6816315,1.551045,-0.8712826 +-0.9880418,0.4093702,1.725616,0.1246891,-0.1128139,-0.3256601,-0.4968914,-0.204656,-0.02034948,1.667225,1.704972,0.2016331,0.07238845,-1.17196,0.746085,-0.2964937,-0.2562786,0.2672673,-0.4997591,1.487513,0.9980471,0.5122782,0.1744423,1.34242,0.05880209,-0.4655734,-1.933518,0.1988467,0.6800292,0.9076504,0.6039096,0.4996211,0.08234098,1.117931,-0.9143311,-0.4341819,-1.850155,0.6720384,2.133564,1.172802 +0.2655464,0.4978427,0.1475785,-1.009646,-1.04353,1.828776,0.02974972,-0.09687242,-0.9888079,-1.85699,0.05868926,-0.9001594,1.678017,0.8841138,-0.5909342,0.4679891,-1.241346,-2.040296,-1.76593,1.496377,0.5887878,-1.23486,0.1144224,0.8102137,1.714042,0.4885925,0.9854121,1.255515,-0.5911703,2.134603,0.9033767,-0.4851969,1.451656,0.3898712,1.806922,-0.05011991,-0.8599497,-0.4022455,-0.4268569,0.5163048 +-0.7281862,-2.444361,-0.6751603,-0.3957392,0.9393475,0.3293567,-1.430669,0.9558588,-1.024881,-1.729167,1.02848,-0.04068818,-0.7371042,-0.592002,1.240044,0.175436,-1.232373,0.8858884,-1.620555,-0.3039725,-0.3667964,-1.188939,0.2163704,-1.220178,-1.651822,0.4876244,-0.2430402,-0.2760915,0.50565,-0.7240222,-0.6545006,-0.6999898,-0.2385631,-0.5703595,1.380156,2.227543,1.891342,1.629774,-0.2173507,0.6060055 +1.375405,-0.8632323,0.4854132,-0.2721355,1.920915,-0.760389,-0.2594967,-0.05629895,-0.3852954,-0.2756903,-0.1590551,0.06338618,-0.1562142,0.897472,-1.345439,0.4290873,-0.3771969,1.044532,0.009474854,3.433989,-0.1814239,0.2099289,0.01682755,-1.657072,0.5004456,1.220222,0.7028585,0.2527,0.192639,-0.6041057,-2.71534,-0.356418,1.321444,-0.552072,-1.483296,-1.162713,-1.348714,-1.921473,0.3652842,-1.187009 +-0.5992643,0.3783489,-1.212882,-2.165937,-1.325429,0.1051114,1.623882,-1.190362,0.5932622,-0.8008024,1.666904,0.4034121,-0.2955939,1.138703,-1.068602,-1.430137,-2.071786,-0.2909736,1.572102,0.5872055,0.1183306,0.05097854,0.3533149,0.7231585,-1.613555,-0.01758446,-0.5238051,0.2096348,0.3294019,-0.9698836,-1.932254,-0.859125,-0.6088681,1.020804,-0.4282727,0.2019422,0.3050419,0.2542059,0.003554424,0.1148087 +-0.2721732,1.782631,-0.2885697,0.2480974,0.24662,2.069251,-0.5738506,1.353252,-0.3770427,2.455436,0.9285689,0.008632555,2.21373,-0.2460688,-0.5273507,1.489762,2.100411,0.2747435,-0.1627249,-0.2575115,-1.709557,0.7728303,-2.254784,-0.1091911,3.264521,0.09313222,1.108991,0.403105,0.6140012,0.3360944,-2.435738,1.501132,-0.3842364,-1.61316,0.5489184,-0.08407308,-0.4102376,0.3094714,0.1704051,-1.068226 +0.4053248,0.4933589,-0.9169194,-0.5193774,2.499668,-0.4180339,-0.2879642,-2.180604,0.7104208,-1.035181,0.3539026,-0.705636,-0.122849,-0.7252941,-0.08133532,-1.16866,0.1998674,-0.8989412,0.6703593,-0.404215,-0.9120994,-0.257514,0.371383,-0.7749655,-0.8188727,0.8041616,0.6863537,-0.3344004,-0.0157785,0.03237643,0.363315,-0.1345477,-0.8714524,1.081903,0.9448091,0.3974792,-1.228647,-1.642149,-0.3856898,2.028774 +0.8467341,0.2921253,0.7738001,-0.115779,-0.9721239,0.6600434,-0.8910122,0.3666055,0.2829667,-0.5869986,-1.034726,0.2315081,0.4422849,-0.2761098,-2.363146,-0.8158987,2.554145,-0.6314835,-0.883662,-0.1121974,1.289608,0.2730852,-1.412767,0.480591,0.7322932,-1.837853,-0.02285923,1.454342,0.219104,0.4627585,-1.033411,1.166257,-2.095674,-0.316308,-1.206223,-0.5018878,1.261501,0.2665508,0.4228372,-3.567009 +0.518126,0.5720211,-0.9626799,-2.154488,-0.8510522,-1.198813,0.7353483,0.3273344,-0.7160835,1.310115,-0.5495687,-0.06677828,-0.766685,0.4596801,0.8312243,-0.1925712,0.5519502,0.4740179,-0.9722572,0.8209983,0.3289796,-0.3472046,-0.09620191,-0.1353786,-0.796075,0.5332779,0.3290643,0.4913522,0.6903516,0.7289626,-1.423574,0.2653968,0.7233855,-0.1870821,-0.9221188,0.1089035,1.076719,-1.087806,0.640315,-0.5900255 +1.579414,-0.2142836,-1.07171,-1.761204,-1.0934,1.590645,-0.4022675,1.103156,0.2280818,-0.9287337,-0.8366895,1.982923,2.04022,1.452206,-0.3550992,-2.104339,-0.1509778,-1.151245,1.417394,-0.5964072,0.6450452,0.9867087,1.308197,1.353907,0.51243,-1.012248,-0.8127927,1.9239,-0.56279,-0.02995235,2.169285,-0.6409468,-1.936747,1.326275,0.8206593,0.2583038,-0.3643114,-0.0702031,1.310753,0.2124464 +-0.1129517,-0.2339555,-0.04975481,0.6531446,-0.481213,0.6337245,-0.5448561,-1.068247,0.0442188,1.603929,-0.6556325,1.137493,-0.05372969,-1.27035,-0.5186424,1.253919,-1.190393,-0.473091,-0.5557522,-0.1498488,-1.460989,0.4543015,0.9494815,-0.365656,-0.08825793,1.130017,-0.7388613,0.3361811,0.7774279,-1.82927,2.675128,-0.1292347,-1.408707,0.9576255,-2.155015,1.021544,2.559428,-0.03401229,0.9337984,0.905713 +-0.4846529,1.012213,-1.016969,0.5926625,0.1952381,-0.549385,-0.2969495,2.021248,0.8576453,-1.06024,-1.201887,0.7588194,-1.771989,0.930215,0.5520463,-0.789074,-1.112367,0.861608,1.351843,-0.6605793,0.0954359,-1.118887,-0.04493496,-0.153078,0.4081487,-0.6753155,1.756343,0.42308,0.7952406,-0.190887,-0.1635297,-0.1700127,0.1465504,-0.1933476,0.3049162,0.976026,1.455022,0.3021551,1.538133,0.8672118 +-2.377854,0.2788531,-1.222772,0.1260705,-1.961161,-1.012085,0.9239419,-0.7845851,1.263138,0.02648854,0.9027194,-1.066476,-1.887396,-0.5158211,0.9006284,0.102399,-0.6071269,-1.082922,0.8809645,-1.30944,-0.01930573,-1.545082,0.2280628,-1.352642,-2.368484,1.321884,0.2275638,0.9364751,-0.3751205,-0.639053,0.01711315,-0.4046213,0.4679565,-1.482196,0.7302493,-0.03427036,0.003855805,-0.8216257,0.5780817,1.00835 +-0.636048,-0.670136,0.7567572,0.0227272,0.7915466,-0.3862465,1.12714,0.768508,1.37353,-1.148162,-1.344479,-1.095391,-0.2774808,0.1526535,-1.596822,0.3487851,-0.7588063,0.4107959,0.5850017,0.8012339,0.5052422,-0.2348498,0.6004261,-1.598922,2.167048,-1.156739,-0.1344037,-0.7066765,-0.9281516,0.276603,0.4320958,0.3324732,-0.6256546,-0.529237,0.75374,1.292731,-0.1704725,-1.902886,0.9219753,-0.007370935 +0.03566377,0.106384,1.772417,0.8902339,0.7760706,0.2967675,-0.7136521,0.608069,0.8937662,-0.03511123,0.4150116,-1.894405,-1.321792,1.536711,-1.003487,0.06882725,-0.01294066,0.4765148,1.501267,1.550162,0.02542738,-0.2320392,0.09380722,0.2771617,-0.5768983,0.4785052,2.517941,0.488835,-1.395307,-1.782702,-0.5271048,0.993923,0.6153983,-0.4075332,0.2953443,-0.947113,0.3591987,0.542686,-0.3129754,0.2250276 +-0.1091669,0.5952879,-0.8885107,-1.005286,0.6245696,0.3554542,0.4734282,-2.359148,-1.929748,-0.934225,0.8633298,0.2809204,0.5197729,-1.443437,1.769154,-1.835441,1.038382,-0.2824458,0.004681763,-0.4684144,1.787031,-0.1630842,1.296556,0.3235851,-0.4219812,-0.2779525,0.6959842,-0.2730611,-0.1249009,1.104898,0.1425456,0.1115275,0.3638832,-0.009467597,0.4802873,1.705993,1.051595,0.737075,-0.5181178,-0.2740292 +-1.871305,0.7627637,0.8642673,0.2824183,1.175571,-0.02865124,0.190517,-1.471968,-0.79068,-0.3578848,-0.2650232,-0.9739894,0.3168934,0.1654207,-0.877312,1.080176,-0.2501086,2.261283,2.505563,0.9660144,0.3474922,-1.13478,1.8337,1.736094,0.3085746,0.9027783,0.2373179,-2.052939,1.960063,0.08344373,-0.7416007,-1.733931,-0.9255858,-0.9307098,-1.021729,-0.5822776,1.510176,-0.4306193,0.000564089,-1.171229 +-0.07933309,-0.3417049,0.2023428,0.7462615,0.6380455,0.6331653,1.01196,-0.2908592,0.3189063,0.2515275,-0.4916359,1.401589,0.1302291,2.072361,0.7641201,-2.063882,-1.907868,-0.2126671,-0.4264151,0.990768,-0.4545116,0.3906981,1.275503,0.1885916,0.07060933,0.3222066,0.946658,-0.3816681,-0.4032341,1.207723,0.001439757,-0.3546108,-0.6399915,-0.03679779,0.9281037,0.2149945,0.0466485,-0.5338407,-1.514781,0.6344767 +-0.3185969,1.611862,-0.8306344,0.8921589,0.1923568,1.054391,1.20711,-1.238208,0.002902075,0.1316032,1.768046,-1.451333,1.041441,-0.5563049,-0.4408649,1.828206,-1.05244,-0.3383705,-1.832173,0.2980504,-0.2258827,-1.909304,0.3452469,-1.596888,-1.142496,1.709758,1.011433,-0.5549082,-0.01265098,1.094166,-2.062165,-1.023417,3.229797,-0.9602896,-0.4651673,-1.521205,0.8348372,-0.3035944,0.01004874,-0.4134953 +0.1181811,-0.3162932,0.519684,0.6661052,-1.461394,1.285307,0.06262483,0.4682342,-1.980524,-1.36807,1.008816,-0.2108,-0.005515733,-0.8323124,1.17352,1.296645,-0.6406248,-2.069875,-0.4578338,-0.40809,0.272419,-0.1391451,0.4570115,1.726079,1.319851,-0.2615546,1.169866,0.4583434,-0.3804516,0.679449,-0.6132853,-0.7476282,0.2064263,0.2653893,0.5115461,0.1730679,-0.6967277,2.170808,-0.6336401,0.9628156 +0.5324036,-1.449579,0.2277111,-0.7321297,-2.476858,0.8109581,0.0582898,0.5021346,-0.1348938,-1.045035,0.747745,-0.5671715,-1.07967,0.6673761,0.6354308,1.137129,0.3253595,-0.1326785,-0.3099762,0.8169519,0.04905553,-0.84218,-0.01274349,0.3657588,0.5786693,0.7618615,-0.1287248,-1.501985,-1.472223,-0.2552651,-1.191114,-1.534846,2.713012,-1.79,-0.2126805,0.4949112,0.02165276,1.23643,1.237272,0.2502135 +-0.1976839,0.4506472,1.525538,0.4185465,-0.1175913,2.80962,-1.001855,-1.059958,1.080562,-0.7979753,-0.7472536,0.6454744,-1.152007,-0.06989389,0.4696564,0.193335,-0.5203801,-1.935318,-1.769489,-0.1519583,-0.03818389,0.6382614,-0.8750879,0.3653813,0.05987376,0.07057442,-0.3069176,-0.52712,0.4894772,1.348183,0.3337098,1.541026,-0.0481273,-0.03522957,0.9929462,1.252343,1.234161,-0.1440722,1.200075,0.3065263 +1.664379,1.1299,-0.7786955,-0.8287047,0.06591268,0.3134534,0.239494,-0.4312257,1.259415,1.50401,1.033004,0.5340657,-0.1914057,-0.2496382,-0.8642268,-0.1610488,-1.057452,0.3353577,-0.07077311,-1.989734,0.01048508,-1.508759,-0.01262349,0.49683,-1.493825,0.2996997,0.1696268,-0.6124486,0.2086339,-0.883364,-0.77249,-0.4344974,-0.2775649,0.247009,0.1797078,-1.057723,-1.600563,0.9376456,-0.7296465,-1.323425 +0.3144951,-1.958008,0.4328868,0.2485605,-0.979884,0.8348862,-0.4882026,0.89385,-0.5434021,0.5794736,-0.4930622,0.5164448,1.872235,-1.154929,-0.7106444,0.8690883,0.1130996,-1.207881,-2.245264,-0.2370222,-0.76595,0.4177734,-1.572519,1.231982,-0.6684601,-0.4414165,0.3116744,-0.1975355,-0.8841211,0.4653737,-1.451883,0.363243,0.7022209,-0.1320235,-0.3750696,1.525412,1.009811,0.2513015,-0.175015,1.256715 +0.6015918,-0.1638196,0.5882359,1.33924,1.256693,-1.369633,1.360368,-0.6946129,0.7346338,-0.9231872,-0.2077854,-1.512815,-0.2016419,-0.576273,-0.736622,-1.292433,-0.5393724,-1.268426,1.821716,-0.0431129,-1.642569,-0.4048909,0.1565905,0.4674726,0.06780303,0.8642687,0.9577703,-0.1875446,0.1630243,0.2418986,0.1179475,0.2618319,0.2149651,0.1469577,-0.07295278,-0.8151814,-1.297167,1.128227,0.574938,-0.08002137 +-1.429558,-2.081914,1.411115,1.224812,-1.744939,-1.319397,1.460893,1.091136,0.8291939,0.4263965,-1.324619,-0.5421091,1.387056,0.5908236,-1.408663,-0.188571,-1.059858,0.5594888,0.857768,-0.7064506,1.373712,1.57617,-1.04479,-0.5237648,-0.760697,0.8896434,-0.5470096,-0.05057,0.6855709,-1.326971,0.1919937,-1.315895,-0.5577946,0.3952936,0.02517129,-1.034477,2.536362,0.4583902,0.5935535,0.6798796 +-0.4729991,-1.014198,-0.7693723,1.393379,-1.133264,-0.4857457,-0.9583965,1.44078,1.125785,-0.2131687,0.02144601,0.3229832,0.3928797,-1.46275,-0.2582163,0.7792899,0.005459101,0.04519501,0.2799123,0.2753872,1.012878,0.3092816,0.1202915,0.03524004,0.4473066,-2.131593,0.3501101,-1.234717,1.645601,0.2889796,0.7229377,0.7544157,-0.4558298,-1.537284,0.7541373,-0.3886754,-0.2055679,0.08881009,0.2993735,-1.531217 +-0.4878888,0.1123629,-0.3303089,-0.3065891,-2.74875,-0.1171416,-0.1847223,-0.2040311,1.457887,0.1426941,-1.214094,0.7638064,0.7652941,1.88858,0.3275883,-0.8542973,-0.2910244,0.3320907,-0.2549147,-0.6280944,0.4428364,1.046792,-0.7867206,-0.2948733,0.5914692,0.2756259,-0.1353659,0.6886442,0.6318698,0.03430277,-0.2092355,-1.305309,-0.7241217,-0.700498,-0.515202,0.006820219,-0.9054642,0.3844715,0.4193129,-0.9958857 +1.027389,-1.298597,1.880972,1.042242,-1.80191,-0.5644251,-0.1072139,0.6187678,-1.420688,0.4241985,-0.360003,0.983581,-0.7120257,-0.0754362,2.102757,-1.849082,0.7909378,-0.293434,-0.5630861,0.8797048,-0.5645114,-1.49944,0.1930069,1.411597,0.5313197,-0.5667718,-0.3446636,1.106292,-1.087268,-0.5824837,-0.07088768,1.017654,-1.257299,0.383469,0.239521,-0.8419412,-0.1184191,-0.6202432,-0.6045922,0.4997473 +-1.095764,-0.922611,0.3332256,-1.29328,0.8629992,-0.01800285,-0.3462512,0.06253946,-0.8489166,1.179636,0.5282299,-1.425351,1.916078,0.02566201,-0.91711,-0.4373903,0.2924075,1.246281,0.9234243,-0.2791935,-2.150523,-0.0488299,-0.3210413,-0.3758759,1.163321,-0.0698295,0.3174893,-0.6330041,0.7549338,-0.8375203,0.2100581,0.9067337,0.8777907,2.191424,0.3485907,-0.02633855,-0.4324832,1.716539,-1.464804,0.1920596 +-0.9164456,-1.312763,-0.804992,1.788962,-0.1945065,-0.7335065,-2.024207,-0.6260116,0.05661926,1.413478,0.8558653,-0.2119675,1.298529,0.6737724,-0.5808671,-2.181794,0.7760414,1.259209,-2.98662,0.04413215,-0.09364966,0.7595078,0.1164912,-1.730487,-1.007495,1.211969,0.6895878,1.252225,0.2793219,0.4915421,-0.8149936,0.8651399,-2.006216,-0.4014017,-0.3439046,1.348619,-0.03137267,1.134768,-0.9784431,1.47514 +1.054095,-0.1425622,0.5539767,-0.04544975,0.4249936,-0.2746244,-1.268574,-0.6875731,0.2168453,0.692024,-0.3652643,-0.7527752,0.3669125,0.2003103,-0.1477153,0.876604,2.357756,-2.332519,1.146428,1.079006,0.004060206,0.06446555,1.799291,-1.053055,0.1250709,1.066894,1.15491,-1.960478,-0.3864977,1.804164,-0.3538078,1.244055,-2.337407,-0.9690951,0.2505464,1.270689,0.01606963,0.06251337,-0.2268947,0.4654132 +-1.439834,-2.304883,-0.08775313,-0.8795182,-0.1948879,0.4984929,0.05870488,0.2754161,1.542732,-0.1750895,-0.3395868,0.9620455,0.4140059,-0.5705213,0.5209987,-0.8926053,-0.1735557,-0.05034675,0.2897551,0.3208105,0.6225994,0.9033187,0.5677173,1.305925,1.325051,0.3683458,0.005737144,-0.8344667,1.466659,-0.8902477,-0.3651142,-0.2353137,0.3751938,2.669083,-0.4520038,-0.02453718,0.6923328,-0.3625488,0.3205606,1.173252 +0.5325641,0.6958684,1.439169,-0.2140128,0.4601286,0.3677307,-0.3719496,1.819619,0.3307335,-0.9273468,-1.88722,-0.1323334,0.5105563,-2.018163,0.2520384,0.1999195,0.8579925,-0.7905585,-1.115615,1.943344,0.363217,0.0747775,0.4644303,-2.634828,0.7231856,1.244098,1.3461,1.431581,0.4456104,-0.8107838,-1.499282,-1.301363,0.6378652,0.3584408,0.08368993,1.245398,1.635683,-0.4987267,-0.5278267,1.232386 +-1.046929,-1.15592,-0.5277358,-0.6008303,-2.025225,0.2019147,-1.728988,1.260363,-0.316768,-0.6153591,-0.6169308,0.627744,-0.01926193,-0.39947,1.582768,-1.387893,1.614908,-0.07495543,2.124503,0.2052818,-0.111696,0.6503327,1.862355,0.06979831,0.4610993,0.6619956,0.03217489,-0.003144171,-0.2485315,1.67398,-0.3504854,-2.073764,-1.014727,2.030747,-0.8025345,2.187409,1.72072,-0.5333669,-0.5506944,-1.214608 +0.08738588,-0.3510553,1.283553,-0.2103298,-0.739816,0.08463683,1.282136,-1.1062,0.03541426,-0.9318186,0.8300233,-0.7568346,-0.9526063,-1.150902,-0.5666646,-1.438308,1.634061,0.2826369,1.311963,-0.4894188,-0.4319464,1.138773,-2.369647,0.571458,0.5880608,-0.6432348,0.8222181,0.7086809,-0.982295,0.6303447,-0.353809,0.2172209,-0.4776215,-0.3022625,0.1713483,0.6151736,-0.1301592,-0.1852595,-0.3912223,0.9033757 +-0.1741117,0.4631775,2.823791,-0.0144479,-2.236957,-0.5401128,-0.3882962,-0.4471782,0.08152758,0.2664512,-0.3630859,0.05825045,-0.320265,0.933914,-1.025931,-0.8496283,-0.4424679,-1.39919,0.3614503,-0.1538349,-0.3478959,1.561919,-0.9151859,2.191144,-0.4425738,-1.303115,1.049744,-0.2486172,-0.7755829,-0.1295952,-0.04268221,0.836885,-2.646166,1.36033,0.5207675,0.1820813,0.3538364,-1.101909,0.5480305,0.5768108 +-0.5338263,-1.119316,-1.155949,-1.075801,-1.216553,-1.053518,-0.7310681,1.315309,1.418382,-1.423775,0.9597397,-1.514819,-0.004966852,0.7473703,-0.6273467,0.5828591,-0.6284082,-0.4432669,0.3570427,-0.2066171,0.8445493,0.3254784,-0.291387,2.875797,-1.373337,1.769732,-0.2281094,-2.272556,-0.7719545,0.6954222,-1.708636,0.042509,-0.6684906,-1.318275,0.6407037,-0.9448677,-0.1216873,-0.3875763,-0.9972548,-0.09040511 +0.09931775,1.608629,1.60231,0.5047653,-0.7861697,-1.458855,0.3178277,-0.9552828,0.475573,0.9281264,-0.07368011,0.2648562,0.3418431,0.06046047,0.006208465,-0.1599333,-1.092786,0.2494336,1.213932,-0.5695074,1.128097,1.422688,0.7496051,0.8754139,1.273978,0.6093686,1.158013,-0.794172,-0.6318442,2.638203,0.271232,1.047943,-0.7297553,-1.190478,-0.5933604,0.4756309,-1.279635,-1.043481,0.4225341,-0.1130021 +-1.020107,-0.2084907,0.9054224,1.459456,-0.5491949,-1.31143,0.1042487,-0.2255735,-0.9412391,1.094789,0.000209526,0.3510475,-0.9555894,-1.309341,-0.0529393,-0.9461725,-1.033299,-1.081074,-1.228633,-1.091972,-1.230836,0.07050277,0.3830135,-1.371496,-2.169488,0.3297572,-1.427224,1.197344,-0.766892,-0.1112019,-1.172778,-2.301586,-1.015324,-1.118685,-0.1599354,0.1382208,-0.1418907,1.294727,0.1778073,1.444665 +0.4300418,1.923953,0.4400684,0.2952224,0.1023547,-0.3167251,0.5671206,1.747656,0.4169696,0.3590629,0.405639,0.3300231,0.204773,0.06333162,0.5324155,-1.180043,1.121603,-1.168078,0.998214,-1.338183,0.5984292,0.6847403,-0.3270934,-0.1291007,1.180713,0.4351511,-1.311425,-0.1642593,0.2543677,1.775384,-1.045715,0.8608067,1.298419,0.534686,0.7977225,-0.3753931,0.8881787,-0.8706575,-1.092527,-0.763238 +1.502848,-0.7212617,0.9407055,0.6297948,1.595628,1.976872,-0.6208181,0.5488782,0.1929413,-1.577203,-0.9454659,0.08037539,-1.182327,-0.3404287,-2.534636,0.7090455,1.871393,-0.4415975,0.442929,2.444031,-0.278692,1.261802,-0.2916899,1.073331,0.444325,-1.010152,-1.065489,-2.244728,-1.771479,1.051043,0.6599536,-0.8795362,-0.3849592,-0.9131786,-1.237992,-0.4626759,1.286746,1.046411,-1.313203,0.2391025 +-0.2707608,-0.314924,1.816154,1.694685,-0.6468139,0.3892016,-0.7801426,1.067402,-0.5912866,-1.069286,-2.475704,-0.3249549,-0.8024044,1.229205,0.3833512,-0.4917509,-1.516319,-1.60438,-0.675628,1.535386,0.4818095,-1.266635,0.4978635,0.5924866,1.525281,-0.2533804,-0.7518494,0.1987739,-0.8584606,-0.4761665,-0.3646473,-0.2281105,1.092615,-1.998565,-1.867103,-0.2198416,-0.8907168,0.9160591,1.46698,-1.870654 +0.8715554,-0.6874732,-0.1337061,-0.222694,-0.5027498,0.1267491,0.3227322,-0.4326786,-2.070301,0.2405508,0.3605488,0.8605773,0.7239751,0.3428101,0.3031033,0.4210347,-0.3839236,-1.793898,-0.4926686,0.2608138,-0.9907336,0.4889798,0.8834736,1.258478,1.631716,-1.675949,0.998717,0.4775568,-0.3925449,-0.2954847,0.2106215,2.304902,1.196874,1.819561,-1.218998,-0.4179576,0.5316533,0.4236971,-0.4209269,-0.3541083 +-1.216294,0.8433805,-0.3492536,0.2359891,-0.6112727,0.684322,0.262337,2.092005,-0.1269612,1.375485,0.2637899,-1.300303,-0.1819238,0.2087964,-1.93922,-0.438638,0.3484531,0.692103,0.4494471,-0.7443409,0.7145612,-1.544293,0.5468173,-0.8976167,1.577752,2.369027,0.5489654,0.311327,-0.5791104,-0.536697,1.551878,-1.083358,-1.315695,0.6372857,1.001883,1.77926,0.6023737,-0.8350129,0.1068378,0.4192748 +-1.892739,-0.2975108,-0.4665746,0.9142837,1.172472,0.04458141,-0.3238838,1.144281,-0.5115451,0.3355118,0.4521848,1.981768,1.457582,-0.3607241,0.7577676,1.773012,0.3598021,-1.138702,-1.773112,-0.8192119,2.010592,0.008063516,-0.05978764,-0.2377284,1.112816,0.8041882,0.8081342,0.4966003,0.8053298,0.5222056,-0.1441031,0.3953131,0.4725748,-0.5554327,1.246034,-0.88449,0.8519637,-0.1668306,-0.438063,0.6992197 +-0.8223826,-1.109601,-0.8086896,0.5234243,-1.224338,-0.7144213,-0.08511602,-1.378556,-0.2894148,0.4617165,0.7447684,-0.1128952,-1.144095,-1.122337,0.2503817,-0.8092682,-0.7796048,0.1604922,-1.223164,0.5280187,-0.8291257,0.2897393,1.743601,-0.04467965,0.09840887,0.1149126,-1.706264,0.2252389,0.6164876,0.494588,-0.932191,-0.006850215,1.037906,-0.8527162,0.03938953,0.6140857,0.6500916,0.740759,0.187713,-0.932005 +0.751782,0.06158548,-0.218244,0.3781433,1.391325,0.2580242,1.457012,0.005679313,1.000425,0.4656612,-0.9208086,1.767622,-0.4584411,-0.29573,0.01940835,-1.18047,1.747931,0.1913225,0.2903921,1.667847,-0.6969488,-0.2066152,1.45567,0.2914063,1.69074,2.213017,0.5023814,-0.5976137,-0.1217422,0.4278989,0.0773409,-0.879454,-0.3565501,1.926119,0.48298,1.822674,0.917929,0.5277348,-1.05312,0.2086481 +0.7735745,0.3572745,-0.5933545,-1.796662,-0.1543615,-0.02407432,1.843891,-0.2174867,-0.681004,0.409822,-0.03117314,0.04960918,-2.149404,2.350811,0.2738792,0.9366432,-0.1056802,0.6724218,0.4560314,0.8461153,-1.192541,1.206101,0.3090558,-0.5302658,1.37269,-1.3491,-1.231532,0.08488687,0.4584264,-2.416724,-0.1578085,-0.8629123,0.4964178,0.06770313,0.9687114,-0.450754,-0.3358339,0.1497615,-1.623834,-0.08624798 +-1.787532,-0.9268928,-0.5772998,1.896168,-0.3081158,0.05033551,-0.7463286,0.37935,0.3605203,0.8077493,0.1172945,-1.021436,-1.525681,2.389447,-0.4838707,0.8020935,1.308537,0.4160353,0.3970067,0.1238296,0.51279,0.3273175,1.030279,-0.8648233,-2.123714,1.876299,0.6804817,-1.082053,1.293572,-1.261245,-1.189776,-0.09834272,0.3991112,-0.7362926,-0.7215085,1.45492,-0.979625,0.5566155,-1.03374,-0.8998379 +0.08881747,0.4662813,0.3253024,0.8248897,-0.08608494,0.5768708,-0.5028213,-1.78987,-1.293544,1.199681,-0.5293988,0.6263181,0.4878648,-0.491599,-0.6358317,0.6897183,-0.07915782,1.232061,-0.5835077,2.183902,-2.034,-1.876506,0.720175,0.000182119,-1.669152,-1.146466,-0.4531479,-0.3216194,0.3564202,0.2251164,-0.2888241,1.084057,0.1206081,-1.544514,-0.1607902,0.3821871,0.2830542,-0.005445845,-0.400982,0.3348095 +-0.1800529,-1.039454,-0.9952738,-2.149644,-0.06808813,1.616819,-0.4152564,-0.5669422,-0.1440807,-0.9440913,0.8656913,0.256837,1.052066,0.8201026,0.9306526,-0.5014338,1.831927,0.8194543,0.6993491,1.212573,1.529005,-0.1059612,0.881851,-1.452176,0.275347,0.3703591,0.04259933,-0.3310892,-1.415064,-0.1446727,0.3561798,-2.083863,0.2718418,-0.9504309,-1.230524,-0.2157924,1.06146,0.3778185,-0.6132729,-0.6078118 +-1.471715,-1.421679,-1.47086,-1.920581,-0.5232705,-0.9520944,1.366265,-0.1203026,-1.125253,-0.8109396,1.093926,0.08699999,1.731531,-1.712375,-0.09016503,0.649044,1.495784,0.1839868,1.336574,1.135536,2.057974,-0.2783356,-0.4618278,-0.2458776,1.34242,0.6676007,0.9278645,0.08592563,1.713929,-0.6649299,-0.4686345,-0.9312994,0.569443,0.2433124,-0.5383845,0.2841436,-0.7792576,-1.238187,-2.711228,-0.02525759 +0.4900949,0.5420073,0.6803407,2.372806,-0.531343,0.3586832,0.03014277,-0.5742431,-0.726359,-1.4695,0.6334471,-0.3549981,1.378394,-0.76689,1.826896,-0.9727188,1.547722,-1.327237,-2.445487,1.182456,-0.2281719,1.623994,-0.2931314,-1.159684,0.940677,-0.6591262,1.271421,-1.288573,-0.08837951,0.1432553,-3.479825,-0.3014329,-0.3544237,1.061246,-0.683061,-0.939136,-0.1847287,-0.6310709,-0.2235627,2.545964 +-0.3978332,0.01225306,-0.5784628,-0.3023308,1.072259,0.7464048,0.3252437,-0.7518205,0.2015492,-1.30408,0.9036115,-1.914971,1.542955,-2.280939,-0.3998906,-0.9436018,0.9175122,0.573203,-0.009974,-1.205236,-0.3427017,0.6515506,0.1905839,0.5612271,-0.1358581,-0.6268294,-0.2858294,-0.2202527,0.9269216,-0.2845229,-0.4491025,-1.215803,-0.2977982,-1.266888,-0.800485,0.1647712,0.9979839,-1.114887,-0.0428258,-3.581637 +-0.1696794,-0.1656831,-0.5551334,-0.09584185,-0.7380956,0.4585266,1.658209,-0.07290578,1.079264,-0.5649708,-0.6211978,-1.282963,1.81532,0.1185465,0.6620172,0.1237561,0.294778,-1.213209,1.441977,-0.02310572,1.33492,0.03546823,1.324879,1.137873,-1.3293,0.2673109,0.7585945,-1.166286,-0.6512303,0.2710359,-0.3272931,1.988033,-0.4764517,0.002038168,-1.08593,0.3480224,0.8530655,1.540216,0.84626,-0.4981497 +-2.214983,-1.741093,0.3283349,0.5787717,-0.6976978,0.6312851,-1.349233,-0.7028015,-0.8665673,-0.7658685,-1.113791,0.4086542,-2.129689,0.699191,1.184714,-1.468093,0.4220199,1.203952,-0.3049927,-0.9059935,1.044847,0.6117947,0.3393693,0.5389335,1.30957,-0.6386303,0.8378245,-0.5649432,0.2398326,-1.059736,-0.2577502,0.3844247,-0.5759177,-0.4617481,-0.6967576,0.5592053,0.5567024,1.189942,-0.3294826,-0.2534072 +1.008145,0.3595513,-0.9500335,0.6690592,1.592502,0.5343223,-1.00067,0.9042402,-0.9220163,-1.016989,0.4919161,1.50849,-0.5384651,-0.4463117,-0.007095146,-0.6930148,-1.320504,-0.4905535,0.1095498,1.798759,0.6505743,-0.3954426,0.2474347,0.5484001,1.256569,-0.859642,-0.2212938,0.5966216,0.5852128,0.3119784,-0.3484447,0.642543,0.01985722,0.8713501,1.51592,-0.05326369,-0.00207102,-1.065743,-0.2451913,-0.3471057 +0.5242607,0.9502757,1.901279,-1.214095,0.1343618,-1.333697,0.03159751,1.425625,0.8089411,-0.456769,0.2132328,0.05651368,-0.9569572,-1.334568,-0.6369183,0.772525,1.280911,-0.5407955,-0.3433701,-0.5322366,1.195401,-0.839747,-0.419963,2.58846,0.6700293,1.567704,0.9877257,-0.512514,0.4088709,0.765347,-0.9342308,-1.024618,-0.6069785,0.2275718,-0.07341274,1.603145,0.973115,-0.4190188,0.3969075,-0.2094916 +0.2648206,-0.6887496,0.7317588,0.8839132,1.090685,-1.399565,1.087889,0.3037503,1.072338,-0.2049236,-0.7045185,1.586645,-1.402029,-0.679705,-0.008159873,-0.5303862,1.281981,1.23368,0.7339917,1.86264,-0.3600759,1.27393,-0.2632546,0.4986897,0.02722221,-0.813682,-0.6275902,-1.545594,-0.3506393,0.1369018,0.6325429,-1.125693,-0.9894204,0.6698663,-0.1660289,-0.966268,0.7155861,-0.8705507,-0.616063,1.141447 +-0.3882564,-0.394503,-0.7427655,-0.8383758,-0.5324999,1.260549,0.8939918,-1.143069,0.3817241,-0.01439583,-0.6391116,1.844515,0.3583312,-2.11514,2.073535,-1.477467,0.7761985,-1.390494,1.603812,-0.9460123,0.8429812,2.409208,-1.585668,0.2873053,0.7529115,-0.8322558,-1.007014,-0.5527073,0.4868928,0.5145941,-1.04861,-0.0236192,-2.095852,0.8384089,-0.6643402,-0.1382743,-0.3752907,-0.9507792,0.7575054,-0.6860502 +-0.3994519,-0.587108,1.558543,-1.343214,1.722157,1.148743,-0.4464501,0.8557042,-1.333738,-1.039362,1.460452,-1.801539,-0.7371427,-0.7442128,-0.2375923,-0.151004,0.8265023,0.5373244,-0.5477163,0.08001357,0.4076333,0.419234,-0.9032192,1.04463,0.8715977,0.02680652,0.6352046,0.6478005,0.4106245,0.1638196,0.1534928,-0.86783,1.833931,1.103647,-0.2275345,0.2399725,-0.2245709,2.050212,1.788459,1.51199 +0.2271462,-0.9049526,-0.03967166,-0.9544842,0.5139724,1.304697,-1.694733,-0.1385947,0.4756813,0.8715315,-1.573645,0.7697311,1.003445,-0.4571671,-1.08857,0.4312732,-0.9612313,0.6671236,0.5018413,0.3255822,-1.876824,-0.710274,-0.6212835,-0.2906934,0.3371331,-0.8886188,-1.039242,0.2433013,0.1820139,-0.1242036,-0.6390605,0.1417925,-0.8937439,0.3279703,0.821913,0.5127782,-0.9509434,1.419877,1.846227,2.800684 +-0.611157,0.1849971,1.03148,0.2505964,0.2124272,0.7966427,0.7167514,1.177822,0.8480188,1.517302,-0.9816164,-1.668654,1.357535,-0.4891543,-0.9143187,-1.822108,-0.08559663,2.507545,0.9344972,-0.3734989,1.790999,-0.1117951,-0.797988,0.632353,-1.08984,-0.1005279,0.6891203,-1.084307,0.02915843,-1.023947,-1.815896,0.6650033,0.3866452,-1.427992,1.02016,1.280093,1.886738,0.7172988,1.248301,0.08895933 +0.5406505,0.6038968,0.9347172,-0.5997078,0.01457052,-0.4494523,1.232062,-0.3764804,-1.415197,-1.204177,-2.174494,1.492053,-0.4725049,-0.1920005,-0.226676,1.085123,-0.1745949,-1.104255,-0.4520817,2.726105,0.5294095,1.28002,-1.72767,0.6112904,-0.2851281,-0.3627582,1.151438,0.9445646,-0.8790167,0.390133,-1.086579,0.4378484,-0.07708298,-0.6472846,1.392951,1.380879,2.412566,-1.40355,-0.05435948,-0.8098117 +0.909955,0.8432076,-0.9139735,-0.1872277,0.2292307,1.233986,-1.059789,-1.167939,-0.3386399,-1.296884,-0.02855854,1.707446,0.6510365,0.581,-1.843035,-1.79001,-0.2106708,-0.02051765,0.4180364,-1.523489,0.5550635,0.5948754,1.379911,1.306698,0.7865574,1.756781,1.458481,1.471044,0.3321195,0.5462397,1.136583,2.297054,-0.9775344,-0.6993102,-0.6836364,-0.2777081,-0.07384135,-1.61304,0.9209182,-0.3959982 +1.406622,-1.891784,-1.139463,-0.3380756,2.203875,-0.1122689,0.1638654,-0.06123867,1.112284,0.07983217,0.296043,1.062207,-1.160508,-2.961583,0.07912599,-1.981216,0.2003422,2.276176,-0.07744978,-0.4392397,0.170682,-0.4424453,-0.9615257,0.3114508,-0.3368715,-0.4959641,-0.3983138,-0.5300468,0.3479005,-1.669331,-0.1907561,-0.2392643,-0.639802,1.303794,0.9111505,1.635166,-0.6941328,-2.540694,1.085141,0.03048889 +-0.7555871,-0.4139746,-1.737831,1.717442,-0.8173428,-0.4333292,-0.4174071,-1.282945,-0.8552529,0.7359904,-1.05584,0.06697781,1.370689,0.6317189,0.7584578,-1.63135,-1.024898,-0.8406188,-0.911155,-1.713293,-0.2685121,-0.08807514,0.1468606,-0.8848247,1.443982,0.3989156,-0.3878695,0.653281,-0.6034227,-1.203211,0.189868,-0.8121995,-0.2732926,1.28009,0.6644262,-0.5469533,1.931752,-0.06554595,1.015996,0.9890246 +-0.02059243,-0.3432295,-0.2088816,-0.8169235,-1.019769,1.283682,-0.2990381,0.0193259,2.197745,-0.2671072,-0.5932357,-2.207454,-0.837391,-0.1701569,-0.4109833,0.8600789,-1.429786,-1.080064,0.3265684,-0.5319984,1.689007,-0.4262448,-0.1345559,0.628364,0.5642561,1.196189,0.003733393,-0.3111685,-0.7675118,0.202177,-0.6105828,2.390519,-0.1961291,2.072161,-1.856687,-0.8620721,1.145863,-2.356383,1.298575,0.869772 +0.1378585,-0.4240921,0.7555976,1.326437,0.5789412,-0.2295489,-1.936446,0.1984902,0.02578204,0.4189856,0.8217314,-1.273685,0.5701557,-2.531349,1.356897,-0.789957,-0.02347339,-2.660589,0.6081887,-0.9983314,-0.1204218,1.410582,-0.3826437,-1.684666,-0.1123269,-0.6681975,-0.06988989,-0.03349247,-0.3091754,-1.260045,0.04534606,0.1955737,-0.7551001,1.379169,1.019462,-0.09025623,-0.6666806,0.9257642,-0.8480185,-0.2374059 +-0.9215404,-1.346872,0.2781898,2.031318,0.06127333,0.3552192,-0.3393969,-0.6204147,-1.583657,0.8191514,1.376855,-0.4646583,-0.9237942,0.576942,0.6590051,1.394819,0.2145302,-0.7733944,-2.835041,-0.9673602,-1.267252,-1.469271,-1.233643,-0.6658312,-0.3469466,0.3316381,-1.34703,-0.7784676,-1.226765,0.5188028,-1.723461,-0.9391005,-1.590306,-0.9864027,-0.955196,-0.07613837,-0.1937701,-0.3030904,-0.2814333,-1.261413 +0.5765234,-0.9894068,0.4195737,-2.310679,-0.4341679,-0.07915782,-0.9540078,-0.2936501,0.9750917,-0.6811234,1.133732,0.3786953,-0.01629632,0.4471102,-0.4461427,0.001552186,0.9855956,0.1504323,-1.037283,-1.214276,0.3306909,-0.2806865,-0.3931501,-0.02794186,0.2240214,-1.506773,-0.6822704,1.073849,1.567597,1.748889,-0.1772614,-1.024996,-0.640446,-0.5874295,0.664307,-1.793478,1.417327,-2.104343,0.6920926,-0.4529594 +0.2988937,0.131146,-0.8584298,-0.6279129,-0.8796653,-0.1058318,-2.173706,2.346803,-0.3201366,0.932697,1.655392,-0.616683,-0.4432898,-0.6139409,0.3006993,-0.06100562,-1.121326,1.196484,-0.4051043,-0.4809252,1.410253,-0.8160448,0.08422356,0.2543402,-1.603908,2.009732,0.4695212,1.79098,0.13213,-0.5687666,-0.8899737,1.033969,0.08884866,-0.538214,-2.229564,0.2085559,0.6464108,-1.055222,-1.076827,-0.2646165 +-0.862301,2.206007,-2.156902,0.1400342,0.4797187,2.706423,-0.9759375,0.9450309,-0.4975935,-0.7563964,-0.4088207,0.8193055,-0.08761266,1.029468,0.4383418,0.3587557,-1.667304,-0.6789605,-1.423129,-2.388547,0.140309,-0.7056501,-0.04545164,-1.049717,-1.31206,-0.5343109,-0.01208168,1.020173,-0.608412,0.264993,0.4712208,-2.04989,-1.324961,0.0909471,0.580034,-0.08703079,1.047683,-0.4687554,0.4069257,-0.7202705 +-0.8101381,-0.4880259,-0.02258244,0.5038334,-0.2222141,1.033715,-0.0550295,-0.949678,-0.2137424,0.8447172,-0.6968401,-0.09005683,1.09495,-0.02628488,0.6625812,0.3487502,-0.3812122,-0.3916378,-0.6146786,-1.60281,1.400711,-0.2022272,1.347394,-0.1198266,1.481384,-0.9048778,0.3755951,1.658489,-1.829733,-0.6508813,0.4225428,0.1311824,-0.2791665,-0.6634366,-0.1032911,-0.3203196,-0.264543,-1.076201,0.215278,-0.9357935 +0.4725559,-0.05859646,0.4445384,-1.104955,1.555178,-1.53381,1.032633,1.315148,0.1730461,-0.9266977,0.01148031,0.8937625,-0.695113,1.096289,-0.7793445,-0.3099569,-0.2379078,-0.4997116,0.1179185,-1.350029,-0.5359162,-0.491137,1.277915,-2.104365,-0.9868642,0.8765422,-1.089285,-0.8228665,0.4122369,0.3318079,-0.1412039,0.7345418,-0.5245307,1.466002,-0.4311032,0.1469879,1.872315,-1.384171,0.3183897,2.264798 +0.568499,0.3815074,0.4677388,-0.4048382,-1.346565,-0.08423313,0.2343253,-0.6222837,0.4041525,-0.1440535,-0.74232,-0.4715456,0.111487,-0.5811815,-0.4161647,0.2401459,-0.1005765,-0.591957,-0.2492095,-0.1837779,-0.322271,0.03505055,-0.3451709,-1.573429,-1.662857,-0.9252076,-0.1486393,-0.5920573,0.1055035,-1.309933,-1.50708,1.67925,1.16697,1.301049,0.9930027,-0.6962476,1.051854,0.2270802,-0.614903,0.90829 +1.413098,1.396094,0.2336009,1.371906,-2.727039,0.2091286,1.163983,-1.075961,-0.7769263,-1.045464,0.6161839,-0.05021547,1.944434,-0.7044978,0.08496505,0.06474985,-0.3806701,2.219755,1.632958,1.557029,-0.2155821,-1.199963,-0.647052,-0.3794608,-0.4683484,0.4143101,-1.139068,0.3210359,1.407712,-1.88081,-1.05152,0.105681,-1.470077,-0.2363645,0.2960777,0.8185218,1.060667,1.341146,0.6014673,2.731237 +0.05881738,0.1633836,-0.08672898,-1.106651,-0.2581098,0.6130016,0.6517205,-0.02489594,1.116307,-1.152351,-0.5025063,0.2801366,-1.530147,-0.8645239,-0.9005778,-0.3964228,-0.5741119,-0.4530036,0.2200627,0.169538,-0.9496582,-0.8216985,-1.413027,2.007974,0.4605372,-0.06039936,0.5745415,1.598692,0.5712546,-2.659713,-0.5068608,1.154605,2.583778,1.656085,-0.08278438,-1.072792,0.7072485,1.279435,-0.7787383,1.089582 +-0.5834479,-0.07448124,-1.115903,0.02760885,1.009487,1.156914,-0.4105086,0.9115326,0.869017,-0.4590516,1.003386,1.112832,-0.9616253,-0.7388038,0.5843482,-0.1179352,-0.5469783,0.3263092,0.308653,1.569749,-1.789911,1.400087,0.1520123,0.8572611,1.04345,-0.1563995,0.219218,1.128357,-0.0547567,0.5415095,-0.2572749,0.3225808,1.528879,-0.5198953,-1.534678,0.5201836,2.082536,-3.233977,0.08992024,-0.6226402 +0.2701231,-1.043623,1.489661,-0.2150529,0.9315255,0.7463084,-0.5487783,1.311679,1.152206,-0.5581729,0.6388511,-2.056672,1.783536,0.4034619,0.1576661,-1.604431,0.4752059,-0.6567207,-0.4142717,-0.1847941,-0.9605026,-1.506389,-0.05274114,0.6117821,0.7751493,-0.5047877,1.357511,-0.07840207,-0.3905191,-0.168555,-0.4137956,-0.1198986,1.565326,-0.2107891,-0.932112,0.6186226,-1.615898,-1.337254,-0.8600915,1.297102 +1.078046,0.01575521,0.4861249,0.1165349,-0.2754173,1.137744,-0.8538327,1.273841,-0.3319208,1.02287,1.237159,-1.022342,-1.478591,0.4391717,0.2977585,0.4108931,0.02997734,-0.5642255,-0.693872,0.2498415,-0.2041311,0.29778,-0.1485495,1.415268,0.3859203,0.4754382,-0.8337888,0.960313,-0.6962694,1.861162,-0.2423945,-0.1721416,0.6194001,0.05314962,-0.6453273,0.5131093,0.6508946,0.5998992,-0.3032439,0.757012 +-0.1465796,-0.5686484,0.5126323,0.03169142,-0.1545222,-1.070446,0.3934666,0.1253511,0.2172285,0.1640811,0.39973,1.124659,-1.171505,1.717197,0.4984726,-1.791004,-0.8623121,1.085722,-0.6137886,-0.4004513,-0.5270464,-0.2657372,-0.2477783,-1.192792,0.2668348,0.6561872,-0.9379493,-0.4044395,-0.9095754,-1.431134,1.810561,-0.3319207,0.9440885,1.707345,0.3656806,-1.782357,0.2029573,1.269938,1.33762,-0.9324003 +-0.6352536,0.3361345,0.2547365,0.3481929,-0.5848437,0.6517613,0.5088642,0.8578557,0.7408979,0.3635919,0.1113037,-0.2889423,0.927813,0.5930912,-0.1666013,1.380175,-1.229242,-0.496288,0.8446761,-1.088473,0.3255175,-1.263276,0.4268831,-0.8105003,-1.085005,1.127836,-1.250197,0.7895727,-0.08399955,0.6849612,0.7288448,-0.7167858,-0.6449887,1.713295,-1.148787,-2.010084,-1.660754,-0.3389272,-1.681706,-0.03293852 +0.5240847,0.0725406,0.6165087,-1.247047,0.4722185,2.269369,-0.2636479,-0.5335241,-0.1401696,0.8959468,-0.1923944,-0.1734163,0.2960612,0.5584624,-0.8898843,-0.4660813,-0.1154959,0.97985,-0.2735789,1.353981,-0.3294121,0.765852,-0.8986376,1.183231,1.144596,1.066115,0.03225453,0.2864813,2.315503,-0.01883358,1.115497,0.1386072,-1.571452,1.175254,0.02328071,2.132364,0.6109857,-0.4493414,0.4719253,-0.1766699 +1.105146,-1.411528,-1.543019,-0.759479,-1.319903,0.1373407,0.2112947,-0.3188405,0.1692621,0.1403082,-0.5374475,-0.7066275,-0.02461981,-1.880076,-0.8012183,-1.859665,0.3712268,-0.5275359,1.677069,1.639276,-0.7875032,-0.3377572,0.6913626,1.19807,1.650522,1.443848,0.5065247,0.9693455,-1.063585,-1.157497,-0.9277302,0.1892092,0.791484,-0.2275854,-1.83886,1.392531,0.3594493,-0.7675819,1.430414,1.881706 +-0.5010119,-0.06088542,-0.2736444,-0.6155999,-0.1895159,1.306475,1.83268,-0.2097905,1.087991,0.1510832,-0.2351312,1.078499,0.2503442,0.8922755,1.214479,-1.766925,0.5240595,-1.072653,-1.145249,-0.3351149,-0.7743273,-0.1184084,0.4003212,1.623776,0.2857032,-0.8630109,0.2454722,-1.251914,1.3688,1.000707,1.977798,-0.7314074,-0.1835058,-0.4723842,-0.3263246,2.004582,0.09703421,-1.964636,1.338178,1.115453 +0.4241926,-0.4587048,-0.4209944,1.442377,-0.1742047,1.828591,-1.772957,-1.948657,0.9580416,0.2457761,-0.02056191,-0.6549559,-0.1697334,0.712861,-0.1909866,0.962907,0.6890235,2.222981,0.4234554,-1.554321,1.858121,0.9083904,-0.7444561,1.220734,1.212071,0.208017,0.3502088,0.2275264,0.8619287,-0.2499534,0.5067083,-0.2651063,-1.036375,-0.6136398,1.487275,-0.5789905,0.06287416,1.692822,-0.1797141,0.5251211 +-1.077358,0.7295546,-0.601976,0.4340126,0.3327995,-0.7353875,-2.04328,0.4670024,-1.03499,1.711248,-0.121219,1.627789,0.8580174,0.06890386,-0.06273455,-0.9953863,1.373742,0.7982451,-0.1185965,0.1930606,-1.06809,0.146414,-0.3709573,0.4832597,1.121614,-0.2557976,0.3560919,-0.869082,-1.229377,-0.9864441,-1.206641,1.035111,1.250699,1.26765,0.7596817,-1.108692,-2.280842,-2.042255,-1.000208,-0.201472 +-0.6258375,1.017146,0.4050523,-0.7593517,0.3740898,-0.1807326,0.4410814,0.06037562,0.2647726,0.9408074,-0.2063764,0.8486724,-0.02690883,-0.1650887,1.037881,-0.1097239,1.646319,-2.293052,0.503993,0.16722,0.2157754,-0.9075744,0.05972665,0.1933314,-0.461325,1.028308,-2.119902,-0.08582049,-0.8651413,-1.502705,0.4312394,-0.302692,1.230052,0.4273396,-0.1236817,0.4545562,1.248728,-1.444615,1.351471,-1.135989 +0.8401206,0.9378846,0.7368737,0.996591,1.299237,-0.5415974,0.4458895,-1.329484,0.3469177,0.4187814,-1.696315,0.2460802,1.296981,-1.014661,1.158803,1.681777,1.02088,0.8664008,0.1531442,-0.2705563,-0.1439914,-1.408516,-1.55579,0.5722215,-1.021558,-0.1259895,0.7774772,-0.8370817,-1.49012,-0.5937779,0.4235354,-0.6472084,2.169007,0.228458,0.4713326,0.7197631,-0.02042737,-1.401861,0.7678565,-1.159345 +-0.05946661,-0.753608,0.9642627,0.6858057,0.2721049,0.646867,-0.07257217,-0.02494447,-0.2871019,-0.6351775,-0.546593,0.489292,-0.6259565,0.9315107,0.3241991,0.901187,-0.9501484,-0.3368059,-1.005672,0.06488199,-0.3344632,0.03611817,0.02567393,-1.121709,-0.7133258,0.7834712,0.07172693,-0.3273907,-0.6429158,0.2381504,0.668705,-0.8747388,-1.581268,1.170684,-0.8504382,0.4819537,0.2944233,-1.835914,0.250195,-1.453785 +-0.07721755,-1.258515,1.031056,1.094227,0.3196682,-2.500625,-0.425323,-1.172233,0.6085352,-0.1139382,0.3755707,-0.7843996,-1.668105,-0.5636865,0.8745212,0.2924656,-0.9333033,-0.08808866,-2.977596,0.7503745,-0.04198832,0.2212858,-0.9274097,0.4035951,-0.9985882,0.1854961,-0.5243524,-0.3596676,0.464056,-0.3410021,-0.0498745,-0.9778274,0.003042096,1.15839,0.5264799,0.7885993,0.6613638,-0.5557506,0.05381138,1.11899 +0.4167096,0.4116793,0.7074665,0.08717525,-0.09691663,-1.261255,1.581478,-0.6783469,0.7967944,0.9055292,-1.355674,-1.041108,1.091192,0.6080176,0.4727946,-1.705097,0.4922651,0.5593915,-0.6220739,0.2475402,0.6726739,-1.07242,-0.06349285,0.6738534,1.877059,0.929598,1.13348,-0.3688263,-0.6073801,0.5538865,0.4868661,-2.748326,2.424341,1.5306,1.59374,2.592256,1.305765,-0.4715845,-0.1435752,-1.101705 +-1.379914,-0.01983617,-0.09003496,-1.017548,0.1935063,1.759334,0.02336921,-1.082838,-1.038586,0.9893743,-0.6822653,-1.173802,-1.335414,0.3517991,0.8353139,0.9312781,-1.274486,1.222881,-0.672682,-0.1646388,1.370816,0.3463163,-0.6802726,0.3512115,0.2447735,-1.04568,0.1987068,-0.893395,-0.08010153,-0.4124045,-1.373251,0.8009274,-0.346889,-0.4036667,1.186602,0.762126,-0.2680129,0.4778575,0.7845125,0.8729515 +-1.48215,-1.40504,-0.3154756,-0.5148007,-0.9337977,-1.209993,-0.02093987,-0.8181531,0.3401547,0.3772012,-0.5659988,0.01161367,0.1299849,-1.188068,-0.8214922,0.969597,0.2779007,1.355281,0.5799689,1.042095,0.5057847,-0.2872966,0.1762483,0.3589809,-1.741297,-0.894869,-0.846841,0.6220394,-0.9237239,-0.7197763,-0.278412,1.95817,-0.3139408,0.2085647,1.642484,0.1121221,-0.008632948,0.5010003,-1.046782,-0.1362393 +0.1220714,0.2475681,0.4652751,-1.925225,-0.5655694,-0.7176182,1.768069,-0.9252517,-0.954024,-0.7261655,-1.378278,-0.7265041,-1.798579,0.3886727,-0.4732897,-1.544929,-2.003104,-0.9171946,-1.235946,-0.02772727,0.2847313,-0.434706,0.4939558,0.08367245,1.171677,-0.5747935,-0.01696712,1.060455,1.08278,-0.5477961,-0.7939191,1.357722,0.2463868,-1.438718,0.5152048,1.603285,0.5741579,-0.1879927,-0.01163986,-1.398614 +1.195887,-0.941905,-1.083341,-0.4533493,1.320404,-0.5418542,0.3809792,-1.467127,0.4850116,0.6366948,0.2994939,0.9982777,-0.1216447,1.080337,1.008556,-0.1795415,0.1750764,-0.2731479,0.672377,0.05122199,-0.9465164,-0.278979,-0.4692454,-0.3145799,-3.177395,0.8467914,-0.1069307,-1.348413,-0.324068,1.526057,-0.8823823,0.3348363,0.1136485,-1.515167,-1.78957,0.07109646,1.230344,-1.385777,0.4965637,0.8203851 +-0.2540624,0.4729573,0.1091577,-0.8085183,-1.299647,1.650485,-1.758896,0.1217541,0.168382,0.1554137,1.907855,0.9212415,0.2676087,0.3791718,-1.451056,-0.727055,-0.1790879,0.6469336,0.611769,-0.5062204,0.1007255,1.556133,0.3907513,2.08839,0.7751195,-0.5754871,-1.531568,-0.004031519,0.01339385,-0.2784879,1.183919,0.3606403,0.3844021,1.076513,0.77839,0.2479942,0.2831005,-0.5421988,-0.9546793,-1.896217 +-0.3444073,1.638864,-1.962424,-0.009047911,0.9288942,-0.5906747,1.210393,0.2061113,-1.446194,0.1129079,-1.021397,-0.693908,-0.2117023,-0.1924082,-1.428175,0.183919,0.3040223,-0.8850081,1.207807,-0.06479523,0.06269892,-0.9466581,0.04648014,1.839927,1.02307,1.491165,0.5689359,-0.3591014,-1.480063,-1.81564,0.6877408,1.137929,0.1752574,-0.855445,0.02602467,-1.53832,0.2687638,-1.00692,0.2007477,-1.090587 +-1.809799,1.475059,2.154065,1.614185,0.291283,-0.3261695,-0.7141991,0.4013355,-0.7646623,0.2580183,-0.3061337,0.6315334,0.6286927,0.5621312,-0.4614822,0.9073541,-0.8825408,-1.755696,0.7837722,0.24823,-0.1797158,-0.9134267,1.307432,0.07303586,0.3735509,2.282123,2.445539,-0.970238,-1.687975,-0.8712183,1.444104,0.731562,-1.397655,0.6290671,0.1051703,0.8701399,-2.842453,-1.089539,-1.060095,1.225592 +-1.630895,1.566531,-0.7525802,1.909589,0.5629205,-1.332766,-0.03383592,0.01831932,1.98285,0.2609682,-0.6721052,0.5116921,1.727859,0.194369,0.526911,-0.4987763,1.030703,0.1657236,-0.9227092,-0.589804,-1.242192,0.3650022,1.362121,0.9959762,0.7433031,0.6069499,0.2075259,-0.689745,0.09364572,0.2435238,-0.1616642,-0.3409248,-1.68054,-0.107514,-0.03034482,-0.1011667,-2.094992,0.8408371,-0.8337535,-0.990752 +-0.2719659,-0.9187196,0.3595398,-0.3571213,1.102162,0.9690472,0.6232401,0.7309155,1.027565,1.077159,-0.3851255,0.7016191,-1.861356,-0.7070998,0.910213,-0.1653801,0.6436111,0.8022112,-1.138601,-1.163646,-0.3965631,1.080263,-0.4634507,-0.4939815,-2.798954,1.734034,0.3093281,-0.9028559,-0.482737,-1.150509,1.408027,1.298049,-1.196309,0.934463,0.173468,-0.5082792,0.4730252,-0.6223594,-0.1887386,-0.1065898 +-1.790901,-0.2551083,-0.1667216,-1.076975,1.028393,0.1350358,0.2201249,0.419445,-0.399833,-0.4073916,0.006043161,-0.2250456,-1.827989,0.7154949,0.6174726,-1.049155,-0.5104019,-0.6044482,0.3035058,0.1324813,1.101872,-0.4357056,-0.4116984,-1.01318,-1.297579,0.09337903,0.5830186,0.8248381,0.3982123,-0.8947602,0.1804668,-0.1906395,-1.949097,-1.285239,1.061874,1.640009,-0.2241341,0.3235198,0.08914161,-0.6910483 +-0.2284992,-2.31533,1.186129,0.02718465,-0.7187753,-0.4770396,0.05132662,0.2861972,-0.08793968,1.122064,-0.7058777,-0.1799075,0.5143394,-0.05504814,0.1570844,1.609285,-0.167172,2.496328,0.2606656,0.2702177,0.6880381,0.7678542,-0.4256688,0.724755,0.7731816,1.808089,0.5213056,0.9285499,-0.6180186,-0.5078685,-0.2309089,-0.9008524,1.206169,0.7886456,-0.9089336,-0.7051035,-0.698255,-0.02714871,0.7256314,-0.7355933 +1.784241,-1.780579,0.203823,-0.7835838,0.01163696,-2.508591,-1.359581,0.00864686,0.09737813,-0.05582824,-0.826509,1.761649,0.5796121,-0.2766362,0.02054147,-0.599227,-1.159331,-2.517065,0.31132,0.5624224,0.2304538,-0.6115064,0.5699885,1.245328,-1.194449,-0.961834,1.526517,0.3161797,1.064406,0.7514254,0.7219565,-0.3076737,1.097445,0.5316979,-0.3652523,-0.5341936,-1.102695,-0.3547037,-2.031348,1.21508 +-0.1663136,2.011912,-0.4761804,-0.0250342,0.7609504,-0.1619376,-0.8797625,-1.044562,-0.7819292,-1.196361,0.5962612,0.4707073,0.7150178,1.438555,-0.1014545,-2.57621,-0.176813,0.3353415,-0.4246597,-0.3081515,0.1850437,0.2568398,-0.3560067,-0.2223577,0.2574739,-0.4839539,0.1603824,0.3693999,-0.07234621,0.156884,1.322124,0.4244364,0.2790802,1.303326,0.7867102,0.0249549,0.7755459,-1.400512,0.6381601,1.004852 +0.8090874,0.7423579,0.7903376,-0.2258267,-1.417117,-0.8318247,-0.9813837,-0.151499,1.94433,-0.4045717,1.366862,-0.1778932,0.07283573,-0.2936121,-1.332692,0.700632,-0.6855569,0.3263921,0.426697,-1.239288,1.369179,0.8556477,-1.458156,-0.1597908,0.1782215,-2.421758,0.5950463,0.6254386,-0.7374992,-0.5063005,1.504888,-0.4961062,-1.355668,-0.8165416,-1.158781,-0.4098486,0.670913,1.723382,0.04794611,-0.9993471 +-0.9723293,0.3025405,-0.8904054,-0.5865495,-1.449382,-0.9343191,1.096251,0.3055601,-0.8996204,-0.7617576,0.9239339,-0.05928208,-0.2150583,-0.06988385,1.617863,1.865176,-1.873562,-0.9153646,-0.6281153,0.4004081,0.9538178,-0.9943741,-0.1017763,1.021448,0.851345,0.3887967,-1.562773,-0.3361043,-1.045622,0.3413844,0.7571114,-0.3145171,-0.80968,-1.342506,-1.135986,-0.3540097,2.36E-05,-0.1200589,0.3817018,-0.9360328 +-1.952502,-0.9301512,1.901943,-0.9055543,2.153897,0.4242188,-1.941874,0.1051827,2.456962,-0.5567076,-0.7171749,-0.2275321,-0.8219456,0.1691793,-0.1105117,0.7292264,-0.09193224,0.05179873,-0.4684873,-0.2173969,0.4924469,-0.559284,0.4086895,1.339353,-0.6771875,-0.4809611,-0.5206794,-0.1606414,0.112728,-0.007366137,0.5968023,-0.0747559,1.400759,-0.02125983,1.566856,-0.3141645,0.9543917,0.3051924,-0.3445162,1.119007 +1.786423,-0.6237304,-0.09844711,1.257165,-0.5342907,-0.01222969,0.4874997,-0.7030065,-0.9145537,1.566156,-0.6701325,-1.164807,-0.9838523,0.2737348,-1.104857,0.8074601,-0.5874916,1.044978,-0.9758597,1.224409,0.9491986,-2.662905,-0.7264853,-1.482035,0.4315646,-0.6637764,1.402703,-1.031473,0.9774834,0.8918795,0.6938968,-0.5940963,2.58625,1.574864,-0.2647429,-0.5355857,-0.6931203,0.3185534,-0.351452,-1.66461 +1.312988,1.139786,1.422991,-0.8712608,-0.4080144,-1.766652,0.5063333,-1.854061,0.1737996,-1.639731,-0.8298095,1.323815,-0.5921358,1.257319,1.471927,0.4010245,-0.2055673,1.660844,1.0973,1.090141,1.495645,-0.8999745,-0.3789832,1.324437,-0.2036761,-0.2654647,0.2814492,0.2109689,-1.970793,1.830162,-3.349868,0.02385549,0.6782608,0.208131,1.248889,-1.103801,0.7599252,1.670892,0.03043532,-0.2190249 +1.108675,1.525068,0.9588439,0.6123897,-0.2410641,-0.8733687,0.3117959,0.174348,-0.2763855,-0.006000638,-0.7096279,0.5344725,0.06332292,-0.6425008,-1.182337,1.483236,-0.3882034,-0.1575238,0.1335983,-0.8025782,-1.642221,0.2847822,-0.8341297,0.4821308,0.630788,-1.090875,0.1349911,2.352891,0.1380662,1.392404,0.7701381,1.847616,-1.303578,0.6407466,-0.2962553,0.859954,-0.2236059,0.05711353,-0.8934904,0.4599481 +-0.9932007,-0.08140044,-1.191342,0.6151858,0.660348,1.154504,1.801399,1.184333,1.845117,-0.3263442,1.922458,0.5239917,1.373344,-1.820682,-0.3519679,-0.1378445,-0.08537618,-0.6801275,2.220008,0.2088782,-2.182412,0.3941256,-0.3197778,0.561551,-0.4757655,-0.7960678,-1.062878,-2.267791,-0.4532568,-0.5044202,0.8949987,-0.9139782,-0.5696397,-1.47729,0.5450914,2.334647,0.01613249,0.9835542,-0.6146236,1.068684 +0.5260998,-0.4132552,0.9869504,0.2229047,-0.09367741,1.681824,-0.374344,-0.07010686,0.5239424,0.2922427,0.1822388,-1.263721,1.075125,-0.1263012,2.35984,-0.03861758,-1.645494,-1.365103,1.660071,0.08737285,-0.2145201,0.396678,-0.6652206,-0.9529181,-0.2950436,-0.5115965,-0.2658816,1.609768,1.41268,0.8461248,-0.4129728,-1.063458,0.273915,-0.9094626,0.640947,-0.9882497,-1.047994,-0.6366783,-0.338148,0.09805261 +-0.768071,0.1779261,-0.1504406,-0.6071368,1.022471,0.2677225,0.4462004,-1.863199,0.8348274,0.002529911,-0.516658,0.08059037,-0.23241,0.6363874,0.2537486,-0.628604,0.2930972,0.807236,0.7284626,-1.279231,-0.737313,-1.504269,0.999234,0.4176601,-0.6890307,0.28797,-0.830218,0.4294998,-0.7470129,-0.150496,-1.076974,1.764263,-0.2275386,-0.3998873,0.07521556,0.6296337,1.435082,0.8989122,0.6022315,1.966521 +0.529,-1.61176,0.05980348,0.6140137,1.704137,-0.8011553,-0.5617169,1.563706,-0.8877958,-1.951795,0.2681819,-0.04907659,1.825031,-0.05135975,1.108301,-0.4970883,1.04839,0.4032445,1.616118,0.8994005,-0.7014032,-0.7081412,0.2498809,-0.7673618,-2.7152,0.2212572,-1.500636,-1.232376,1.540586,0.2319765,-0.5668778,-0.2880716,0.652218,0.851744,-0.8991104,0.02720753,0.4615035,0.7100321,0.0670536,0.2372704 +0.142538,0.9343846,1.317076,0.50624,0.2394388,-2.626989,0.03580558,-2.992283,0.2031837,0.8711626,0.1423066,-0.2071118,-0.4436279,0.5898241,-0.727293,1.778006,-1.161734,1.505964,0.6823214,0.4447785,-0.9290533,-0.6181296,-0.9068672,0.8556668,-1.226044,-1.239333,-0.08985337,-0.6597288,-1.343023,-0.7425149,-1.375762,-1.553086,-0.5803377,-0.7028511,0.2138933,0.4852001,1.491173,-1.744033,0.7542505,0.513621 +-1.540493,0.7797665,0.03187052,0.8091063,-1.252922,-0.652121,0.5801869,-0.0345218,0.8080924,-0.9086785,-0.4220508,-0.8856277,0.1242793,0.3222499,0.8505659,-1.10584,-0.1856555,-1.692325,-1.142524,-0.2729821,0.2516733,-0.3861844,2.121851,0.550127,1.037375,-1.574204,0.3514674,0.714796,-0.8616913,-0.3446871,-0.1999238,0.000720387,1.227664,-0.948124,0.07356902,-0.7183782,-0.5968743,-0.04308738,1.266409,1.630112 +-0.7768091,0.9488591,-0.06273985,1.130287,2.282672,-0.0343595,0.04608607,-0.7559873,-0.3520376,1.219665,0.05176892,-1.459528,-0.1365461,-1.755384,-1.702637,0.1661987,-1.142117,0.3870769,-0.7483016,-0.580024,-0.0894807,-0.6118879,0.08855194,-0.7196668,0.3599979,-0.4988472,-0.3117821,0.7943778,1.258519,-1.744927,0.4809676,-0.1636729,-1.015233,-0.1475043,-0.1656796,-0.4942076,0.1260714,0.7480223,1.224649,1.065762 +-3.0533,-0.2789342,-2.026913,-1.007246,1.397478,1.017979,-0.1232643,-0.4313686,-1.156912,-0.7823728,-0.07955913,-0.005139809,0.1166111,-1.965275,-0.3680379,-0.455548,-0.2530965,-1.894909,-0.002621178,0.5363754,0.1690183,-0.3135185,0.6509252,0.5523993,-0.3964115,-1.215589,0.5596138,0.2349434,1.911014,-0.5589784,-1.176808,0.02062763,-0.5467488,-0.224879,0.06795704,0.2383508,-0.6692917,0.4924856,1.091878,-1.983717 +0.8714743,-1.453476,0.7009227,-0.4443174,2.182453,-1.303025,0.8768539,-1.987462,-0.3926273,-0.4266736,0.7421449,-0.1574262,0.861673,2.148282,1.192266,-0.168051,-0.2995161,0.1100348,-0.8976138,-1.265529,-0.2119371,-0.4475914,-1.985363,1.472548,1.431173,1.904673,-1.987689,0.7602488,-0.5708075,-1.177554,-0.3458852,-0.3953647,-0.8044152,-0.4410601,1.004628,0.1749539,2.913934,-0.1894373,-0.4739804,0.8575221 +0.1909621,0.9851275,-1.741286,0.0190286,-0.232154,0.4336666,0.5544363,-0.2520063,-1.019331,1.206124,2.034623,1.10169,2.836365,-1.270891,-1.7232,1.515484,-1.267175,-0.3000803,-1.562338,0.4871898,-0.3292844,-1.868938,-0.0159186,-0.4460873,-1.530472,-1.04456,-1.277256,-0.6549965,-1.216142,1.474473,0.3992445,-2.175307,0.979121,0.6068101,0.5868657,-1.068554,0.06152889,-0.9750514,1.017692,0.9563766 +0.8495726,0.2641752,-0.9905284,-1.676665,0.289375,-1.235175,0.1663571,-0.1856698,0.9020847,-0.5304582,1.739278,-0.1690549,0.3051505,-1.048987,2.853079,0.8641292,0.1126223,-1.17895,0.4616486,0.3385137,1.395789,-0.3071949,-0.2665208,-0.3265169,-0.1464136,0.6498986,0.6088806,0.7253072,0.499189,-0.09476938,0.876569,1.074772,1.197576,0.3788167,-0.2071225,-1.258681,-0.5136455,-0.4799542,0.01126897,1.373386 +-0.5634817,0.1818199,0.3433529,-0.9478715,0.8156552,0.3965565,0.8436645,-1.05479,0.7720235,1.88302,-1.057869,1.619803,0.6294064,-0.4517871,-1.688318,-0.3140889,2.086026,0.4185572,1.145851,0.6769728,0.5533707,-0.4697154,0.0667619,0.7419082,-1.524104,-0.1141371,-0.9950814,1.657713,1.890392,-1.248155,-1.404972,2.01384,1.33003,1.652415,-0.6374017,0.6663757,0.271351,-0.575709,-0.5177895,0.007667642 +1.204417,-0.489429,0.2314262,0.7400705,-0.1813321,0.439539,-1.898284,-0.8597412,0.576075,-0.2420763,0.3587716,-1.76614,0.3157339,-0.1539138,1.061619,-0.3156986,-0.7308367,-0.629552,-0.6770103,0.1157615,1.243068,1.076259,0.4570385,2.530846,-0.5969971,-0.928413,-1.693465,-1.162242,1.330018,-0.408383,0.1583841,0.1406872,-1.561373,-0.1184658,0.3829936,0.6271289,1.706237,1.277988,0.7002502,0.4366448 +1.704493,-0.143931,-0.8259445,0.6973516,-0.8407041,-0.6411114,0.3273374,-0.03593679,0.926047,-0.1038535,0.01175694,-0.1144074,-1.34484,1.474188,0.3542219,0.07502897,-1.457645,0.4741405,0.6389281,-0.2900034,0.07277291,-0.4649185,-0.165252,-0.6120707,-2.659353,2.212246,-0.6345311,-0.941266,0.2427191,-0.6518689,0.5458369,0.9171127,-0.4941953,0.4426178,0.921846,1.515848,0.7513363,0.3719432,-0.5652805,-1.968469 +-0.05800861,1.354275,-0.239299,0.2713215,0.983899,-0.2172831,1.589416,-1.467549,-0.3976177,3.206543,0.9375365,-2.282999,0.8652155,-0.04751884,0.2082824,2.069485,1.664915,-0.834383,0.4672561,-0.2477015,0.4007323,1.606619,0.922489,0.607913,1.291011,-1.505332,-0.6249341,-0.9998165,3.000724,0.2155622,-0.0353026,-0.8347956,-1.612824,-1.827973,1.378563,-1.355399,-0.5039985,-1.334651,-1.292665,0.2450609 +0.7708767,0.3486849,2.250446,0.163804,-2.466631,0.5912959,1.920821,0.4818365,-0.4659968,-0.1347553,-0.01234349,-0.6808743,2.645808,-2.221406,-0.1398426,-0.6690354,0.6626977,-0.1575483,-0.6124289,-0.7772433,-1.383573,-1.032056,-1.218617,-0.5967369,-0.9456703,1.156155,-0.9339402,0.0799137,-0.4878132,0.9088552,0.6875919,2.227149,0.7386513,0.3956196,-0.2483388,-2.351745,-0.3314037,-1.447997,-1.389942,-0.4539381 +-0.02769032,0.8532997,-0.6206261,-0.4866618,-0.8754389,0.2621587,0.032868,-0.4777377,-0.678165,-1.543781,2.200218,1.014297,0.0567176,-1.760343,0.5646108,0.3743544,0.2296769,1.725532,-0.4840224,-0.05426112,-0.1347144,-1.095967,0.376729,0.9703791,-0.4838677,0.2604245,0.6826753,-1.029478,0.1932969,0.5151822,0.1441701,-0.833259,1.913823,0.3970998,0.09898187,0.1664698,0.1036858,-1.148269,-2.107749,1.668641 +1.303131,1.012087,0.1297552,2.244177,-1.354803,0.009106597,-0.05054641,1.874409,-0.4885863,-0.05756357,0.4341598,0.4997426,-2.566349,1.164323,-0.4590187,-0.9594596,-0.145106,0.8976257,-1.040286,-0.6260146,0.03936069,0.2543808,0.3589277,-1.713925,1.808309,0.3925163,-1.446299,0.4617412,0.6405755,1.514073,0.02128457,1.549509,-0.440687,-0.1419045,-0.5333,-0.156119,-0.1726124,0.135493,1.672319,1.409899 +-0.7115859,-0.9288807,-0.04307628,-1.123998,0.4168903,-1.665252,-1.167713,-0.04938994,0.9475816,0.7274474,-0.3797749,-0.7611032,-0.2110975,-1.429211,-0.7858378,-0.1241325,-0.2513716,-1.130513,0.5399115,1.138836,-0.2464631,-0.2147262,1.184079,-0.740308,-0.001875773,-1.661465,0.8841305,-0.07875293,1.464768,-1.709754,-1.36996,0.07918199,0.3587374,0.5962312,0.02330376,0.7298669,-0.3420587,0.7258118,-1.26976,0.9431407 +-0.771192,0.7960816,0.4355247,0.9161138,1.141277,0.907529,1.442945,0.806831,1.722391,1.742833,0.4037926,1.864646,-0.046828,-0.6598338,0.7855016,-0.2671295,-0.768104,1.141629,-0.3189041,0.5884861,0.604175,0.04968804,0.6915691,0.2352636,0.2499088,-0.4089996,0.1769578,0.1076459,-1.080041,-0.8928274,0.09153104,-0.999597,1.258298,0.4283606,0.6401197,-0.1945373,0.8077568,0.06739109,-1.269331,-0.7551863 +-1.338908,-0.8027995,-0.8846403,1.067132,0.5517586,0.5139403,0.3741545,0.002512278,-0.172501,-0.8099412,0.810245,1.478321,-0.944018,-1.286487,-0.6017438,0.3131375,-1.404695,0.3194398,0.008676748,-2.601949,1.934455,1.902091,-0.1322258,0.2774981,0.3962827,-0.7476098,0.1298697,0.8931065,0.5559694,3.335541,1.115669,-1.118684,-0.457713,-0.271471,-0.6714897,0.4520548,0.9362398,0.2757168,-0.8511041,1.352182 +0.4081607,-0.4582456,1.068867,0.004417286,0.7210867,0.6556772,0.5786601,-1.768198,-0.3199396,-0.1636399,0.482088,-0.06034198,-0.1513313,-1.334129,-0.7686935,-0.1566465,-1.501425,-1.796912,0.06774226,2.112362,-1.662881,1.108096,-2.451736,-1.2476,-0.811376,1.104622,-0.9387797,-0.3451353,-1.377314,-0.9261767,-1.207056,-0.0872671,2.860868,-0.02748069,-1.881141,-1.324854,0.04444403,0.4259368,-0.3932071,0.8843148 +0.2065376,0.6862764,1.309281,1.780441,0.3683187,-1.104728,-0.6103361,-0.9215248,-0.2735535,-1.756968,-1.371118,2.532275,-1.907745,0.1264912,0.0958119,-1.637143,-1.063453,-1.802465,-1.652349,-0.4178317,0.1226236,0.710011,-0.5142702,-0.5810804,0.9056408,-0.9558416,0.2742739,-0.3678077,0.962721,-0.1884397,-0.4882988,0.04976419,0.4496915,-0.7240807,-0.03869689,-0.3268311,-0.4777362,0.937782,-0.3192127,-0.6128031 +-0.2244763,-0.5953003,-0.2035059,-0.5779668,0.3393237,0.4778255,0.1464572,-0.1866153,-0.7808768,-0.0241523,-2.231856,-1.221182,-0.161173,2.530827,0.359927,2.240675,1.313078,-0.1262299,1.363122,-1.43669,-0.3067058,-0.5482303,1.000501,-0.3677064,-1.116502,-1.002137,0.02440755,-1.429114,1.881219,1.664344,1.360784,1.172683,-0.7540478,-0.4332615,0.9210744,-0.5348103,-0.634734,-0.5003289,0.4064227,2.417015 +-1.504028,-0.4073832,1.137082,-1.360031,-0.7018681,0.7826775,1.63756,-0.3041854,-1.050337,-1.183702,-1.229847,0.1070859,1.302777,-0.5822529,0.7794241,-1.380514,1.904876,-1.550579,-0.1590209,0.06756641,-0.05480491,0.7167292,-0.3448292,0.4901765,-0.6521723,-1.196772,-0.09497319,0.716825,0.4824695,-1.073559,-0.4471944,-0.7677499,-0.4937433,-1.093991,-0.8042766,-0.8386794,0.3116309,-1.761492,1.75296,-0.1121505 +0.2387984,-0.3809501,0.914253,0.7795765,0.4449621,0.5481967,0.1255538,-0.746418,-0.2724555,-0.9190377,-0.5766929,0.799646,1.741884,-0.3856321,0.1069362,0.3248277,-2.575962,-1.109181,0.791433,0.4297864,1.038114,0.5681225,0.8874413,-0.9706664,-0.5275962,-1.926547,-1.096123,-1.813656,0.410115,1.763321,0.2020745,0.1213078,-1.111832,0.8096451,0.3367681,-0.04633341,-0.9378546,0.5154623,1.165821,1.486392 +-0.4400945,-0.5792708,0.8364688,-1.289218,0.1894396,1.852491,0.7471357,0.6789266,-1.135156,-0.3559053,-0.7749858,0.464534,-0.5041882,-0.1320751,0.1393929,0.1193989,-1.12648,0.6759225,-0.2416547,0.4610984,-0.1851701,-0.4500192,1.228677,0.3642424,-0.6864427,-0.05068541,0.08717088,-1.506646,0.04814946,-1.053728,1.698337,0.8321063,-0.1089066,0.247128,-0.5715435,-0.01260449,1.53719,-2.138389,-1.552935,-1.089231 +1.085948,-1.355887,-0.1909188,0.08769516,-0.3127809,0.04527923,-0.2787614,0.7763111,0.6773808,-1.068041,-0.4100815,0.07315535,0.990975,1.471725,0.5365281,-0.9575866,-1.185806,0.09564944,0.7743521,-0.4393558,-1.52473,0.003916985,-0.01306802,0.927715,-1.258428,0.4748626,0.2789287,1.423146,0.07975428,1.715978,2.860077,0.3983013,0.2045095,-0.1008955,-0.4657762,0.2572242,-0.9522414,-0.6734471,0.6163599,-0.6536034 +0.2879373,-0.1130526,-0.059825,-0.3606093,-1.016342,-1.180394,-1.885399,-0.1601803,0.04785583,-0.2333788,1.018353,1.276264,0.2194723,-0.5460249,-1.29653,0.5619274,-1.004499,0.2115898,1.234024,0.7017173,0.2377058,0.8823748,0.6717558,0.5877186,-0.4174573,-0.6029146,0.6679442,-0.4148408,-0.8219626,0.3484924,-0.8453646,-0.8048374,-0.7456439,0.6090437,0.9143693,-1.812331,0.7370503,-0.444407,0.2180797,1.168993 +-0.5925822,1.012705,1.004685,0.4109785,0.5002854,0.2339004,0.9666269,0.7859245,-0.457107,1.18962,-0.429202,0.4517601,-0.2836733,-0.4271085,-0.6412598,-0.3396288,-0.3978598,-0.4547112,-1.608933,-0.4350217,-1.107532,-0.2527574,0.2899332,0.3309811,0.1292831,-0.06171664,-0.9533953,-0.3610514,0.6402895,-0.1216457,-0.1964592,-0.9941262,-0.04154541,-1.003803,0.2315899,-2.683701,2.173943,0.08460832,-0.1336795,-0.4996839 +-1.030482,-0.1814063,-0.06019712,0.1808341,-1.355358,0.6614733,-0.719781,1.148983,1.772662,1.270099,-1.64028,0.6594025,-0.6729969,0.4086266,1.876071,1.701429,0.795363,-0.6646771,0.896666,-1.420872,1.765216,0.3595857,-0.1009953,-0.3832685,-0.9797023,-1.463201,-0.6199874,-1.087524,1.530943,0.7253667,-0.7278351,0.1781232,-1.214357,-1.62779,1.170696,0.27308,-2.801042,0.1017067,1.888234,0.3435032 +0.7227389,-1.469275,0.9299748,0.01112966,-1.275725,-0.5448196,0.03390336,-0.4596177,1.116786,-1.48225,-0.1882313,0.864571,-0.01780542,-0.9236949,-1.405103,-0.5312181,0.9583228,1.511782,0.2922372,-0.4725685,-0.3673442,0.1425919,-0.1351894,1.433129,0.006349458,0.8381448,-1.373364,0.4054019,1.663267,-0.1699634,0.6392822,1.455212,0.1310615,0.9776026,1.580509,-0.6321934,-0.673863,-0.7403546,1.438167,-0.72356 +-0.8148326,-1.895418,-0.1652187,0.5144418,-0.3823098,-0.1692764,-0.06239582,0.8442851,-0.2581305,0.2124214,-1.854754,-0.4850005,0.4252365,-1.104447,-0.7087892,0.3074013,-0.3426765,-0.1639293,-1.429632,0.4865958,-1.291172,-0.408762,-1.45483,-0.6239267,-1.25473,0.3857341,0.3676147,-1.712471,-0.5806333,-2.884276,-1.22297,-0.8586406,-0.07852916,0.6405476,1.803196,0.8602678,-0.01245136,-0.08523926,-1.072038,0.5361672 +0.4146352,-1.015997,2.136512,-0.7019133,0.362243,-1.311227,1.429916,-0.386313,0.333398,1.441169,0.08721387,-1.679743,-0.01267881,-0.2753774,-0.516217,-0.2075414,-1.746296,0.2921569,-0.4335475,0.1768164,-0.2077664,-0.8473804,1.250945,-0.8168546,-1.108809,-0.5663122,-1.57889,-1.038045,0.5674672,-0.4139889,0.3651443,-1.048829,-2.864301,0.3616172,0.1071516,-1.429696,-0.5406061,0.4290355,0.713027,-1.566905 +-0.1926365,-0.08470884,0.856721,1.095517,1.19049,-0.5357722,1.022167,0.5935541,1.284745,0.002177359,1.603611,-0.6171278,0.1762107,-0.7623301,-0.4173833,0.1283499,-0.2198367,0.1382507,0.07478995,-0.757277,1.064895,1.362428,0.7416075,0.3377493,0.08512672,0.3616607,-0.491868,-1.069673,-1.087231,0.7186361,1.113721,1.009754,0.0530835,1.699607,-0.5289925,-0.709445,0.01450559,-0.8882126,0.5878019,0.3019731 +-0.4101087,-0.78864,1.618141,1.382558,-1.299516,0.3319862,-0.3891712,-0.562214,-0.1443211,0.3316716,-0.05740049,-0.6218958,-0.2697696,1.484519,1.112788,-0.1401497,-0.5112382,-0.5582543,0.6077023,-0.622369,-0.1996082,-0.4311899,1.115584,1.561217,2.152851,-1.249738,-0.2172284,-0.7217665,1.175579,0.6828174,-0.9357453,0.123984,-1.523457,0.01055509,-0.5623578,1.771351,-1.673804,2.363786,-0.2774055,-0.7917333 +0.2063615,1.449019,-1.080855,-0.400339,-0.1057151,-0.3447741,0.3081227,-0.2325841,-0.5809499,0.3778185,1.210305,-0.4807387,1.410221,-0.6810058,0.8619079,-0.2107583,-0.2560076,0.1096738,3.112966,0.7237672,-2.047124,-2.197908,0.8179642,0.1679459,-2.798687,1.304388,-1.681183,-0.7622918,0.0476493,0.381722,-0.4037353,-1.082769,-0.80981,0.8367218,0.3438234,-1.813345,-0.2764277,0.3501717,-0.3188157,0.6429957 +0.2846302,0.4195211,-0.4055031,0.2120678,-0.5605734,1.024254,-0.9747495,0.8748885,0.3886965,-0.6958344,0.04787796,-0.40916,-1.16332,-1.199565,-0.7946219,1.790924,-1.053535,1.876369,-1.172832,-0.3002236,-1.303593,-1.281109,0.6400126,-0.07222218,0.6388764,-0.2801321,-0.6709576,1.683726,1.161185,0.5500223,0.6312135,-0.4748325,0.5783794,0.732107,-0.5004546,2.120948,-0.1698921,1.925596,0.2384512,-0.1468029 +-0.6048848,0.4069906,-1.259549,0.8360682,0.5172754,-1.841476,1.122471,-0.215902,-0.6209263,-0.06040037,-1.78911,-0.6434254,0.2812651,-0.7583667,-1.606504,-0.4996632,1.757582,-0.170642,-0.6487112,-0.7956535,-1.533208,-1.660696,-1.713689,0.8587803,-0.3092294,-0.05631039,-2.13844,-1.085487,0.1236986,1.08706,-2.180196,-1.66642,0.01572287,-0.04404123,0.2245653,0.3206553,2.363165,0.51655,0.1552137,-1.456547 +0.5832206,-0.4760313,0.4439115,0.8248393,-0.05326433,-1.241444,-1.649573,-0.2555377,-2.207208,0.2518643,1.316294,-0.1736766,-0.1016352,0.8952832,-0.5052517,-0.3457449,0.3526743,1.756344,0.2603508,-0.1749341,-2.956764,0.06651089,-0.7382564,0.4375127,-0.747996,0.1749421,0.6385306,-1.095028,0.5066359,0.4598596,-2.596974,-0.6185885,-0.7254273,-1.193216,-0.4571544,1.079017,1.149757,-0.417167,1.737587,0.1792404 +0.280145,0.5650942,1.344637,0.5397017,1.116446,-1.637306,-1.492851,-0.7283165,-0.4039185,-0.2772967,-0.8466088,-1.335145,0.9661007,-0.85542,-0.3345072,-0.006047873,-0.8862779,0.2636305,2.197402,0.8782036,-2.0043,-0.03616923,-1.757819,1.13036,1.186299,2.928176,1.013705,-0.94013,-1.014352,0.9873284,0.08553971,0.1076009,1.241577,-0.02882525,0.4359677,-0.4998007,0.5379277,0.7835849,-0.3239606,-0.03604479 +1.31561,-1.121641,0.1597877,1.865733,0.4172165,-1.720142,1.361786,0.113904,-0.4323685,-1.145858,0.2620429,-0.4156657,-0.2581565,1.215675,0.262732,0.5680749,-1.253456,-1.128661,-0.6777977,1.016195,0.2069907,-0.721642,-0.9218959,0.815092,1.075564,1.525335,-0.8953729,-0.787725,-0.4593063,0.6960934,0.9356025,0.266793,0.968436,0.1728986,1.005976,0.1373052,-0.527554,-0.2233489,-0.836384,0.6664342 +-1.452677,-0.06941162,-0.301364,1.196109,0.1989032,0.69979,0.04489945,0.1484763,-0.9532112,0.9520493,-1.685904,-0.2135647,0.7199188,-0.4516086,0.2114882,-2.545166,-0.834231,-0.9625171,-0.9924517,0.8743486,1.339554,-1.507802,0.3820551,0.07416785,-0.9826282,-0.7106792,0.7961435,-1.130875,1.805981,0.289486,-0.070557,0.6447672,0.2228283,0.04194367,1.909412,-0.6777757,0.7460289,-1.304751,-1.113784,0.4128499 +-1.158009,0.1723395,0.7517435,0.427045,-1.897187,-0.5018942,0.502718,0.5684189,-0.6857679,0.5345181,0.4960417,-0.9134971,-1.934911,1.086962,-0.4956216,-0.233819,-0.7489297,-0.3973536,0.9301974,-0.816905,-1.515661,-1.537497,-0.3447646,0.630696,1.973756,-9.43E-05,-1.177251,-0.2924762,0.01564326,0.2335228,0.2479693,-1.755827,1.010343,0.9202476,-0.5810843,-0.7234073,0.4150341,1.669309,-1.078725,1.020963 +-0.04717828,-0.03840447,-0.1831143,-1.252295,-0.3951427,-0.3025649,-0.4213726,0.5648759,-1.151637,-0.0884337,1.780079,0.4898418,1.054631,1.527302,1.318321,-0.6168512,-0.9972447,0.6383262,-0.6861112,-0.7943888,0.6979824,-1.166084,1.935054,-0.7015473,-2.340102,-1.954948,0.2893004,-0.1076948,-1.291557,-0.1302183,0.2207758,-0.1621704,0.4311594,-1.910078,-0.3265421,0.59715,0.2843554,1.159079,1.112797,-0.4118977 +0.1629123,1.177584,0.1244014,-0.1788584,1.100591,-1.561548,1.154732,0.3578608,-0.1646845,0.8952532,0.1987225,1.959606,-0.6733987,-0.3913244,0.1564428,0.7638067,0.1608396,-0.5832368,0.6455891,0.6809234,-1.417927,0.2554098,-0.4593514,-0.7691827,-2.89817,1.126034,1.453547,-0.2551155,1.383747,-0.6031203,2.133373,-1.165952,-2.663609,-0.3726923,0.861029,0.6843732,1.903273,0.982914,-2.093236,0.931821 +0.9409331,1.636541,1.697414,-2.089916,-0.5302948,-0.5194421,0.4264245,0.02927084,0.6141593,-0.1714843,0.4024088,0.940776,-1.958653,1.659763,0.1200187,-1.458034,0.1196775,1.359227,0.4162451,0.2903465,-0.6387699,-0.6340005,-0.5484352,-1.732022,1.307278,-1.695456,1.156509,0.6522232,0.5203652,-1.434244,1.432173,-0.4986901,-0.8207871,0.5410878,0.2716454,0.503681,-0.7155396,-0.5871049,0.1915459,-0.4966676 +-0.3928341,0.8535055,-1.931876,-0.360396,1.632813,-2.10254,1.709961,0.09996849,-0.6820205,1.326865,-1.613605,0.0952283,-0.8592448,0.7592339,-0.07457393,-0.8458448,-0.01795279,0.4284469,-2.119094,-2.361829,-0.1568001,1.085916,0.6338148,0.671363,-0.556053,0.4524327,-0.9971075,-0.8646787,0.08828258,1.642495,-1.129892,0.527764,-0.9569065,0.8821726,-0.3595883,0.8594036,1.135451,1.444634,-0.6913142,-1.277599 +-0.562332,-1.590101,0.0534175,-0.2125989,1.31227,-1.422819,2.175334,-2.0266,0.7978084,-0.04279455,-0.8340345,-0.3531955,1.325136,-0.1304263,-0.7527853,1.952056,0.3674946,1.018987,-0.9524046,1.125828,-0.5514874,-0.4623514,0.6945454,1.307586,0.7842386,-0.0556153,2.035786,-1.093297,1.055034,-1.90602,0.2343622,0.3057207,0.6283713,-1.424988,-0.6986662,0.5758578,0.4174273,0.2109642,0.005915987,0.960435 +1.502073,-2.398953,-0.1369543,0.3492872,1.932439,1.238219,1.637826,1.329243,-2.122684,-0.5413392,-0.494583,-0.8268372,-0.4183164,-0.2720829,0.6404818,0.0745289,1.533448,-0.643468,-2.373633,1.749494,-1.452112,-1.294686,-0.7640346,0.7720705,1.273566,1.283653,-1.766052,-1.119034,0.8701092,0.7933013,-1.912494,-0.3820067,0.7222522,-0.1706734,-0.3058925,0.5201288,1.950333,-0.1198152,1.451923,-0.3310378 +-0.4772616,0.00748957,0.5329211,-1.753773,-0.4135361,-0.7708648,-0.3386397,0.4631526,1.894854,-0.3638919,-0.1593839,1.986145,-1.536034,0.2802293,-0.9732203,-0.9451228,-0.4113929,-0.5943226,0.7809786,-0.4568726,-1.468792,0.4383136,-1.572437,-0.57909,0.815848,0.6726524,0.07582493,0.5324683,0.510883,1.741746,0.5000753,-1.533984,1.527606,1.201434,1.216539,1.674612,-0.08822834,-0.2490213,-1.083644,0.5332457 +-0.7501636,-0.4533979,-1.03566,0.3246749,0.6591747,-0.9605078,-1.560688,1.760223,-0.4573205,0.9989185,-0.8363419,0.4785071,0.7542742,0.06626982,0.06308861,0.172192,0.5964468,-0.9767322,0.2019855,0.1566495,-2.03049,1.70478,-0.3036391,1.020613,0.7712687,0.5481747,1.188109,0.3048382,-1.096785,0.4216614,-0.6419086,-1.013898,-1.277272,0.4748204,0.7658207,0.7422455,2.234579,0.05858196,-1.451742,-0.7796271 +-1.018059,1.756587,-1.417135,0.05752017,2.016033,0.7679416,-0.4673791,1.27117,1.724026,0.630575,0.2565214,0.248255,0.7569424,-0.2477939,-0.5672941,0.5552126,-0.2696211,1.372516,-0.1340798,0.4658719,0.4424759,1.578644,0.1591567,-0.1840727,0.1690191,0.9559992,1.199702,-0.8024503,0.380061,0.8962639,0.3393192,-1.235441,-0.6350349,-0.938522,1.483974,-0.06428085,1.159199,-1.245533,-0.7956673,0.1499325 +-1.054689,1.527629,-0.008123334,0.6539864,0.4521477,0.1609605,-0.7620702,-0.3933526,1.031368,-0.9367743,-0.5585416,0.3068667,-2.035917,-0.6266941,0.7316697,2.048933,0.584891,-0.3696954,-0.5718967,1.034303,-0.4745219,0.1361283,1.316481,-0.6705394,2.621112,-0.6907048,-0.296172,-1.034988,1.924507,-0.3384037,-0.6067838,-0.5616806,1.162142,0.6680984,-1.124634,-0.3707993,0.4417809,0.2696673,-0.1129062,-1.398691 +-0.6413453,0.1672024,0.674164,-0.5535178,0.2455877,-1.50284,2.900877,0.2003143,-0.2455003,-0.2031218,0.8042805,0.7567361,0.08738355,0.2603427,-0.3983221,-0.7737044,1.948705,0.7968706,2.583269,1.286029,-0.2011389,1.131818,-0.659625,0.9506783,0.1538184,0.1885556,1.426315,0.1498789,0.3370833,-0.3401576,1.345415,-0.08090112,-0.7923906,-1.065964,-0.4061008,2.010784,0.2744661,0.2327453,-0.6664846,0.8349282 +1.404537,-2.923243,1.564483,0.008765323,-0.1110098,0.4429698,-0.210516,-1.539797,-0.3329609,0.8290026,0.1333958,-0.4065551,2.17479,0.4813955,-1.103382,-1.147896,0.9488905,-0.1395405,-0.8313065,-0.2717551,-1.492237,-0.1772833,0.5253719,-0.6409641,-0.505009,0.002969431,-0.4075022,0.6441614,1.062387,-1.071382,-2.348587,1.327619,-1.3389,0.6786664,0.232039,-0.2265561,-1.145573,0.3732253,-0.6936437,0.7540261 +-0.06575134,-2.262758,-0.7812184,-0.6133351,-0.01600776,-0.3816716,0.1477556,-0.6242023,-1.11355,0.987908,-1.377574,1.622037,0.6073347,-0.8971733,-2.349923,0.8381961,0.9293055,0.02615957,-1.449088,-1.506326,-0.6814097,0.9301956,0.7772038,0.4970692,-0.1219239,-2.131895,0.801024,0.1436348,-0.4910187,0.7944562,-0.4379058,-0.8975468,-0.4245158,1.041135,0.2066227,-0.8660633,0.878617,-1.431259,0.01821261,0.4811424 +-0.04939262,0.7591875,0.4007861,0.849552,0.292698,0.2070309,-0.09566103,-1.928837,-0.5715132,-1.093812,0.1622605,-1.122878,-0.326515,-0.05218985,-0.5724801,0.4781096,0.554159,2.356008,0.9675885,-0.2611715,1.645835,0.05932138,0.3015387,1.299444,-0.3525242,-0.4350556,-0.9665448,-0.8639035,0.3224253,-1.145641,-0.9403511,-2.884245,-0.0990484,0.4003048,-0.6431403,0.030599,-1.134326,1.490373,2.325167,-0.1878263 +0.5108275,-2.182011,-1.021876,-0.2661703,0.01991169,1.385723,0.2140603,-0.3816841,0.5870715,1.931276,-0.404093,0.3462587,-0.752497,0.8795597,1.295798,1.325886,0.6407302,-0.7208065,0.1246701,0.4533182,-2.100567,0.6055493,0.8921441,0.8328203,-0.3654306,-0.6024905,0.8597598,1.775625,-0.6800393,-1.357067,-0.5619987,0.5544852,0.3741789,0.2093307,-2.550028,0.4220759,0.3301061,-0.5841589,1.195994,-1.399721 +0.2850626,1.717445,-0.7753905,-0.1200016,0.5252172,0.09860155,-0.9135937,0.5050425,0.3113484,0.1716843,-0.3587253,-0.4289003,-1.514154,-0.004050465,-0.4611185,-1.633442,1.24364,-0.2279432,-0.8947162,1.342031,0.595891,0.5749081,-1.191898,-0.832656,1.183245,1.709595,-0.3927442,0.1601841,-1.763827,-0.8309829,-0.05246128,0.06055147,0.4037137,0.4327737,-0.9152623,1.855055,-1.6681,0.93442,-0.8251253,-1.634734 +-0.04417974,-2.147715,-0.3974191,1.123741,0.4526328,0.2625745,-0.08863584,0.1122452,-1.076501,1.109057,-1.986347,1.912098,-1.179875,-3.238671,0.6995207,-0.6726803,-1.060686,0.723667,0.001969057,0.665116,1.121461,-0.1296501,0.2683851,-0.8420384,0.05394693,0.7358748,-0.01365078,0.6491793,1.163873,-0.430577,-0.7439401,-0.9427894,-1.078175,-1.531509,-0.1199231,-0.2487528,0.2869654,-1.311675,0.4827905,0.05232211 +2.160619,0.352126,0.2117558,-0.6470156,-0.774465,0.6402703,-0.2590178,1.915775,-0.1471806,-0.3368051,-0.2286211,1.353057,-1.393829,-1.279169,0.9756788,0.7080794,0.2496189,2.488429,-0.3310193,0.3006035,1.539231,1.93059,0.9598066,0.3496733,0.93639,0.6222282,0.3435332,-0.3274327,1.728768,-0.8122692,-0.7563276,0.6732264,0.5853976,1.723429,-0.8793338,2.408133,0.940942,-1.212351,1.024709,1.013245 +0.2514379,0.111037,-0.1739291,0.3346895,0.02650803,-0.6064793,0.3241098,1.371343,-1.469032,1.979551,-0.0895354,-1.62946,0.8125569,-0.6442982,1.473831,0.2774314,-1.049697,1.485052,0.1028447,-3.530432,1.168846,-1.032142,-0.8964128,0.2183141,0.008605549,1.59394,-1.802357,-0.1397832,0.2582439,0.983412,0.847093,0.1188332,0.76217,0.6075987,-1.283802,-0.2662055,0.06383505,0.2756351,-0.5531469,0.4353225 +-0.7578027,0.1386332,0.05729733,0.2883709,-1.160927,0.3710642,-1.554536,-1.408569,-0.3842142,-1.979466,-0.4668801,-0.8196327,0.4433722,0.02165709,-0.2587439,-1.355753,1.284011,-1.422504,0.6175927,-0.4229525,0.4464416,-1.376608,1.709438,0.2644666,-0.7157838,-0.04746653,0.004591634,-0.754561,-0.000925007,-0.2260471,0.7867866,-1.180851,-0.987101,1.984948,-0.1907254,-0.1813453,-0.4197174,0.5569277,-0.2939047,-0.007178954 +2.029114,1.00771,-0.7397226,-0.5330147,-1.20017,-0.1119981,-0.5583406,-0.01149687,-0.8153178,1.913725,-0.4874255,1.047259,2.585074,-0.6556287,-0.3364187,-0.9370345,-2.620129,-0.9738536,-1.215108,-0.1919021,1.644173,-1.72957,0.02846504,-0.699928,-1.401521,1.419949,1.006285,-0.4711,0.7070312,1.057045,-0.2685251,0.5012254,-1.187036,-0.2425644,-0.3784071,1.674394,1.520458,-1.087799,0.7788954,1.204341 +-0.7597542,-0.2469696,0.3084581,-1.951922,0.1652025,0.2853102,1.547494,-0.4418097,0.003511206,-0.6994495,-1.614206,-0.2714244,-1.183431,-1.05962,-0.004984263,-0.1913372,0.2220645,1.587172,-1.084815,0.07899868,1.291574,2.467566,-0.2972721,-0.3077219,-1.687774,0.5011569,0.9183932,0.3954295,0.5219652,1.711861,0.3018883,-0.0680762,-0.6002737,-3.145001,-0.7356744,1.454149,0.3708792,-0.503151,0.4677864,1.969432 +0.5409988,1.230334,1.924691,1.153647,-0.8038915,1.031217,0.2616756,0.824783,1.418326,-0.6107274,-0.8133912,-1.283106,-0.3067611,-0.6816157,-0.6095727,0.7600751,0.07522942,0.7662411,-1.340435,1.352165,-1.499241,0.04850886,-0.3288606,1.399099,-0.07603216,-0.3530523,0.6849685,-1.911149,-0.006005037,0.6558186,1.191565,1.03801,0.1457833,0.7885842,0.01638152,-0.02243715,0.2319054,-0.4171139,-2.190047,0.39997 +-2.266765,0.3351331,1.850234,-0.2445792,-0.3082748,0.5332829,0.4664274,0.2491345,1.837728,1.005176,0.3228625,-0.9418545,-0.1109538,-0.2430981,0.2190041,0.9740497,-0.2259533,1.528648,-1.628994,1.595428,0.01267602,0.9126833,-0.3755833,0.2630545,-1.416957,0.2274258,-0.6790688,1.402477,0.885102,-0.03871524,0.2287978,0.4842677,-0.9841462,-0.458074,-0.4068991,0.02439129,1.144174,1.590616,-1.235564,-0.4974299 +-1.367218,1.291958,0.535788,-1.185213,0.3524884,-0.5455086,3.890746,0.7471574,1.914327,0.573706,-1.303781,-0.4633856,0.5748079,-1.556869,0.5976165,-0.751121,2.838896,-0.848601,0.2162285,-0.6306925,0.3540548,0.9659158,1.309188,0.5659253,-1.058579,1.294,-1.770592,0.7668783,-1.050235,0.6552959,-0.5978839,-0.4232391,0.1313997,-0.5367647,-0.609267,-2.001821,0.9606579,0.226624,-0.3264846,0.5739483 +1.046089,-2.32E-06,0.6439484,1.159789,-0.2226885,-0.6563511,1.626063,-0.7668478,0.7765727,0.1148746,0.7267536,1.646669,-1.132559,0.2462779,1.75585,-0.5284062,1.125138,-1.097999,-1.147858,-0.4055646,-0.09900874,0.7521498,1.72964,1.74474,-1.351707,-0.3100019,-0.007165094,1.874543,0.5391885,-2.092776,0.3596638,0.8728946,-0.548624,0.4724982,0.07343241,-0.4262374,-0.852925,1.094629,-1.376367,0.05769222 +0.3455312,0.2933967,0.3906887,-1.151834,-0.09571666,0.8103322,-0.8223606,-0.410941,-1.212487,-0.6361873,0.934707,-2.249815,-0.5620771,-0.8203194,-1.679645,0.1980409,-0.2351915,-1.261944,0.3248248,-1.280482,-0.6168901,-0.08943045,-1.431222,0.9881665,1.630021,0.8445498,-0.3888051,-0.7505478,1.870187,1.485479,0.464218,0.9297648,0.5155046,-0.3231836,0.001803336,-0.3793707,-0.7190805,1.266149,-0.6193637,-0.6676488 +1.169704,-1.282943,-0.3883313,-0.5615776,0.4122895,0.4211987,1.454575,0.3310977,0.3850062,-2.022887,0.5098096,0.04156781,1.131329,-1.136859,-0.2317975,1.34753,-2.024483,-2.019467,0.4430324,1.218932,-0.07392581,0.3908925,-1.582162,-0.1990289,-1.350127,0.7858125,-0.8591522,0.08865973,1.240235,0.1844992,1.094895,-0.2936918,-0.6682701,-0.4656143,-0.4597928,-1.183736,-1.110058,0.3015774,0.9930293,-0.8913358 +1.622791,2.385125,-0.323122,-1.502136,0.5093718,-0.6077353,-0.1294484,-0.9449475,-0.4854414,-0.08203666,-0.232768,-0.5864336,-1.279452,0.3738012,0.3890792,-2.008903,-1.568271,-1.690435,-0.7614245,-0.4029246,1.006068,3.173352,0.4740173,0.4334501,-0.6208717,1.238136,0.6668307,0.3186124,-1.346485,1.034924,-1.724268,-1.472453,-1.9661,-0.7693275,-0.8403711,-1.3609,-0.6949388,2.163833,-1.096542,-1.876465 +-0.2779612,-0.6402161,-0.8616966,0.2136899,-0.1478995,0.7436439,-0.7728036,0.5150577,-2.194208,1.523032,-0.5470406,0.4102216,0.07796569,-0.2737777,0.9186566,-0.5250218,-1.456832,-1.187423,-0.4012396,0.06490833,-0.848176,1.171998,0.5195856,2.02178,0.2691071,-0.9516675,-0.6140618,1.803813,1.583212,-0.889156,0.9008353,0.8840608,-0.3945754,-0.7413473,-0.01004718,0.7397391,-0.8809021,-0.2302372,-1.043727,1.307871 +-1.162968,0.1330253,2.085224,-0.6329638,0.1502263,0.6531147,-0.3569049,0.8651933,-1.193056,-1.314464,0.3668341,2.477719,-0.4704446,1.031967,0.93858,0.2738707,0.9945309,0.9204617,2.592702,0.08803368,0.1211632,0.1067576,-0.2249989,-0.4146501,1.358282,0.5890841,-0.2182411,0.6838824,-0.4491947,-1.433358,-0.7674163,-0.2751859,-1.145211,-1.375859,-0.19461,-0.8174401,0.04628879,0.947951,-0.4986649,0.3570598 +0.8751511,0.000731171,-1.20871,-0.1930974,-0.1914845,0.0679133,0.5955949,0.1134485,-0.4573983,-0.7874373,1.093483,-0.4552571,1.381769,-1.670087,-0.1791685,1.360551,-0.1615054,-0.1985388,0.6525315,0.1718976,0.7213844,1.359319,0.2209517,0.1541251,0.4811794,-0.9753979,0.9120423,-0.07834531,0.397797,0.409418,-0.9848404,1.770218,-0.02998597,-0.3471578,0.2159825,2.143574,0.2348877,-0.5856808,0.6822433,-0.1732561 +0.2441953,-0.3128785,-0.9111995,0.6052626,0.3990665,-0.848917,1.059043,-1.077614,0.6372276,-1.579183,0.5754742,0.1136303,-0.2054378,-0.260909,0.7110598,-0.4834615,1.496437,-1.357003,1.92467,0.0718142,-1.126256,0.6705026,-0.9977399,-0.1351882,0.1738152,0.5641764,1.230329,0.5067138,-2.101703,-0.1004179,1.138342,0.8272135,-0.7339093,-0.05666385,-1.291651,0.3645243,-1.068837,0.3068049,0.02914539,-0.1496416 +-0.02546942,-1.081462,-1.470219,-1.968249,0.7469507,1.986113,-0.857289,1.07404,0.3079609,-0.3147825,-1.743418,-1.045745,1.535974,0.4234852,-0.07719356,1.317813,0.7520005,0.3837969,-0.08226986,0.2259486,0.8126045,0.3366998,1.028658,-1.778108,1.038594,1.871949,-0.3159301,0.5997,0.3373228,-0.6821137,-0.1035452,1.56852,0.3286361,-0.8305545,1.305418,-2.086871,0.1087652,0.6741196,-1.052779,1.262328 +-0.07705133,-0.8807226,-0.7182744,0.2552225,1.946093,0.2031328,-0.3891569,0.03185012,-0.07243095,0.7476851,-0.4662692,0.1115174,-0.08618621,0.4334349,-0.6210939,-0.02133107,-1.58315,0.961924,0.3236826,0.8174656,-1.059033,0.2745771,-0.6268679,0.2013952,-0.1757056,-0.703581,-1.650448,-0.1943132,-0.8317814,-1.447739,-0.2861832,0.3733242,1.531597,1.32111,0.4249345,-0.06278949,-0.2696768,1.041274,-0.505213,-0.0295883 +-2.050564,1.063015,0.05066276,-0.6603276,1.310362,-0.6499503,0.2981692,1.371465,-0.4718722,-0.1068022,-0.2263725,0.6280932,-1.048561,1.540125,-0.8366778,0.7297319,-0.862846,-1.109634,-0.07155213,1.398822,1.113268,-0.9081014,-0.4602903,0.3620187,-0.7058483,-0.5847078,-0.783133,-1.730593,0.03226706,0.6519156,0.0731894,0.5914054,0.1631346,0.2990341,1.447682,-1.639467,-1.150218,-0.03766771,0.1812896,0.5088007 +0.06133713,-0.007693965,-0.08532146,-0.4914633,2.153415,-0.3465105,-0.96743,-2.250431,-1.495221,-1.029023,-0.1377158,-0.548947,0.1699701,-0.8990784,-1.864945,-0.2000398,0.1017739,0.211328,0.2737686,0.1643779,-0.05576899,1.837534,1.392644,1.188484,0.672652,-0.4892215,-0.5860423,0.04771479,-0.7782408,-1.45129,0.03157048,1.864391,-0.4608936,-0.2014303,-1.590137,-2.360655,0.2809761,1.101016,-0.03268848,0.3110187 +-0.4949984,0.8996958,1.54739,0.8211694,-1.007771,-0.008813462,-1.764296,-0.1705641,0.5256734,-0.7212294,-0.3243895,-0.5632966,0.1407109,0.4437013,-0.3233954,0.3755384,-0.781306,0.5483217,-1.455431,-1.052412,0.04171015,2.830747,-0.3240892,1.587166,0.6121967,1.564179,0.1727552,0.6918655,-1.470507,0.7299691,-0.4658298,-0.5160917,-1.062343,0.07550007,0.01022362,0.2779377,-0.5251522,2.092515,0.3635856,0.6167995 +0.4792697,-0.1328603,0.1351049,0.7162004,1.042199,1.84544,-0.2024554,-0.002141408,-1.614473,1.048317,1.860298,1.209098,0.278168,-2.280015,0.213183,-0.5968982,0.6244814,-0.7788438,0.08739218,1.815737,-0.710909,0.3643146,-0.1081601,-1.079951,-1.103536,-0.8512351,0.9363793,0.4167575,-0.305679,0.5044816,-1.169884,0.5803574,-0.2212836,0.4795124,0.2166414,0.7935458,0.6773878,-0.4662245,0.2030559,-0.2890248 +0.7513906,-0.865921,-1.395214,0.846543,-1.300212,0.6631713,1.32724,-0.4516459,-1.225202,-0.4115717,-1.926631,0.4192732,0.07650889,-1.800334,-0.859208,0.2028348,-2.161308,-0.9569607,1.059527,1.182242,-0.06479278,-0.179256,-0.2262578,-2.218891,1.114174,1.65581,0.671931,1.37362,-0.3544194,-0.6097877,-1.283584,1.776369,-0.3035225,-0.790524,0.06166071,1.244818,0.6457397,0.8165565,0.1137468,-1.556388 +0.1725892,1.235444,0.5901277,-1.800635,-0.6333373,-0.1318002,-1.091008,-1.175646,-1.638734,1.348027,0.5450302,2.106537,-0.1854237,0.4751681,0.5199004,0.6992182,1.944746,0.1949585,0.5987992,-0.950056,1.663355,-1.767209,0.4975286,0.9391164,-1.266929,-0.3717286,-0.8973073,1.433433,0.2208994,1.778684,-0.7998434,-1.44445,0.263155,1.071508,-1.410349,1.33743,0.3122398,-0.3946711,-1.180983,-0.3269108 +-0.0608472,0.3104196,-2.404892,0.08646449,-0.476952,-0.1011721,0.746863,-0.1707723,-0.8858659,-0.4668425,1.00171,0.5510232,-0.1802038,0.0779547,-1.144402,0.4287325,-1.158367,0.5100271,0.2601872,0.9769063,-0.1557982,-0.1835213,0.1497562,-0.08678158,-0.6534971,0.1535002,-0.8842144,0.09967089,0.2127782,-1.437019,2.137298,-0.08650632,1.190131,-1.111576,-1.636614,-0.778592,-0.4788259,-0.2811472,-0.5143669,1.223358 +0.6369664,-1.086127,-0.663535,-0.1420308,-1.01861,-0.1072861,0.1478066,1.493444,0.8713574,-1.11159,-1.134416,-1.429424,1.280277,1.615203,0.628773,-1.665824,-0.780871,0.7636331,-0.6423025,-1.23481,-1.053807,0.0910805,-0.05209202,-0.1502295,0.2116342,0.3514764,-0.69834,1.082675,1.495927,-1.106921,0.4765329,-0.1334155,-0.6274608,1.09579,-0.2210151,0.6541948,0.4542698,-0.4742441,-1.16949,-0.02068537 +-1.463983,0.06068965,-0.9266162,-1.333303,-0.7022165,-1.38827,-0.8602296,2.178026,-0.7378819,-0.5362204,-0.2171703,0.2150016,0.8405794,0.6848036,-0.6881447,1.009157,1.609537,0.6537178,0.4021736,-0.02422508,0.00622902,-0.2956344,-0.7517578,-0.511894,-0.7448259,0.1043102,-1.949753,-0.6975772,1.394516,2.249914,-0.08642346,-0.3647527,0.4278844,-2.063246,0.8487818,-0.2797576,-0.4342948,0.361036,1.137371,0.6377372 +-0.8490696,0.1328686,0.8078446,-0.3132353,0.2966908,-0.1471594,0.8712303,0.9571328,-0.3548784,-0.4643514,0.2448872,0.3738516,-1.91955,-1.050079,0.883703,-0.9070252,0.2713434,-0.3330772,0.1311921,0.5048471,-0.9651302,-3.549025,0.3581058,0.6690854,1.031403,0.04728314,-0.4674367,1.784981,0.4298726,1.367494,-0.8232872,-0.6727222,0.6321195,0.6354232,-0.4607963,-0.6213822,1.717836,-0.8928384,1.183751,0.05508109 +0.5089401,-0.7015299,1.306718,0.3915749,1.099776,1.865072,-0.6171181,0.1299875,-0.01635643,0.6169287,-0.8347216,-1.231975,0.1434929,0.8889328,-0.09658285,-1.51248,0.720099,0.1373799,0.945711,1.973188,0.1724038,0.9268082,-0.5225279,-1.515249,-0.9538612,-1.294604,1.971304,1.40848,0.2527976,1.662163,-0.2224994,0.160501,-0.5428514,-1.418427,-1.734086,-0.8050142,-0.3654517,1.960788,0.6521145,0.7118974 +0.632461,0.5640843,-1.158075,0.05309722,-0.1519097,-0.6323629,0.1073343,0.1287115,1.329362,0.02582828,-0.312584,1.304448,0.01565486,0.2840414,0.8051406,0.5661982,0.1016154,-2.418199,0.03808422,0.3928666,0.175072,0.8936534,0.3552402,0.3791251,0.1134417,1.507272,-1.707764,-0.2861406,0.6022669,-0.3365272,-1.076423,-1.340684,-0.2772807,1.090431,0.2577202,0.1815451,-0.4588974,0.994618,0.3763423,-0.03427395 +1.166245,1.660551,-1.625918,-0.2090717,0.7338309,-0.06744242,0.4464029,1.07188,-0.8508813,0.4290762,1.661918,0.687608,0.8395174,0.9960829,2.228636,-0.238111,1.574373,-1.120541,-0.3136345,0.7877585,0.570742,0.9608518,0.9680278,1.308513,-0.3750598,-2.451055,0.5571965,-1.382536,0.3322467,0.03453776,-0.8594367,-1.397896,-0.5141195,-0.7081474,1.330073,1.215022,-0.5527622,1.084464,-0.8043978,-0.9423797 +0.3905792,1.096505,-0.4920402,-0.6425849,0.3584608,-0.6367868,1.41674,-0.319927,-0.4384824,-1.126347,-0.999286,0.6572604,0.9956376,-0.1359207,0.7451275,-0.4277247,-0.2015582,0.7720079,-0.001099061,-0.461317,0.1654546,-0.3433587,0.7014474,-0.2756989,0.865611,0.9153336,-0.09780544,-1.701494,-0.1998845,-0.975902,0.4666723,0.8782658,-0.9050583,-1.265353,0.6844466,-1.127711,-0.2719463,-1.745884,1.8204,0.4283397 +-0.1369166,0.1074717,-0.6645142,-0.6470042,-1.427994,-0.1543411,1.869133,-0.1246356,-2.852236,0.02272605,-0.7344855,0.06139253,0.6653159,1.596868,2.3663,0.01602978,1.039565,-0.858842,0.6825511,2.044211,0.6411697,-0.02773667,1.703579,0.01780121,1.17657,1.074911,0.503353,0.06218236,1.731065,-0.01518593,-0.01316161,0.02340051,0.4508311,-0.807856,0.2221029,0.3608331,0.290161,-1.787886,0.4539505,-0.2932153 +-1.093804,1.125032,-0.3806519,1.23615,0.8395937,-0.2819171,0.888994,-0.9642419,0.7756779,1.219933,-0.482388,-0.8940818,1.551811,-0.04003543,0.6935243,-1.646008,0.914857,-0.5712629,0.6842144,-0.2896035,0.05658325,1.622143,1.098922,-2.129815,-0.08547517,0.1028253,-0.1317777,2.465166,0.2096029,0.2817038,-0.09159578,1.454762,1.730492,-0.04916471,0.5499805,-0.6032212,-0.2469388,0.9879813,-1.555275,1.126351 +-1.6698,0.5737722,0.9576206,-0.6253846,-0.04308182,0.765563,-0.1702084,-1.367737,-0.007057774,-1.693183,-0.2609518,-0.144783,0.5738326,0.2486962,0.3016454,-0.8430824,-0.08422793,1.771253,0.1671617,0.5829858,0.7508465,-1.864099,0.5193147,1.282002,0.1089154,1.69608,-0.5019022,0.2130018,-0.8089593,0.1025906,-0.4759253,-1.895064,-0.9499314,-1.375131,-0.1146696,-0.5666209,1.216569,0.4821444,0.7085889,3.214808 +1.131947,0.5224184,-0.6656139,-1.628845,0.6440818,-1.955802,-0.6925104,0.5139465,1.615959,-1.94159,0.6452319,-0.07934824,1.164481,1.63879,1.082194,1.775444,0.1265955,-0.2212109,-0.2683398,-3.021196,-0.811114,0.4235676,-1.125592,-0.551147,-1.10326,-0.1724963,0.08480435,0.1110249,-0.8450583,0.8454403,0.4266237,0.3051815,-0.765817,0.7147994,0.192852,0.353159,-1.608782,0.2405721,-0.4189459,0.9543071 +0.5696478,0.1789226,0.9687541,1.357368,0.6760235,-1.454403,0.5548348,0.08300772,0.5280393,0.2986408,-1.057241,-1.250247,-0.9895325,-0.1627223,1.079031,-0.2066164,0.2532997,0.6629438,1.491284,-0.6191504,-0.5374721,-0.4897775,-0.8543232,0.3568886,0.1531491,0.6656659,-0.4410617,-0.2201606,0.4500488,1.387834,0.1467455,-0.2663506,-1.865167,-0.4696121,0.2627581,-0.3985805,-0.9223595,-0.2332978,-0.4047847,0.08769545 +-0.6294001,-2.24454,0.2119665,0.3894679,-2.173525,-1.38525,1.042164,-0.947401,0.2057483,-0.6901228,-0.7821276,-0.2194726,-0.5600555,-1.158016,0.4020606,-0.316885,-0.9839273,-1.160329,-0.2578889,0.805047,-1.412566,-0.3087706,-0.3630313,0.8747875,0.5547495,0.3321289,0.1987509,-1.381006,2.300121,-0.6384054,-1.141024,-0.8618206,0.8002448,1.372109,0.5425462,-0.580681,-0.1440022,0.04038492,0.9971686,0.3897525 +-0.9174054,-0.3038185,-1.638059,-0.3467781,0.8838816,-0.6758029,2.08881,1.122318,0.6650173,2.043524,-1.019996,-0.737784,-0.7040731,-0.9372437,-0.3642053,0.4716215,1.063382,-0.3195568,0.09731909,0.4012124,0.852211,0.8681206,0.008712486,-0.920747,-0.04278635,1.601043,-1.06913,-1.911215,-0.1083153,-0.8635068,-1.253584,-1.161317,0.291982,-0.1621194,-1.199339,-0.6677983,-0.1128238,-1.684557,0.6155719,0.0854523 +0.995658,0.6841635,0.7332582,0.9449956,-0.1854941,0.562192,-1.412603,-1.853357,-1.382054,1.102355,0.2407049,-0.2877895,1.11013,-0.3758609,-0.2962322,-1.744079,1.279101,-0.1322485,1.059252,-0.638826,-0.2766819,0.5496889,0.03316565,0.861852,0.5519145,0.4330357,-0.3786175,1.240831,0.7629886,-0.6840464,0.02067383,1.332312,-0.8254109,-0.9755908,-0.2181276,-0.8155525,-0.47511,0.6801044,-0.3496051,0.09340004 +-0.8894882,-0.2811645,-0.7402077,-1.373804,2.891998,-0.5095243,-0.9098159,-0.111439,-1.83225,-0.3337755,1.141283,2.397813,-0.07456981,0.2500421,-1.654114,1.024675,-1.291497,0.3096132,0.5198246,-1.553381,0.1386766,-1.273247,0.14656,-0.0657266,-1.980286,0.2247925,-0.6696775,0.128413,1.061496,0.1135687,-0.7059692,-0.265206,-0.7031408,-0.7030427,0.7804537,1.507406,-0.6906443,-0.04670045,-0.9827916,0.4769856 +1.325977,1.730905,1.572437,-0.5963825,0.186126,0.6678059,-0.08983416,-0.7853106,-1.744531,-0.9354616,0.04413511,0.3704218,-3.312274,-2.06923,1.709984,-0.7787176,1.593725,-0.05799162,-0.3598422,-1.524849,-0.4809074,0.1127438,-0.451012,1.913178,1.137924,-1.367628,1.748494,1.257763,0.55673,0.8132774,-0.257949,0.8922825,-0.002807529,0.2062785,0.1074921,0.5868007,-0.0398625,0.162196,-0.8587543,-1.175782 +0.3409506,-2.188865,1.434529,-1.250852,0.1880192,-0.1879272,0.8830633,0.2027149,-1.035544,0.4324088,1.262173,0.3514282,0.2477246,-2.417458,-0.7665658,-0.5214571,-0.3802376,0.004438136,0.5727987,-1.31721,1.931717,-3.214503,-0.8489658,0.9206529,-0.4794482,-0.5298492,0.7611297,1.003994,0.9929303,0.58119,-1.293728,-0.9062147,0.1464359,-0.3319371,-0.7838433,0.8534758,0.3563667,0.4373398,-0.8032229,-0.08802913 +1.262076,-0.7600402,-2.13995,-1.360729,1.346644,0.176745,0.1142585,-0.8024593,-0.3865182,-1.566132,1.728406,-0.8852655,0.8897312,-1.336108,1.544114,-1.669484,0.6284774,-1.169744,-1.657486,0.4966557,-0.3758803,-1.171293,0.1955152,1.02148,0.7471174,-0.02079461,-0.06880471,1.581638,1.17913,0.5427624,0.9399094,-0.982514,-0.2761941,0.6216423,0.3565886,0.7898837,1.378449,-1.737346,-0.1119862,-0.1416043 +2.189657,-1.069406,-0.2600074,-0.562237,0.02146689,0.637616,1.619116,-0.9743796,0.01314358,-0.2689653,-0.1739779,-0.5744471,-2.228468,-1.448737,-0.418183,-0.1460538,-1.1644,-0.3230464,-0.04492732,-0.09955658,-1.649804,-0.5796071,-0.3261975,0.9547514,-0.6088828,-0.3664289,1.453969,-1.524643,-0.3165158,1.315457,0.2789144,0.7829563,1.564281,1.707704,-0.0537235,-0.7485649,2.339356,1.704113,-0.02484202,-1.229086 +1.866736,-0.3468833,0.8890667,-0.1274468,1.430422,-2.257659,1.380455,-1.055178,-0.791888,0.4128714,0.3406614,-1.020847,1.609082,-0.3401246,1.629501,-0.09588264,-2.583895,0.3485598,-2.320446,-0.6901088,0.4843828,-1.011229,0.5083269,0.06608628,-0.2422403,-1.71942,-0.7381664,0.7488137,0.8355473,1.012445,-0.5544448,-0.6872742,0.8857269,-0.4908423,-0.9004061,-0.01138096,-0.3801097,-0.5199918,-0.4561635,0.4564369 +0.7350912,0.7478706,-4.69E-05,1.348814,-0.004293224,0.1990579,2.147903,1.357628,-0.7209085,1.813409,0.950456,-1.999263,0.9598386,0.3959101,-0.2221988,1.299563,0.2499581,-0.02430871,-1.503818,0.5733077,0.07361223,-1.362107,-0.4303165,0.4226632,0.2925826,1.303135,-0.04732672,0.1577464,0.04869842,-0.7265502,1.06213,1.771888,-1.747842,-0.0261601,0.5868693,0.8086304,-0.3359675,0.5850546,0.4151012,1.453917 +-0.7863044,2.038284,-0.1154721,-0.688689,0.9381195,-0.1854638,-0.2821145,0.3543249,1.145156,1.075433,0.5994681,-0.5887072,-0.9303461,-1.675876,1.03397,-2.56187,0.1625946,-2.005851,-2.015801,0.0770333,0.8764786,-1.231103,-0.6120385,-2.814615,0.7166713,-1.278962,-0.2411957,0.9459107,-1.989365,-0.7756401,-0.1129562,0.7138283,0.744304,0.2936516,0.1939988,0.2775849,-0.6047392,0.1950976,-0.004072717,-1.456855 +-0.8769743,1.975216,-0.3124435,0.3457196,0.09163018,-1.183683,-2.045811,0.463852,0.8733703,-0.9408406,-0.267403,0.1293521,0.9107215,1.196748,1.158178,0.3381188,1.692885,0.2579686,0.1056108,0.6963002,-1.102497,-1.204685,-0.152871,0.2519682,-0.2997757,0.02251967,1.821456,-0.7150975,0.1406057,-1.381674,-2.264814,-0.4175053,0.7122032,-0.5270236,0.3158009,-0.3550036,0.2178155,1.256167,-0.7931935,-1.04831 +-1.220868,0.3011162,-2.03008,-0.7017984,0.08524112,0.03775985,-1.871342,0.1877875,-0.3973715,-0.0527255,-0.1579394,-1.844597,0.6190762,0.7096722,-0.733769,-1.692339,2.348165,1.13933,0.5067061,-0.1075718,-1.116474,1.080318,0.2848722,-1.207075,0.3685622,-1.091024,-0.4878207,0.4673709,1.626938,0.1830333,-0.295734,2.622501,-0.2124475,-0.533044,-0.7032827,0.004032735,0.1645798,1.28242,2.944728,-0.2160222 +1.616652,-0.8612993,-1.745707,2.24338,1.850747,-0.9717092,0.1508837,1.436092,-0.03019092,0.1717637,-0.006587316,-1.214331,-1.645907,0.1325833,-1.149854,0.7030811,0.903639,-1.024022,0.330958,0.03182274,0.3283985,-0.348244,-2.187418,0.7879418,1.096332,0.2701831,-0.8978919,0.05164823,1.876594,-0.7522881,1.715154,-0.7474391,-0.5972201,-0.8675209,-0.1603904,0.9671862,0.03576929,-2.324691,0.1892978,0.36575 +0.4980438,1.037358,0.7333106,-0.681072,-0.2031754,0.488064,-0.1244944,-0.5908773,-1.550598,0.3036166,0.09490008,0.8522919,0.4098244,-0.1182363,2.107419,0.3605091,0.7270142,0.3183656,1.461785,0.2717239,0.06114976,1.77026,1.879694,-1.129057,-1.605143,1.151794,0.6827958,1.319628,0.307225,-0.19673,-0.6878975,0.1763484,-0.2503978,0.6202423,1.740674,-0.2104125,-2.205849,-0.5303344,-0.5342823,0.02312966 +1.602559,0.7525393,-1.258726,1.916808,-0.3953048,0.6774481,-0.6617396,0.1076767,0.6244922,-0.2159668,-0.1305384,0.6827092,-0.813823,-1.146244,1.230226,0.6319264,-0.8461523,-0.6417762,-0.2177935,-1.309953,-0.5329755,-1.256394,0.2606859,0.9768466,-1.474903,0.4539297,-0.6782374,0.6601391,-2.085974,-0.5832968,1.802021,-0.5508605,-1.672587,0.2516173,-1.517939,0.9708857,0.4587552,1.01497,-0.588631,-0.1115419 +0.6355756,-0.7032309,1.472605,2.028561,0.786877,0.04769079,0.5905994,0.4967395,0.6615336,-1.602587,-0.7236331,-0.2187661,-0.9001952,-0.001078251,-0.7113537,0.3416258,-0.264619,-1.434096,1.132811,-0.5139777,3.136309,0.6891356,-1.836292,-0.8166598,-2.275607,0.1961124,-1.927381,1.007474,0.1927279,1.30005,-0.3278329,0.1944196,-0.7949875,-2.701067,-0.618401,2.244184,-0.2166499,0.4320232,-0.3654567,-0.9797293 +-1.068432,0.3875388,0.3848753,-1.83846,0.8521534,-0.3675838,-1.754248,-0.1040352,-0.3545407,0.2422556,0.02186299,-0.7928895,1.0684,0.8342406,0.3603216,0.6186667,0.05821851,1.575885,-0.9972041,-1.381807,0.8428876,0.6673956,1.003913,1.100404,-0.3293659,-0.4661102,0.9492819,-0.09990063,1.325523,-0.5159665,0.8599363,0.166238,-0.06587246,1.204158,-0.02529602,1.602985,-2.865242,-0.2343846,1.292998,0.9178107 +-1.707927,1.491132,-0.07955294,-0.4529438,-1.706196,1.21097,-0.760338,-0.08043177,0.648423,0.560607,1.427036,0.1909277,2.255785,-0.3958118,1.044716,1.17411,-1.133732,0.1129457,-0.05896328,0.7831143,-0.738593,-0.2902862,-1.56401,-1.971176,1.408654,-0.1105115,0.4920567,0.05754005,-0.3196087,1.122323,0.621713,-1.126429,0.3608319,-0.4893595,-1.906904,1.225194,-1.260496,0.1741526,-1.259376,0.3192659 +-1.021218,-1.495981,-0.3921603,-0.5077572,0.3852688,-0.2442244,0.6325949,0.7166147,0.9193851,0.0943415,0.4691045,0.08417545,0.3713743,-0.8509066,0.4012429,0.8140735,-1.081821,-1.172469,0.4568584,-1.389115,-1.286241,1.749857,1.149539,-2.111812,-0.0871845,-1.045128,-0.4961551,-0.09357845,-1.423211,0.4518932,0.8711506,-0.00655861,1.469418,0.06727703,-0.6404844,1.045517,0.7799638,1.065892,0.80469,0.3261379 +-0.2271061,1.288703,1.017901,0.1102284,0.7796032,-1.38049,-0.5195862,0.1577238,0.5026156,0.5944365,-0.0968494,-0.2835587,-0.7095757,-0.2588618,1.320576,-0.8833495,2.013261,-0.1576348,0.685696,-0.8407122,-0.5890556,-0.5047651,1.462565,-0.9248906,1.267035,-1.026145,0.3525372,0.7780157,1.261442,0.5815962,-0.9726709,-0.4189941,0.9060146,-1.02457,-0.1204225,-1.109552,0.5145458,0.4325182,1.943413,-0.4378061 +1.242464,0.03617525,-0.03631599,-0.3025353,-0.5747094,0.4386343,0.994291,-1.335225,-1.207628,0.06684685,-0.4878378,-0.1356744,-0.1763969,1.576854,-1.200279,-0.7694257,-1.277292,0.2297965,0.7535111,0.7620463,-2.621532,-0.09643931,0.4310087,0.08668919,-0.2906127,1.162635,-0.01237671,2.487364,-2.410139,0.2080614,-0.5124321,0.192597,0.4523399,-0.5172666,0.1363612,0.2230765,0.09821035,-0.7430487,-1.26052,-0.4115761 +-0.5789086,-1.141569,-0.03589313,0.08649885,0.154728,-2.853887,-2.327516,0.3150727,0.663484,-0.0967736,1.16565,-0.0379128,0.6463988,-2.289926,-0.2349781,0.6549386,-0.2823913,0.1450758,1.932174,-1.070673,0.6474167,0.1619384,-0.09734697,1.917222,0.658908,1.097337,0.6177156,0.4183629,0.7255854,0.558872,1.174332,-0.9397543,0.714758,-0.5034286,0.5290154,0.1841066,-0.4170894,-1.187391,1.291129,1.057043 +1.032077,-0.3376248,-0.2376712,1.092826,-0.5384941,0.2515526,1.093712,0.1351583,-1.47543,1.002652,0.7305071,-1.990173,-1.659373,0.93638,-1.301876,-0.404155,-1.521231,-0.7727998,-1.302583,0.07399177,1.612613,2.861744,0.6727558,0.8109452,0.01331028,0.9644708,-0.4647126,1.916449,2.050955,1.120837,-0.07723984,1.320296,-0.6786547,2.102916,-1.007909,1.317177,-0.11684,-0.02197139,0.3649056,-0.2448642 +-0.1914775,-0.995596,-0.5719259,0.2171549,-1.341025,-1.173758,-1.325836,-0.443841,-1.397781,-0.106768,-0.4350684,0.1600878,-1.550777,-0.1586556,-0.1158304,0.03512539,0.8356726,0.06305947,-0.8986025,-0.8790261,-0.5870576,2.106833,-0.3064085,-2.574953,1.10564,-1.350706,0.3221951,-1.133129,-1.633299,-2.184239,0.05418395,0.8347114,0.9935186,1.007761,-1.156541,0.8791044,0.4192427,-0.3726376,1.398331,2.759618 +-0.4241861,0.4814133,-1.818765,-2.106624,1.305579,-0.08740409,1.207863,0.4095115,-0.7795278,-1.956544,-1.079107,0.5640301,0.3996719,2.131996,0.2154773,-0.3604653,-0.000834189,0.7184551,-2.465292,-0.6565154,-1.14367,0.2298344,-0.2912484,-0.816044,-1.010703,0.3267918,1.101835,-0.5771344,2.243954,-0.6938669,0.4337485,-1.096375,0.4597715,-0.07970062,-0.3893217,-0.6889473,-1.035587,-0.4350726,1.827734,0.2562112 +-2.391613,-0.09810114,-0.09325368,0.2638803,-0.5585746,0.1336839,-0.5512516,1.030441,0.7908795,1.627934,0.8824813,-0.806734,-0.4849536,0.3398209,-1.088596,1.513513,-0.332648,-1.755733,-0.876744,0.8149652,0.01104073,0.2915588,0.9446782,-1.803926,0.2145526,-0.6235253,0.09438495,-1.126244,1.334201,1.167263,0.5315754,-0.3391001,1.168943,-0.9453143,-0.1240526,0.7747884,0.7919105,-0.3869382,0.6717669,-0.2841517 +1.646065,-0.2862455,0.2771793,0.7020198,1.547866,-0.6291021,1.015551,-0.4124384,-0.5283807,-1.580758,-0.7847788,0.2378234,0.3372738,-1.996658,0.4713466,1.519502,0.3901392,0.2546647,-0.8189247,-0.9298436,0.02096738,0.0727262,-1.453708,0.9698408,-2.140822,0.6264652,1.34563,2.065284,-0.3163286,-0.0715845,0.5152401,-0.650464,1.059382,0.4406738,1.141959,-1.703514,0.852764,1.016504,-0.52155,-0.4065704 +-0.7670604,-0.6942854,-0.9743483,1.091441,0.9183951,1.107497,1.337673,-0.2801548,-0.8122764,0.7031218,0.972147,1.488083,-1.230194,1.225036,-0.001361945,1.016871,-0.002703959,-0.5666145,-0.8215706,-0.5531364,-0.5347412,-1.078737,0.6801678,-0.2607026,1.443805,-0.2044912,0.7303067,0.2902087,-1.779253,0.797006,0.9337455,0.006241884,-0.7375173,1.504358,1.491884,0.9347488,1.742114,-1.044018,2.052529,-0.981575 +0.756271,1.372438,-0.7150245,0.5662927,-0.07358233,-0.9030335,-0.2500278,-1.185774,-0.6561446,-0.9930168,1.486048,0.1142932,1.066514,-0.3609889,-0.8233467,-0.1695808,0.6040269,0.2497948,-0.7402668,1.557926,-0.4063436,-0.823663,-1.276971,-0.0339888,0.2801171,1.001504,-0.08876232,0.3472672,0.7995524,-0.9941289,-1.088962,1.008781,1.75496,-1.197427,-0.7740088,-0.6474976,-0.7647207,0.7160687,1.118434,1.491095 +1.49271,-0.7199028,-0.128885,-0.3142583,-0.1839092,1.081212,-0.5927078,-0.7773358,-1.517282,0.283998,-0.4262652,-1.434787,-0.2210092,-1.295098,-0.7364237,-0.5141719,-0.4726536,1.551937,-2.686355,-0.9979969,1.2987,0.6884165,-1.044236,0.03014421,1.193455,0.1509331,0.5972592,0.3234137,0.6544782,-2.271936,0.3894763,0.2703986,-0.6475158,-0.7408528,-0.03901464,1.268745,1.00733,1.551991,0.426139,0.1604542 +0.655936,-0.5871402,1.484913,-0.6533059,0.2658844,0.3158667,0.6426067,-0.5747372,0.3337167,-0.6943155,-1.782593,0.6545194,-1.992131,0.1013132,0.4831483,0.6850449,-1.512572,0.39644,-2.014355,-0.2456253,1.877653,-1.119979,-0.4470062,-1.031355,0.1534969,1.936946,1.679998,-0.4676724,0.8820944,-1.453224,-0.951177,1.964412,0.9337746,-0.366063,1.616811,0.3913271,-0.1609009,0.5547862,-0.1390033,0.7624066 +0.4947172,-0.2301099,-1.647894,0.2824051,-0.3789998,-0.1301719,1.547332,2.783055,-0.704781,0.05774945,0.1764473,0.6721847,-0.2978885,1.51382,-0.108025,-0.6841164,0.690195,-0.0681601,0.07718561,-0.2969796,0.4929557,0.2191132,-0.5263902,0.6310975,1.77164,-2.263262,-1.623151,-0.136331,-1.638136,0.001457287,-0.6666235,-1.538708,0.6487674,-1.464266,-1.580371,0.4011429,-1.696638,1.096374,-0.6227066,1.533549 +-0.2531019,0.6751816,-1.761986,-0.632118,0.7217275,-1.793287,-0.1758876,-0.618511,1.099773,1.56149,-0.01233304,0.5452427,0.02182848,1.600703,0.2796984,1.017169,-0.8285453,0.3485297,0.0966231,1.004711,0.4877406,-0.4110718,-1.74979,0.6662494,1.425231,2.472471,-1.426405,-0.7194122,-0.8629356,-1.605514,-0.516557,0.3522579,-0.1721676,-1.947992,-0.06898672,-0.4708051,0.3211692,-0.4215396,-0.5857982,1.309757 +-0.6629416,-0.6684469,2.010626,1.359616,0.6186938,1.034339,1.515957,0.7464751,-0.3859917,-0.9265356,-0.2456438,0.3471318,-1.879147,-0.7899024,-1.074491,0.7664106,1.610763,-1.246958,-1.243307,0.202961,-0.9749164,-0.1739301,-0.1272026,0.9393569,0.6436851,0.986296,1.663894,0.4547087,-0.03613065,0.7083303,-0.5064922,1.114506,0.4535737,-1.269096,1.308269,1.240366,-1.148569,-0.6740404,-0.8023688,0.5494892 +0.5621256,-0.4332095,0.868683,-0.04466273,-0.5854759,0.1633844,2.893355,0.01778396,2.240482,-0.5977538,-0.9404177,-1.664479,-0.9694735,-0.8403697,1.331517,0.5031969,0.1601252,-1.021319,1.204525,-0.5695939,1.728159,-0.60255,1.832935,0.5745419,-0.9771337,1.760743,0.7318818,2.594631,0.03052767,-0.372652,0.6333312,0.7203164,-1.089602,-1.058218,-1.312651,-1.014919,-0.9487502,-0.6802212,0.8243316,-1.311633 +-0.7792979,-2.134923,0.1114684,0.8202725,-1.087917,1.189235,-1.146495,-0.7457918,-0.628191,-0.6768057,0.6303697,0.4730779,0.9663865,-1.405146,0.1880468,0.4465787,-0.1939958,0.4625492,-0.6720588,0.08884945,0.5745184,-1.667765,-0.861687,-0.2751276,0.8670845,-0.4882992,0.536138,-2.649357,-0.1394775,0.8420915,-0.3500236,0.8959736,0.04056345,0.8383203,-0.4940791,0.8883028,-0.6528475,-0.48961,-1.098541,1.625319 +0.8589623,0.4233107,0.05130464,-0.9176836,-1.751326,0.8756188,-0.2227276,1.062793,0.758234,-0.8945922,0.05215897,0.620253,-0.1444023,0.5323072,0.1935703,1.055186,0.9684064,1.207956,-1.405561,-0.9308506,-0.3928046,-0.3714152,-0.03217509,-1.020432,1.550822,0.2446213,-0.09788858,0.9654315,1.532321,-0.1209487,-0.4744781,0.6337079,-0.9628981,1.936896,1.531788,0.2705362,-0.1245521,-0.710712,0.7649373,0.1661617 +0.9349611,1.629885,-1.087774,0.9220936,0.3854698,-0.672752,0.8289636,-1.34766,-0.3806889,-0.665549,1.08825,-0.9162383,-0.03367234,-0.5596756,-1.175996,-0.9310023,-0.364331,-1.307186,1.752646,-0.558602,-0.5651294,1.365778,-0.2361847,-1.792839,0.8677135,-1.886312,-1.373577,-0.08811942,-2.034668,-1.534727,0.9418462,-0.2476741,0.1051205,0.2706765,-0.5416574,-0.6738075,0.439936,-0.709569,-0.3112841,1.620033 +-0.7919525,1.239626,0.2876882,-0.6163602,-0.6345362,0.6018114,-0.6531783,0.1459709,0.9177802,-0.5070657,0.6718584,1.569896,0.5657696,0.02240993,0.449144,-1.813056,0.1432132,-1.872235,-0.2708579,0.1727771,-1.287368,0.8221744,0.2607301,-0.2567381,-0.877421,0.1899926,0.1327904,-0.1262557,-0.5448904,-0.191903,1.224964,-1.271259,-0.2520549,0.09046933,-1.413818,-0.7846705,-0.4029519,0.3577853,-2.1544,-0.838981 +1.359487,-0.6839787,0.02099824,2.685902,-0.1446208,1.32223,2.309487,0.5437921,-1.723597,0.1239781,1.71261,-1.677332,-0.8553957,-0.3063914,0.8113015,-0.3321415,-0.4741498,-1.181823,0.9307352,1.465559,1.00394,1.300857,-1.812353,1.861689,2.146633,1.091674,1.788597,-0.6715781,-2.246766,0.4084276,-0.3038008,0.6891732,-0.5132171,0.6648318,0.1675307,0.9425837,0.3017843,0.4409115,1.752187,-0.07205467 +0.01964938,0.489466,1.044673,-1.094006,-1.26307,0.5161893,-0.2112221,1.786717,0.4071771,1.040825,0.0334345,-0.4024511,1.450529,0.4740357,-1.013904,1.203665,-0.9470513,0.5913728,0.09146755,1.758675,-0.3135122,1.001774,0.1130651,0.09881463,-2.102912,0.7882323,-1.369003,-1.242487,0.4956255,-2.042741,0.7628145,-1.708509,-0.4259373,1.392669,1.018481,-1.256721,-0.3369222,-0.7082759,0.2396022,-0.866444 +-1.18594,0.06380143,-0.2863318,-0.6688604,-0.1568936,0.69027,-0.5501672,0.8483182,0.1321657,0.2907692,-0.1809557,0.4051114,0.04065801,0.0868436,0.6241097,-0.3571997,-0.2578856,0.6833244,-0.4503449,0.190263,-1.651732,-1.812563,0.1793168,-1.079213,-0.045624,-0.2641121,1.201116,-0.9562013,1.135646,0.003775361,-0.292985,1.330829,-0.7298733,-0.5775293,0.861584,-0.1095319,0.2788535,-1.467767,-0.02022936,1.060611 +0.3436321,-0.3843236,-0.3878032,1.614064,-0.5955147,-1.368581,0.9829147,0.71717,-1.082208,-0.2353067,0.607811,0.1636329,2.021954,-0.5509276,-0.7505025,-1.661391,-0.3571986,0.7716279,-0.133316,-0.2988562,0.6949332,-0.1032095,-0.3690789,0.4794883,-0.6455192,0.1624968,-0.03026434,0.05376773,0.6961567,0.6847313,0.8480049,0.7852028,1.621387,-0.694994,0.07203394,-0.4394279,0.1405174,-0.3724521,0.3641685,0.8125987 +0.5854878,-0.2999803,0.902421,1.48005,-0.09991788,1.438317,-0.5575975,1.041498,-1.43422,-0.2722246,0.4057469,-0.7101987,-0.04494988,1.568398,-0.02371955,-0.9253552,0.3299345,0.4697327,-0.3570622,-0.2733554,-1.11152,1.365977,-0.9053025,0.8257456,0.845296,0.7180887,-1.465917,0.9403136,-0.4946432,0.8378597,0.8713637,-1.280719,1.710359,0.211455,-1.650711,1.609402,1.040989,0.679362,0.6098222,0.2705757 +-1.430201,1.347302,1.037083,-1.70404,0.7470431,0.6360727,0.3546348,1.437767,-1.84541,0.2059743,0.0902755,0.4055122,-0.000158398,1.644075,0.4518295,0.5603023,-0.2264825,-0.5750909,0.9333361,0.4792581,-0.02458834,-2.279511,0.3040829,0.6682849,-0.5147271,-0.4531303,-0.829265,-0.1980041,0.4842733,1.331322,1.322648,-1.527097,1.097834,2.307385,-0.4550655,-0.2092324,2.462697,0.1634586,-0.3493489,-0.4562289 +1.019029,-1.071599,0.3359908,-0.6521202,0.1082125,0.9224822,1.546095,-0.4416505,1.27755,-0.324042,0.1025064,-2.02149,0.712812,1.008509,-1.930429,-0.4762063,1.22565,-1.407526,1.474595,-0.9340602,-0.05641815,0.2011193,0.64767,-1.697545,-2.080687,2.070191,-0.1489136,-1.304175,-0.3495555,-0.2687855,-1.358945,0.4029653,-1.093813,-2.069866,0.1083145,-0.04432106,1.07512,0.2889732,-1.44978,-0.2130287 +0.3202989,-0.6365512,1.319878,-0.3819877,-0.4871094,0.4112999,1.810291,-0.002038987,-0.9297353,0.2139263,-0.1666221,0.9297771,0.629476,-0.1796122,-0.1040021,0.600704,0.8978907,1.501337,-1.122347,0.1100243,1.799759,0.437368,-0.430044,-1.195699,0.5712847,0.4338434,1.498412,-0.563485,0.4856192,-0.4816113,-0.7520566,-0.8608445,0.2990674,0.2210376,0.9228526,-0.613637,0.5800369,1.167557,0.6656982,-0.7668875 +-0.5015762,0.1928244,0.3688429,-0.8499133,-0.6795451,-0.3703849,-0.2204061,-0.3400319,0.148872,1.056746,-0.4807258,-0.1369173,-0.9923861,0.3261359,0.1223115,0.962832,-0.6080865,1.417591,0.8095612,0.959176,-0.8511424,2.010226,0.6078103,0.793477,-1.216056,1.351904,0.02807152,0.798661,1.271367,0.8660523,-0.6583436,0.976434,-1.801897,-0.4354427,-0.6854887,-0.4419362,1.325382,-0.4137914,0.327318,0.833289 +-0.141376,0.759649,0.1305911,-1.355782,-0.07421788,-2.130878,0.3992076,-1.720708,-0.9311938,-0.9774086,-0.4202974,-0.2839517,0.08142922,0.5158842,0.2926571,-0.7455263,-0.9913057,-1.057854,2.112539,0.4261202,-0.3690983,0.7391793,-0.4403426,0.07225604,-0.9362919,-1.139555,-1.045039,0.6078945,0.1601601,1.007918,-0.6788437,0.9206366,1.132627,0.07283737,-0.9374378,0.2912792,0.3247691,1.752893,0.3646462,1.089359 +-1.430818,2.296522,0.6161374,0.3303129,0.05438528,-0.8104995,0.8667535,-1.465853,-0.4184085,0.7956578,0.9733957,0.734071,0.3168184,-1.630487,-0.809843,-0.5205701,-1.249681,-0.3076145,-0.3616816,0.06149618,-1.346284,-3.014149,0.01217615,-0.4344228,-0.5584891,-0.2075461,-0.3495977,1.266376,1.04478,-1.518501,-0.5300908,-1.029685,-0.3171673,-0.1382676,-0.0812553,-0.4891834,0.2441839,0.6112479,0.3355143,-1.093697 +0.5878577,-1.421285,0.6663875,-0.8184058,0.06432345,-0.5065063,0.625394,1.655104,-0.2751342,-1.887496,2.36204,0.2640537,1.402022,-1.809538,-0.8670912,0.06971255,0.9718566,0.1677771,-0.1947886,-0.9642267,-0.7670746,0.000555087,-0.3278587,-0.6203304,-0.9157997,-0.2507519,-1.362049,0.3217026,0.3142404,-1.243276,-1.16617,0.239255,-1.251601,0.06095528,-0.4159228,0.4333503,-0.7297805,-0.3573563,-0.05591005,-1.624929 +0.8689636,-0.201374,0.3112518,-0.9247279,0.190651,-0.9885347,-0.2306527,0.002774074,1.047825,-0.1174894,0.390347,-0.03134799,-0.68987,-0.864884,-0.2520169,2.526143,1.779337,-0.5201258,-1.60281,-0.6739081,0.1563646,-0.5016665,0.5785289,0.1303203,-0.429298,-0.4134226,-0.6183505,0.3413957,-2.039326,-0.1661965,-1.769406,-0.8388533,1.199961,1.026396,-1.408332,-1.450188,0.2309258,-0.7023523,0.2215136,1.637557 +0.456655,0.9638491,-2.23789,0.4210065,0.3972796,-0.6870674,-0.4524819,0.4924339,0.03528236,-0.1949874,0.3107017,1.178873,-2.357802,-0.7263735,1.683148,0.1013613,-0.9180723,-1.863693,1.895569,-0.3964302,-0.799157,-0.6646493,0.2965193,-1.437718,1.530778,0.05365268,0.6102973,-0.3149828,-1.741238,-0.6250928,-1.393347,-0.1903083,0.708308,0.767057,-0.7773713,-1.48734,-0.5886493,-0.4719768,-0.4133481,0.005040228 +-1.35596,-0.7757011,-1.459694,-1.689856,-1.364789,0.04437966,0.3225126,0.7362092,0.6635691,0.8918707,0.07834244,-0.6557617,-0.7556408,0.4610055,-0.02872966,-2.579209,0.3704082,-1.108875,-1.55595,1.08086,-0.8665878,-1.107063,-1.50559,-0.4608552,-2.097777,0.7722557,-1.407611,1.496639,-0.3177151,-0.006055867,1.604991,-0.1775518,-1.2064,-0.1403065,0.2337006,-0.314046,0.8057341,-0.04715548,0.8773326,-0.3331563 +0.6438078,0.1585294,-1.002772,-0.1812609,0.1881216,1.068276,-0.5941099,-0.2970993,-0.05015556,0.934483,-0.2222722,-0.6700305,0.9719099,1.092228,-1.031457,0.5189994,-1.577667,0.2923423,-2.589766,0.5636189,-1.472184,0.07110789,-0.6587554,-0.4829965,-0.2713114,-0.6936795,0.6819529,0.7611387,1.094323,1.537535,1.571894,-0.08268015,-1.65955,-1.026268,-0.891006,2.775102,2.527839,0.05837232,1.183566,1.544317 +0.07010119,-0.348833,-0.2160629,0.5559041,-1.114345,0.2712311,-0.5580189,-1.647722,-0.6671503,-0.5332581,0.6265139,1.281073,-0.4194359,-0.5484946,1.322195,1.0641,-1.899478,-0.04960611,0.751564,-0.4036841,2.32583,-0.729727,-1.657372,1.524485,-0.1974961,-0.1555634,0.4402412,-0.8270807,0.2180831,-1.055677,-0.8608544,-1.580288,-1.006605,0.2832318,-1.291302,0.9908553,0.2204154,-1.35244,-0.3893539,0.6375871 +0.01039156,-0.5483803,0.7632484,-0.8128283,-1.634781,0.1439454,-0.5789583,1.734793,-0.1102341,2.304712,-0.04418902,-0.182985,0.6383177,-0.4910943,0.4634079,-1.270248,-1.083431,0.1927577,-1.094563,0.4980582,-0.8070209,-0.4362196,-0.1675237,-1.016757,0.5294652,1.740889,-0.2948814,0.4862399,-1.157691,0.780368,0.5150275,1.29979,-0.166886,-1.403538,1.948866,-2.921486,0.606708,-0.835509,0.5684512,-0.6709416 +0.7212,0.7748794,-0.9318928,-0.2658154,-1.273606,0.03291844,-0.07269605,0.6621802,0.3220807,0.8741718,1.951593,0.7953093,-0.6047873,-0.1575188,0.4497729,0.6399663,-0.5283726,0.08344085,-0.384924,-0.1648336,0.9347756,0.6101262,0.2115456,0.3159717,0.5744524,0.2598539,2.193938,-1.255561,-1.031691,2.0448,0.1839638,0.4635619,-0.06548587,0.4449828,-0.9278055,1.006327,-0.1903801,1.195306,1.54427,1.215224 +0.0938149,-0.22184,0.01157565,-1.321279,-2.565221,-0.6918516,-0.3757065,-2.208001,-0.03025197,0.756686,-1.013602,0.08172775,-1.438196,1.082837,1.101166,0.5135999,0.9541377,-0.06935365,1.920076,-0.5520388,-0.6418908,-1.356498,0.03825422,1.728058,-1.326533,0.9578852,-0.4060226,-0.894449,-0.130755,-0.5816267,-0.95353,-0.5174373,0.8039801,-1.32952,-0.9160498,0.09098669,1.556648,0.3698252,0.1467718,1.269941 +-0.564257,0.0904786,0.4785867,-0.4332092,0.4247771,0.07232066,0.4146833,-0.3738023,1.552462,0.6463034,0.2370054,0.5988958,1.362803,2.479816,0.02272545,0.2792166,-0.632703,-0.816086,0.3073629,0.3073939,-1.10749,-0.9080491,-0.5180739,0.4931451,-0.3141043,1.200448,1.299392,0.8399245,-1.440627,-0.1094771,0.1637685,0.04064322,1.301396,0.68689,-1.54063,0.4601333,-0.5303713,0.04162859,0.7874828,1.500231 +0.007023682,1.192642,0.7618782,-1.628415,0.2891769,-0.6524983,0.4231564,-0.6183092,0.00649836,-0.1491777,-1.143826,-0.9217341,-1.13039,-0.987195,-0.3841495,-0.7847474,-0.000632543,0.2574384,-0.473648,-2.014275,0.363892,0.2229175,0.3865472,-0.4405697,-0.7475108,0.5753202,-1.434002,-0.1634349,-0.09323267,-1.321921,-1.50575,-1.784588,-2.336435,-0.8080248,-0.7788471,-1.567059,1.045105,3.257098,-0.09597879,-0.06379105 +-0.5615375,0.14192,-0.9657676,1.07561,0.9804556,-1.523636,-0.3984246,-1.777327,0.1035801,-2.193984,0.6232083,-0.2342392,-0.2430621,0.3771789,1.142808,-0.4133846,-1.315899,-0.8857324,-1.156049,1.666819,-0.0958084,0.6001445,0.1304127,0.3088739,0.06967457,0.714152,0.1467273,-1.048975,0.6130483,1.479316,-0.6098914,-1.794217,1.320577,1.183363,-0.2648786,-0.7902905,0.7156846,0.02333841,0.7156303,0.8288978 +1.75596,-0.2127835,-0.1463588,-0.7751778,-0.256199,0.07961125,0.2679258,0.3108932,-0.2442849,0.9476085,0.07325579,0.8697422,1.058393,-0.6759413,-1.345534,1.161885,-0.6878872,0.9890406,-0.4910596,-0.9169666,0.8487217,-1.012169,0.9917787,-0.04876188,-0.4123185,1.122073,-1.880679,1.347374,1.152767,-0.7255313,0.2549316,0.3646083,-1.240261,0.1178492,1.962787,1.433633,1.406833,-0.6332604,-0.3405588,0.3189355 +0.1704266,1.318411,0.8699514,0.9326741,-0.8961302,-0.4451129,-0.3817531,0.603182,1.801978,-2.13392,-0.3952827,-0.2305105,2.003929,-0.6574106,-0.9054553,0.7530996,-0.7420283,0.5578854,-0.2086288,0.2604475,-0.3585232,1.219752,-0.2901977,-0.5664348,0.2575152,-0.2069482,0.788329,0.03661373,-0.4258706,-0.8543963,-2.178002,1.428231,-0.9079424,0.3931979,-0.2312283,0.8014036,0.5469401,0.8298215,0.2706809,-0.1815031 +1.065195,-1.133662,0.3034185,1.709274,-1.218689,-0.4216412,0.1878984,1.245549,1.014319,0.7122809,-0.8737861,-0.7965743,1.250408,-1.955064,0.6541109,-0.6529828,-0.389064,-1.157063,1.779489,-0.001670226,-0.3227535,-1.013723,-0.7005875,-0.2908098,-1.522591,0.7027204,-0.3623261,-0.397061,0.02173404,-0.1123466,1.402198,0.8557516,0.3587709,-0.3112247,-0.4404708,0.5945183,-2.707044,-1.516654,-1.718932,0.5795151 +1.504156,-0.047708,-0.2870446,-0.846426,-0.8476596,-1.809584,-0.5461731,0.7249898,0.7007855,0.4186149,1.063829,0.129168,0.08853638,-0.7893752,-1.290989,-2.595172,0.9352825,-0.1144598,-0.2501392,1.556031,0.3911321,2.168467,0.8267985,0.3999602,0.6537838,-0.4816313,0.691625,-1.585936,0.2077667,0.9910332,0.3056281,1.25002,-0.4125027,0.6607217,0.2641381,-0.5042766,1.712854,1.368215,0.1097799,0.3208953 +0.6599597,0.2852269,-0.6082962,0.1852383,-0.3046626,-1.967611,-0.1237928,0.3038529,0.1414408,0.3815031,-1.278644,0.4269819,0.1279189,-1.575218,1.731153,0.3531025,-1.056666,-1.732715,0.5807704,1.609269,0.5535223,-0.664931,-0.2152756,-1.543761,0.5082113,1.518328,-0.1798024,-0.3128498,0.07632472,0.4231139,0.3840358,-0.7322554,-0.7328001,-0.3585832,-0.1619828,0.7751277,1.145336,1.270823,1.718511,1.096223 +-0.838201,-1.285206,1.091943,-0.5735278,-2.359957,-0.7881309,0.6584903,-3.032043,0.2854695,0.4862911,0.3820096,-1.21792,0.8277873,-0.05465333,0.5428459,-1.420225,0.4699992,-0.9042917,-0.4236902,-1.316493,-0.279519,0.3715308,0.9210671,1.718987,-1.975819,1.358084,2.410848,-0.02929559,-1.612056,-0.960735,-1.043013,-1.064182,-0.01563092,0.312955,0.8157851,-0.3913669,0.1927998,0.0587984,0.4191974,1.822172 +0.3946891,1.471751,0.05077218,-0.547464,0.545471,0.5362984,0.01292664,-1.204682,0.2129121,-0.3209297,0.971607,-0.1397598,0.6306283,-0.8540724,0.0381828,-0.3838109,0.2190143,-0.9432044,-1.11795,0.4581801,1.262316,-0.4054938,-0.7061698,-1.029324,1.161167,-1.19141,-0.2426079,-0.6850611,1.720593,-1.24433,-0.793584,-0.09226757,-0.9490102,0.3291173,1.218998,-0.5934212,-0.5654708,1.113703,-0.5984805,1.707305 +-0.379626,-1.142299,-1.813286,0.3444314,0.7928062,0.7663268,-0.8846189,0.2864225,0.7072531,-0.005009003,0.5596958,-0.8938186,-0.0973408,-0.3296889,-0.488072,-2.268406,0.3884711,-1.973007,0.02158617,-0.4000843,0.3954497,-0.2009204,0.8723143,0.8313683,0.1061541,0.4323318,0.5357208,1.116735,0.05992159,0.1949326,-0.790737,-0.007683882,0.3780479,0.6997815,-0.933785,-2.318244,1.04793,1.358863,0.9463952,0.2911647 +2.06678,0.3633626,0.8771536,0.1907076,-0.4165517,-1.234121,0.05306897,-0.3553858,1.253677,-0.9056793,-2.127122,0.8577264,0.7795015,0.2373721,0.2026299,0.5855015,-1.211274,-0.4405458,0.1726908,-0.2882903,0.8492814,1.543663,-1.279642,0.6644707,-0.3481127,1.733193,0.2007784,-0.1036717,-0.3327323,-0.6918675,-0.1554868,0.7101754,-1.044047,0.3081455,-0.5939283,0.814536,-0.4005154,-0.7177985,-0.908902,-0.1283832 +-1.02975,-1.195911,-0.775254,-2.104059,-0.6203647,-0.899898,0.6780542,0.7814643,-0.4187883,-1.101397,-0.2149295,1.179524,1.193286,-1.17237,0.6446983,1.891873,-1.466047,-0.3926044,-0.1640419,-0.7238707,-0.2242538,-0.8470004,-1.2293,1.528352,0.1659866,1.26198,0.2185383,1.288618,-1.385571,-0.1559298,0.1298652,0.979893,-0.4019682,-0.4756949,-1.849961,0.2575478,-0.7100421,0.7643602,-0.7246437,-0.6068302 +-0.1230955,0.004294699,-1.010106,-0.1591294,-0.8946622,0.3294134,-0.6945988,-1.088954,0.8602883,0.07068053,0.5536197,-1.206931,-0.1773181,-0.1031226,-1.518887,-1.049509,-0.2915731,0.8599345,0.3992445,0.566023,0.6019497,0.6986485,0.6581472,-0.8715242,-0.6246828,0.2065832,0.1924873,-0.3071687,0.3280539,1.338113,-0.6278461,-0.06326807,2.156064,0.1792455,0.5975888,0.4020217,0.1350532,0.1513136,-0.01088989,-0.9209925 +0.1739577,-0.8519134,-1.316667,-0.7071567,-0.2985566,-0.731269,0.9921941,-0.7370847,0.669854,0.4564555,0.5176279,-1.046147,1.739419,-0.6790853,-0.420693,1.550942,0.9337357,0.2070793,-0.827817,0.3211221,1.523122,0.8543945,-2.121189,-0.4313398,-1.34675,1.11145,-1.040791,0.3507214,-0.705119,1.792445,-0.9515385,-0.3849803,-0.811022,-1.157891,0.3073732,1.176432,1.272668,-1.538602,1.534815,-0.9874215 +0.5531906,-2.443559,0.1524802,-0.6160735,1.133177,-0.884311,-0.06309336,-1.002261,0.7309447,-0.4566769,-0.06817188,0.5655566,1.27109,1.023603,-0.08147434,1.452424,0.1571275,1.57256,0.3197431,-0.3498499,1.259149,0.3293293,0.8675929,1.298383,-1.322375,-0.3711266,1.201783,-0.6738736,-2.748366,0.260967,1.268491,-1.922516,-0.1791222,0.3437847,0.1698488,1.2987,-1.284117,-0.6315271,-0.6279468,-0.9440823 +-0.7590687,1.062889,0.06801613,0.06387697,-0.9192331,-0.4750339,0.8364547,-0.7673882,1.204784,-0.114297,-1.111605,-0.6867077,0.3523266,-2.123472,1.107444,0.8350194,2.069301,-1.263763,-0.636396,-1.011338,0.002251127,-0.3144899,1.816294,-0.9495852,1.264933,-0.3961405,-0.339,0.1759035,0.9890481,-1.629864,0.05153394,1.006808,-0.8218353,1.525576,-1.189755,-1.06713,0.7053942,-0.1984536,-1.273809,-0.251422 +0.03582959,0.7908535,0.04518163,0.4523217,-0.5754001,0.2413004,0.322852,-0.4038679,2.635549,-0.7347626,0.8322323,-0.439046,-2.007868,0.4482425,1.859667,0.7890953,0.1588531,-0.4984224,0.3252995,0.2272699,1.70789,-0.8734164,-0.3533229,-0.214107,-0.02353458,0.7232081,0.2347984,-0.610085,0.6766552,0.762013,0.2012147,-0.609761,0.9193295,0.8232338,-0.6491519,-0.5795062,-0.4634143,-0.1603706,0.8903427,1.082279 +-0.003495063,1.052093,-0.7913918,0.1264823,-0.4901084,-0.3631039,1.133117,-0.9347404,0.7857015,0.1748107,-0.1846863,1.419307,2.044814,1.349268,-1.010873,-0.3660764,-0.0701745,1.395136,0.1254838,2.270311,-0.9771715,-0.5091772,0.4646891,-0.8299487,0.5403319,-1.330591,0.1748305,0.6788545,-0.4707514,-1.05994,-0.05408469,-0.4419152,2.019195,1.188442,1.243725,-0.5095886,0.4829097,-0.6388239,0.3726736,-1.102631 +-0.2201904,-0.996881,1.115548,-1.095486,-0.888647,0.822349,1.320295,2.724162,0.1415595,-0.3260624,0.786418,-2.246102,-0.16492,0.4660131,-0.2375562,-0.00757005,-0.9688472,0.332621,-0.6315207,-0.5797238,-0.08296072,-0.5246571,2.823928,1.161304,-1.242084,-0.7626719,-1.297649,-1.362129,0.5447425,-0.6695479,1.378446,0.3740587,0.8604069,-0.4022182,-3.855814,-1.765756,-0.1348839,-0.3508246,-1.771854,0.3624982 +-1.55768,-0.7037607,0.6424789,-0.2901112,-0.9851917,0.2042809,-1.503794,0.5408151,0.04427318,1.943906,0.1296188,1.219339,0.7792991,-1.698417,0.4357677,-1.146742,0.1026153,0.04323995,-1.044959,1.835036,0.1192853,-0.9061407,-2.027763,1.455101,1.508104,0.4933803,0.2436719,0.8378828,-0.3817796,0.134848,0.8793706,-0.07547864,-0.653779,2.06184,-0.2855171,-1.115879,0.02072452,0.776733,1.179867,-1.299827 +-0.238046,-0.3681636,0.3608139,0.9008221,-0.3871882,-0.8663852,-1.059726,-0.2651744,-0.4401985,0.5799823,0.7043815,-0.08632157,0.7488586,-0.2234972,-0.6808266,-1.272573,1.777949,-0.6724998,2.579021,-0.1847361,-0.6366194,-1.552641,-1.223856,1.187966,1.101057,-1.366143,2.479324,-1.740632,-1.280491,1.291703,0.1502292,0.8940562,0.1713344,0.000224209,-0.006450431,-0.3686914,2.01722,0.550677,0.03489809,-0.3041059 +0.4249825,-1.74654,0.3181208,-1.234621,0.5876329,-0.3829641,0.3622121,0.2221516,0.9136308,-0.06836536,-1.949233,1.650032,-0.01769589,1.549007,-0.4161002,-0.9070988,0.927697,0.416848,0.04899301,0.5795015,0.8930483,-1.303778,-0.6146152,0.3613332,-0.5609537,0.9258414,0.898374,0.8084167,-2.328983,-0.180753,0.4867866,-0.9942634,-1.537743,-0.699404,-0.4029286,-1.570982,0.06782952,0.4245139,0.2098171,-0.5835923 +-0.4467797,-0.04093408,0.7297823,-0.3683547,1.568109,-0.4389475,-0.6996696,-0.7184824,1.164903,-0.504622,-1.34386,2.610234,-1.146549,0.9910537,-0.5728049,1.043273,-1.481755,0.406635,0.51766,0.4856313,1.087241,-0.7850654,-0.2088621,-0.8900878,-0.8992631,1.449792,-0.932401,-0.6394274,-1.110734,0.6391931,-1.784179,2.044656,0.7389891,2.536503,0.2329551,0.3619579,-0.04201728,-1.367907,-0.3128595,-1.029127 +-1.401855,-1.23531,1.18797,2.150188,-1.372164,0.978491,-0.0442789,-1.667145,0.2989488,0.3347308,-1.962694,0.9789452,-1.693876,1.957207,-0.09228119,0.743971,1.321482,0.7598494,2.333505,0.04341833,1.410843,0.940348,0.861414,0.3710753,-1.006375,1.132765,0.9414066,-0.1217275,0.4062339,1.180758,3.167501,-1.82376,2.198422,0.4516352,-1.173427,-2.196838,-0.3614167,0.9038611,-0.7358994,1.208447 +-0.5474143,0.6300323,0.251289,0.3361185,-0.1761917,1.171943,0.074002,-0.5443276,0.551282,1.153375,1.26884,1.391087,0.252858,1.293141,-0.5683381,-0.009677314,-0.9065256,1.126857,0.1831434,0.181412,0.3448226,-0.1352756,-0.6592338,-0.6141484,2.28581,0.5715271,-0.3725106,-0.7442005,1.279709,-0.1393724,-0.5567404,0.1661903,0.1256179,1.57056,-1.753599,-0.1377322,0.02135114,-1.133352,0.4589762,0.04615501 +0.2479418,1.256897,-2.575965,-0.8847018,1.916102,0.216638,0.02185754,0.4212248,-1.230869,1.532391,-0.6004421,-0.4987663,0.6368046,0.8771151,-0.8702773,1.521106,0.02841026,0.4981229,-2.667069,-1.359138,-0.980039,-1.256704,0.1473892,-0.5463132,-2.534406,-0.8876654,-0.2873431,0.289187,0.98195,-0.1371208,-0.2457923,0.4085408,0.724285,0.04596019,0.7545026,-0.6590065,1.45339,1.262973,1.836616,1.417929 +-1.351703,-0.5846703,-0.1151626,1.086271,-0.2328411,-0.1714394,-1.272734,0.1862458,0.5064348,0.02841345,-0.9204164,0.3730397,-0.956158,-0.7765465,-2.92703,1.076424,0.3789172,0.4919264,-0.8372324,0.06455486,-0.01481283,-1.325839,0.1638275,1.497486,0.9755608,0.6197308,0.2201455,-0.2348242,0.1246765,0.3725958,-0.05110614,0.2291636,0.3346105,-0.5209908,0.664923,-0.1121199,-0.2936483,0.0880004,-0.3356476,0.1329518 +1.285375,-0.4470747,-0.6429563,0.36167,-0.7571848,1.099267,0.4774565,-0.6466104,0.2047915,-2.162331,1.502392,0.4965681,1.661604,0.6888704,-1.137754,-1.576171,-1.601082,-0.2320844,-1.193459,-0.489941,0.906298,-1.006244,0.1604717,-1.530194,0.637704,-1.365589,0.5944706,-0.1082886,0.140958,0.1678204,-1.066326,-0.8516136,0.7527387,-0.6361214,-0.5566345,0.2039859,1.37746,0.01963026,-0.4911345,-0.6723834 +2.096633,-0.5008069,-0.8475154,1.191856,-2.744965,0.9903307,-1.124524,0.4179952,1.083873,-1.375593,-1.756378,-1.444647,0.5479401,0.3372686,-1.20206,-1.223765,-0.02959149,0.03147435,-0.512009,-1.08327,0.06344614,0.5075991,-0.3888084,-0.7063196,-0.711356,2.467365,1.486759,-0.1672694,-1.183372,0.6891571,1.912999,0.6428114,-0.1312204,0.4023541,-1.074955,0.6060837,-0.6852178,0.09076746,-0.8266085,-0.1486843 +-1.075717,-0.001746825,-0.6664939,1.132069,-2.56914,-0.6801512,0.03711223,1.002717,-0.1959469,-1.076845,-0.6310238,-0.4917559,0.01394219,-3.668497,-0.563846,-0.1681669,-1.014518,0.7936547,-0.8660654,-0.3314524,-2.608531,0.3421003,-0.4941405,0.4187685,1.369541,0.03581541,-0.3092859,-0.7170414,-0.9060502,-1.199047,-0.01170702,1.094224,-0.06205639,1.534338,-0.3664155,0.01456282,0.9146354,-1.139898,0.7076895,0.3118714 +0.6438186,-0.6590142,-0.1388683,-0.03339338,-0.8118925,1.14005,-1.089225,-0.9277041,-0.2588179,-0.2607366,1.177115,3.474295,-0.3564288,-0.795332,-0.1990377,1.985665,-1.34214,1.344573,2.445243,-0.1508433,-0.2979269,1.033513,0.1033963,2.009845,-0.8639516,-0.939108,0.5950458,-0.3694516,-0.4114335,0.925302,-0.007029155,0.2035519,1.686091,-1.941781,-0.7686518,-0.6908431,-0.4176206,-0.6543682,-1.900158,-0.3113265 +0.8111071,1.179954,0.6228956,-2.40461,0.187065,-0.1113911,1.459813,1.127853,-0.6266049,-0.5538222,-0.1610509,-1.181566,0.873373,-0.2344291,-0.7162805,0.5997385,-0.2515448,0.6738856,0.149913,-1.217565,0.1107869,1.524313,-0.1678335,0.6672117,0.2443525,-2.691422,-0.003174612,0.04058156,-2.056941,0.5859926,0.06678159,0.6677424,-1.16173,0.8262904,-0.3119408,2.332007,1.662367,0.1616018,-0.5184626,0.8264756 +-0.3733692,0.4861306,-0.9052755,1.301777,-0.04610147,0.4471562,-0.1314943,-1.155464,-1.258016,1.17787,-0.675523,-0.4593015,0.6394225,1.366958,1.097781,1.01363,-0.6056924,0.09960852,-1.388063,-0.6569506,-0.8560603,0.2332906,-0.1279507,0.3410103,0.4861975,1.057413,-0.1334438,0.5541636,0.654201,1.037262,0.8217927,0.9292402,1.471521,1.011132,0.5873983,-0.4416429,-1.926441,0.09777527,-0.1195092,-0.5141369 +-0.1177557,0.3539342,1.13887,0.635135,-1.767671,0.638162,-0.3827319,0.28603,0.9899202,-0.9548675,0.06754662,-0.1415621,-1.212573,-0.5876228,0.5238983,-0.9241095,-0.369148,-1.785894,-0.4504722,1.179746,-1.787202,-0.1937118,0.698357,-1.478495,-0.008678207,0.5915458,-2.263488,0.1683867,0.09285426,-0.4107802,-2.099026,-0.2898388,0.8327232,0.1676297,-0.8595318,1.496532,-1.235668,1.129734,0.2618576,-2.217595 +0.9395093,-0.175695,1.003066,0.4178984,0.3100866,-0.6288187,0.8209677,-0.2504144,-0.212723,0.04917843,0.4710929,1.482187,-0.3115984,-2.487316,-1.879913,0.5404161,0.2802228,-1.664458,-0.08081017,-0.863271,-1.088444,-1.117152,3.137381,-2.115675,0.08231575,0.1732226,1.756937,0.4554525,-1.871195,1.560957,-1.53199,0.5853504,0.5681113,0.2127347,-0.3169054,-0.002409708,-0.09676863,-0.7249787,0.4720403,0.1461167 +-0.2235445,-0.8509989,-0.404067,-0.6264451,1.87002,-0.2607009,-0.04048696,0.1662389,0.04533637,-1.171979,0.1561336,-0.6022324,0.1503507,0.2801627,0.1411248,-0.1038148,2.520379,-0.8071563,0.03420008,0.3049107,0.06274438,-0.6304616,0.8981803,-1.545503,-0.04382897,-0.1568601,2.711522,0.9721814,-1.094601,0.2250148,0.6643499,-0.2756265,-1.122545,0.03966737,-0.1455259,1.315266,2.041763,0.5794835,-0.344182,0.679908 +-1.188866,-0.4545327,-1.08494,1.128816,1.2728,0.4875614,0.6557298,-1.20229,0.125972,-2.207189,0.576926,-0.8672226,-0.3142655,-1.276121,-0.1599409,-1.088115,0.972787,0.5087686,-0.03272444,-0.9946409,0.129362,-0.2861554,-0.8227578,-0.3498458,-0.7015344,-1.585203,-0.9203257,-0.07028036,-0.3992178,-1.509773,-1.576261,-1.303063,-0.2760847,1.238343,1.46453,-0.2488817,0.526052,-0.8151874,-0.5453543,-0.1607108 +0.5999612,-0.09943265,0.7292364,0.1432192,-0.08708173,-0.5533299,-1.599018,-0.2387254,-0.4027381,0.5306227,-0.1296289,-0.7525691,-0.5432279,-1.167336,-0.1545132,0.637766,0.7620243,0.7079554,-0.05480367,-0.1470745,0.2373628,-2.054736,-0.3654871,1.159703,0.1919437,-0.4894054,-0.3125504,0.4866431,-0.3040639,0.003990617,0.8920342,-0.4843432,1.396907,-0.3630507,-0.1187302,0.3376594,1.206101,-1.249583,-0.4866528,-0.2898848 +-0.863306,-0.4879376,0.2205045,0.09006043,-0.4140238,-1.047197,-0.8203093,-2.54148,0.2677235,1.004379,1.392739,-1.518711,-0.5453716,0.7968766,-0.000872914,0.944308,-0.07098689,0.3921168,-0.4211073,0.7695809,-0.0729708,1.476663,0.06143163,-2.10231,0.9035352,1.356176,0.1426814,-0.4433644,-0.2537794,-0.5348313,-1.075129,-0.6037476,0.1875737,-0.2518606,0.7796665,-0.003520921,-0.5549682,0.4328032,0.105719,0.8257056 +-0.6691361,0.08262384,-1.81555,-0.1538418,-0.1823487,-1.510106,1.451297,0.01075384,0.02976529,0.300573,-0.1630741,0.1684219,0.7225737,0.1914103,-0.670737,-1.372036,0.1389423,0.4565917,0.4490138,-0.3936443,0.2476466,2.411481,1.207951,-0.7107321,-1.316707,-1.588687,-1.379884,0.7092522,-0.993097,0.168409,-1.072762,1.407541,-0.3255197,-0.1831943,-0.6189954,-1.287111,0.6828812,-0.5424859,1.108541,1.222738 +0.7580809,0.749001,-0.1139596,1.107445,-1.217549,-0.4821755,-0.7247468,1.568395,0.524202,-0.9625275,0.122321,-0.4140011,-0.7544196,-1.751579,-0.7216776,0.3909715,0.1531992,0.3120912,0.8020386,1.645952,0.8938871,0.8423425,-0.9829442,-1.776028,0.3161373,0.9803263,-1.42415,-0.2421191,-0.1911808,-0.113709,0.2959847,-0.7657686,0.0723546,1.901751,-1.126754,-1.185904,1.225157,0.3710091,-1.778555,1.461797 +0.9772123,0.1092167,-3.527188,-0.8081458,1.305229,-0.4292242,-0.08253606,-1.184331,-0.7587355,-1.253124,-0.2457016,0.7839245,1.583881,2.76723,1.385571,0.4121683,-1.208376,-0.4423771,-1.065218,-0.9350434,0.7257725,0.6067523,-1.132366,-0.6179929,-0.5478469,0.8720953,-0.791684,1.950497,-0.2099435,0.5091877,2.918367,0.8047624,0.2927701,0.1114583,1.370898,0.3866004,-0.9184228,-1.637901,0.1005591,0.2418777 +1.48696,-1.543985,-1.648497,-0.2615624,-0.5503982,0.796099,-0.1198034,-0.06977252,1.219529,0.2310556,0.2720999,-1.167494,0.6137151,-1.017366,0.134187,0.7400347,-1.020439,-1.005334,-1.71691,-0.6744815,0.2605697,-0.9173215,-1.10419,0.7983523,0.09273501,-0.3424897,0.9072945,0.6718961,1.869259,1.336743,-1.647781,-0.5320545,-1.100776,-0.5380732,0.07761829,-0.05326585,-0.473379,-0.1742196,0.2240772,0.6804585 +0.809182,-1.466868,-0.9945496,1.277784,-1.135271,-1.536974,-0.08354215,1.194725,0.7624517,-1.108062,0.1176776,1.48831,0.7973505,1.150661,-0.5763482,-0.851345,-1.386444,0.04816677,1.316609,2.269519,1.46627,-0.3333486,0.03613422,-0.4830835,0.7142673,0.8858135,0.1348697,1.082121,-0.5270995,-0.005064547,0.9153436,-0.5359387,-1.348899,-2.052655,0.2708628,-0.8126825,0.5306847,0.2763904,1.301807,0.85778 +1.049591,-0.07035627,0.464516,1.191006,0.3985518,-1.752928,-0.1820474,0.228129,1.483509,-1.157987,-0.3176934,-0.4294575,-0.1921987,0.8044196,-1.439662,-0.8906121,0.3349691,-0.8239805,1.0064,-0.5753834,-2.591747,-1.397332,-0.1246497,-2.431177,-1.032622,-0.3445996,0.1821507,-0.4112094,0.3215858,0.9621425,0.1600519,-0.4600216,-0.3895833,-0.4289286,-1.04584,-0.1665089,0.9537026,0.5293981,1.265973,-0.3950233 +-1.378317,-0.7761757,-0.833727,0.663794,-1.121605,1.246559,0.4210373,1.294106,1.22252,-0.1816253,-0.4328273,-0.3251372,-0.294374,1.08799,0.1303711,-0.9388355,1.20648,-1.510045,-0.4519456,-1.378059,0.2902575,0.4623071,0.5724253,-0.5691547,-1.388314,1.843684,0.4718218,-0.008436159,0.3620175,-0.08307316,-0.8621396,1.253851,0.7562047,-0.9295301,0.9090012,-0.6756359,-0.5313791,-1.018607,0.3050303,-0.7579818 +-1.16638,-0.377443,-1.639407,0.5450351,1.400501,2.516324,-0.5000195,1.353622,-0.8373182,0.1355683,-1.754832,1.256544,0.8758194,2.358032,0.5070926,-0.4563515,-1.823478,-0.1894391,1.373692,0.5065553,0.05472337,1.140826,-0.4337159,0.5035864,1.750637,2.006157,-1.053266,0.5214563,1.139876,0.3402222,0.7288696,-0.4395392,1.167703,-2.047966,-1.414642,0.3358284,0.7775989,1.003375,-0.1416323,-2.114164 +0.4505317,-0.3044688,0.379054,0.08211945,0.03667063,0.517153,-2.283504,-1.91772,-0.1672483,-1.519495,-0.8711005,1.019683,-1.595826,-0.6120568,-1.459304,1.418281,-2.143961,-0.6077122,0.4741789,-0.1342743,0.6823928,-1.332856,-0.5516542,-2.232988,-1.48308,-1.319085,1.90187,1.558443,0.1952598,0.0729458,-1.108679,1.606046,1.134252,0.2841624,-0.953852,0.2863665,-1.160352,-0.8653906,0.2511908,1.602707 +0.5539867,-1.164967,-0.7960813,-0.3045997,0.07874646,0.8068451,-0.869457,0.1033479,-1.014393,-1.09952,-1.00327,-0.9841222,0.322161,0.3834297,-0.4033515,-1.56251,0.5648479,0.5148207,0.03072109,-1.598164,0.03697058,2.371032,2.153224,-0.2572003,0.477197,-0.1171818,-0.02792233,-0.1297908,0.8957647,0.4908679,-1.007739,1.065188,0.1285964,-0.4196626,-0.3776629,1.674399,-0.2573014,1.450743,0.186405,-0.6020264 +0.1715338,0.2649411,-0.5479242,-0.231161,-0.7591474,-0.5231862,-1.446289,-0.3001847,0.5027377,-0.02687963,-1.01412,1.044645,1.220524,0.4574344,-1.256776,0.1116768,1.375502,1.37034,1.703309,0.9387455,0.5681459,1.563154,-0.7264319,0.864669,-0.9388179,2.474123,0.2557535,-0.475795,0.7530139,0.430109,0.4842024,-2.425937,-0.7666981,0.9517564,-0.3429801,0.2270008,0.09364447,-0.7692015,-0.9585363,-0.227309 +-0.8068815,1.438229,-1.477317,0.5374188,1.131209,0.4962162,-0.6435086,-0.6116171,0.320649,0.3177665,0.7988249,1.521389,-1.554973,-1.394338,-0.3634644,0.5944005,-0.9999564,0.6054442,-2.009069,-0.8599312,-1.08848,1.741099,-1.139458,-1.042248,-1.124992,-0.6373139,2.294039,1.142399,0.8431255,0.6718938,-0.7734978,0.09249289,-2.14251,-0.2522761,-0.9624474,-0.4830116,0.02168296,1.336048,1.821995,-1.005655 +0.9788529,0.3611041,-0.2877611,-0.3900816,1.728461,-1.045892,-0.7389679,-0.2932002,-0.8796877,0.7902843,0.2722558,-0.4332581,-0.4668018,0.5747703,-1.382624,1.378108,0.1398395,0.1060934,-0.1929436,-1.799583,0.1110571,1.037283,1.477406,-0.8270877,1.598591,0.05988508,0.7697912,-0.3282878,-0.3087108,-0.3705668,-0.3211894,-0.4517664,-0.2411167,-0.1904603,-1.172289,0.55475,-0.7359285,-1.127106,0.3456106,0.5506389 +1.171035,1.890076,0.007019281,0.4340185,0.2231723,-1.021798,0.8095314,-0.3177292,1.717797,0.690754,-0.215566,0.2724063,1.531223,0.2586132,-1.758089,-0.4783602,-0.8363614,-1.838027,-0.1330544,-1.558136,1.319885,0.07724256,-0.2479858,1.0305,1.56834,-1.017241,-0.4595681,0.2292286,-1.083296,0.318922,0.222394,-1.243441,-0.1410765,-0.9470506,1.653621,-1.320705,0.06251614,-0.2165988,-0.5912453,0.1962269 +-0.254751,1.484355,-0.6890549,0.1006756,0.1971776,0.29411,-0.9603765,1.328605,-0.0915437,0.07464341,0.3597071,0.3867383,0.1855222,-0.6832537,-0.9245603,1.303165,-0.6176109,-1.195776,0.6199067,0.5138934,1.001488,-0.8688572,-0.04168549,1.185517,-0.3359975,-0.1403799,-1.225432,0.6520251,0.8347326,1.409484,-0.1812372,0.7031601,1.65595,0.4821248,-0.6970557,-1.845948,0.9473544,-0.5288403,1.157041,1.721385 +-0.2617795,-0.8888009,0.6727177,0.2221468,1.93523,-1.247121,0.4740143,-0.6228891,1.38254,-0.7564781,-0.5904429,-1.039772,-0.1431549,-1.16616,-1.179469,1.359348,-0.1232374,-0.08258215,0.150401,-0.0295297,-0.08989514,1.435013,-0.1105948,-0.7977052,0.07592196,-0.234942,0.865632,-0.7505151,0.280029,0.03041233,-0.6033627,2.115771,0.149556,0.1004673,0.2187593,0.1201573,0.8220164,0.4893301,-0.547821,0.6663315 +1.357009,-1.158972,-1.774903,-0.01326677,-0.4230857,-0.01751377,-2.298624,0.1036792,0.1524117,-0.2646101,-0.3398376,1.04717,-0.7674005,0.3639642,1.208368,0.241011,-0.9869239,-0.7407433,-0.3060513,-0.653934,-0.1995783,-0.313551,0.3394276,0.9900359,-0.09921273,0.8879894,0.2702641,0.2843525,0.08821987,-1.960377,0.1632266,0.5745623,0.7818602,0.3867238,-1.043599,0.4624813,-1.528725,-0.1695374,-0.2164875,-1.667461 +1.048195,-0.9190463,0.1457795,0.2474471,0.09124576,-1.19552,1.014689,0.8384953,-0.07573963,-0.05756328,-0.1021602,-0.4180694,0.6769811,0.3628939,1.060763,-0.4916933,-0.264315,0.8417435,0.5668858,0.1656153,2.079224,-0.343402,1.063605,-0.7131021,-0.1064097,0.06198874,0.2540479,-1.124773,0.4098056,0.110798,1.18411,0.351733,0.5370336,1.340927,1.15577,-0.220336,-0.3576129,-0.7784876,-0.4202629,1.393602 +2.228898,1.833361,-1.162663,-0.3586945,2.897466,0.1252737,-0.3747375,-0.2791793,0.4295827,1.082078,-0.2693356,1.309247,1.173834,0.561817,-0.6274795,-0.2577628,1.628969,0.1612612,-0.09747288,0.8043901,-0.6266014,0.4877911,-0.001216218,0.8159417,-0.4883996,0.3454965,1.522417,0.7287174,0.03555783,1.124224,1.3798,-1.351407,0.124773,0.6988127,0.2550017,0.5989685,1.023423,1.285989,-2.276436,-1.295986 +-1.470519,0.8390895,-0.8703584,1.201124,0.4940827,0.7780694,1.075702,0.9389752,1.175993,0.8540851,-1.754178,2.469458,0.1524638,-0.6406385,1.53723,0.565288,-0.85437,-1.001589,-0.4103852,-1.705322,-0.06153413,0.4436477,0.4617441,-0.7752879,-0.07638524,-0.2278434,1.777163,-1.061323,-0.05063684,0.8769144,0.8675449,1.048368,-0.1681094,0.195629,0.0140398,1.849923,-0.4057805,-1.570191,-0.3182346,0.2507957 +-1.03125,-1.234017,-0.2182988,-0.4289292,-0.8657282,0.733469,1.175247,-1.00815,0.7159635,-1.504966,-2.163145,-0.7063002,-1.711282,-1.048813,-0.8252904,-0.9206347,0.2037709,-0.04539004,-0.943517,0.8343938,1.253041,-1.366362,0.9573872,-0.3424231,1.524475,0.4565435,0.888041,-0.5906248,-0.1709903,-0.6202476,0.439012,-0.838827,0.9119951,0.2486786,1.096118,-1.832625,-0.3392626,-0.6587768,-0.4315764,-0.8253217 +0.1648135,2.064862,-0.5556146,0.66853,-0.1622478,-0.9633904,-0.4773941,0.8804666,0.5049112,1.888378,0.01530488,-0.6179588,0.8961308,-0.9953752,1.297062,0.8771796,0.05467559,-1.663756,-0.6493516,-0.7019877,1.626666,0.31064,-0.5773176,-0.3765934,-0.2019914,0.1012248,0.43421,0.2286504,-1.10102,1.8238,0.6630492,0.365097,1.490959,0.2810553,0.8499574,-0.3273274,0.1386214,-2.311149,-0.9387176,-0.5562547 +-1.243838,0.5036313,1.976301,0.2714859,0.6278645,-0.6428605,-0.9419118,-0.07539535,1.057831,-1.78759,-1.014971,-0.939879,-0.1657322,0.8367528,-0.9843362,-2.302834,0.6458495,-0.9251874,0.2011408,-1.477507,0.5641544,1.500839,0.440533,1.51938,0.6944335,-0.3787977,0.1978294,2.227215,-2.282573,0.2077602,1.11114,-0.3622092,0.3280933,0.7794994,1.203317,-0.1959438,-1.202538,-0.9285505,0.2382903,-0.7900252 +0.1163082,2.030007,-1.119302,-0.7613264,-1.498044,0.03295067,-1.184987,0.2724301,-1.700399,-0.5540536,0.8399307,-0.2348535,-1.335498,-0.5167946,0.4888628,-1.26556,0.862739,-0.3222667,0.3573804,0.9923687,-0.8063731,-0.5872621,-0.3607358,0.2403046,0.8697318,0.6667567,0.08000558,-0.396094,0.3993738,0.6812645,-0.8327262,-1.366495,-1.347054,-0.6631846,-0.6798093,0.563748,0.5704469,-0.09126346,0.7378995,0.3035539 +1.197547,-0.4386965,0.1764839,1.436773,-0.6122457,-0.163444,-0.1886606,-0.2083767,0.02542085,-1.20008,-0.2848676,-0.7984122,-1.517164,0.7555252,0.3688841,1.491679,0.6796921,-1.427856,1.348951,1.099267,-0.6997221,-2.27035,-0.3020144,-0.2609735,0.6174565,-1.25428,0.827507,0.1232284,0.7588392,0.6679103,-1.803585,-0.6426721,-2.86075,1.004793,-0.3665447,0.4790571,-0.2333595,0.2758528,-1.923261,-0.9331353 +-0.698131,2.453544,-1.396635,0.843609,1.310324,0.7794752,0.8358074,0.6476819,-2.2577,-0.7135654,-1.230275,0.479997,-0.4858049,-1.331223,1.00627,-0.7172182,1.074952,0.4724275,-0.1161275,-0.5852922,1.618818,-1.829123,0.1100215,-0.3124966,-0.9676753,0.1794216,-0.871254,-0.2891836,0.2897556,0.06613705,-0.6584158,-0.7534885,-0.2482147,-0.2678622,0.5721144,3.354593,-0.4621923,1.25766,-0.7685659,-1.315449 +0.2936049,-0.3853686,1.360084,-0.7570108,-1.865651,0.7832159,-0.5330827,0.4864162,-0.4308377,-0.03887812,0.104399,-0.3611608,-1.065812,-0.2042614,0.04796612,0.2337132,1.60797,-0.4615922,-0.4551642,-0.4266442,0.2955197,0.267011,-0.7257444,-1.166771,-0.5857194,0.9214529,-1.168045,-1.335339,-1.313032,1.237663,-0.2173716,-0.1212377,0.1339615,1.484252,1.91739,0.143394,1.861746,1.186234,0.4997838,-0.09100197 +0.4372977,0.5893337,0.5897531,-0.9350238,-0.121774,0.5983993,1.047077,-0.7819572,-0.4418132,1.089831,-0.5459323,-0.8264996,-1.657669,-0.8562527,1.130882,1.75271,0.5656491,0.8500432,2.074582,-0.2610331,-1.481085,0.1094013,0.309322,1.465292,-0.0455304,1.300783,-1.82237,-0.4164526,-0.1511632,-0.9594887,-0.9287908,-1.607623,0.0206581,1.074504,-0.5454107,-0.3073143,-2.358884,0.1473649,-0.0324356,-1.734529 +0.6277626,0.06484014,-1.903018,-0.1185487,0.7147085,0.4677818,-0.618488,-2.059647,-0.1122544,-0.7228432,-1.893146,0.1914301,-0.1872154,-0.4085191,1.469499,1.329491,0.6095966,-0.1614454,0.9251304,-0.3883644,-0.7840294,-1.186282,-1.775118,-0.2334268,0.5891598,-1.527583,0.8511362,0.2190763,-2.186704,0.01587254,1.022368,1.857556,1.488423,-0.5487023,-0.5263618,1.224858,0.07291883,-1.021908,-0.4809067,1.111721 +-1.160544,0.3978536,1.293697,1.605596,-0.6779083,0.9507378,-1.610964,0.9023951,-0.8464205,0.1844654,0.9316583,-1.163303,0.5456451,-0.8967114,1.175951,-0.5276976,0.06201748,1.140271,0.2620628,-1.888187,-1.047333,-2.309614,-1.995654,1.04404,0.5853844,0.541112,0.4790423,-1.743157,1.591524,0.6356673,0.6138853,-0.2233158,0.8887691,0.07152849,0.2276933,0.3947295,0.152072,1.419056,-1.572249,0.7148719 +-0.02078842,0.5427111,-1.818537,0.2081777,0.8870819,-0.2749711,0.6654719,0.2994982,-1.482863,-0.5130542,-1.123845,0.2756603,-0.5467229,0.2595462,0.02040175,-1.283758,0.4649578,-0.3114766,-0.02088971,-0.00967436,1.250263,0.9392065,0.1469451,0.8557285,1.571193,-0.001701211,0.4892823,-0.4291674,-1.249788,-1.652589,1.339184,0.05690047,0.1280175,0.1885199,0.569053,0.872241,0.8436954,-0.2800882,-1.620958,-0.2293929 +1.044724,0.6596893,0.6966726,0.2341416,-0.8531567,-0.06128801,-0.06587713,-0.1127021,-0.08058004,0.8465573,-0.3932579,-1.35004,0.1920655,0.6448106,-0.1842819,-0.1805458,2.288045,-0.6021191,-0.7282392,-0.5831005,0.09244005,-0.1653063,-0.5842128,-1.06252,-0.01552308,-0.2974091,-0.5020182,0.591209,-0.8939695,-0.7126903,0.01382629,0.4331809,-0.4207278,-1.542264,-0.6964014,1.844829,1.798164,1.042945,0.8520049,-1.721873 +1.311595,-2.134584,1.647552,1.691224,-1.454029,1.541222,-0.5817308,1.390487,-0.06121712,-1.595311,-0.3277679,-0.6247446,1.196787,-1.824022,0.8340748,-0.3956005,0.7423795,-0.1231835,0.995995,2.150472,0.7613983,0.273938,-0.2370937,0.7494846,1.080794,-2.009121,-0.2433523,-0.02641442,-1.978315,-1.112,-0.8610949,-0.1135869,1.443149,0.6371356,-0.8312132,0.2965801,-0.2064274,1.046284,-0.440023,0.7100417 +-0.08243903,0.1073325,0.1121957,-1.311944,0.4878046,0.5953834,-0.750571,-2.322321,-0.2963984,0.282032,-0.2761149,1.005612,0.4664247,-1.203688,0.1655062,1.119314,1.350102,1.458706,-1.590762,-1.148638,-1.130491,0.1736687,-1.024221,-0.04026464,1.709731,-0.3901527,-0.2395617,-2.638432,-0.6583239,0.09489869,-0.4566372,-1.816817,1.960955,0.6477604,-0.3232178,-0.4062636,-0.562265,-0.9840575,0.3049788,-0.409945 +0.9192248,0.8540678,1.444052,-1.093098,1.515334,0.2219548,0.1846714,-0.5882634,1.241096,0.2505662,-2.066068,-0.02843547,0.8848689,-0.599708,0.03991691,-0.350666,2.172893,0.1479646,0.2338455,-0.1383494,-0.966886,-1.316723,-0.2455922,-0.110217,0.9941223,-1.004556,-0.3461451,1.284434,-1.08757,-1.705115,-0.1918276,0.864469,-0.827734,2.160508,-2.239425,-0.971726,0.4618132,0.3598704,-0.458971,1.44177 +-0.2432699,-0.3339704,-0.1890999,0.2809294,0.1993837,-1.332606,0.3259816,0.1251037,-0.5011874,1.014383,-0.8042179,-0.8610017,1.43002,0.06910387,0.1289696,-0.07659328,-0.5040367,-0.826998,1.247691,-1.433292,-0.5496767,-0.6360297,-1.207506,0.4012302,-2.505206,-0.5692881,2.409142,-1.75993,0.14079,0.6731547,-0.9795079,1.242463,-0.07696994,0.4486091,0.1550739,-0.8204183,-0.2635246,0.5286668,2.072945,0.03069472 +0.3510944,0.4813534,0.2060802,-1.700883,-0.2069855,-1.03134,-1.728816,0.6066039,0.4040262,0.2795396,1.350247,-1.234508,-1.797437,0.4789101,-0.4309282,0.3208007,0.3910084,2.705051,-0.4244942,1.322783,-0.3969264,-1.395186,0.403594,0.5465173,2.608144,-0.283368,-0.872792,-2.371429,1.769814,-0.0286874,-1.026496,-1.191688,-2.470138,0.5293279,-1.273239,2.090392,0.6152808,-1.369787,-1.606465,-0.7374487 +-1.340384,0.3139555,-1.281535,-0.09318068,-2.991323,-0.8524653,-0.1736102,0.2487974,2.252967,0.2864047,-0.001895077,-1.907005,-0.2010789,-1.431186,-0.2831273,-0.6861266,-0.2596906,-0.3995178,0.2710246,0.01958423,1.378114,2.011322,-0.5729747,1.146199,2.031715,0.5787101,-0.3023101,0.3155953,0.1469174,-0.1840262,1.764561,0.6658524,0.6438631,0.2887819,1.167107,-0.5912703,0.9437901,0.4223869,-0.9492568,0.4147354 +0.4258461,-0.8146854,1.098108,-0.268505,-1.132631,1.224321,-1.054673,2.604325,0.01515233,-1.647439,-0.5334163,-0.855258,0.1003781,1.382486,-1.492787,-0.3822837,0.9255142,0.5210734,0.8937331,-0.1629473,1.858473,-0.2728187,-0.001404885,-0.2676169,-0.1349961,-1.591685,1.922558,-0.424935,-0.1755222,-0.4769539,-1.218981,-1.617592,0.03313869,-0.8522413,-0.7050798,0.696993,1.069855,-0.3056104,-0.9116465,2.010859 +-0.8143537,-0.7742154,-0.01392543,-0.9572512,0.2236049,1.543521,-0.1499608,-1.370776,0.3217391,2.134929,-1.059492,0.08836967,1.204874,1.074059,-0.1464178,0.4438299,-0.9095718,1.39889,-0.705963,-0.159917,0.0830038,0.4086942,0.491411,-0.4022193,-0.7749848,0.07441055,-0.3296101,-0.2817293,-0.803343,0.06585547,-0.8103295,-1.242288,0.870473,-0.07849906,-0.6312834,-1.852132,0.3237876,-1.775214,0.606651,-0.470889 +0.6933741,-0.2683207,-0.8238425,1.187016,0.556396,-0.1802022,0.6941389,-2.558442,0.5355966,-0.2742549,-1.526325,1.120254,-0.3646822,-0.5021338,0.09106595,1.251713,-2.146414,-0.4314268,-1.085074,0.4075111,-1.141332,0.3254143,0.5546033,-0.6812257,0.001696165,0.5138835,-0.5649054,0.6434208,-0.0469304,1.078632,1.408565,-0.3945717,1.864507,0.5698489,0.5295269,-0.3791418,-1.999063,0.01538955,-0.3728927,-1.414092 +0.7055446,-0.1464686,-1.770469,-0.6908792,0.381193,0.2650282,1.653707,-0.3035594,1.335931,0.4940149,-0.837455,-0.2380224,-0.76661,-0.2925178,1.274354,0.511156,-0.507728,-0.2309911,1.144095,-1.432753,-0.4524153,-1.270046,-0.991756,-1.106618,1.875696,-0.3856418,-2.342653,-0.7335017,0.671111,0.6423255,1.962682,0.9195766,1.273558,1.346956,-1.405145,0.4557044,0.4380583,0.7946343,-0.079367,2.102732 +0.2783505,-1.035011,-1.690698,1.566124,-0.5446115,0.4016218,-0.1179975,-0.6969288,0.9059976,-0.4747399,2.147687,-0.2554343,-1.242996,-0.2882802,-0.2766563,-2.549993,0.2533359,-1.437965,-0.4602521,-1.014543,-0.6372627,-0.2020763,1.673288,-1.546781,-0.07128208,2.212638,0.0528939,-1.062919,1.242324,-1.053067,0.9148895,-1.00451,-0.8596982,2.052162,-1.561467,-1.222705,0.3904741,-0.5867275,0.2432658,-0.5237142 +-1.202692,-0.1772332,-0.4371229,0.1240588,-0.03007637,-1.35357,3.047876,1.537788,0.9606522,-0.2295829,0.5008843,-0.02879606,0.3407318,1.726529,0.6423659,1.04035,-0.9585709,-0.118509,-0.7352783,-0.5519026,-1.06687,-0.0497326,0.8655984,-0.2393853,-0.593669,0.07094694,-0.6532815,0.1316551,-0.1376175,-1.762628,0.4223868,0.4048169,0.1721182,-0.4777708,0.6648482,0.3240321,-0.3696345,0.4467254,0.8475154,0.4292335 +0.6427618,0.02169295,0.7324676,0.6895852,0.1119512,-1.208408,1.097341,-0.2232168,-0.4487541,-1.658203,1.213364,0.8732638,0.9296646,2.305172,0.4635306,-2.140003,0.578292,0.8190518,-0.2525907,-0.6509012,1.615659,1.059388,0.4477752,2.158283,0.03944699,0.1986332,0.08608324,0.2725911,0.3138032,0.9184929,-1.526403,0.8484376,-1.504527,-0.575291,1.401835,-1.5656,-1.164772,-0.238388,0.3519138,-0.03329793 +-0.605974,0.2259434,-0.7472986,-0.3168895,0.5936792,-1.166581,0.1806815,0.6438963,0.4227912,1.488468,-1.073324,-0.02509841,0.70236,0.6226806,-1.193267,-0.8922777,-0.2347612,0.129873,0.09486735,-1.834986,-0.6937676,-0.0768029,1.475057,-0.9274081,-1.330287,0.1721402,1.124798,-0.1780256,1.506301,1.432535,-0.2176948,1.317709,0.8021034,-1.207603,-0.255905,-0.7292937,0.07229622,1.116153,-0.5505506,-1.047865 +-1.377258,-0.1213654,1.510237,-0.4976278,-0.07755099,-0.6179222,-1.760412,0.8159218,-1.412954,-0.01153816,-0.7391988,-0.3019558,-0.19462,-0.3996488,0.6197173,0.2017745,0.2018008,2.026345,-0.8605134,-1.103219,0.3859028,0.3773579,-0.6276301,-1.18437,-1.030264,-1.115332,0.9857048,-0.4577118,0.1348365,1.132564,0.6079763,1.111083,-0.973112,0.8039375,-0.4351065,0.5855664,0.3997674,-1.614539,-0.9998652,-1.814197 +-0.9259927,-0.04637114,0.7236386,2.028784,-0.5500401,0.09041432,-0.6448775,-0.4001064,0.2488748,3.39001,0.4030343,-0.2141453,1.258071,0.5126712,-0.7178603,-0.269221,0.3908997,-0.4789467,0.91031,1.132245,1.244762,0.6666351,0.3861839,-0.8846689,0.3390846,0.6377301,-0.739123,-0.6294136,0.1310733,-0.8296051,-0.7580362,0.3335802,-0.1253659,-0.5762461,0.6095277,-0.8720968,-0.486177,-0.370696,0.1940672,0.2102202 +0.5672439,0.870366,1.113742,1.931292,0.2722655,0.003351645,0.6819314,2.439551,-0.3987777,-0.3356231,-0.2520231,-0.8377941,-2.6864,-1.807724,0.2284345,0.5806255,0.08888671,-0.7253065,-0.08481728,-1.729414,0.1837677,0.2259686,-0.4548604,-0.8923077,-0.5096517,-1.156083,-0.9898826,-0.4165654,-0.6956224,-1.624231,-0.3006703,-0.3341773,-0.4048721,0.4620089,1.261179,0.0130217,0.08795195,-1.310212,2.30951,0.790152 +2.595481,-0.9662976,0.3693113,0.7702376,-0.1293229,-0.649206,-1.169863,-0.3786913,-0.1133623,-0.05773844,-0.5675267,-0.720266,0.4999155,-0.8661897,0.6792793,1.028663,1.66841,-0.1248796,-0.5045021,-0.5065117,-1.257339,-0.4179653,1.082745,-0.1998143,1.950952,-0.845764,1.766103,-0.2126217,-0.4764593,2.169803,-0.4618778,0.5092396,1.479225,0.5338306,-0.2522587,-0.2414511,-1.605517,1.387149,0.832966,0.8917106 +-0.2399845,0.6839356,-1.320221,-1.815744,-2.276793,0.9771001,-1.057593,-0.03092764,-0.6587278,0.8637892,0.05517718,0.6625253,-0.7782538,0.2950432,-0.554375,1.466512,-1.280595,3.013818,0.7250924,-0.113713,0.353353,-0.04500513,-1.286403,-0.5359986,1.739065,-0.584181,-1.79788,-1.270346,-0.00127924,0.03966747,1.044363,2.136506,0.8611705,2.18343,-0.1520754,-0.5454713,-1.281653,-0.8879952,-1.09719,-0.6497584 +0.6875364,0.1339006,0.3761377,-1.412319,-1.275488,-0.2946974,-2.055651,-0.5937857,0.8213186,-2.114199,2.405432,0.02201856,-1.604107,-0.6930919,0.6481324,0.935003,-2.473618,0.4890078,-0.2511018,-0.5603642,-0.6234918,0.04797439,-2.233947,-0.03822812,0.3786618,1.473975,1.370819,-1.506198,-2.340355,-1.295348,-1.154712,-0.3883824,0.05144078,1.658993,0.8788187,-1.082783,-1.048835,0.2855579,-0.04069522,-0.0726698 +-0.151789,0.3495957,0.4000536,-1.360738,0.2317062,-1.5602,1.140706,1.42376,1.018279,0.4534504,0.3878663,-1.244638,-0.110573,1.17328,2.374563,0.7511785,0.4187421,-0.910769,0.9639984,-2.20424,0.1813984,0.09224238,-1.240684,0.4625854,0.7595874,-0.5529995,0.1762621,-0.7700103,0.263297,-1.205116,-0.830661,-0.2550672,1.805191,0.600877,-0.9141542,0.8440583,-0.1117724,-0.2742214,0.9414581,-0.3237913 +0.2275443,-0.6581023,-0.6284581,-0.3307767,-1.9515,-0.516064,-0.2460474,0.8908396,-0.6944813,0.854125,2.112212,-1.315288,-0.3210697,0.3727206,0.1179887,1.182124,-0.8954884,-0.04215346,-0.2403383,-0.1746785,0.1552455,-2.50096,0.0876308,1.318511,-0.1409292,0.3964507,-1.007716,1.165179,1.512251,-0.4614519,0.2539194,-0.9663387,0.4479402,2.497611,-0.3675072,0.5571862,0.7310517,-0.8313689,1.217659,0.7830779 +-0.668648,-0.4044412,-1.338368,-0.06434358,-0.9193649,-2.767486,0.9706123,-0.93664,0.6277567,0.4480796,0.1902979,0.3914307,-2.045069,0.4246687,0.5301222,1.10953,-0.09854828,-0.5180997,-0.5003889,-1.404547,0.4822272,0.4283203,-1.6588,-0.4125634,-1.905093,-0.1669342,0.4123344,-0.1693779,0.3779916,-1.251165,1.26011,-0.0449862,0.1783588,-1.323045,-0.9495855,-1.111648,-0.4518973,2.328848,-0.2737686,0.4012714 +0.03082907,-0.436991,-0.4965212,-0.729967,0.1924269,0.1505789,-0.2399707,1.181175,-1.751284,-0.01956342,-1.081934,1.58282,-1.0878,0.2922354,1.555356,-0.4336346,0.4915547,-1.536556,-2.672201,0.3238976,0.1147279,-0.9523644,0.05130673,0.2905056,-1.578721,1.203175,1.131235,0.2234879,-0.1679749,0.7972274,0.8147851,0.9744281,1.317246,-0.7484976,0.1496167,-0.4230811,0.06478455,0.8795938,-0.8970431,0.905915 +0.02842956,-1.090097,-1.128967,0.425766,1.19815,-0.8167485,-2.020807,-1.378407,-1.200961,-0.411164,0.1422168,-1.271054,0.8800085,-0.5882142,0.07967019,-0.0698988,0.10517,2.191667,-1.700385,0.4486487,-0.275214,1.394278,-0.4773784,2.722772,0.03780541,2.200986,-0.1196252,0.983846,-0.2581505,-2.092555,1.344424,0.03387066,-1.278521,1.281206,0.2541052,-0.5045377,0.2020384,-0.7931132,0.2452265,-0.4410872 +-0.3654551,1.031791,0.5007854,-0.4546751,1.567731,-1.915574,-1.101273,-0.21659,0.7275913,-0.4346129,-0.4184496,-1.134872,-1.702189,-0.06370009,-0.4444766,2.103916,-1.220412,0.5155547,-0.9448037,-1.157909,-0.8367624,-0.8777072,3.082751,-0.6906891,-0.0760895,-0.2715407,0.6443005,-0.3679036,-1.496117,-1.164734,1.582988,-0.4320935,0.4498904,-0.4090114,-0.6694167,1.826563,0.1904951,0.1136774,-1.116313,0.03203225 +-2.208012,0.6070065,-2.030893,2.524892,0.3428824,-0.7654394,0.7489759,-1.656576,-1.845374,-0.9266423,1.525444,-0.8760431,-1.414985,-0.06018191,1.483822,1.19726,-0.4104147,-0.4761826,-0.06229392,1.543414,-1.935496,-0.3766416,-1.859022,1.891443,1.685058,0.01571311,-1.325545,-1.595483,-0.5201573,1.112941,1.000621,0.6948838,0.2446498,0.1889269,0.992443,0.3080378,-0.8598248,0.3112013,1.301151,-0.1958812 +0.2970394,-0.9019485,1.248129,-1.673335,1.381582,1.402157,0.04595002,0.4996004,-0.1852921,0.2934187,-0.6258729,-0.07401805,1.071498,-0.7500067,-0.191765,0.4235094,1.300139,-1.020683,-2.654219,-2.060505,0.4506467,0.05904913,0.5270029,0.6025286,-0.4568734,-1.334508,0.07786589,0.3800816,1.145387,-0.132019,0.000105348,0.9435213,-0.610503,0.3574013,0.795338,2.259321,-0.2519503,-0.4114837,-1.110207,-0.2342255 +2.129701,0.1501161,0.4428051,0.6351316,0.7731632,-0.07170323,-0.990884,-0.7656925,-0.4520285,-1.509502,-0.4118268,-1.072398,-0.3285594,1.995366,0.6663898,0.447222,-1.311294,0.99457,0.09307493,1.540446,1.312444,-1.419603,-2.337718,-2.443389,0.5098248,-1.036724,0.4002378,0.7690027,0.6915579,-1.375258,0.1149836,0.1301809,0.4433502,-1.26772,0.9159167,0.1459508,0.890922,-0.05237304,-2.530259,0.4835652 +1.325041,0.7408377,-0.4355333,-3.065529,-2.378938,0.1016926,0.2630762,-0.6039312,-0.4272266,-0.2056017,-0.1898557,0.2100856,1.726559,0.425689,0.2308183,0.1469038,0.6827635,0.6152455,-0.71549,-0.1975617,1.644762,0.2479326,-0.6555025,0.4220537,0.06794629,0.01523415,0.1463287,0.6143776,-0.4015139,0.2162816,-0.01789902,-0.2317468,1.342449,-1.00863,-1.28125,0.2414654,0.5279062,0.2311242,0.4279883,-1.385103 +-0.1161714,-0.1623921,-0.2359119,1.597294,-0.08694592,1.0196,0.5836379,-0.5815026,-0.1276441,-0.1059597,-0.01648401,0.5291526,1.320573,0.6842546,0.6400301,0.2523005,-0.5582693,-0.09988047,0.2253327,-0.5865235,-0.3276349,1.448734,2.045453,1.369483,-0.2590277,-1.407117,0.8695499,1.089907,0.629989,-0.5877524,1.036959,-0.6673942,-0.4014564,-1.593596,1.388591,-0.876963,-2.420576,-0.1097108,1.768674,1.490871 +-1.470146,-0.6333746,1.44666,0.7374776,-0.122342,-0.5536003,1.779488,-0.3461341,1.196953,-0.1814678,0.1752456,0.2045995,1.407051,-0.7304805,-1.503089,-1.305663,-1.278102,1.145779,0.7715375,-0.1405917,-0.4336587,-0.4168831,0.5206407,-0.5454822,-0.2857581,0.498203,0.3730235,1.459461,1.637902,0.7350382,0.2462023,-1.361881,-0.3095315,-0.08704018,-1.698148,-0.4801125,-1.385459,0.2893876,0.7392771,-1.067035 +-0.3792718,-0.8955208,-1.127459,-0.6312482,1.418029,0.7313706,0.6365803,-0.08359964,1.078757,-0.5392688,0.0977141,-0.04269784,0.9567556,-0.8103076,0.3140158,0.3452025,1.403212,0.1783721,0.439871,-0.4131332,-1.447535,-0.5620699,1.774069,0.2199345,-0.7484262,0.01673518,0.01392829,0.3595336,0.4302833,-1.296073,0.9891317,-1.234378,-0.2271906,-0.2486205,0.1850186,0.2649093,-0.1559807,-1.31897,-0.4014781,0.1859376 +-1.465006,2.034465,0.4408494,-0.5304421,1.075337,0.2981279,2.515097,0.191902,0.3253761,0.7907912,-0.3074093,-1.445376,-0.4621784,0.2865637,-0.000340997,-0.2713901,0.9863574,0.493799,-0.1435576,1.882245,0.6516502,-1.494485,-0.8857061,0.826793,-0.07812493,1.152476,0.8536542,0.4022551,0.2391626,1.045731,-1.421523,-0.3110446,0.9141036,-1.549967,1.324033,1.027569,0.4646951,-1.259852,0.7379093,0.9574745 +1.075148,3.003267,-0.1234407,-1.03674,-1.270604,-1.277029,-0.2785037,1.249723,-0.7069937,-0.7046713,0.1374137,-0.545526,1.237348,-0.1151673,-0.1885763,-0.3354451,-1.039042,2.098071,1.189881,-0.4883022,-1.131076,0.5326998,1.715767,-0.5847613,-1.416804,-0.9343693,-0.825359,-0.09631337,-0.5033496,-0.2163598,0.8421608,-0.7621541,0.5468812,1.586981,-0.2420434,0.5071895,1.297424,0.3142901,-1.513097,-0.07470855 +-1.226125,-0.5017017,-0.7174301,-0.1691128,0.5995296,-0.9979873,0.0282364,0.2005076,-1.364865,0.5649572,0.03299788,-1.902298,-0.4371406,-1.046251,0.6651386,-0.4431168,0.1799518,-0.5040884,-0.2979136,1.380388,-1.166284,1.143079,0.4112727,1.201187,0.2153674,-0.07230739,-0.1710817,-0.5238425,1.125239,-1.952056,-1.012523,0.5932517,-0.5945063,-1.443559,-0.02887016,0.05217033,-0.8672269,0.2285312,-0.2077585,-0.2096647 +-3.056328,0.4498887,1.880362,-0.742841,2.238346,-0.2917377,1.270233,0.6964153,1.242857,0.429148,1.56024,1.698944,-2.109933,-0.8072399,-1.146654,-2.11941,-1.382769,0.9753073,-0.8158415,-1.797123,1.583854,1.546694,1.932612,-1.480001,-0.7217243,0.1664868,0.7904981,-0.2317101,-1.906912,-0.718626,0.135085,-0.7320773,-0.03746764,-0.8366888,0.02027417,-0.8033058,-0.9072765,-0.781791,0.06990843,1.336894 +1.450658,1.310348,0.3838369,-0.4088601,-0.4711108,-1.392396,-0.8058084,0.2109001,1.727079,0.8628701,0.8207546,-0.7488937,1.560224,-0.5095834,1.071049,-0.9695467,-1.436005,0.4245734,-0.688211,-0.4130026,-0.3670095,-2.349555,1.323857,-0.7776493,-0.05239478,1.011977,1.719617,-0.5714752,1.165922,1.456871,0.1090179,-0.1285215,0.8602701,0.7650145,-2.36009,0.2526993,-1.461818,-0.8123422,-1.095099,-1.460114 +0.7179769,0.7634819,0.313576,-0.3264731,-0.1587002,0.468113,0.5191607,-0.4270991,2.759068,-2.571514,-1.037164,-1.530438,1.692381,0.3000961,-0.6435787,0.0567591,-1.265215,0.2781767,-0.499156,-0.4427016,1.20461,0.3110264,-0.8861183,-0.2463007,-0.8881552,-0.4879487,0.9232075,-0.5642434,0.1745982,-0.3590661,-1.105732,-0.8769501,-0.2987923,-0.5761456,-1.19678,0.4090315,-0.03228221,-0.5620644,-0.315811,0.1928633 diff --git a/ISLP_data/College.csv b/ISLP_data/College.csv new file mode 100644 index 0000000..53222d2 --- /dev/null +++ b/ISLP_data/College.csv @@ -0,0 +1,778 @@ +,Private,Apps,Accept,Enroll,Top10perc,Top25perc,F.Undergrad,P.Undergrad,Outstate,Room.Board,Books,Personal,PhD,Terminal,S.F.Ratio,perc.alumni,Expend,Grad.Rate +Abilene Christian University,Yes,1660,1232,721,23,52,2885,537,7440,3300,450,2200,70,78,18.1,12,7041,60 +Adelphi University,Yes,2186,1924,512,16,29,2683,1227,12280,6450,750,1500,29,30,12.2,16,10527,56 +Adrian College,Yes,1428,1097,336,22,50,1036,99,11250,3750,400,1165,53,66,12.9,30,8735,54 +Agnes Scott College,Yes,417,349,137,60,89,510,63,12960,5450,450,875,92,97,7.7,37,19016,59 +Alaska Pacific University,Yes,193,146,55,16,44,249,869,7560,4120,800,1500,76,72,11.9,2,10922,15 +Albertson College,Yes,587,479,158,38,62,678,41,13500,3335,500,675,67,73,9.4,11,9727,55 +Albertus Magnus College,Yes,353,340,103,17,45,416,230,13290,5720,500,1500,90,93,11.5,26,8861,63 +Albion College,Yes,1899,1720,489,37,68,1594,32,13868,4826,450,850,89,100,13.7,37,11487,73 +Albright College,Yes,1038,839,227,30,63,973,306,15595,4400,300,500,79,84,11.3,23,11644,80 +Alderson-Broaddus College,Yes,582,498,172,21,44,799,78,10468,3380,660,1800,40,41,11.5,15,8991,52 +Alfred University,Yes,1732,1425,472,37,75,1830,110,16548,5406,500,600,82,88,11.3,31,10932,73 +Allegheny College,Yes,2652,1900,484,44,77,1707,44,17080,4440,400,600,73,91,9.9,41,11711,76 +Allentown Coll. of St. Francis de Sales,Yes,1179,780,290,38,64,1130,638,9690,4785,600,1000,60,84,13.3,21,7940,74 +Alma College,Yes,1267,1080,385,44,73,1306,28,12572,4552,400,400,79,87,15.3,32,9305,68 +Alverno College,Yes,494,313,157,23,46,1317,1235,8352,3640,650,2449,36,69,11.1,26,8127,55 +American International College,Yes,1420,1093,220,9,22,1018,287,8700,4780,450,1400,78,84,14.7,19,7355,69 +Amherst College,Yes,4302,992,418,83,96,1593,5,19760,5300,660,1598,93,98,8.4,63,21424,100 +Anderson University,Yes,1216,908,423,19,40,1819,281,10100,3520,550,1100,48,61,12.1,14,7994,59 +Andrews University,Yes,1130,704,322,14,23,1586,326,9996,3090,900,1320,62,66,11.5,18,10908,46 +Angelo State University,No,3540,2001,1016,24,54,4190,1512,5130,3592,500,2000,60,62,23.1,5,4010,34 +Antioch University,Yes,713,661,252,25,44,712,23,15476,3336,400,1100,69,82,11.3,35,42926,48 +Appalachian State University,No,7313,4664,1910,20,63,9940,1035,6806,2540,96,2000,83,96,18.3,14,5854,70 +Aquinas College,Yes,619,516,219,20,51,1251,767,11208,4124,350,1615,55,65,12.7,25,6584,65 +Arizona State University Main campus,No,12809,10308,3761,24,49,22593,7585,7434,4850,700,2100,88,93,18.9,5,4602,48 +Arkansas College (Lyon College),Yes,708,334,166,46,74,530,182,8644,3922,500,800,79,88,12.6,24,14579,54 +Arkansas Tech University,No,1734,1729,951,12,52,3602,939,3460,2650,450,1000,57,60,19.6,5,4739,48 +Assumption College,Yes,2135,1700,491,23,59,1708,689,12000,5920,500,500,93,93,13.8,30,7100,88 +Auburn University-Main Campus,No,7548,6791,3070,25,57,16262,1716,6300,3933,600,1908,85,91,16.7,18,6642,69 +Augsburg College,Yes,662,513,257,12,30,2074,726,11902,4372,540,950,65,65,12.8,31,7836,58 +Augustana College IL,Yes,1879,1658,497,36,69,1950,38,13353,4173,540,821,78,83,12.7,40,9220,71 +Augustana College,Yes,761,725,306,21,58,1337,300,10990,3244,600,1021,66,70,10.4,30,6871,69 +Austin College,Yes,948,798,295,42,74,1120,15,11280,4342,400,1150,81,95,13,33,11361,71 +Averett College,Yes,627,556,172,16,40,777,538,9925,4135,750,1350,59,67,22.4,11,6523,48 +Baker University,Yes,602,483,206,21,47,958,466,8620,4100,400,2250,58,68,11,21,6136,65 +Baldwin-Wallace College,Yes,1690,1366,662,30,61,2718,1460,10995,4410,1000,1000,68,74,17.6,20,8086,85 +Barat College,Yes,261,192,111,15,36,453,266,9690,4300,500,500,57,77,9.7,35,9337,71 +Bard College,Yes,1910,838,285,50,85,1004,15,19264,6206,750,750,98,98,10.4,30,13894,79 +Barnard College,Yes,2496,1402,531,53,95,2121,69,17926,8124,600,850,83,93,10.3,33,12580,91 +Barry University,Yes,990,784,279,18,45,1811,3144,11290,5360,600,1800,76,78,12.6,11,9084,72 +Baylor University,Yes,6075,5349,2367,34,66,9919,484,6450,3920,600,1346,71,76,18.5,38,7503,72 +Beaver College,Yes,1163,850,348,23,56,878,519,12850,5400,400,800,78,89,12.2,30,8954,73 +Bellarmine College,Yes,807,707,308,39,63,1198,605,8840,2950,750,1290,74,82,13.1,31,6668,84 +Belmont Abbey College,Yes,632,494,129,17,36,709,131,9000,4850,300,2480,78,85,13.2,10,7550,52 +Belmont University,Yes,1220,974,481,28,67,1964,623,7800,3664,650,900,61,61,11.1,19,7614,49 +Beloit College,Yes,1320,923,284,26,54,1085,81,16304,3616,355,715,87,95,11.1,26,12957,69 +Bemidji State University,No,1208,877,546,12,36,3796,824,4425,2700,660,1800,57,62,19.6,16,3752,46 +Benedictine College,Yes,632,620,222,14,24,702,501,9550,3850,350,250,64,84,14.1,18,5922,58 +Bennington College,Yes,519,327,114,25,53,457,2,21700,4100,600,500,35,59,10.1,33,16364,55 +Bentley College,Yes,3466,2330,640,20,60,3095,1533,13800,5510,630,850,87,87,17.5,20,10941,82 +Berry College,Yes,1858,1221,480,37,68,1620,49,8050,3940,350,2375,80,80,16.3,17,10511,63 +Bethany College,Yes,878,816,200,16,41,706,62,8740,3363,550,1700,62,68,11.6,29,7718,48 +Bethel College KS,Yes,202,184,122,19,42,537,101,8540,3580,500,1400,61,80,8.8,32,8324,56 +Bethel College,Yes,502,384,104,11,28,347,74,6200,2900,600,800,63,63,11.7,13,7623,35 +Bethune Cookman College,Yes,1646,1150,542,12,30,2128,82,5188,3396,650,2500,48,48,13.8,9,6817,58 +Birmingham-Southern College,Yes,805,588,287,67,88,1376,207,11660,4325,400,900,74,79,14,34,8649,72 +Blackburn College,Yes,500,336,156,25,55,421,27,6500,2700,500,1000,76,76,14.3,53,8377,51 +Bloomsburg Univ. of Pennsylvania,No,6773,3028,1025,15,55,5847,946,7844,2948,500,1680,66,68,18,19,7041,75 +Bluefield College,Yes,377,358,181,15,30,653,129,7150,4350,450,1500,61,67,17.8,3,6259,53 +Bluffton College,Yes,692,514,209,20,50,760,81,9900,3990,400,900,76,71,13.3,19,9073,58 +Boston University,Yes,20192,13007,3810,45,80,14971,3113,18420,6810,475,1025,80,81,11.9,16,16836,72 +Bowdoin College,Yes,3356,1019,418,76,100,1490,8,19030,5885,1495,875,93,96,11.2,52,20447,96 +Bowling Green State University,No,9251,7333,3076,14,45,13699,1213,7452,3352,600,1700,81,89,21.1,14,6918,67 +Bradford College,Yes,443,330,151,5,36,453,42,14080,6270,500,900,57,80,10.2,21,15387,46 +Bradley University,Yes,3767,3414,1061,30,58,4531,643,10870,4440,2000,1522,75,81,14.4,21,7671,85 +Brandeis University,Yes,4186,2743,740,48,77,2819,62,19380,6750,410,1000,90,97,9.8,24,17150,84 +Brenau University,Yes,367,274,158,12,41,917,479,9592,5879,500,700,71,80,13.7,12,5935,49 +Brewton-Parker College,Yes,1436,1228,1202,10,26,1320,822,4371,2370,500,2000,62,62,12.6,10,4900,18 +Briar Cliff College,Yes,392,351,155,16,44,738,430,10260,3597,600,1500,39,66,13.1,26,8355,58 +Bridgewater College,Yes,838,673,292,22,53,881,55,10265,4725,560,875,68,73,13.2,24,8655,82 +Brigham Young University at Provo,Yes,7365,5402,4615,48,82,27378,1253,2340,3580,860,1220,76,76,20.5,40,7916,33 +Brown University,Yes,12586,3239,1462,87,95,5643,349,19528,5926,720,1100,99,100,7.6,39,20440,97 +Bryn Mawr College,Yes,1465,810,313,71,95,1088,16,18165,6750,500,1200,100,100,12.3,49,17449,89 +Bucknell University,Yes,6548,3813,862,49,85,3316,31,18550,4750,800,1200,95,97,14.2,36,13675,93 +Buena Vista College,Yes,860,688,285,32,70,1928,442,13306,3797,450,950,62,69,8.8,10,6333,78 +Butler University,Yes,2362,2037,700,40,68,2607,148,13130,4650,500,1600,77,81,10.9,29,9511,83 +Cabrini College,Yes,599,494,224,8,28,1035,446,10518,6250,300,300,59,76,16.5,36,7117,71 +Caldwell College,Yes,1011,604,213,17,42,693,868,8900,4600,425,1000,87,96,13.9,25,7922,55 +California Lutheran University,Yes,563,247,247,23,52,1427,432,12950,5300,612,576,72,74,12.4,17,8985,60 +California Polytechnic-San Luis,No,7811,3817,1650,47,73,12911,1404,7380,4877,612,2091,72,81,19.8,13,8453,59 +California State University at Fresno,No,4540,3294,1483,5,60,13494,1254,7706,4368,600,1926,90,90,21.2,8,7268,61 +Calvin College,Yes,1784,1512,913,29,56,3401,136,10230,3710,400,1210,75,81,14.8,41,7786,81 +Campbell University,Yes,2087,1339,657,20,54,3191,1204,7550,2790,600,500,77,77,21.8,34,3739,63 +Campbellsville College,Yes,848,587,298,25,55,935,184,6060,3070,600,1300,62,66,17.7,13,5391,49 +Canisius College,Yes,2853,2193,753,16,34,2978,434,10750,5340,400,1130,90,92,14.6,26,7972,64 +Capital University,Yes,1747,1382,449,34,66,1662,960,13050,4000,500,800,64,69,12.1,27,9557,83 +Capitol College,Yes,100,90,35,10,52,282,331,8400,2812,300,2134,10,50,12.1,24,7976,52 +Carleton College,Yes,2694,1579,489,75,93,1870,12,19292,3957,550,550,81,93,10.4,60,17960,91 +Carnegie Mellon University,Yes,8728,5201,1191,60,89,4265,291,17900,5690,450,1250,86,93,9.2,31,24386,74 +Carroll College,Yes,1160,991,352,19,55,1357,737,12200,3880,480,930,74,81,17.8,25,7666,79 +Carson-Newman College,Yes,1096,951,464,27,62,1776,239,8150,3150,400,500,61,62,13.6,16,6716,67 +Carthage College,Yes,1616,1427,434,20,43,1405,580,13125,3775,500,1300,74,89,15.9,22,7364,62 +Case Western Reserve University,Yes,3877,3156,713,71,93,3051,513,15700,4730,525,1460,95,95,2.9,29,19733,67 +Castleton State College,No,1257,940,363,9,22,1547,294,7656,4690,400,700,89,91,14.7,8,6318,79 +Catawba College,Yes,1083,880,291,13,34,915,80,9270,4100,600,1860,75,82,13.5,27,8425,55 +Catholic University of America,Yes,1754,1465,505,24,49,2159,211,13712,6408,526,1100,90,96,9.3,18,12751,75 +Cazenovia College,Yes,3847,3433,527,9,35,1010,12,9384,4840,600,500,22,47,14.3,20,7697,118 +Cedar Crest College,Yes,776,607,198,25,58,791,764,14340,5285,500,1000,58,83,11.7,39,10961,74 +Cedarville College,Yes,1307,1090,616,25,55,2196,82,7344,4410,570,1000,50,52,15.3,34,6897,64 +Centenary College,Yes,369,312,90,12,46,396,526,11400,5400,500,760,41,85,9.5,20,9583,24 +Centenary College of Louisiana,Yes,495,434,210,35,55,775,44,8950,3490,600,1900,86,92,11.3,25,9685,66 +Center for Creative Studies,Yes,601,396,203,1,20,525,323,11230,6643,2340,620,8,58,6.8,4,13025,47 +Central College,Yes,1283,1113,401,31,65,1355,40,10938,3660,650,600,76,90,13.5,29,8444,67 +Central Connecticut State University,No,4158,2532,902,6,24,6394,3881,5962,4444,500,985,69,73,16.7,4,4900,49 +Central Missouri State University,No,4681,4101,1436,10,35,8094,1596,4620,3288,300,2250,69,80,19.7,4,5501,50 +Central Washington University,No,2785,2011,1007,8,65,6507,898,7242,3603,654,1416,67,89,18.1,0,6413,51 +Central Wesleyan College,Yes,174,146,88,8,29,1047,33,8300,3080,600,600,62,62,15.2,18,3365,58 +Centre College,Yes,1013,888,288,55,82,943,7,11850,4270,600,900,95,99,11.4,60,13118,74 +Chapman University,Yes,959,771,351,23,48,1662,209,16624,5895,600,1100,72,80,12.8,6,12692,47 +Chatham College,Yes,212,197,91,28,56,471,148,13500,5230,400,850,95,98,9.3,37,16095,52 +Chestnut Hill College,Yes,342,254,126,25,64,518,232,10335,5015,700,850,71,71,8.3,29,7729,73 +Christendom College,Yes,81,72,51,33,71,139,3,8730,3600,400,800,92,92,9.3,17,10922,58 +Christian Brothers University,Yes,880,520,224,16,42,1068,364,9300,3260,600,900,81,81,11.1,24,8129,63 +Christopher Newport University,No,883,766,428,3,37,2910,1749,7860,4750,525,1889,80,82,21.2,16,4639,48 +Claflin College,Yes,1196,697,499,21,47,959,13,4412,2460,500,1000,69,69,16.9,31,7083,21 +Claremont McKenna College,Yes,1860,767,227,71,93,887,1,17000,6010,500,850,99,99,9.6,52,18443,87 +Clark University,Yes,2887,2059,457,30,61,1928,296,17500,4200,500,950,94,95,10.5,35,11951,79 +Clarke College,Yes,460,340,167,14,45,604,350,10740,3676,350,900,67,71,11,27,7963,74 +Clarkson University,Yes,2174,1953,557,35,68,2332,53,15960,5580,700,1300,95,95,15.8,32,11659,77 +Clemson University,No,8065,5257,2301,37,65,11755,770,8116,3610,800,1618,82,88,18,17,7597,73 +Clinch Valley Coll. of the Univ. of Virginia,No,689,561,250,15,30,1125,422,7168,3689,600,1900,67,67,18.1,9,4417,46 +Coe College,Yes,1006,742,275,29,60,1127,205,13925,4390,500,2200,73,86,12.7,32,10141,67 +Coker College,Yes,604,452,295,15,47,690,222,9888,4502,400,1000,64,77,12.1,39,8741,75 +Colby College,Yes,2848,1319,456,58,84,1720,35,18930,5590,500,1000,83,94,10.2,41,15954,91 +Colgate University,Yes,4856,2492,727,46,75,2649,25,19510,5565,500,750,95,98,10.5,45,15494,93 +College Misericordia,Yes,1432,888,317,29,58,1121,493,10860,5760,550,900,56,62,12.9,23,8604,96 +College of Charleston,No,4772,3140,1265,22,55,6851,1200,6120,3460,666,2316,73,78,17.2,18,4776,51 +College of Mount St. Joseph,Yes,798,620,238,14,41,1165,1232,9800,4430,400,1150,46,46,11.1,35,6889,100 +College of Mount St. Vincent,Yes,946,648,177,23,46,707,432,11790,5770,500,1000,75,77,11.9,35,10015,83 +College of Notre Dame,Yes,344,264,97,11,42,500,331,12600,5520,630,2250,77,80,10.4,7,9773,43 +College of Notre Dame of Maryland,Yes,457,356,177,35,61,667,1983,11180,5620,600,700,64,64,11.5,32,7477,75 +College of Saint Benedict,Yes,938,864,511,29,62,1715,103,12247,4221,500,600,70,88,13.1,26,8847,72 +College of Saint Catherine,Yes,511,411,186,23,51,1692,562,12224,4440,450,1000,63,87,11.5,32,7315,77 +College of Saint Elizabeth,Yes,444,359,122,34,53,493,968,10900,5250,380,1000,68,70,11.4,23,9447,78 +College of Saint Rose,Yes,983,664,249,23,57,1698,894,9990,5666,800,1500,66,71,14.3,28,6084,64 +College of Santa Fe,Yes,546,447,189,16,42,873,683,11138,4138,600,1200,40,74,14,7,8820,80 +College of St. Joseph,Yes,141,118,55,12,21,201,173,8300,4850,450,1300,53,53,9.5,19,6936,76 +College of St. Scholastica,Yes,672,596,278,29,60,1350,275,11844,3696,450,1146,54,76,11.6,33,8996,72 +College of the Holy Cross,Yes,2994,1691,659,70,95,2675,22,18000,6300,400,900,92,96,11.3,55,12138,95 +College of William and Mary,No,7117,3106,1217,68,88,5186,134,11720,4298,600,800,89,92,12.1,31,9534,93 +College of Wooster,Yes,2100,1883,553,29,65,1704,1,16240,4690,500,500,84,96,11.1,43,14140,69 +Colorado College,Yes,3207,1577,490,56,87,1892,7,17142,4190,450,1200,85,97,11.3,51,14664,84 +Colorado State University,No,9478,6312,2194,29,65,15646,1829,8412,4180,470,1800,87,89,19.2,10,7850,59 +Columbia College MO,Yes,314,158,132,10,28,690,5346,8294,3700,400,900,87,87,15.3,2,5015,37 +Columbia College,Yes,737,614,242,21,67,968,237,10425,3975,500,1500,61,77,14.7,34,8693,76 +Columbia University,Yes,6756,1930,871,78,96,3376,55,18624,6664,550,300,97,98,5.9,21,30639,99 +Concordia College at St. Paul,Yes,281,266,139,13,29,1049,181,10500,3750,450,950,69,75,12.8,18,6955,45 +Concordia Lutheran College,Yes,232,216,106,16,34,534,172,6900,3800,450,1825,67,76,12.1,9,6875,42 +Concordia University CA,Yes,688,497,144,30,75,641,101,10800,4440,570,1515,55,60,13.1,13,8415,55 +Concordia University,Yes,528,403,186,22,56,1168,145,9216,4191,400,1000,56,64,12.1,13,7309,75 +Connecticut College,Yes,3035,1546,438,42,93,1630,232,18740,6300,600,500,86,95,10.7,40,14773,91 +Converse College,Yes,440,407,149,35,70,643,80,12050,3700,500,900,63,76,10.2,31,10965,75 +Cornell College,Yes,1538,1329,383,33,68,1140,10,15248,4323,550,800,71,76,12.2,31,10340,64 +Creighton University,Yes,2967,2836,876,30,60,3450,644,10628,4372,650,2055,85,90,6.5,32,22906,85 +Culver-Stockton College,Yes,1576,1110,274,24,55,992,112,8000,3700,400,500,51,52,14.1,28,5807,51 +Cumberland College,Yes,995,789,398,26,47,1306,122,6230,3526,400,600,42,44,13,4,8189,63 +D'Youville College,Yes,866,619,157,18,47,1074,336,8920,4310,680,1320,68,68,14.6,42,6898,46 +Dana College,Yes,504,482,185,10,36,550,84,9130,3322,450,1450,46,51,12.6,25,8686,54 +Daniel Webster College,Yes,585,508,153,12,30,460,536,12292,4934,500,500,61,61,22.2,10,8643,72 +Dartmouth College,Yes,8587,2273,1087,87,99,3918,32,19545,6070,550,1100,95,99,4.7,49,29619,98 +Davidson College,Yes,2373,956,452,77,96,1601,6,17295,5070,600,1011,95,97,12,46,17581,94 +Defiance College,Yes,571,461,174,10,26,645,283,10850,3670,400,1159,58,60,12.8,19,7505,56 +Delta State University,No,967,945,459,15,48,2806,538,4528,1880,500,1200,49,63,17.1,16,5113,58 +Denison University,Yes,2762,2279,533,32,60,1835,14,16900,4720,500,600,88,97,11.6,45,12423,81 +DePauw University,Yes,1994,1656,495,50,80,1983,36,14300,5020,550,950,78,94,11.1,31,11525,82 +Dickinson College,Yes,3014,2539,487,31,68,1889,62,18700,5000,595,1250,87,94,11.2,39,13861,87 +Dickinson State University,No,434,412,319,10,30,1376,237,4486,2146,600,2000,50,64,16.5,28,4525,46 +Dillard University,Yes,1998,1376,651,41,88,1539,45,6700,3650,500,2307,52,52,14.1,12,7566,61 +Doane College,Yes,793,709,244,20,47,1022,411,9570,3000,400,1000,67,72,15.1,42,6852,60 +Dominican College of Blauvelt,Yes,360,329,108,4,19,756,863,8310,5500,600,1800,43,43,12.7,5,5480,54 +Dordt College,Yes,604,562,328,25,50,1048,56,9800,2650,450,2800,61,60,12.5,17,7325,87 +Dowling College,Yes,1011,829,410,9,33,1059,2458,9000,3100,450,1413,77,78,12.4,7,11178,42 +Drake University,Yes,2799,2573,839,34,65,3322,726,13420,4770,560,1675,88,93,15,24,9473,77 +Drew University,Yes,2153,1580,321,56,84,1192,87,18432,5616,520,660,93,97,10.2,28,14907,83 +Drury College,Yes,700,650,314,33,66,1065,48,8730,3523,500,750,82,92,13.2,35,9303,67 +Duke University,Yes,13789,3893,1583,90,98,6188,53,18590,5950,625,1162,95,96,5,44,27206,97 +Earlham College,Yes,1358,1006,274,35,63,1028,13,15036,4056,600,600,90,94,10.6,46,14634,78 +East Carolina University,No,9274,6362,2435,14,44,13171,1687,7248,3240,500,1700,74,78,13.2,18,9002,58 +East Tennessee State University,No,3330,2730,1303,15,36,6706,2640,5800,3000,600,2200,73,75,14,9,9825,42 +East Texas Baptist University,Yes,379,341,265,10,36,1050,151,4950,2780,530,1500,62,62,15.7,7,5619,38 +Eastern College,Yes,458,369,165,16,42,1057,355,11190,4800,450,1230,60,60,13.6,22,8135,54 +Eastern Connecticut State University,No,2172,1493,564,14,50,2766,1531,5962,4316,650,500,71,76,16.9,14,5719,50 +Eastern Illinois University,No,5597,4253,1565,12,38,9161,845,5710,3066,120,1730,62,71,16.2,5,5682,76 +Eastern Mennonite College,Yes,486,440,227,19,48,903,59,9650,3800,600,1300,46,65,11.4,29,10188,82 +Eastern Nazarene College,Yes,516,409,200,17,40,1238,30,8770,3500,450,700,58,58,17.3,17,6430,70 +Eckerd College,Yes,1422,1109,366,33,65,1363,23,15360,4080,600,1000,82,89,12.8,26,15003,59 +Elizabethtown College,Yes,2417,1843,426,36,70,1476,299,14190,4400,500,750,65,68,12.8,25,9815,81 +Elmira College,Yes,1457,1045,345,27,50,1109,502,14990,4980,450,550,77,98,21.5,21,7502,64 +Elms College,Yes,245,208,125,23,46,544,436,11800,4765,450,1700,71,71,11.3,21,8952,86 +Elon College,Yes,3624,2786,858,11,39,2933,334,9100,3883,490,1777,70,74,18.9,34,6329,63 +Embry Riddle Aeronautical University,Yes,3151,2584,958,14,40,4772,856,7800,3750,570,3020,37,43,16.5,4,12878,44 +Emory & Henry College,Yes,765,646,226,30,60,809,32,8578,4408,700,1600,79,88,13.9,51,8061,82 +Emory University,Yes,8506,4168,1236,76,97,5544,192,17600,6000,600,870,97,98,5,28,28457,96 +Emporia State University,No,1256,1256,853,43,79,3957,588,5401,3144,450,1888,72,75,19.3,4,5527,50 +Erskine College,Yes,659,557,167,47,74,532,35,10485,3840,475,1246,76,80,13.5,47,7527,67 +Eureka College,Yes,560,454,113,36,56,484,16,10955,3450,330,670,62,87,10.6,31,9552,53 +Evergreen State College,No,1801,1101,438,14,50,3065,363,6297,4600,600,1323,75,78,18.1,14,8355,68 +Fairfield University,Yes,4784,3346,781,30,66,2984,1037,15000,6200,700,1100,86,90,15.1,30,11220,94 +Fayetteville State University,No,1455,1064,452,1,16,2632,617,6806,2550,350,766,75,75,15.1,10,6972,24 +Ferrum College,Yes,1339,1107,336,12,36,1051,82,9400,4200,500,1600,53,58,12.5,9,7967,22 +Flagler College,Yes,1415,714,338,18,52,1345,44,5120,3200,500,2140,52,60,18.1,9,3930,69 +Florida Institute of Technology,Yes,1947,1580,523,39,74,1863,233,13900,4140,750,1500,90,90,10.6,7,8923,57 +Florida International University,No,3306,2079,1071,42,89,10208,9310,6597,2494,800,3028,81,96,13.9,20,6722,66 +Florida Southern College,Yes,1381,1040,374,20,44,1506,970,8025,4865,400,650,65,74,17.4,10,6339,68 +Florida State University,No,11651,8683,3023,50,90,18906,3242,6680,4060,600,1020,80,89,23.1,15,7250,58 +Fontbonne College,Yes,291,245,126,16,49,981,337,8390,4100,350,1500,45,55,21.5,24,4607,62 +Fordham University,Yes,4200,2874,942,30,55,4740,1646,14235,6965,600,1735,86,97,14.4,14,10864,80 +Fort Lewis College,No,3440,2823,1123,16,35,3793,486,6198,3320,500,2500,89,97,19.1,6,4362,46 +Francis Marion University,No,1801,1655,819,13,38,3224,436,5840,3138,400,2430,76,76,19.1,8,5039,43 +Franciscan University of Steubenville,Yes,553,452,228,22,49,1301,242,9650,4400,600,1000,57,69,14.9,8,6336,83 +Franklin College,Yes,804,632,281,29,72,840,68,10390,4040,525,1345,54,78,12.5,37,11751,60 +Franklin Pierce College,Yes,5187,4471,446,3,14,1818,1197,13320,4630,500,800,50,56,17.6,16,6418,51 +Freed-Hardeman University,Yes,895,548,314,20,54,1174,50,5500,3340,600,1600,68,76,16.1,13,6078,62 +Fresno Pacific College,Yes,346,274,146,51,87,704,63,9900,3670,630,1818,59,59,10.5,14,8095,54 +Furman University,Yes,2161,1951,685,56,82,2371,175,13440,4048,600,1250,92,95,13.5,28,12940,82 +Gannon University,Yes,2464,1908,678,24,57,2693,691,10970,4280,500,1380,47,51,13.3,18,7711,65 +Gardner Webb University,Yes,1110,930,332,18,36,1603,374,8180,4270,500,500,65,58,15.2,12,5664,29 +Geneva College,Yes,668,534,237,19,39,1306,258,9476,4820,500,1100,67,67,20.1,26,6786,74 +George Fox College,Yes,809,726,294,27,52,1271,43,12500,4130,400,1050,53,53,13.5,22,7136,52 +George Mason University,No,5653,4326,1727,17,29,9528,3822,10800,4840,580,1050,93,96,19.3,7,6751,46 +George Washington University,Yes,7875,5062,1492,38,71,5471,1470,17450,6328,700,950,92,93,7.6,15,14745,72 +Georgetown College,Yes,727,693,286,30,55,1063,48,8100,3950,550,550,73,76,13.3,28,7508,55 +Georgetown University,Yes,11115,2881,1390,71,93,5881,406,18300,7131,670,1700,91,92,7.2,27,19635,95 +Georgia Institute of Technology,No,7837,4527,2276,89,99,8528,654,6489,4438,795,1164,92,92,19.3,33,11271,70 +Georgia State University,No,3793,2341,1238,9,24,7732,9054,6744,2655,720,3450,87,89,19,10,7762,34 +Georgian Court College,Yes,348,281,127,12,52,1095,785,9150,3950,500,800,56,59,12.2,27,7348,76 +Gettysburg College,Yes,3596,2466,575,42,78,1944,46,19964,4328,500,500,94,95,12.1,32,14720,83 +Goldey Beacom College,Yes,633,468,284,10,27,823,963,6120,2985,531,1830,25,25,27.6,4,6081,36 +Gonzaga University,Yes,1886,1524,526,31,67,2523,296,13000,4450,600,2400,78,90,14.7,32,9553,69 +Gordon College,Yes,674,565,282,25,54,1151,39,12200,4070,400,1200,73,82,14.2,32,9226,66 +Goshen College,Yes,440,396,221,26,51,910,166,9420,3730,600,1230,51,56,9.9,46,10270,72 +Goucher College,Yes,1151,813,248,40,64,850,80,15588,6174,500,1200,78,90,9.2,34,16623,77 +Grace College and Seminary,Yes,548,428,167,18,46,618,113,8958,3670,300,1000,53,59,15.3,26,9798,64 +Graceland College,Yes,555,414,242,14,41,996,2281,9100,3100,550,880,51,61,23.6,24,5609,47 +Grand Valley State University,No,5165,3887,1561,20,60,8234,2619,6108,3800,500,1000,64,66,20.6,9,5063,57 +Green Mountain College,Yes,780,628,198,7,20,545,42,11750,2700,400,850,77,83,14,24,6475,76 +Greensboro College,Yes,608,494,176,10,31,649,314,8330,3770,550,1300,64,80,13,31,7949,39 +Greenville College,Yes,510,387,194,20,46,771,53,10310,4530,400,800,57,61,14.3,16,8222,60 +Grinnell College,Yes,2039,1389,432,56,91,1333,30,15688,4618,400,400,88,92,9.5,54,18979,83 +Grove City College,Yes,2491,1110,573,57,88,2213,35,5224,3048,525,350,65,65,18.4,18,4957,100 +Guilford College,Yes,1202,1054,326,18,44,1410,299,13404,5160,450,1050,78,86,15.6,30,9114,65 +Gustavus Adolphus College,Yes,1709,1385,634,36,72,2281,50,14125,3600,400,700,79,89,12.5,58,9907,80 +Gwynedd Mercy College,Yes,380,237,104,30,56,716,1108,11000,5550,500,500,36,41,7.8,22,7483,96 +Hamilton College,Yes,3140,1783,454,40,82,1646,24,19700,5050,300,800,91,96,9.6,60,17761,91 +Hamline University,Yes,1006,825,328,34,73,1362,102,13252,4194,450,550,89,93,13,33,10296,65 +Hampden - Sydney College,Yes,817,644,307,20,40,945,1,13218,4773,660,600,95,97,13.3,53,12263,69 +Hampton University,Yes,7178,3755,1433,25,63,4623,740,7161,3518,600,2000,60,64,14,9,6791,70 +Hanover College,Yes,1006,837,317,33,65,1024,15,8200,3485,500,1200,84,84,10.6,26,9248,64 +Hardin-Simmons University,Yes,467,424,350,16,40,1365,334,6300,2980,700,2140,75,79,13.7,10,7054,38 +Harding University,Yes,1721,1068,806,35,75,3128,213,5504,3528,700,910,71,77,17.7,37,6466,73 +Hartwick College,Yes,2083,1725,430,22,49,1464,67,17480,4780,500,700,75,87,12.3,32,11625,73 +Harvard University,Yes,13865,2165,1606,90,100,6862,320,18485,6410,500,1920,97,97,9.9,52,37219,100 +Harvey Mudd College,Yes,1377,572,178,95,100,654,5,17230,6690,700,900,100,100,8.2,46,21569,100 +Hastings College,Yes,817,708,262,22,52,935,37,9376,3272,500,1902,57,63,13,17,7335,52 +Hendrix College,Yes,823,721,274,52,87,954,6,8800,3195,500,1200,82,99,13.1,26,8588,63 +Hillsdale College,Yes,920,745,347,35,66,1133,42,11090,4700,400,750,80,80,12,31,12639,79 +Hiram College,Yes,922,729,244,37,66,1000,275,14067,4560,400,1000,75,95,10.6,34,12165,79 +Hobart and William Smith Colleges,Yes,2688,2081,500,25,53,1792,5,19029,5841,600,600,99,99,12.1,37,13040,79 +Hofstra University,Yes,7428,5860,1349,25,63,6534,1350,11600,5920,1000,1000,81,90,13.9,10,10093,60 +Hollins College,Yes,602,498,215,26,58,795,74,13470,5515,500,850,78,91,11.1,48,13957,72 +Hood College,Yes,699,565,176,36,64,710,399,13960,6040,450,690,82,88,14.4,34,12434,72 +Hope College,Yes,1712,1483,624,37,69,2505,208,12275,4341,465,1100,72,81,12.5,40,9284,72 +Houghton College,Yes,949,786,302,30,70,1210,26,9990,3550,500,1500,85,90,15,24,8187,67 +Huntingdon College,Yes,608,520,127,26,47,538,126,8080,3920,500,1100,63,72,11.4,9,7703,44 +Huntington College,Yes,450,430,125,20,46,488,43,9950,3920,300,1300,76,76,11.8,25,9466,47 +Huron University,Yes,600,197,124,3,9,392,69,7260,3090,600,1840,31,35,12.9,4,9249,21 +Husson College,Yes,723,652,361,10,30,951,706,7800,4000,350,1500,36,44,22,4,4923,84 +Illinois Benedictine College,Yes,607,558,269,22,47,1222,519,10500,4348,650,1500,81,91,11.6,29,8324,75 +Illinois College,Yes,894,787,262,28,63,909,28,8050,3850,600,1000,75,75,15.6,30,7348,52 +Illinois Institute of Technology,Yes,1756,1360,478,42,77,1911,626,14550,4620,500,700,80,88,12.3,26,12851,56 +Illinois State University,No,8681,6695,2408,10,35,15701,1823,7799,3403,537,2605,77,84,21,16,5569,54 +Illinois Wesleyan University,Yes,3050,1342,471,55,86,1818,23,14360,4090,400,650,77,92,12.9,34,9605,83 +Immaculata College,Yes,268,253,103,16,44,494,1305,10000,5364,500,1000,56,64,11.2,33,7305,69 +Incarnate Word College,Yes,1163,927,386,16,49,1685,556,8840,4689,750,2775,67,69,11.4,21,6095,95 +Indiana State University,No,5659,4761,3147,10,31,8596,1949,6892,3706,600,2500,72,76,16.6,8,6996,40 +Indiana University at Bloomington,No,16587,13243,5873,25,72,24763,2717,9766,3990,600,2000,77,88,21.3,24,8686,68 +Indiana Wesleyan University,Yes,735,423,366,20,48,2448,707,9210,3782,700,1000,49,51,39.8,15,6562,34 +Iona College,Yes,4892,3530,913,13,33,3906,1446,10690,6790,570,1150,66,83,16,14,8107,66 +Iowa State University,No,8427,7424,3441,26,59,18676,1715,7550,3224,640,2055,81,88,19.2,22,8420,65 +Ithaca College,Yes,7259,5526,1368,23,52,5612,166,14424,6192,634,1000,58,79,11.5,25,9812,75 +James Madison University,No,11223,5285,2082,32,72,9652,742,7994,4544,500,732,77,81,17.9,29,5212,98 +Jamestown College,Yes,472,410,262,14,41,9950,71,7620,3050,400,400,51,53,17,21,3186,54 +Jersey City State College,No,2957,1423,691,10,30,3817,1394,3946,4800,400,1500,63,67,14.9,10,8367,26 +John Brown University,Yes,605,405,284,24,53,961,99,6398,3672,400,1350,68,68,13.3,19,8118,75 +John Carroll University,Yes,2421,2109,820,27,57,3168,392,11700,5550,600,450,89,90,14.5,28,7738,89 +Johns Hopkins University,Yes,8474,3446,911,75,94,3566,1569,18800,6740,500,1040,96,97,3.3,38,56233,90 +Johnson State College,No,833,669,279,3,13,1224,345,7656,4690,500,624,80,91,14.4,15,6564,36 +Judson College,Yes,313,228,137,10,30,552,67,9414,4554,500,1700,34,55,10.6,30,7840,56 +Juniata College,Yes,1005,859,298,36,55,1075,43,14850,4460,450,420,97,97,12.7,37,12067,80 +Kansas State University,No,5880,4075,2833,25,55,14914,2246,6995,3120,600,2000,76,86,18.5,22,6122,54 +Kansas Wesleyan University,Yes,589,575,148,16,40,474,258,8400,3250,500,1400,63,55,12.4,14,6535,68 +Keene State College,No,3121,2446,822,5,19,3480,776,7870,4157,500,1150,73,73,16.1,13,6195,61 +Kentucky Wesleyan College,Yes,584,497,175,20,49,662,121,8000,4150,500,1300,57,65,11.3,32,7058,62 +Kenyon College,Yes,2212,1538,408,44,75,1445,1,19240,3690,750,480,95,95,11.1,46,14067,88 +Keuka College,Yes,461,381,174,10,43,738,55,9600,4550,600,750,55,94,13.3,43,7863,51 +King's College,Yes,1456,1053,381,20,45,500,541,10910,5160,400,1795,66,72,15.6,37,7649,87 +King College,Yes,355,300,142,34,65,509,44,8664,3350,600,3000,65,68,10.7,25,8954,65 +Knox College,Yes,1040,845,286,48,77,967,24,15747,4062,400,800,88,95,12.7,33,13224,79 +La Roche College,Yes,361,321,185,10,41,650,819,8842,4782,600,1100,57,73,14.2,14,7022,52 +La Salle University,Yes,2929,1834,622,20,56,2738,1662,12600,5610,450,3160,90,90,15.1,9,9084,84 +Lafayette College,Yes,4010,2402,572,36,59,2018,226,18730,5740,600,1000,93,96,10.5,38,15365,92 +LaGrange College,Yes,544,399,177,15,35,600,363,6987,3585,750,1500,77,83,12.5,12,9067,75 +Lake Forest College,Yes,979,638,271,31,70,968,20,16880,3970,920,1320,91,94,10.7,19,15687,77 +Lakeland College,Yes,497,452,231,24,47,887,1957,9400,4005,500,1000,49,65,17.2,25,4054,57 +Lamar University,No,2336,1725,1043,10,27,5438,4058,4752,3040,508,1463,48,82,18.4,12,5879,26 +Lambuth University,Yes,831,538,224,15,35,840,325,5170,3430,600,1590,61,61,16.1,10,5531,60 +Lander University,No,1166,1009,510,9,33,2074,341,4938,2987,528,1702,67,77,17,11,6119,51 +Lawrence University,Yes,1243,947,324,50,77,1129,74,17163,3891,525,975,76,92,10.1,57,13965,77 +Le Moyne College,Yes,1470,1199,425,21,76,1820,558,11040,4840,400,900,89,92,13.3,28,8118,94 +Lebanon Valley College,Yes,1386,1060,320,28,56,965,502,13850,4755,400,1125,84,84,12.3,30,8196,85 +Lehigh University,Yes,6397,4304,1092,40,84,4298,132,18700,5580,750,1130,96,99,12.5,43,14665,91 +Lenoir-Rhyne College,Yes,979,743,259,25,46,1188,166,10100,4000,400,1000,88,92,12,20,8539,66 +Lesley College,Yes,244,198,82,12,33,1134,336,11700,5300,550,805,71,88,27.8,18,8694,58 +LeTourneau University,Yes,477,417,204,29,54,1532,77,8840,4240,600,1400,58,70,20.8,23,6863,56 +Lewis and Clark College,Yes,2774,2092,482,35,64,1763,59,15800,4790,450,950,97,98,12.3,21,12999,69 +Lewis University,Yes,1154,1050,395,12,31,2192,1423,10560,4520,500,1200,36,48,14.3,10,7701,61 +Lincoln Memorial University,Yes,787,562,363,21,55,925,605,5950,2890,600,1300,67,72,14.6,35,5177,53 +Lincoln University,No,1660,1091,326,15,41,1196,33,4818,3400,350,1400,71,72,12.6,8,10912,45 +Lindenwood College,Yes,810,484,356,6,33,2155,191,9200,4800,1000,4200,65,85,24.1,9,3480,100 +Linfield College,Yes,1561,1188,458,48,72,1548,840,13380,4210,500,900,89,91,17.8,34,8747,81 +Livingstone College,Yes,900,473,217,22,47,621,11,4400,3400,800,900,53,93,10.4,16,9268,92 +Lock Haven University of Pennsylvania,No,3570,2215,651,17,41,3390,325,7352,3620,225,500,47,55,16.1,14,6374,63 +Longwood College,No,2747,1870,724,12,47,2874,118,7920,3962,550,2200,74,80,18.4,23,5553,62 +Loras College,Yes,1641,1283,527,20,39,1663,170,11200,4000,500,1200,61,62,14.2,24,7578,70 +Louisiana College,Yes,2013,1053,212,33,61,912,158,5150,3036,500,1655,64,74,10.5,11,7547,59 +Louisiana State University at Baton Rouge,No,5996,4993,3079,29,57,16269,3757,5925,2980,600,2242,83,87,15.9,11,6741,37 +Louisiana Tech University,No,2397,2144,1525,22,45,6720,1822,3957,2325,618,1656,66,77,20,13,4546,45 +Loyola College,Yes,4076,3137,738,25,54,3010,184,12990,6300,600,900,86,88,14.7,27,9448,80 +Loyola Marymount University,Yes,3768,2662,753,42,64,3558,436,13592,5916,545,1328,84,88,14.2,10,11677,84 +Loyola University,Yes,1891,1698,719,24,80,2740,761,11100,5870,600,750,77,88,11.7,14,9456,53 +Loyola University Chicago,Yes,3579,2959,868,25,55,5244,3417,11500,5330,700,2000,94,95,6.2,15,13009,65 +Luther College,Yes,1549,1392,587,38,72,2269,85,13240,3560,600,400,73,85,13.8,38,8949,77 +Lycoming College,Yes,1286,1005,363,16,37,1363,74,13900,4300,500,900,75,81,14,32,8024,72 +Lynchburg College,Yes,1756,1500,366,3,21,1524,280,12450,5400,450,870,62,66,12.4,24,8832,70 +Lyndon State College,No,535,502,223,6,20,959,150,7320,4640,500,600,48,65,12.6,15,7114,51 +Macalester College,Yes,2939,1496,452,56,86,1723,113,15909,4772,500,700,85,91,11.9,37,14213,77 +MacMurray College,Yes,740,558,177,12,29,628,63,9620,3750,550,950,49,55,10.8,33,10642,59 +Malone College,Yes,874,758,428,21,46,1605,246,9858,3700,450,1200,42,45,17.6,16,4796,55 +Manchester College,Yes,1004,802,239,23,63,909,51,10440,3850,525,1450,63,72,11.8,20,7940,64 +Manhattan College,Yes,2432,1730,563,20,63,2578,254,12370,6800,500,1800,92,92,13.6,25,10062,79 +Manhattanville College,Yes,962,750,212,21,54,830,150,14700,6550,450,400,97,97,11.3,24,11291,70 +Mankato State University,No,3073,2672,1547,9,29,9649,1792,4300,2643,450,1660,57,68,19,11,5801,68 +Marian College of Fond du Lac,Yes,824,670,337,15,41,1160,653,9400,3400,500,1100,37,37,8.4,21,5352,59 +Marietta College,Yes,1611,960,342,27,60,1089,210,13850,3920,470,810,80,97,13.2,30,10223,96 +Marist College,Yes,4731,3171,830,12,31,3557,658,10700,5925,550,1200,74,81,17.6,34,8408,69 +Marquette University,Yes,5152,4600,1685,36,71,7016,804,11610,4760,600,1950,86,94,13.5,25,9982,77 +Marshall University,Yes,4226,3666,2007,14,60,7703,2339,5094,4010,700,1560,77,86,16.6,10,6203,50 +Mary Baldwin College,Yes,499,441,199,26,52,846,377,11200,7400,600,1300,66,79,6.8,50,10819,90 +Mary Washington College,No,4350,2178,756,39,78,2997,736,6490,4942,650,2102,75,80,17.6,30,5358,84 +Marymount College Tarrytown,Yes,478,327,117,9,34,731,370,11510,6450,575,1075,71,93,10.3,30,10502,77 +Marymount Manhattan College,Yes,695,535,239,21,30,988,785,10200,7000,350,1100,63,76,11.7,20,10622,68 +Marymount University,Yes,941,772,214,10,30,1247,776,11390,5280,500,750,77,82,10.6,17,8575,55 +Maryville College,Yes,1464,888,176,26,52,624,128,11200,4208,500,1642,80,90,11.1,43,8317,51 +Maryville University,Yes,549,397,169,26,51,1343,1751,9250,4550,425,1350,52,58,13.1,13,5925,61 +Marywood College,Yes,1107,859,323,13,51,1452,402,11040,4500,600,700,65,76,11.8,30,9034,66 +Massachusetts Institute of Technology,Yes,6411,2140,1078,96,99,4481,28,20100,5975,725,1600,99,99,10.1,35,33541,94 +Mayville State University,No,233,233,153,5,12,658,58,4486,2516,600,1900,68,68,15.7,11,6971,51 +McKendree College,Yes,1002,555,119,16,43,836,684,7680,3740,500,800,70,74,17.7,21,6652,52 +McMurry University,Yes,578,411,187,25,50,880,477,6930,3452,400,1525,57,64,11,11,6383,32 +McPherson College,Yes,420,293,93,11,32,336,80,7950,3750,600,2740,54,54,9.8,45,9754,48 +Mercer University,Yes,2286,1668,564,37,70,2943,1260,11985,4081,400,1500,93,95,9.2,15,8995,91 +Mercyhurst College,Yes,1557,1074,397,15,40,1805,433,9813,4050,425,1000,45,63,16.7,29,7307,78 +Meredith College,Yes,857,772,376,25,58,1721,470,6720,3250,450,1520,77,82,13.9,33,6881,82 +Merrimack College,Yes,1981,1541,514,18,36,1927,1084,12500,6200,375,1000,73,75,16.8,22,8707,80 +Mesa State College,No,1584,1456,891,6,18,3471,911,5016,3798,540,2256,48,48,28.8,12,3871,59 +Messiah College,Yes,1742,1382,607,30,64,2258,53,10300,5080,475,1200,68,75,14.1,30,7762,89 +Miami University at Oxford,No,9239,7788,3290,35,39,13606,807,8856,3960,500,1382,81,89,17.6,20,7846,85 +Michigan State University,No,18114,15096,6180,23,57,26640,4120,10658,3734,504,600,93,95,14,9,10520,71 +Michigan Technological University,No,2618,2288,1032,42,77,5524,414,8127,3978,900,1200,82,82,17,25,7473,65 +MidAmerica Nazarene College,Yes,331,331,225,15,36,1100,166,6840,3720,1100,4913,33,33,15.4,20,5524,49 +Millersville University of Penn.,No,6011,3075,960,22,60,5146,1532,7844,3830,450,1258,72,74,16.8,20,7832,71 +Milligan College,Yes,610,461,189,26,52,685,49,8200,3300,550,1000,63,69,12,16,8128,64 +Millikin University,Yes,1444,1261,456,29,62,1788,95,11910,4378,450,965,60,77,11.4,25,8149,75 +Millsaps College,Yes,905,834,319,32,61,1073,179,11320,4402,550,1350,82,89,12.7,38,11218,58 +Milwaukee School of Engineering,Yes,1217,1088,496,36,69,1773,884,11505,3255,1000,2075,35,46,16.7,23,7140,67 +Mississippi College,Yes,594,385,307,36,57,1695,721,5580,2830,600,700,77,79,16.5,18,6170,61 +Mississippi State University,No,4255,3277,1609,18,57,10094,1621,9866,3084,480,1479,77,77,15.9,20,6223,53 +Mississippi University for Women,No,480,405,380,19,46,1673,1014,4386,2217,600,1500,49,54,15.8,8,5704,63 +Missouri Southern State College,No,1576,1326,913,13,50,3689,2200,3840,2852,200,400,52,54,20.3,9,4172,100 +Missouri Valley College,Yes,1310,983,316,5,35,1057,175,8550,5050,400,900,35,67,17.4,16,4333,27 +Monmouth College IL,Yes,601,503,204,28,57,671,11,13000,4100,400,460,91,91,11.6,43,11087,56 +Monmouth College,Yes,2707,1881,478,14,34,1893,847,12480,5290,530,1740,70,85,14.2,15,9492,54 +Montana College of Mineral Sci. & Tech.,No,572,544,320,45,72,1470,416,6073,3400,550,1400,71,71,16.4,31,6112,74 +Montana State University,No,3500,2836,1779,15,42,8730,993,5552,3710,550,2300,75,83,17.6,8,6324,37 +Montclair State University,No,5220,2128,865,19,53,6411,3186,3648,4834,700,950,82,87,21.5,9,6717,58 +Montreat-Anderson College,Yes,263,223,103,10,24,316,20,8438,3372,500,2958,42,50,11.1,4,11989,15 +Moorhead State University,No,2442,2164,1189,12,37,5983,1075,4426,2664,600,1000,76,81,18.1,19,4795,60 +Moravian College,Yes,1232,955,303,23,58,1241,485,14990,4730,550,1250,86,92,15.2,28,9566,74 +Morehouse College,Yes,3708,1678,722,41,66,2852,153,7050,5490,250,600,71,74,17.8,10,8122,83 +Morningside College,Yes,586,533,239,16,36,950,228,10520,3678,500,1000,48,68,13,32,8111,56 +Morris College,Yes,882,730,330,2,13,926,12,4515,2550,850,2100,53,60,18.6,34,6990,60 +Mount Holyoke College,Yes,1800,1314,526,47,79,1891,40,19300,5700,750,750,79,91,9,51,18359,84 +Mount Marty College,Yes,279,276,126,17,37,600,435,6844,2980,500,500,45,55,11.7,38,5073,44 +Mount Mary College,Yes,235,217,121,12,32,931,487,8950,3119,550,1125,51,51,10.7,26,7016,78 +Mount Mercy College,Yes,368,317,159,20,49,806,542,10500,3555,500,2285,44,50,11.3,30,6695,64 +Mount Saint Clare College,Yes,325,284,95,16,33,364,88,9900,3650,500,1200,32,37,13.6,43,6525,21 +Mount Saint Mary's College,Yes,1321,1159,328,15,36,1243,79,12850,6200,550,900,77,82,12.8,36,8536,80 +Mount Saint Mary College,Yes,1170,695,238,14,48,1170,429,7470,4600,250,1400,74,75,15.3,23,6898,88 +Mount St. Mary's College,Yes,657,537,113,37,90,1039,466,12474,5678,630,1278,53,71,11.9,19,10613,72 +Mount Union College,Yes,1310,1086,458,26,61,1365,144,12250,3530,400,1150,85,87,16.7,35,7215,81 +Mount Vernon Nazarene College,Yes,510,485,334,18,36,1114,94,7400,3346,600,600,57,57,19.8,7,6869,58 +Muhlenberg College,Yes,2519,1836,462,30,61,1656,352,16975,4565,600,850,76,86,12.8,39,10888,83 +Murray State University,No,2225,1910,1190,29,55,5968,955,4738,3110,700,940,72,76,20.2,27,5972,52 +Muskingum College,Yes,1109,922,375,24,46,1115,70,13240,3914,600,800,73,85,13.4,27,9333,73 +National-Louis University,Yes,513,347,279,23,48,2508,505,9090,4500,650,500,62,65,18.3,2,7905,71 +Nazareth College of Rochester,Yes,947,798,266,36,68,1274,471,10850,5150,550,800,77,93,13.6,24,8797,61 +New Jersey Institute of Technology,No,1879,1216,483,27,62,3311,1646,8832,5376,700,1850,92,98,13.5,19,12529,72 +New Mexico Institute of Mining and Tech.,No,787,601,233,40,73,1017,411,5376,3214,600,1100,99,100,13.7,11,9241,34 +New York University,Yes,13594,7244,2505,70,86,12408,2814,17748,7262,450,1000,87,98,7.8,16,21227,71 +Newberry College,Yes,872,722,154,14,36,601,36,10194,2600,500,1500,57,63,11.4,32,5788,83 +Niagara University,Yes,2220,1796,467,65,99,1919,334,10320,4762,450,650,68,100,14.2,20,7788,65 +North Adams State College,No,1563,1005,240,1,19,1380,136,5542,4330,500,1000,65,71,14.2,17,6562,57 +North Carolina A. & T. State University,No,4809,3089,1429,12,33,6162,871,6806,1780,600,1651,72,72,16.7,9,7090,44 +North Carolina State University at Raleigh,No,10634,7064,3176,39,78,16505,5481,8400,6540,600,1300,92,98,17.5,21,9670,62 +North Carolina Wesleyan College,Yes,812,689,195,7,24,646,84,8242,4230,600,1295,77,77,12.7,11,10090,52 +North Central College,Yes,1127,884,308,30,64,1310,766,11718,7398,450,1800,73,87,16.4,33,8871,76 +North Dakota State University,No,2968,2297,1610,13,47,7368,1128,5834,2744,600,2000,79,83,17,24,6310,42 +North Park College,Yes,465,361,176,19,39,879,156,12580,4345,400,970,76,79,13.1,24,10889,74 +Northeast Missouri State University,No,6040,4577,1620,36,72,5640,266,4856,3416,400,1100,69,72,15.7,13,6601,76 +Northeastern University,Yes,11901,8492,2517,16,42,11160,10221,13380,7425,600,1750,73,82,12.9,17,9563,46 +Northern Arizona University,No,5891,4931,1973,23,48,11249,2682,6746,3728,620,2342,78,83,21.7,7,6157,41 +Northern Illinois University,No,10706,7219,2397,12,37,14826,1979,7799,3296,470,1750,73,78,17.3,11,6086,56 +Northwest Missouri State University,No,2729,2535,1257,8,29,4787,472,3735,3136,250,1630,62,65,21.7,23,5284,54 +Northwest Nazarene College,Yes,616,514,385,29,52,1115,60,9840,2820,450,822,59,59,14.8,20,6261,58 +Northwestern College,Yes,860,811,366,22,56,1040,52,9900,3075,300,1800,68,68,14.9,34,6357,68 +Northwestern University,Yes,12289,5200,1902,85,98,7450,45,16404,5520,759,1585,96,100,6.8,25,26385,92 +Norwich University,Yes,1743,1625,626,8,29,1862,382,14134,5270,500,800,71,74,13.1,22,9209,63 +Notre Dame College,Yes,379,324,107,15,37,500,311,9990,4900,400,600,44,47,12.1,26,4948,33 +Oakland University,No,3041,2581,1173,16,56,6441,3982,9114,4030,400,650,88,90,19.7,13,6637,53 +Oberlin College,Yes,4778,2767,678,50,89,2587,120,19670,5820,575,1119,77,96,10.1,47,16593,83 +Occidental College,Yes,2324,1319,370,52,81,1686,35,16560,5140,558,1152,91,93,10.5,30,16196,79 +Oglethorpe University,Yes,792,649,186,56,87,769,377,12900,4340,600,4110,91,95,13.1,27,8568,67 +Ohio Northern University,Yes,2936,2342,669,35,62,2502,66,15990,4080,600,825,73,78,14.5,31,9979,83 +Ohio University,No,11023,8298,3183,21,54,14861,1310,7629,4095,550,2300,79,87,20.4,13,8811,64 +Ohio Wesleyan University,Yes,2190,1700,458,36,65,1780,48,16732,5650,550,550,93,93,12.1,32,12011,75 +Oklahoma Baptist University,Yes,758,681,484,35,59,1707,705,5390,3140,515,1290,63,71,15.1,18,5511,50 +Oklahoma Christian University,Yes,776,765,351,22,44,1419,228,6400,3150,500,1900,58,64,16.2,8,6578,45 +Oklahoma State University,No,4522,3913,2181,29,57,12830,1658,5336,3344,800,3100,84,92,15.3,14,6433,48 +Otterbein College,Yes,1496,1205,428,26,57,1648,936,12888,4440,420,840,62,68,13.9,30,8802,87 +Ouachita Baptist University,Yes,910,773,450,31,73,1310,61,6530,2800,500,1500,63,67,13.3,10,6413,65 +Our Lady of the Lake University,Yes,2308,1336,295,22,46,1202,942,8530,3644,616,1576,56,64,14.9,25,7114,37 +Pace University,Yes,8256,3750,1522,37,70,5809,4379,11000,5160,660,1115,90,95,13.8,10,10059,62 +Pacific Lutheran University,Yes,1603,1392,504,31,68,2580,302,13312,4488,600,1516,78,78,11,23,9431,83 +Pacific Union College,Yes,940,668,385,20,48,1316,139,11925,3825,630,1926,48,87,12.3,12,9157,69 +Pacific University,Yes,943,849,288,41,71,1041,35,14210,3994,450,1100,76,76,10.9,22,11216,42 +Pembroke State University,No,944,774,440,14,34,2174,529,6360,2760,550,1498,77,77,15,5,6443,48 +Pennsylvania State Univ. Main Campus,No,19315,10344,3450,48,93,28938,2025,10645,4060,512,2394,77,96,18.1,19,8992,63 +Pepperdine University,Yes,3821,2037,680,86,96,2488,625,18200,6770,500,700,95,98,11.6,13,16185,66 +Peru State College,No,701,501,458,10,40,959,457,2580,2624,500,900,48,100,20.1,24,4870,44 +Pfeiffer College,Yes,838,651,159,11,25,654,162,8640,3700,400,1915,62,62,12.2,13,7634,48 +Philadelphia Coll. of Textiles and Sci.,Yes,1538,1259,468,19,42,1664,1042,11690,5062,600,1664,48,80,12.9,15,8028,68 +Phillips University,Yes,692,576,174,19,50,597,83,10500,3860,600,940,58,64,11.6,19,8990,39 +Piedmont College,Yes,663,562,127,20,40,641,63,5640,3620,600,750,89,89,13.2,17,7309,31 +Pikeville College,Yes,404,400,169,28,48,797,100,6000,3000,500,500,48,57,13.4,14,5557,61 +Pitzer College,Yes,1133,630,220,37,73,750,30,17688,5900,650,850,100,100,10.4,11,14820,73 +Point Loma Nazarene College,Yes,809,687,428,20,43,1889,217,10178,4190,800,750,71,71,16.1,19,7895,54 +Point Park College,Yes,875,744,207,7,38,1173,1402,9700,4830,400,1200,45,90,14.5,10,7652,66 +Polytechnic University,Yes,1132,847,302,58,89,1379,214,16200,4200,436,2486,90,90,10.4,14,14329,62 +Prairie View A. and M. University,No,2405,2234,1061,10,22,4564,448,4290,3500,598,1582,55,93,19.4,1,5967,35 +Presbyterian College,Yes,1082,832,302,34,63,1133,30,11859,3635,554,1429,80,85,13.4,42,8354,85 +Princeton University,Yes,13218,2042,1153,90,98,4540,146,19900,5910,675,1575,91,96,8.4,54,28320,99 +Providence College,Yes,5139,3346,973,20,55,3717,1358,14400,6200,450,1100,66,74,18.4,35,8135,96 +Purdue University at West Lafayette,No,21804,18744,5874,29,60,26213,4065,9556,3990,570,1060,86,86,18.2,15,8604,67 +Queens College,Yes,516,392,154,32,62,630,549,11020,4970,610,1900,73,75,14,36,9315,58 +Quincy University,Yes,1025,707,297,22,66,1070,72,10100,4140,450,1080,69,71,16.3,32,6880,80 +Quinnipiac College,Yes,3712,2153,806,17,45,2677,714,12030,6140,1000,500,63,73,12,33,8847,86 +Radford University,No,5702,4894,1742,15,37,8077,472,6684,4110,500,900,73,83,19.6,9,4519,62 +Ramapo College of New Jersey,No,2088,957,362,6,29,2745,1938,4449,4860,600,1655,74,95,17.8,8,7333,47 +Randolph-Macon College,Yes,1771,1325,306,21,46,1071,27,13840,3735,400,900,77,80,10.7,38,11080,74 +Randolph-Macon Woman's College,Yes,696,616,169,35,66,653,56,13970,6110,370,920,88,97,9.2,24,16358,68 +Reed College,Yes,1966,1436,327,47,80,1199,61,19960,5490,500,450,90,90,11.8,37,15886,68 +Regis College,Yes,427,385,143,18,38,581,533,12700,5800,450,700,81,85,10.3,37,11758,84 +Rensselaer Polytechnic Institute,Yes,4996,4165,936,53,82,4291,16,17475,5976,1230,1100,94,98,15.4,21,15605,70 +Rhodes College,Yes,2302,1831,391,58,82,1345,59,15200,4768,550,1500,90,96,10.8,47,13388,77 +Rider University,Yes,3586,2424,730,16,31,2748,1309,13250,5420,700,3100,84,92,12.3,23,11299,70 +Ripon College,Yes,587,501,211,28,52,735,28,15200,4100,350,650,87,90,9.4,49,12472,64 +Rivier College,Yes,484,386,141,6,28,590,1196,9870,4860,600,1100,59,59,12.2,19,6744,81 +Roanoke College,Yes,2227,1790,437,27,54,1460,239,13425,4425,450,1200,85,89,13,26,9405,72 +Rockhurst College,Yes,935,858,345,22,50,1127,754,9490,4100,500,1500,60,79,10.7,21,7519,79 +Rocky Mountain College,Yes,560,392,270,11,31,743,118,8734,3362,600,625,56,78,11.3,27,6422,68 +Roger Williams University,Yes,3304,2804,679,10,20,2111,1489,12520,6050,500,730,44,54,16.4,8,7957,61 +Rollins College,Yes,1777,1151,382,31,55,1668,1052,16425,5220,955,750,81,85,13.3,23,11561,90 +Rosary College,Yes,434,321,141,28,53,624,269,10950,4600,550,950,79,82,12.9,30,9264,81 +Rowan College of New Jersey,No,3820,1431,695,21,70,5303,3942,4356,4830,800,800,76,81,22.1,6,7252,51 +Rutgers at New Brunswick,No,48094,26330,4520,36,79,21401,3712,7410,4748,690,2009,90,95,19.5,19,10474,77 +Rutgers State University at Camden,No,3366,1752,232,27,79,2585,1300,7411,4748,690,2009,90,95,18.6,12,10134,57 +Rutgers State University at Newark,No,5785,2690,499,26,62,4005,1886,7410,4748,690,2009,90,95,17.4,16,11878,58 +Sacred Heart University,Yes,2307,1896,509,19,51,1707,1889,11070,5780,400,600,71,73,14.8,16,7120,82 +Saint Ambrose University,Yes,897,718,276,12,48,1345,390,10450,4020,500,1500,56,56,14.1,16,7444,70 +Saint Anselm College,Yes,2095,1553,514,15,40,1873,94,12950,5400,450,1120,70,82,14.5,29,6719,97 +Saint Cloud State University,No,3971,3306,1921,10,34,11493,2206,4259,2625,350,1884,70,75,18.9,10,4629,58 +Saint Francis College IN,Yes,213,166,85,13,36,513,247,8670,3820,450,1000,43,78,12.5,4,7440,48 +Saint Francis College,Yes,1046,824,284,21,45,1223,451,10880,5050,400,1235,64,64,19.3,24,7344,69 +Saint John's University,Yes,933,800,444,18,45,1691,72,12247,4081,500,600,76,85,12,38,9853,70 +Saint Joseph's College IN,Yes,920,684,225,24,42,815,222,11200,4250,600,950,55,60,14.8,19,7360,67 +Saint Joseph's College,Yes,833,682,217,12,33,716,2196,9985,5180,500,800,53,89,27.2,8,4322,85 +Saint Joseph's University,Yes,2519,2003,776,39,71,2473,1314,12750,6350,350,1690,84,90,17.4,13,8243,83 +Saint Joseph College,Yes,292,241,96,20,52,543,712,12200,4600,650,950,87,90,11.2,32,8680,76 +Saint Louis University,Yes,3294,2855,956,44,67,4576,1140,11690,4730,800,6800,84,94,4.6,19,18367,67 +Saint Mary's College,Yes,888,734,393,26,60,1433,27,12730,4514,500,1525,74,95,9.9,31,11165,98 +Saint Mary's College of Minnesota,Yes,876,802,367,14,35,1263,118,10800,3600,400,820,68,74,18.8,19,5081,78 +Saint Mary-of-the-Woods College,Yes,150,130,88,23,50,341,768,10300,4130,500,1700,44,58,10.2,37,9678,75 +Saint Michael's College,Yes,1910,1380,463,16,64,1715,106,13030,5860,500,750,79,88,14.5,34,10190,84 +Saint Olaf College,Yes,2248,1673,745,38,73,2888,105,14350,3750,550,550,82,88,10,31,12502,83 +Saint Peter's College,Yes,1606,1413,530,23,38,1921,1154,9408,5520,500,450,78,78,12.1,22,7669,53 +Saint Vincent College,Yes,700,595,278,19,35,1035,182,10850,3936,500,900,62,64,12.3,31,8534,88 +Saint Xavier University,Yes,785,647,295,15,65,1670,726,10860,4624,600,794,87,100,13.7,15,8953,55 +Salem-Teikyo University,Yes,489,384,120,23,52,700,45,10575,3952,400,620,46,24,13,9,8946,98 +Salem College,Yes,335,284,132,28,69,534,216,10475,6300,500,2000,68,68,11.2,46,9599,60 +Salisbury State University,No,4216,2290,736,20,52,4296,1027,5130,4690,600,1450,73,75,17.9,18,5125,56 +Samford University,Yes,1680,1395,691,34,76,2959,402,8236,3700,569,1650,74,75,14.7,17,9533,61 +San Diego State University,No,9402,7020,2151,20,70,16407,5550,8384,5110,612,2400,87,93,19.5,7,7930,41 +Santa Clara University,Yes,4019,2779,888,40,73,3891,128,13584,5928,630,1278,88,92,13.9,19,10872,100 +Sarah Lawrence College,Yes,1380,768,263,57,82,1000,105,19300,6694,600,700,89,93,6.1,18,14779,83 +Savannah Coll. of Art and Design,Yes,1109,688,386,20,65,1897,208,8325,5000,1200,1600,14,98,16.1,26,6874,55 +Schreiner College,Yes,584,413,131,19,51,521,99,8955,5900,500,1488,51,56,11.8,23,8545,52 +Scripps College,Yes,855,632,139,60,83,569,7,17238,7350,600,800,95,100,8.2,41,18372,73 +Seattle Pacific University,Yes,1183,1016,411,42,82,1922,704,12669,4875,600,1250,83,85,16.8,20,10368,66 +Seattle University,Yes,2115,1540,494,28,72,2993,347,12825,4375,500,1500,85,85,12.2,16,10175,89 +Seton Hall University,Yes,4576,3565,1000,16,36,4384,1530,12000,6484,650,1000,81,84,14.4,15,10080,64 +Seton Hill College,Yes,936,794,197,24,56,752,210,11240,4180,350,2000,71,71,11.2,37,10065,71 +Shippensburg University of Penn.,No,5818,3281,1116,14,53,5268,300,7844,3504,450,1700,80,83,18.8,13,6719,72 +Shorter College,Yes,540,445,165,23,70,1115,111,7210,3600,500,2000,62,65,13.2,18,7356,58 +Siena College,Yes,2961,1932,628,24,68,2669,616,10800,5100,575,1090,71,82,14.1,42,8189,100 +Siena Heights College,Yes,464,419,183,10,31,686,287,9240,3880,475,1090,29,49,7.2,17,9431,47 +Simmons College,Yes,1003,782,295,23,53,1144,160,16160,6950,500,1200,74,81,8.9,33,14086,79 +Simpson College,Yes,1016,872,300,27,57,1116,602,11250,4980,550,1400,66,73,15.8,36,7411,70 +Sioux Falls College,Yes,437,400,211,13,35,614,271,8990,3064,500,1700,73,73,14.8,7,7881,48 +Skidmore College,Yes,4293,2728,591,25,62,2322,263,18710,5970,500,700,87,92,12.7,29,14837,81 +Smith College,Yes,2925,1598,632,51,88,2479,95,18820,6390,500,1050,85,97,10.3,44,21199,90 +South Dakota State University,No,2807,2589,1701,13,37,7000,1103,3811,2190,500,1970,62,65,15,29,5084,67 +Southeast Missouri State University,No,2281,1870,1408,18,43,6553,1246,4680,3540,200,2150,75,76,17.1,8,5916,45 +Southeastern Oklahoma State Univ.,No,818,700,447,20,50,2962,651,3738,2619,450,1022,55,59,19.6,9,4444,53 +Southern California College,Yes,385,340,193,18,38,784,127,9520,4124,630,1818,63,65,18.6,11,8219,43 +Southern Illinois University at Edwardsville,No,2540,2195,994,13,40,6063,2550,5472,3598,221,2216,76,81,16.5,8,7498,43 +Southern Methodist University,Yes,4301,3455,1166,41,69,4892,387,12772,5078,576,1802,74,88,13.5,17,12726,72 +Southwest Baptist University,Yes,1093,1093,642,12,32,1770,967,7070,2500,400,1000,52,54,15.9,13,4718,71 +Southwest Missouri State University,No,6118,5254,3204,15,37,13131,3374,4740,2590,500,1360,70,75,19.9,11,4632,56 +Southwest State University,No,1047,938,511,13,33,2091,546,4285,2750,600,1800,58,75,16.5,31,6591,51 +Southwestern Adventist College,Yes,321,318,172,11,27,620,280,7536,3736,430,1651,44,77,13,12,5309,36 +Southwestern College,Yes,213,155,75,28,66,504,147,7200,3532,550,1500,56,56,11.8,12,7818,52 +Southwestern University,Yes,1244,912,352,44,77,1177,43,11850,4675,600,1050,83,89,11.3,35,12995,67 +Spalding University,Yes,283,201,97,10,45,589,263,8400,2800,600,900,50,56,10.6,40,6860,89 +Spelman College,Yes,3713,1237,443,47,83,1971,107,7000,5565,660,2400,73,80,12.5,18,9988,65 +Spring Arbor College,Yes,372,362,181,15,32,1501,353,8600,3550,385,665,48,48,15.4,9,10938,49 +St. Bonaventure University,Yes,1489,1313,375,13,45,1688,131,10456,4927,500,1050,91,91,17.7,32,9828,78 +St. John's College,Yes,323,278,122,31,51,393,4,16150,5450,275,800,63,72,7.2,26,15622,64 +St. John Fisher College,Yes,1368,1064,354,19,51,1687,677,10570,5600,400,800,86,81,14.5,29,7908,66 +St. Lawrence University,Yes,2753,1820,505,31,56,1801,45,18720,5730,650,825,90,94,11.5,38,14980,85 +St. Martin's College,Yes,191,165,63,5,25,494,574,11550,4270,300,500,43,77,14.5,8,9209,40 +St. Mary's College of California,Yes,2643,1611,465,36,80,2615,248,13332,6354,630,1584,88,89,16.1,17,9619,78 +St. Mary's College of Maryland,No,1340,695,285,42,73,1315,209,6800,4730,675,1250,84,89,11.6,23,10357,63 +St. Mary's University of San Antonio,Yes,1243,1020,414,33,60,2149,418,8678,3858,700,1736,82,83,16.2,7,7651,72 +St. Norbert College,Yes,1334,1243,568,30,56,1946,95,12140,4450,425,1100,74,78,15.1,36,8595,88 +St. Paul's College,Yes,651,581,243,8,17,617,34,5000,3650,600,600,45,45,14,8,8426,45 +St. Thomas Aquinas College,Yes,861,609,215,10,27,1117,815,8650,5700,500,1750,69,73,16.1,13,6534,67 +Stephens College,Yes,450,405,194,17,34,614,388,13900,5200,450,2150,46,63,10.9,17,9995,59 +Stetson University,Yes,1557,1227,489,37,69,1964,81,12315,4565,600,1365,85,90,12.5,24,10307,73 +Stevens Institute of Technology,Yes,1768,1249,380,51,93,1263,11,16900,5680,450,750,89,89,19,33,12837,79 +Stockton College of New Jersey,No,4019,1579,710,23,65,4365,765,3040,4351,711,1125,78,92,19.5,7,5599,64 +Stonehill College,Yes,3646,2300,585,25,69,2022,926,12170,6172,480,800,79,79,13,30,7495,97 +SUNY at Albany,No,13528,9198,1843,16,61,10168,1231,6550,4355,700,1560,93,96,17.4,16,9075,74 +SUNY at Binghamton,No,14463,6166,1757,60,94,8544,671,6550,4598,700,1000,83,100,18,15,8055,80 +SUNY at Buffalo,No,15039,9649,3087,36,100,13963,3124,6550,4731,708,957,90,97,13.6,15,11177,56 +SUNY at Stony Brook,No,12512,6969,1724,27,66,9744,1351,6550,4712,600,1200,91,96,10.5,7,13705,57 +SUNY College at Brockport,No,7294,3564,904,7,34,5758,1363,6550,4460,500,705,79,83,19,14,6632,49 +SUNY College at Oswego,No,8000,4556,1464,17,70,6943,869,6550,4810,500,1500,69,85,22,21,5280,63 +SUNY College at Buffalo,No,5318,3515,1025,8,29,7626,2091,6550,4040,550,1230,71,78,18.7,12,7511,42 +SUNY College at Cortland,No,7888,3519,1036,6,40,5011,346,6550,4680,630,1274,82,85,17.8,17,5563,53 +SUNY College at Fredonia,No,4877,2798,814,13,48,4123,298,6550,4420,620,1481,82,90,16.3,10,6442,66 +SUNY College at Geneseo,No,8598,4562,1143,56,93,5060,146,6550,4170,600,650,79,84,19.1,25,5716,76 +SUNY College at New Paltz,No,8399,3609,656,19,53,4658,1478,6550,4240,550,1500,85,93,15.3,8,6608,53 +SUNY College at Plattsburgh,No,5549,3583,853,9,40,5004,475,6550,4176,600,1380,80,90,17.9,16,6174,65 +SUNY College at Potsdam,No,3150,2289,650,16,51,3598,234,6840,4660,500,1000,71,75,15.1,17,6436,59 +SUNY College at Purchase,No,2119,1264,390,5,33,2478,1441,6550,4760,1125,1362,80,100,14.9,8,8170,46 +Susquehanna University,Yes,2096,1512,465,27,59,1442,166,16130,4710,400,800,83,86,13.9,37,10554,90 +Sweet Briar College,Yes,462,402,146,36,68,527,41,14500,6000,500,600,91,99,6.5,48,18953,61 +Syracuse University,Yes,10477,7260,2442,28,67,10142,117,15150,6870,635,960,73,84,11.3,13,14231,67 +Tabor College,Yes,257,183,109,19,41,396,38,7850,3410,400,1500,55,70,10,15,7233,53 +Talladega College,Yes,4414,1500,335,30,60,908,119,5666,2964,1000,1400,56,58,15.5,7,5970,46 +Taylor University,Yes,1769,1092,437,41,80,1757,81,10965,4000,450,1250,60,61,14.2,32,8294,98 +Tennessee Wesleyan College,Yes,232,182,99,7,29,402,237,7070,3640,400,3158,59,65,8.9,16,6286,36 +Texas A&M Univ. at College Station,No,14474,10519,6392,49,85,31643,2798,5130,3412,600,2144,89,91,23.1,29,8471,69 +Texas A&M University at Galveston,No,529,481,243,22,47,1206,134,4860,3122,600,650,103,88,17.4,16,6415,43 +Texas Christian University,Yes,4095,3079,1195,33,64,5064,660,8490,3320,650,2400,81,93,14.8,23,9158,64 +Texas Lutheran College,Yes,497,423,215,27,57,895,429,7850,3410,490,1700,54,58,13.8,24,7002,50 +Texas Southern University,No,4345,3245,2604,15,85,5584,3101,7860,3360,600,1700,65,75,18.2,21,3605,10 +Texas Wesleyan University,Yes,592,501,279,19,44,1204,392,6400,3484,600,1800,80,83,14.5,10,7936,43 +The Citadel,No,1500,1242,611,12,36,2024,292,7070,2439,400,779,95,94,17.1,17,7744,84 +Thiel College,Yes,1154,951,253,15,31,791,140,11172,4958,700,1350,68,76,11.6,16,9186,60 +Tiffin University,Yes,845,734,254,5,21,662,351,7600,3800,600,1200,59,74,19,40,5096,39 +Transylvania University,Yes,759,729,244,57,81,867,51,10900,4450,500,1000,81,91,12.1,41,10219,70 +Trenton State College,No,5042,2312,944,55,94,5167,902,5391,5411,700,1000,81,87,14.4,6,8504,81 +Tri-State University,Yes,1262,1102,276,14,40,978,98,9456,4350,468,1323,53,53,12.8,24,7603,65 +Trinity College CT,Yes,3058,1798,478,46,84,1737,244,18810,5690,500,680,91,96,10.4,48,18034,91 +Trinity College DC,Yes,247,189,100,19,49,309,639,11412,6430,500,900,89,93,8.3,37,11806,96 +Trinity College VT,Yes,222,185,91,16,41,484,541,11010,5208,550,500,58,78,10.4,26,9586,78 +Trinity University,Yes,2425,1818,601,62,93,2110,95,12240,5150,500,490,94,96,9.6,20,14703,93 +Tulane University,Yes,7033,5125,1223,47,75,4941,1534,19040,5950,350,800,98,98,9.1,21,16920,74 +Tusculum College,Yes,626,372,145,12,34,983,40,7700,3400,450,800,70,70,21.9,28,4933,52 +Tuskegee University,Yes,2267,1827,611,20,59,2825,144,6735,3395,600,1425,70,74,12.2,7,10872,65 +Union College KY,Yes,484,384,177,9,45,634,78,7800,2950,500,600,60,88,14.1,9,6864,64 +Union College NY,Yes,3495,1712,528,49,84,1915,123,18732,6204,450,1024,94,96,11.5,49,15411,88 +Univ. of Wisconsin at OshKosh,No,4800,2900,1515,14,48,7764,1472,6874,2394,518,1890,73,78,19.2,14,5901,56 +University of Alabama at Birmingham,No,1797,1260,938,24,35,6960,4698,4440,5175,750,2200,96,96,6.7,16,16352,33 +University of Arkansas at Fayetteville,No,3235,3108,2133,25,65,9978,1530,5028,3300,500,2000,73,89,14.8,10,6820,39 +University of California at Berkeley,No,19873,8252,3215,95,100,19532,2061,11648,6246,636,1933,93,97,15.8,10,13919,78 +University of California at Irvine,No,15698,10775,2478,85,100,12677,864,12024,5302,790,1818,96,96,16.1,11,15934,66 +University of Central Florida,No,6986,2959,1918,25,60,12330,7152,6618,4234,700,1600,80,98,22.2,9,6742,46 +University of Charleston,Yes,682,535,204,22,43,771,611,9500,3540,400,750,26,58,2.5,10,7683,57 +University of Chicago,Yes,6348,2999,922,68,94,3340,39,18930,6380,500,1254,99,99,5.3,36,36854,90 +University of Cincinnati,No,6855,5553,2408,26,57,11036,2011,8907,4697,556,1851,89,95,10.8,6,13889,54 +University of Connecticut at Storrs,No,9735,7187,2064,23,63,12478,1660,11656,5072,700,2300,89,95,16,16,10178,71 +University of Dallas,Yes,681,588,246,44,74,1058,73,10760,6230,500,1200,85,93,13.4,26,8731,63 +University of Dayton,Yes,6361,5293,1507,26,51,5889,665,11380,4220,500,900,81,85,14.8,25,8894,93 +University of Delaware,Yes,14446,10516,3252,22,57,14130,4522,10220,4230,530,1300,82,87,18.3,15,10650,75 +University of Denver,Yes,2974,2001,580,29,60,2666,554,15192,4695,400,1350,84,91,15.9,21,11762,67 +University of Detroit Mercy,Yes,927,731,415,24,50,2149,2217,11130,3996,600,2166,72,79,13.5,14,10891,51 +University of Dubuque,Yes,576,558,137,11,39,662,131,10430,3620,400,1500,85,98,16.5,18,8767,45 +University of Evansville,Yes,2096,1626,694,35,67,2551,407,11800,4340,700,960,60,81,15.8,26,7780,77 +University of Florida,No,12445,8836,3623,54,85,24470,3286,7090,4180,630,1530,88,97,13.4,20,14737,66 +University of Georgia,No,11220,7871,3320,43,79,19553,2748,5697,3600,525,1755,88,95,14.7,22,7881,63 +University of Hartford,Yes,5081,4040,1194,11,26,3768,1415,14220,6000,500,1440,61,76,10.7,9,10625,66 +University of Hawaii at Manoa,No,3580,2603,1627,36,69,11028,2411,4460,3038,687,1281,85,87,11.8,6,12833,54 +University of Illinois - Urbana,No,14939,11652,5705,52,88,25422,911,7560,4574,500,1982,87,90,17.4,13,8559,81 +University of Illinois at Chicago,No,8384,5727,2710,22,50,13518,2916,7230,5088,630,3228,82,84,10,6,13883,34 +University of Indianapolis,Yes,1487,1276,388,26,51,1417,1646,11120,4080,525,1405,55,56,11.1,23,6735,69 +University of Kansas,No,8579,5561,3681,25,50,17880,1673,6994,3384,700,2681,88,94,13.7,17,9657,57 +University of La Verne,Yes,1597,969,226,16,38,1431,1522,13540,5050,630,2298,66,68,14.1,23,10139,47 +University of Louisville,No,4777,3057,1823,16,33,9844,6198,6540,3600,530,2440,84,92,11.1,24,10207,31 +University of Maine at Farmington,No,1208,803,438,20,48,1906,344,6810,3970,450,1647,67,75,15.9,26,5712,59 +University of Maine at Machias,No,441,369,172,17,45,633,317,6600,3680,600,400,46,46,15.1,4,5935,64 +University of Maine at Presque Isle,No,461,381,235,10,40,974,503,6600,3630,400,1675,67,67,15.2,11,6408,35 +University of Maryland at Baltimore County,No,4269,2594,985,27,57,6476,2592,8594,4408,494,2768,82,88,18.4,6,7618,55 +University of Maryland at College Park,No,14292,10315,3409,22,53,19340,3991,8723,5146,550,1550,89,92,18.1,12,9021,63 +University of Massachusetts at Amherst,No,14438,12414,3816,12,39,16282,1940,8566,3897,500,1400,88,92,16.7,15,10276,68 +University of Massachusetts at Dartmouth,No,3347,2597,1006,10,37,4664,1630,6919,4500,500,1250,74,90,15,20,7462,56 +University of Miami,Yes,7122,5386,1643,42,69,7760,876,16500,6526,630,1985,82,94,5.9,17,17500,59 +University of Michigan at Ann Arbor,No,19152,12940,4893,66,92,22045,1339,15732,4659,476,1600,90,98,11.5,26,14847,87 +University of Minnesota at Duluth,No,4192,3126,1656,15,45,5887,1254,8828,3474,753,2610,79,91,19,11,6393,53 +University of Minnesota at Morris,No,1458,874,588,56,86,1846,154,9843,3180,600,1500,74,78,14.6,16,6716,51 +University of Minnesota Twin Cities,No,11054,6397,3524,26,55,16502,21836,8949,3744,714,2910,88,90,12.2,37,16122,45 +University of Mississippi,No,3844,3383,1669,26,47,7524,804,4916,3810,600,550,81,86,20.3,14,6971,53 +University of Missouri at Columbia,No,6574,4637,2940,32,62,14782,1583,9057,3485,600,1983,87,87,12.7,15,10145,58 +University of Missouri at Rolla,No,1877,1826,823,49,77,3926,561,9057,3600,700,1435,88,88,14.4,23,9699,49 +University of Missouri at Saint Louis,No,1618,1141,479,18,54,4793,4552,7246,3964,500,4288,71,73,13.4,15,6433,48 +University of Mobile,Yes,452,331,269,17,54,1417,301,6150,3680,550,1200,59,63,16.6,4,5412,52 +University of Montevallo,No,1351,892,570,18,78,2385,331,4440,3030,300,600,72,72,18.9,8,5883,51 +University of Nebraska at Lincoln,No,6277,6003,3526,33,63,16454,3171,5595,3145,500,2070,86,92,15.1,48,6813,53 +University of New England,Yes,1209,750,265,19,54,820,159,11450,5045,900,2500,72,75,11.4,13,9718,64 +University of New Hampshire,No,9750,7640,2529,24,62,10358,1338,11180,3862,650,2450,89,87,17.5,16,7855,75 +University of North Carolina at Asheville,No,1757,979,394,32,74,2033,1078,5972,3420,600,750,77,83,13,11,7011,37 +University of North Carolina at Chapel Hill,No,14596,5985,3331,75,92,14609,1100,8400,4200,550,1200,88,93,8.9,23,15893,83 +University of North Carolina at Charlotte,No,5803,4441,1730,19,62,10099,3255,7248,3109,600,1900,79,91,16.8,7,6227,62 +University of North Carolina at Greensboro,No,5191,4134,1500,15,44,7532,1847,8677,3505,600,1300,75,94,15.5,17,7392,53 +University of North Carolina at Wilmington,No,6071,3856,1449,15,67,6635,1145,7558,3680,500,1500,82,85,19.1,15,6005,55 +University of North Dakota,No,2777,2249,1652,20,54,8334,1435,5634,2703,450,1200,97,97,15.9,16,9424,49 +University of North Florida,No,1800,1253,560,44,85,3876,3588,6634,4360,600,2604,82,85,17.8,14,6104,47 +University of North Texas,No,4418,2737,2049,23,51,14047,5134,4104,3579,450,1700,86,94,22.6,6,5657,35 +University of Northern Colorado,No,5530,4007,1697,12,37,8463,1498,7731,4128,540,2286,75,75,21.5,8,6309,40 +University of Northern Iowa,No,4144,3379,1853,18,52,10135,1448,6197,2930,595,2380,78,82,16.3,26,6333,77 +University of Notre Dame,Yes,7700,3700,1906,79,96,7671,30,16850,4400,600,1350,96,92,13.1,46,13936,97 +University of Oklahoma,No,4743,3970,2233,32,63,13436,2582,5173,3526,765,3176,86,90,11.5,11,10244,44 +University of Oregon,No,8631,6732,2546,25,61,11669,1605,10602,3660,570,1530,79,87,19.7,13,8020,54 +University of Pennsylvania,Yes,12394,5232,2464,85,100,9205,531,17020,7270,500,1544,95,96,6.3,38,25765,93 +University of Pittsburgh-Main Campus,No,8586,6383,2503,25,59,13138,4289,10786,4560,400,900,93,93,7.8,10,13789,66 +University of Portland,Yes,1758,1485,419,27,58,2041,174,12040,4100,600,1100,92,96,13.2,17,9060,72 +University of Puget Sound,Yes,4044,2826,688,51,83,2738,138,16230,4500,630,1800,79,86,15,17,11217,63 +University of Rhode Island,No,9643,7751,1968,12,40,8894,2456,10330,5558,500,1250,84,89,16.6,7,9158,63 +University of Richmond,Yes,5892,2718,756,46,72,2854,594,14500,3285,700,1125,75,89,11.7,32,11984,100 +University of Rochester,Yes,8766,5498,1243,56,75,5071,438,17840,6582,500,882,93,99,5.9,23,26037,80 +University of San Diego,Yes,3934,2735,886,40,70,3698,217,13600,5940,630,1820,93,96,15.6,13,10813,66 +University of San Francisco,Yes,2306,1721,538,23,48,4309,549,13226,6452,750,2450,86,86,13.6,8,10074,62 +University of Sci. and Arts of Oklahoma,No,285,280,208,21,43,1140,473,3687,1920,600,1800,67,77,23.6,3,3864,43 +University of Scranton,Yes,4471,2942,910,29,60,3674,493,11584,5986,650,800,83,83,14.1,41,9131,92 +University of South Carolina at Aiken,No,848,560,377,14,24,1855,1412,5800,3066,500,1500,62,62,14.8,3,5035,48 +University of South Carolina at Columbia,No,7693,5815,2328,30,66,12594,3661,8074,3522,495,2941,84,88,16.9,18,8246,63 +University of South Florida,No,7589,4676,1876,29,63,14770,10962,6760,3776,500,2180,84,89,17,7,11020,47 +University of Southern California,Yes,12229,8498,2477,45,71,13259,1429,17230,6482,600,2210,90,94,11.4,10,17007,68 +University of Southern Colorado,No,1401,1239,605,10,34,3716,675,7100,4380,540,2948,63,88,19.4,0,5389,36 +University of Southern Indiana,No,2379,2133,1292,8,25,4283,2973,4973,3192,500,1425,56,65,22,21,4078,38 +University of Southern Mississippi,No,2850,2044,1046,20,50,9260,1387,4652,2470,500,500,78,99,18.7,23,5917,45 +University of St. Thomas MN,Yes,2057,1807,828,26,53,4106,982,11712,4037,500,800,80,80,13.8,13,8546,89 +University of St. Thomas TX,Yes,374,280,185,45,77,995,408,8550,4050,500,1344,75,75,12.6,17,7237,62 +University of Tennessee at Knoxville,No,7473,5372,3013,27,53,15749,3237,5764,3262,750,3300,86,92,16.5,22,8612,53 +University of Texas at Arlington,No,3281,2559,1448,19,43,10975,8431,4422,2780,500,2850,73,73,21,4,4696,29 +University of Texas at Austin,No,14752,9572,5329,48,85,30017,5189,5130,3309,650,3140,91,99,19.7,11,7837,65 +University of Texas at San Antonio,No,4217,3100,1686,17,46,9375,5457,4104,5376,452,1200,94,100,25.3,3,4329,50 +University of the Arts,Yes,974,704,290,5,22,1145,39,12520,3860,1300,700,16,59,7.5,9,11641,57 +University of the Pacific,Yes,2459,1997,582,36,66,2664,299,16320,5326,646,1171,87,94,11.2,14,13706,65 +University of the South,Yes,1445,966,326,46,83,1129,24,15350,4080,450,810,89,93,10.3,52,18784,82 +University of Tulsa,Yes,1712,1557,696,41,68,2936,433,11750,4160,1200,2350,94,96,11.5,10,11743,47 +University of Utah,No,5095,4491,2400,27,53,13894,8374,6857,3975,858,3093,89,93,12.8,9,9275,37 +University of Vermont,No,7663,6008,1735,18,51,7353,1674,15516,4928,500,990,87,90,9.9,10,12646,79 +University of Virginia,No,15849,5384,2678,74,95,11278,114,12212,3792,500,1000,90,92,9.5,22,13597,95 +University of Washington,No,12749,7025,3343,40,81,20356,4582,8199,4218,708,2172,96,94,9,10,16527,65 +University of West Florida,No,1558,1254,472,20,57,3754,2477,6172,3994,541,1387,83,87,23.4,12,8488,53 +University of Wisconsin-Stout,No,2593,1966,1030,9,32,6038,579,6704,2592,376,1750,78,78,21,17,6254,65 +University of Wisconsin-Superior,No,910,910,342,14,53,1434,417,7032,2780,550,1960,75,81,15.2,15,6490,36 +University of Wisconsin-Whitewater,No,4400,3719,1472,12,38,7804,1552,6950,2500,300,1200,90,95,23.1,16,5559,67 +University of Wisconsin at Green Bay,No,2409,1939,759,17,50,3819,1347,6900,2800,475,1200,81,89,22.2,1,5968,46 +University of Wisconsin at Madison,No,14901,10932,4631,36,80,23945,2200,9096,4290,535,1545,93,96,11.5,20,11006,72 +University of Wisconsin at Milwaukee,No,5244,3782,1930,12,37,11561,7443,8786,2964,570,1980,79,87,15.9,8,8094,38 +University of Wyoming,No,2029,1516,1073,23,46,7535,1488,5988,3422,600,1500,91,94,15.1,13,8745,45 +Upper Iowa University,Yes,663,452,192,10,35,1481,1160,8840,3060,500,1000,69,75,17.4,19,3733,78 +Ursinus College,Yes,1399,1026,308,44,77,1131,17,14900,5160,500,800,82,85,11.6,40,12082,79 +Ursuline College,Yes,325,260,86,21,47,699,717,9600,4202,450,750,39,69,10.5,15,7164,68 +Valley City State University,No,368,344,212,5,27,863,189,4286,2570,600,2000,39,41,14.9,25,4958,40 +Valparaiso University,Yes,2075,1727,520,49,81,2501,198,11800,3260,500,800,87,89,14.2,23,9681,95 +Vanderbilt University,Yes,7791,4690,1499,71,92,5500,90,17865,6525,630,952,93,98,5.8,26,23850,83 +Vassar College,Yes,3550,1877,653,53,87,2164,77,18920,5950,600,800,90,98,9.7,39,17089,90 +Villanova University,Yes,7759,5588,1477,30,68,6362,1292,15925,6507,400,300,89,90,13.4,24,10458,96 +Virginia Commonwealth University,No,4963,3497,1567,18,45,10262,5065,10217,4182,500,3630,81,87,8.7,11,11183,45 +Virginia State University,No,2996,2440,704,2,30,3006,338,5587,4845,500,600,61,63,16,11,5733,31 +Virginia Tech,No,15712,11719,4277,29,53,18511,604,10260,3176,740,2200,85,89,13.8,20,8944,73 +Virginia Union University,Yes,1847,1610,453,19,59,1298,67,7384,3494,500,1763,51,67,13.7,8,6757,30 +Virginia Wesleyan College,Yes,1470,900,287,20,49,1130,417,10900,5100,500,550,70,81,15.7,14,7804,68 +Viterbo College,Yes,647,518,271,17,43,1014,387,9140,3365,500,2245,51,65,10.7,31,8050,73 +Voorhees College,Yes,1465,1006,188,10,30,703,20,4450,2522,500,1200,43,43,22.9,3,5861,58 +Wabash College,Yes,800,623,256,41,76,801,5,12925,4195,500,635,78,85,9.9,55,14904,72 +Wagner College,Yes,1416,1015,417,10,44,1324,117,13500,5800,585,1700,67,78,13.2,23,9006,75 +Wake Forest University,Yes,5661,2392,903,75,88,3499,172,13850,4360,500,1250,95,97,4.3,37,41766,89 +Walsh University,Yes,1092,890,477,27,92,847,497,8670,4180,500,1450,42,58,11.3,33,5738,68 +Warren Wilson College,Yes,440,311,112,25,49,466,7,10000,3052,400,1100,65,75,11.4,20,9430,63 +Wartburg College,Yes,1231,1074,345,34,66,1295,105,11600,3610,400,850,66,91,12.4,37,7735,67 +Washington and Jefferson College,Yes,1305,1100,334,42,64,1098,151,16260,4005,300,500,91,91,12.1,40,10162,86 +Washington and Lee University,Yes,3315,1096,425,68,93,1584,3,13750,4619,680,1115,81,96,9.6,45,15736,90 +Washington College,Yes,1209,942,214,31,60,822,46,15276,5318,500,300,79,86,11.2,37,10830,65 +Washington State University,No,6540,5839,2440,31,70,14445,1344,8200,4210,800,2719,84,87,16.9,30,10912,56 +Washington University,Yes,7654,5259,1254,62,93,4879,1274,18350,5775,768,1512,91,98,3.9,31,45702,90 +Wayne State College,No,1373,1373,724,6,21,2754,474,2700,2660,540,1660,60,68,20.3,29,4550,52 +Waynesburg College,Yes,1190,978,324,12,30,1280,61,8840,3620,500,1200,57,58,16.2,26,6563,63 +Webber College,Yes,280,143,79,5,27,327,110,5590,2900,650,1952,53,63,15.1,4,4839,90 +Webster University,Yes,665,462,226,17,44,1739,1550,9160,4340,500,500,68,68,20.6,14,6951,48 +Wellesley College,Yes,2895,1249,579,80,96,2195,156,18345,5995,500,700,94,98,10.6,51,21409,91 +Wells College,Yes,318,240,130,40,85,416,19,14900,5550,600,500,93,98,8.3,42,13935,69 +Wentworth Institute of Technology,Yes,1480,1257,452,6,25,2961,572,9850,6050,850,920,10,68,15.4,8,17858,64 +Wesley College,Yes,980,807,350,10,25,872,448,9890,4674,500,1350,52,57,14.4,15,6243,84 +Wesleyan University,Yes,4772,1973,712,60,86,2714,27,19130,5600,1400,1400,90,94,12.1,39,16262,92 +West Chester University of Penn.,No,6502,3539,1372,11,51,7484,1904,7844,4108,400,2000,76,79,15.3,16,6773,52 +West Liberty State College,No,1164,1062,478,12,25,2138,227,4470,2890,600,1210,33,33,16.3,10,4249,60 +West Virginia Wesleyan College,Yes,1566,1400,483,28,55,1509,170,14200,3775,450,1100,58,81,16.4,42,8080,67 +Western Carolina University,No,3224,2519,1057,11,31,5000,706,6390,2380,110,1622,67,78,14.6,9,6554,55 +Western Maryland College,Yes,1205,984,278,31,50,1071,98,14510,5340,500,1400,84,91,12.5,39,10026,60 +Western Michigan University,No,9167,7191,2738,24,53,15739,4278,6940,4100,500,1700,80,84,24.7,11,5983,55 +Western New England College,Yes,1650,1471,409,7,21,1803,1116,8994,5500,498,2065,74,97,15.4,15,8409,59 +Western State College of Colorado,No,2702,1623,604,7,24,2315,146,5918,3755,500,2050,76,79,19.4,4,4599,52 +Western Washington University,No,5548,3563,1549,30,71,8909,506,8124,4144,639,2385,83,89,22.7,10,7203,61 +Westfield State College,No,3100,2150,825,3,20,3234,941,5542,3788,500,1300,75,79,15.7,20,4222,65 +Westminster College MO,Yes,662,553,184,20,43,665,37,10720,4050,600,1650,66,70,12.5,20,7925,62 +Westminster College,Yes,996,866,377,29,58,1411,72,12065,3615,430,685,62,78,12.5,41,8596,80 +Westminster College of Salt Lake City,Yes,917,720,213,21,60,979,743,8820,4050,600,2025,68,83,10.5,34,7170,50 +Westmont College,No,950,713,351,42,72,1276,9,14320,5304,490,1410,77,77,14.9,17,8837,87 +Wheaton College IL,Yes,1432,920,548,56,84,2200,56,11480,4200,530,1400,81,83,12.7,40,11916,85 +Westminster College PA,Yes,1738,1373,417,21,55,1335,30,18460,5970,700,850,92,96,13.2,41,22704,71 +Wheeling Jesuit College,Yes,903,755,213,15,49,971,305,10500,4545,600,600,66,71,14.1,27,7494,72 +Whitman College,Yes,1861,998,359,45,77,1220,46,16670,4900,750,800,80,83,10.5,51,13198,72 +Whittier College,Yes,1681,1069,344,35,63,1235,30,16249,5699,500,1998,84,92,13.6,29,11778,52 +Whitworth College,Yes,1121,926,372,43,70,1270,160,12660,4500,678,2424,80,80,16.9,20,8328,80 +Widener University,Yes,2139,1492,502,24,64,2186,2171,12350,5370,500,1350,88,86,12.6,19,9603,63 +Wilkes University,Yes,1631,1431,434,15,36,1803,603,11150,5130,550,1260,78,92,13.3,24,8543,67 +Willamette University,Yes,1658,1327,395,49,80,1595,159,14800,4620,400,790,91,94,13.3,37,10779,68 +William Jewell College,Yes,663,547,315,32,67,1279,75,10060,2970,500,2600,74,80,11.2,19,7885,59 +William Woods University,Yes,469,435,227,17,39,851,120,10535,4365,550,3700,39,66,12.9,16,7438,52 +Williams College,Yes,4186,1245,526,81,96,1988,29,19629,5790,500,1200,94,99,9,64,22014,99 +Wilson College,Yes,167,130,46,16,50,199,676,11428,5084,450,475,67,76,8.3,43,10291,67 +Wingate College,Yes,1239,1017,383,10,34,1207,157,7820,3400,550,1550,69,81,13.9,8,7264,91 +Winona State University,No,3325,2047,1301,20,45,5800,872,4200,2700,300,1200,53,60,20.2,18,5318,58 +Winthrop University,No,2320,1805,769,24,61,3395,670,6400,3392,580,2150,71,80,12.8,26,6729,59 +Wisconsin Lutheran College,Yes,152,128,75,17,41,282,22,9100,3700,500,1400,48,48,8.5,26,8960,50 +Wittenberg University,Yes,1979,1739,575,42,68,1980,144,15948,4404,400,800,82,95,12.8,29,10414,78 +Wofford College,Yes,1501,935,273,51,83,1059,34,12680,4150,605,1440,91,92,15.3,42,7875,75 +Worcester Polytechnic Institute,Yes,2768,2314,682,49,86,2802,86,15884,5370,530,730,92,94,15.2,34,10774,82 +Worcester State College,No,2197,1515,543,4,26,3089,2029,6797,3900,500,1200,60,60,21,14,4469,40 +Xavier University,Yes,1959,1805,695,24,47,2849,1107,11520,4960,600,1250,73,75,13.3,31,9189,83 +Xavier University of Louisiana,Yes,2097,1915,695,34,61,2793,166,6900,4200,617,781,67,75,14.4,20,8323,49 +Yale University,Yes,10705,2453,1317,95,99,5217,83,19840,6510,630,2115,96,96,5.8,49,40386,99 +York College of Pennsylvania,Yes,2989,1855,691,28,63,2988,1726,4990,3560,500,1250,75,75,18.1,28,4509,99 diff --git a/ISLP_data/Credit.csv b/ISLP_data/Credit.csv new file mode 100644 index 0000000..85c0e1e --- /dev/null +++ b/ISLP_data/Credit.csv @@ -0,0 +1,401 @@ +Income,Limit,Rating,Cards,Age,Education,Own,Student,Married,Region,Balance +14.891,3606,283,2,34,11,No,No,Yes,South,333 +106.025,6645,483,3,82,15,Yes,Yes,Yes,West,903 +104.593,7075,514,4,71,11,No,No,No,West,580 +148.924,9504,681,3,36,11,Yes,No,No,West,964 +55.882,4897,357,2,68,16,No,No,Yes,South,331 +80.18,8047,569,4,77,10,No,No,No,South,1151 +20.996,3388,259,2,37,12,Yes,No,No,East,203 +71.408,7114,512,2,87,9,No,No,No,West,872 +15.125,3300,266,5,66,13,Yes,No,No,South,279 +71.061,6819,491,3,41,19,Yes,Yes,Yes,East,1350 +63.095,8117,589,4,30,14,No,No,Yes,South,1407 +15.045,1311,138,3,64,16,No,No,No,South,0 +80.616,5308,394,1,57,7,Yes,No,Yes,West,204 +43.682,6922,511,1,49,9,No,No,Yes,South,1081 +19.144,3291,269,2,75,13,Yes,No,No,East,148 +20.089,2525,200,3,57,15,Yes,No,Yes,East,0 +53.598,3714,286,3,73,17,Yes,No,Yes,East,0 +36.496,4378,339,3,69,15,Yes,No,Yes,West,368 +49.57,6384,448,1,28,9,Yes,No,Yes,West,891 +42.079,6626,479,2,44,9,No,No,No,West,1048 +17.7,2860,235,4,63,16,Yes,No,No,West,89 +37.348,6378,458,1,72,17,Yes,No,No,South,968 +20.103,2631,213,3,61,10,No,No,Yes,East,0 +64.027,5179,398,5,48,8,No,No,Yes,East,411 +10.742,1757,156,3,57,15,Yes,No,No,South,0 +14.09,4323,326,5,25,16,Yes,No,Yes,East,671 +42.471,3625,289,6,44,12,Yes,Yes,No,South,654 +32.793,4534,333,2,44,16,No,No,No,East,467 +186.634,13414,949,2,41,14,Yes,No,Yes,East,1809 +26.813,5611,411,4,55,16,Yes,No,No,South,915 +34.142,5666,413,4,47,5,Yes,No,Yes,South,863 +28.941,2733,210,5,43,16,No,No,Yes,West,0 +134.181,7838,563,2,48,13,Yes,No,No,South,526 +31.367,1829,162,4,30,10,No,No,Yes,South,0 +20.15,2646,199,2,25,14,Yes,No,Yes,West,0 +23.35,2558,220,3,49,12,Yes,Yes,No,South,419 +62.413,6457,455,2,71,11,Yes,No,Yes,South,762 +30.007,6481,462,2,69,9,Yes,No,Yes,South,1093 +11.795,3899,300,4,25,10,Yes,No,No,South,531 +13.647,3461,264,4,47,14,No,No,Yes,South,344 +34.95,3327,253,3,54,14,Yes,No,No,East,50 +113.659,7659,538,2,66,15,No,Yes,Yes,East,1155 +44.158,4763,351,2,66,13,Yes,No,Yes,West,385 +36.929,6257,445,1,24,14,Yes,No,Yes,West,976 +31.861,6375,469,3,25,16,Yes,No,Yes,South,1120 +77.38,7569,564,3,50,12,Yes,No,Yes,South,997 +19.531,5043,376,2,64,16,Yes,Yes,Yes,West,1241 +44.646,4431,320,2,49,15,No,Yes,Yes,South,797 +44.522,2252,205,6,72,15,No,No,Yes,West,0 +43.479,4569,354,4,49,13,No,Yes,Yes,East,902 +36.362,5183,376,3,49,15,No,No,Yes,East,654 +39.705,3969,301,2,27,20,No,No,Yes,East,211 +44.205,5441,394,1,32,12,No,No,Yes,South,607 +16.304,5466,413,4,66,10,No,No,Yes,West,957 +15.333,1499,138,2,47,9,Yes,No,Yes,West,0 +32.916,1786,154,2,60,8,Yes,No,Yes,West,0 +57.1,4742,372,7,79,18,Yes,No,Yes,West,379 +76.273,4779,367,4,65,14,Yes,No,Yes,South,133 +10.354,3480,281,2,70,17,No,No,Yes,South,333 +51.872,5294,390,4,81,17,Yes,No,No,South,531 +35.51,5198,364,2,35,20,Yes,No,No,West,631 +21.238,3089,254,3,59,10,Yes,No,No,South,108 +30.682,1671,160,2,77,7,Yes,No,No,South,0 +14.132,2998,251,4,75,17,No,No,No,South,133 +32.164,2937,223,2,79,15,Yes,No,Yes,East,0 +12,4160,320,4,28,14,Yes,No,Yes,South,602 +113.829,9704,694,4,38,13,Yes,No,Yes,West,1388 +11.187,5099,380,4,69,16,Yes,No,No,East,889 +27.847,5619,418,2,78,15,Yes,No,Yes,South,822 +49.502,6819,505,4,55,14,No,No,Yes,South,1084 +24.889,3954,318,4,75,12,No,No,Yes,South,357 +58.781,7402,538,2,81,12,Yes,No,Yes,West,1103 +22.939,4923,355,1,47,18,Yes,No,Yes,West,663 +23.989,4523,338,4,31,15,No,No,No,South,601 +16.103,5390,418,4,45,10,Yes,No,Yes,South,945 +33.017,3180,224,2,28,16,No,No,Yes,East,29 +30.622,3293,251,1,68,16,No,Yes,No,South,532 +20.936,3254,253,1,30,15,Yes,No,No,West,145 +110.968,6662,468,3,45,11,Yes,No,Yes,South,391 +15.354,2101,171,2,65,14,No,No,No,West,0 +27.369,3449,288,3,40,9,Yes,No,Yes,South,162 +53.48,4263,317,1,83,15,No,No,No,South,99 +23.672,4433,344,3,63,11,No,No,No,South,503 +19.225,1433,122,3,38,14,Yes,No,No,South,0 +43.54,2906,232,4,69,11,No,No,No,South,0 +152.298,12066,828,4,41,12,Yes,No,Yes,West,1779 +55.367,6340,448,1,33,15,No,No,Yes,South,815 +11.741,2271,182,4,59,12,Yes,No,No,West,0 +15.56,4307,352,4,57,8,No,No,Yes,East,579 +59.53,7518,543,3,52,9,Yes,No,No,East,1176 +20.191,5767,431,4,42,16,No,No,Yes,East,1023 +48.498,6040,456,3,47,16,No,No,Yes,South,812 +30.733,2832,249,4,51,13,No,No,No,South,0 +16.479,5435,388,2,26,16,No,No,No,East,937 +38.009,3075,245,3,45,15,Yes,No,No,East,0 +14.084,855,120,5,46,17,Yes,No,Yes,East,0 +14.312,5382,367,1,59,17,No,Yes,No,West,1380 +26.067,3388,266,4,74,17,Yes,No,Yes,East,155 +36.295,2963,241,2,68,14,Yes,Yes,No,East,375 +83.851,8494,607,5,47,18,No,No,No,South,1311 +21.153,3736,256,1,41,11,No,No,No,South,298 +17.976,2433,190,3,70,16,Yes,Yes,No,South,431 +68.713,7582,531,2,56,16,No,Yes,No,South,1587 +146.183,9540,682,6,66,15,No,No,No,South,1050 +15.846,4768,365,4,53,12,Yes,No,No,South,745 +12.031,3182,259,2,58,18,Yes,No,Yes,South,210 +16.819,1337,115,2,74,15,No,No,Yes,West,0 +39.11,3189,263,3,72,12,No,No,No,West,0 +107.986,6033,449,4,64,14,No,No,Yes,South,227 +13.561,3261,279,5,37,19,No,No,Yes,West,297 +34.537,3271,250,3,57,17,Yes,No,Yes,West,47 +28.575,2959,231,2,60,11,Yes,No,No,East,0 +46.007,6637,491,4,42,14,No,No,Yes,South,1046 +69.251,6386,474,4,30,12,Yes,No,Yes,West,768 +16.482,3326,268,4,41,15,No,No,No,South,271 +40.442,4828,369,5,81,8,Yes,No,No,East,510 +35.177,2117,186,3,62,16,Yes,No,No,South,0 +91.362,9113,626,1,47,17,No,No,Yes,West,1341 +27.039,2161,173,3,40,17,Yes,No,No,South,0 +23.012,1410,137,3,81,16,No,No,No,South,0 +27.241,1402,128,2,67,15,Yes,No,Yes,West,0 +148.08,8157,599,2,83,13,No,No,Yes,South,454 +62.602,7056,481,1,84,11,Yes,No,No,South,904 +11.808,1300,117,3,77,14,Yes,No,No,East,0 +29.564,2529,192,1,30,12,Yes,No,Yes,South,0 +27.578,2531,195,1,34,15,Yes,No,Yes,South,0 +26.427,5533,433,5,50,15,Yes,Yes,Yes,West,1404 +57.202,3411,259,3,72,11,Yes,No,No,South,0 +123.299,8376,610,2,89,17,No,Yes,No,East,1259 +18.145,3461,279,3,56,15,No,No,Yes,East,255 +23.793,3821,281,4,56,12,Yes,Yes,Yes,East,868 +10.726,1568,162,5,46,19,No,No,Yes,West,0 +23.283,5443,407,4,49,13,No,No,Yes,East,912 +21.455,5829,427,4,80,12,Yes,No,Yes,East,1018 +34.664,5835,452,3,77,15,Yes,No,Yes,East,835 +44.473,3500,257,3,81,16,Yes,No,No,East,8 +54.663,4116,314,2,70,8,Yes,No,No,East,75 +36.355,3613,278,4,35,9,No,No,Yes,West,187 +21.374,2073,175,2,74,11,Yes,No,Yes,South,0 +107.841,10384,728,3,87,7,No,No,No,East,1597 +39.831,6045,459,3,32,12,Yes,Yes,Yes,East,1425 +91.876,6754,483,2,33,10,No,No,Yes,South,605 +103.893,7416,549,3,84,17,No,No,No,West,669 +19.636,4896,387,3,64,10,Yes,No,No,East,710 +17.392,2748,228,3,32,14,No,No,Yes,South,68 +19.529,4673,341,2,51,14,No,No,No,West,642 +17.055,5110,371,3,55,15,Yes,No,Yes,South,805 +23.857,1501,150,3,56,16,No,No,Yes,South,0 +15.184,2420,192,2,69,11,Yes,No,Yes,South,0 +13.444,886,121,5,44,10,No,No,Yes,West,0 +63.931,5728,435,3,28,14,Yes,No,Yes,East,581 +35.864,4831,353,3,66,13,Yes,No,Yes,South,534 +41.419,2120,184,4,24,11,Yes,Yes,No,South,156 +92.112,4612,344,3,32,17,No,No,No,South,0 +55.056,3155,235,2,31,16,No,No,Yes,East,0 +19.537,1362,143,4,34,9,Yes,No,Yes,West,0 +31.811,4284,338,5,75,13,Yes,No,Yes,South,429 +56.256,5521,406,2,72,16,Yes,Yes,Yes,South,1020 +42.357,5550,406,2,83,12,Yes,No,Yes,West,653 +53.319,3000,235,3,53,13,No,No,No,West,0 +12.238,4865,381,5,67,11,Yes,No,No,South,836 +31.353,1705,160,3,81,14,No,No,Yes,South,0 +63.809,7530,515,1,56,12,No,No,Yes,South,1086 +13.676,2330,203,5,80,16,Yes,No,No,East,0 +76.782,5977,429,4,44,12,No,No,Yes,West,548 +25.383,4527,367,4,46,11,No,No,Yes,South,570 +35.691,2880,214,2,35,15,No,No,No,East,0 +29.403,2327,178,1,37,14,Yes,No,Yes,South,0 +27.47,2820,219,1,32,11,Yes,No,Yes,West,0 +27.33,6179,459,4,36,12,Yes,No,Yes,South,1099 +34.772,2021,167,3,57,9,No,No,No,West,0 +36.934,4270,299,1,63,9,Yes,No,Yes,South,283 +76.348,4697,344,4,60,18,No,No,No,West,108 +14.887,4745,339,3,58,12,No,No,Yes,East,724 +121.834,10673,750,3,54,16,No,No,No,East,1573 +30.132,2168,206,3,52,17,No,No,No,South,0 +24.05,2607,221,4,32,18,No,No,Yes,South,0 +22.379,3965,292,2,34,14,Yes,No,Yes,West,384 +28.316,4391,316,2,29,10,Yes,No,No,South,453 +58.026,7499,560,5,67,11,Yes,No,No,South,1237 +10.635,3584,294,5,69,16,No,No,Yes,West,423 +46.102,5180,382,3,81,12,No,No,Yes,East,516 +58.929,6420,459,2,66,9,Yes,No,Yes,East,789 +80.861,4090,335,3,29,15,Yes,No,Yes,West,0 +158.889,11589,805,1,62,17,Yes,No,Yes,South,1448 +30.42,4442,316,1,30,14,Yes,No,No,East,450 +36.472,3806,309,2,52,13,No,No,No,East,188 +23.365,2179,167,2,75,15,No,No,No,West,0 +83.869,7667,554,2,83,11,No,No,No,East,930 +58.351,4411,326,2,85,16,Yes,No,Yes,South,126 +55.187,5352,385,4,50,17,Yes,No,Yes,South,538 +124.29,9560,701,3,52,17,Yes,Yes,No,West,1687 +28.508,3933,287,4,56,14,No,No,Yes,West,336 +130.209,10088,730,7,39,19,Yes,No,Yes,South,1426 +30.406,2120,181,2,79,14,No,No,Yes,East,0 +23.883,5384,398,2,73,16,Yes,No,Yes,East,802 +93.039,7398,517,1,67,12,No,No,Yes,East,749 +50.699,3977,304,2,84,17,Yes,No,No,East,69 +27.349,2000,169,4,51,16,Yes,No,Yes,East,0 +10.403,4159,310,3,43,7,No,No,Yes,West,571 +23.949,5343,383,2,40,18,No,No,Yes,East,829 +73.914,7333,529,6,67,15,Yes,No,Yes,South,1048 +21.038,1448,145,2,58,13,Yes,No,Yes,South,0 +68.206,6784,499,5,40,16,Yes,Yes,No,East,1411 +57.337,5310,392,2,45,7,Yes,No,No,South,456 +10.793,3878,321,8,29,13,No,No,No,South,638 +23.45,2450,180,2,78,13,No,No,No,South,0 +10.842,4391,358,5,37,10,Yes,Yes,Yes,South,1216 +51.345,4327,320,3,46,15,No,No,No,East,230 +151.947,9156,642,2,91,11,Yes,No,Yes,East,732 +24.543,3206,243,2,62,12,Yes,No,Yes,South,95 +29.567,5309,397,3,25,15,No,No,No,South,799 +39.145,4351,323,2,66,13,No,No,Yes,South,308 +39.422,5245,383,2,44,19,No,No,No,East,637 +34.909,5289,410,2,62,16,Yes,No,Yes,South,681 +41.025,4229,337,3,79,19,Yes,No,Yes,South,246 +15.476,2762,215,3,60,18,No,No,No,West,52 +12.456,5395,392,3,65,14,No,No,Yes,South,955 +10.627,1647,149,2,71,10,Yes,Yes,Yes,West,195 +38.954,5222,370,4,76,13,Yes,No,No,South,653 +44.847,5765,437,3,53,13,Yes,Yes,No,West,1246 +98.515,8760,633,5,78,11,Yes,No,No,East,1230 +33.437,6207,451,4,44,9,No,Yes,No,South,1549 +27.512,4613,344,5,72,17,No,No,Yes,West,573 +121.709,7818,584,4,50,6,No,No,Yes,South,701 +15.079,5673,411,4,28,15,Yes,No,Yes,West,1075 +59.879,6906,527,6,78,15,Yes,No,No,South,1032 +66.989,5614,430,3,47,14,Yes,No,Yes,South,482 +69.165,4668,341,2,34,11,Yes,No,No,East,156 +69.943,7555,547,3,76,9,No,No,Yes,West,1058 +33.214,5137,387,3,59,9,No,No,No,East,661 +25.124,4776,378,4,29,12,No,No,Yes,South,657 +15.741,4788,360,1,39,14,No,No,Yes,West,689 +11.603,2278,187,3,71,11,No,No,Yes,South,0 +69.656,8244,579,3,41,14,No,No,Yes,East,1329 +10.503,2923,232,3,25,18,Yes,No,Yes,East,191 +42.529,4986,369,2,37,11,No,No,Yes,West,489 +60.579,5149,388,5,38,15,No,No,Yes,West,443 +26.532,2910,236,6,58,19,Yes,No,Yes,South,52 +27.952,3557,263,1,35,13,Yes,No,Yes,West,163 +29.705,3351,262,5,71,14,Yes,No,Yes,West,148 +15.602,906,103,2,36,11,No,No,Yes,East,0 +20.918,1233,128,3,47,18,Yes,Yes,Yes,West,16 +58.165,6617,460,1,56,12,Yes,No,Yes,South,856 +22.561,1787,147,4,66,15,Yes,No,No,South,0 +34.509,2001,189,5,80,18,Yes,No,Yes,East,0 +19.588,3211,265,4,59,14,Yes,No,No,West,199 +36.364,2220,188,3,50,19,No,No,No,South,0 +15.717,905,93,1,38,16,No,Yes,Yes,South,0 +22.574,1551,134,3,43,13,Yes,Yes,Yes,South,98 +10.363,2430,191,2,47,18,Yes,No,Yes,West,0 +28.474,3202,267,5,66,12,No,No,Yes,South,132 +72.945,8603,621,3,64,8,Yes,No,No,South,1355 +85.425,5182,402,6,60,12,No,No,Yes,East,218 +36.508,6386,469,4,79,6,Yes,No,Yes,South,1048 +58.063,4221,304,3,50,8,No,No,No,East,118 +25.936,1774,135,2,71,14,Yes,No,No,West,0 +15.629,2493,186,1,60,14,No,No,Yes,West,0 +41.4,2561,215,2,36,14,No,No,Yes,South,0 +33.657,6196,450,6,55,9,Yes,No,No,South,1092 +67.937,5184,383,4,63,12,No,No,Yes,West,345 +180.379,9310,665,3,67,8,Yes,Yes,Yes,West,1050 +10.588,4049,296,1,66,13,Yes,No,Yes,South,465 +29.725,3536,270,2,52,15,Yes,No,No,East,133 +27.999,5107,380,1,55,10,No,No,Yes,South,651 +40.885,5013,379,3,46,13,Yes,No,Yes,East,549 +88.83,4952,360,4,86,16,Yes,No,Yes,South,15 +29.638,5833,433,3,29,15,Yes,No,Yes,West,942 +25.988,1349,142,4,82,12,No,No,No,South,0 +39.055,5565,410,4,48,18,Yes,No,Yes,South,772 +15.866,3085,217,1,39,13,No,No,No,South,136 +44.978,4866,347,1,30,10,Yes,No,No,South,436 +30.413,3690,299,2,25,15,Yes,Yes,No,West,728 +16.751,4706,353,6,48,14,No,Yes,No,West,1255 +30.55,5869,439,5,81,9,Yes,No,No,East,967 +163.329,8732,636,3,50,14,No,No,Yes,South,529 +23.106,3476,257,2,50,15,Yes,No,No,South,209 +41.532,5000,353,2,50,12,No,No,Yes,South,531 +128.04,6982,518,2,78,11,Yes,No,Yes,South,250 +54.319,3063,248,3,59,8,Yes,Yes,No,South,269 +53.401,5319,377,3,35,12,Yes,No,No,East,541 +36.142,1852,183,3,33,13,Yes,No,No,East,0 +63.534,8100,581,2,50,17,Yes,No,Yes,South,1298 +49.927,6396,485,3,75,17,Yes,No,Yes,South,890 +14.711,2047,167,2,67,6,No,No,Yes,South,0 +18.967,1626,156,2,41,11,Yes,No,Yes,West,0 +18.036,1552,142,2,48,15,Yes,No,No,South,0 +60.449,3098,272,4,69,8,No,No,Yes,South,0 +16.711,5274,387,3,42,16,Yes,No,Yes,West,863 +10.852,3907,296,2,30,9,No,No,No,South,485 +26.37,3235,268,5,78,11,No,No,Yes,West,159 +24.088,3665,287,4,56,13,Yes,No,Yes,South,309 +51.532,5096,380,2,31,15,No,No,Yes,South,481 +140.672,11200,817,7,46,9,No,No,Yes,East,1677 +42.915,2532,205,4,42,13,No,No,Yes,West,0 +27.272,1389,149,5,67,10,Yes,No,Yes,South,0 +65.896,5140,370,1,49,17,Yes,No,Yes,South,293 +55.054,4381,321,3,74,17,No,No,Yes,West,188 +20.791,2672,204,1,70,18,Yes,No,No,East,0 +24.919,5051,372,3,76,11,Yes,No,Yes,East,711 +21.786,4632,355,1,50,17,No,No,Yes,South,580 +31.335,3526,289,3,38,7,Yes,No,No,South,172 +59.855,4964,365,1,46,13,Yes,No,Yes,South,295 +44.061,4970,352,1,79,11,No,No,Yes,East,414 +82.706,7506,536,2,64,13,Yes,No,Yes,West,905 +24.46,1924,165,2,50,14,Yes,No,Yes,West,0 +45.12,3762,287,3,80,8,No,No,Yes,South,70 +75.406,3874,298,3,41,14,Yes,No,Yes,West,0 +14.956,4640,332,2,33,6,No,No,No,West,681 +75.257,7010,494,3,34,18,Yes,No,Yes,South,885 +33.694,4891,369,1,52,16,No,Yes,No,East,1036 +23.375,5429,396,3,57,15,Yes,No,Yes,South,844 +27.825,5227,386,6,63,11,No,No,Yes,South,823 +92.386,7685,534,2,75,18,Yes,No,Yes,West,843 +115.52,9272,656,2,69,14,No,No,No,East,1140 +14.479,3907,296,3,43,16,No,No,Yes,South,463 +52.179,7306,522,2,57,14,No,No,No,West,1142 +68.462,4712,340,2,71,16,No,No,Yes,South,136 +18.951,1485,129,3,82,13,Yes,No,No,South,0 +27.59,2586,229,5,54,16,No,No,Yes,East,0 +16.279,1160,126,3,78,13,No,Yes,Yes,East,5 +25.078,3096,236,2,27,15,Yes,No,Yes,South,81 +27.229,3484,282,6,51,11,No,No,No,South,265 +182.728,13913,982,4,98,17,No,No,Yes,South,1999 +31.029,2863,223,2,66,17,No,Yes,Yes,West,415 +17.765,5072,364,1,66,12,Yes,No,Yes,South,732 +125.48,10230,721,3,82,16,No,No,Yes,South,1361 +49.166,6662,508,3,68,14,Yes,No,No,West,984 +41.192,3673,297,3,54,16,Yes,No,Yes,South,121 +94.193,7576,527,2,44,16,Yes,No,Yes,South,846 +20.405,4543,329,2,72,17,No,Yes,No,West,1054 +12.581,3976,291,2,48,16,No,No,Yes,South,474 +62.328,5228,377,3,83,15,No,No,No,South,380 +21.011,3402,261,2,68,17,No,No,Yes,East,182 +24.23,4756,351,2,64,15,Yes,No,Yes,South,594 +24.314,3409,270,2,23,7,Yes,No,Yes,South,194 +32.856,5884,438,4,68,13,No,No,No,South,926 +12.414,855,119,3,32,12,No,No,Yes,East,0 +41.365,5303,377,1,45,14,No,No,No,South,606 +149.316,10278,707,1,80,16,No,No,No,East,1107 +27.794,3807,301,4,35,8,Yes,No,Yes,East,320 +13.234,3922,299,2,77,17,Yes,No,Yes,South,426 +14.595,2955,260,5,37,9,No,No,Yes,East,204 +10.735,3746,280,2,44,17,Yes,No,Yes,South,410 +48.218,5199,401,7,39,10,No,No,Yes,West,633 +30.012,1511,137,2,33,17,No,No,Yes,South,0 +21.551,5380,420,5,51,18,No,No,Yes,West,907 +160.231,10748,754,2,69,17,No,No,No,South,1192 +13.433,1134,112,3,70,14,No,No,Yes,South,0 +48.577,5145,389,3,71,13,Yes,No,Yes,West,503 +30.002,1561,155,4,70,13,Yes,No,Yes,South,0 +61.62,5140,374,1,71,9,No,No,Yes,South,302 +104.483,7140,507,2,41,14,No,No,Yes,East,583 +41.868,4716,342,2,47,18,No,No,No,South,425 +12.068,3873,292,1,44,18,Yes,No,Yes,West,413 +180.682,11966,832,2,58,8,Yes,No,Yes,East,1405 +34.48,6090,442,3,36,14,No,No,No,South,962 +39.609,2539,188,1,40,14,No,No,Yes,West,0 +30.111,4336,339,1,81,18,No,No,Yes,South,347 +12.335,4471,344,3,79,12,No,No,Yes,East,611 +53.566,5891,434,4,82,10,Yes,No,No,South,712 +53.217,4943,362,2,46,16,Yes,No,Yes,West,382 +26.162,5101,382,3,62,19,Yes,No,No,East,710 +64.173,6127,433,1,80,10,No,No,Yes,South,578 +128.669,9824,685,3,67,16,No,No,Yes,West,1243 +113.772,6442,489,4,69,15,No,Yes,Yes,South,790 +61.069,7871,564,3,56,14,No,No,Yes,South,1264 +23.793,3615,263,2,70,14,No,No,No,East,216 +89,5759,440,3,37,6,Yes,No,No,South,345 +71.682,8028,599,3,57,16,No,No,Yes,South,1208 +35.61,6135,466,4,40,12,No,No,No,South,992 +39.116,2150,173,4,75,15,No,No,No,South,0 +19.782,3782,293,2,46,16,Yes,Yes,No,South,840 +55.412,5354,383,2,37,16,Yes,Yes,Yes,South,1003 +29.4,4840,368,3,76,18,Yes,No,Yes,South,588 +20.974,5673,413,5,44,16,Yes,No,Yes,South,1000 +87.625,7167,515,2,46,10,Yes,No,No,East,767 +28.144,1567,142,3,51,10,No,No,Yes,South,0 +19.349,4941,366,1,33,19,No,No,Yes,South,717 +53.308,2860,214,1,84,10,No,No,Yes,South,0 +115.123,7760,538,3,83,14,Yes,No,No,East,661 +101.788,8029,574,2,84,11,No,No,Yes,South,849 +24.824,5495,409,1,33,9,No,Yes,No,South,1352 +14.292,3274,282,9,64,9,No,No,Yes,South,382 +20.088,1870,180,3,76,16,No,No,No,East,0 +26.4,5640,398,3,58,15,Yes,No,No,West,905 +19.253,3683,287,4,57,10,No,No,No,East,371 +16.529,1357,126,3,62,9,No,No,No,West,0 +37.878,6827,482,2,80,13,Yes,No,No,South,1129 +83.948,7100,503,2,44,18,No,No,No,South,806 +135.118,10578,747,3,81,15,Yes,No,Yes,West,1393 +73.327,6555,472,2,43,15,Yes,No,No,South,721 +25.974,2308,196,2,24,10,No,No,No,West,0 +17.316,1335,138,2,65,13,No,No,No,East,0 +49.794,5758,410,4,40,8,No,No,No,South,734 +12.096,4100,307,3,32,13,No,No,Yes,South,560 +13.364,3838,296,5,65,17,No,No,No,East,480 +57.872,4171,321,5,67,12,Yes,No,Yes,South,138 +37.728,2525,192,1,44,13,No,No,Yes,South,0 +18.701,5524,415,5,64,7,Yes,No,No,West,966 diff --git a/ISLP_data/Heart.csv b/ISLP_data/Heart.csv new file mode 100644 index 0000000..deed37c --- /dev/null +++ b/ISLP_data/Heart.csv @@ -0,0 +1,304 @@ +"","Age","Sex","ChestPain","RestBP","Chol","Fbs","RestECG","MaxHR","ExAng","Oldpeak","Slope","Ca","Thal","AHD" +"1",63,1,"typical",145,233,1,2,150,0,2.3,3,0,"fixed","No" +"2",67,1,"asymptomatic",160,286,0,2,108,1,1.5,2,3,"normal","Yes" +"3",67,1,"asymptomatic",120,229,0,2,129,1,2.6,2,2,"reversable","Yes" +"4",37,1,"nonanginal",130,250,0,0,187,0,3.5,3,0,"normal","No" +"5",41,0,"nontypical",130,204,0,2,172,0,1.4,1,0,"normal","No" +"6",56,1,"nontypical",120,236,0,0,178,0,0.8,1,0,"normal","No" +"7",62,0,"asymptomatic",140,268,0,2,160,0,3.6,3,2,"normal","Yes" +"8",57,0,"asymptomatic",120,354,0,0,163,1,0.6,1,0,"normal","No" +"9",63,1,"asymptomatic",130,254,0,2,147,0,1.4,2,1,"reversable","Yes" +"10",53,1,"asymptomatic",140,203,1,2,155,1,3.1,3,0,"reversable","Yes" +"11",57,1,"asymptomatic",140,192,0,0,148,0,0.4,2,0,"fixed","No" +"12",56,0,"nontypical",140,294,0,2,153,0,1.3,2,0,"normal","No" +"13",56,1,"nonanginal",130,256,1,2,142,1,0.6,2,1,"fixed","Yes" +"14",44,1,"nontypical",120,263,0,0,173,0,0,1,0,"reversable","No" +"15",52,1,"nonanginal",172,199,1,0,162,0,0.5,1,0,"reversable","No" +"16",57,1,"nonanginal",150,168,0,0,174,0,1.6,1,0,"normal","No" +"17",48,1,"nontypical",110,229,0,0,168,0,1,3,0,"reversable","Yes" +"18",54,1,"asymptomatic",140,239,0,0,160,0,1.2,1,0,"normal","No" +"19",48,0,"nonanginal",130,275,0,0,139,0,0.2,1,0,"normal","No" +"20",49,1,"nontypical",130,266,0,0,171,0,0.6,1,0,"normal","No" +"21",64,1,"typical",110,211,0,2,144,1,1.8,2,0,"normal","No" +"22",58,0,"typical",150,283,1,2,162,0,1,1,0,"normal","No" +"23",58,1,"nontypical",120,284,0,2,160,0,1.8,2,0,"normal","Yes" +"24",58,1,"nonanginal",132,224,0,2,173,0,3.2,1,2,"reversable","Yes" +"25",60,1,"asymptomatic",130,206,0,2,132,1,2.4,2,2,"reversable","Yes" +"26",50,0,"nonanginal",120,219,0,0,158,0,1.6,2,0,"normal","No" +"27",58,0,"nonanginal",120,340,0,0,172,0,0,1,0,"normal","No" +"28",66,0,"typical",150,226,0,0,114,0,2.6,3,0,"normal","No" +"29",43,1,"asymptomatic",150,247,0,0,171,0,1.5,1,0,"normal","No" +"30",40,1,"asymptomatic",110,167,0,2,114,1,2,2,0,"reversable","Yes" +"31",69,0,"typical",140,239,0,0,151,0,1.8,1,2,"normal","No" +"32",60,1,"asymptomatic",117,230,1,0,160,1,1.4,1,2,"reversable","Yes" +"33",64,1,"nonanginal",140,335,0,0,158,0,0,1,0,"normal","Yes" +"34",59,1,"asymptomatic",135,234,0,0,161,0,0.5,2,0,"reversable","No" +"35",44,1,"nonanginal",130,233,0,0,179,1,0.4,1,0,"normal","No" +"36",42,1,"asymptomatic",140,226,0,0,178,0,0,1,0,"normal","No" +"37",43,1,"asymptomatic",120,177,0,2,120,1,2.5,2,0,"reversable","Yes" +"38",57,1,"asymptomatic",150,276,0,2,112,1,0.6,2,1,"fixed","Yes" +"39",55,1,"asymptomatic",132,353,0,0,132,1,1.2,2,1,"reversable","Yes" +"40",61,1,"nonanginal",150,243,1,0,137,1,1,2,0,"normal","No" +"41",65,0,"asymptomatic",150,225,0,2,114,0,1,2,3,"reversable","Yes" +"42",40,1,"typical",140,199,0,0,178,1,1.4,1,0,"reversable","No" +"43",71,0,"nontypical",160,302,0,0,162,0,0.4,1,2,"normal","No" +"44",59,1,"nonanginal",150,212,1,0,157,0,1.6,1,0,"normal","No" +"45",61,0,"asymptomatic",130,330,0,2,169,0,0,1,0,"normal","Yes" +"46",58,1,"nonanginal",112,230,0,2,165,0,2.5,2,1,"reversable","Yes" +"47",51,1,"nonanginal",110,175,0,0,123,0,0.6,1,0,"normal","No" +"48",50,1,"asymptomatic",150,243,0,2,128,0,2.6,2,0,"reversable","Yes" +"49",65,0,"nonanginal",140,417,1,2,157,0,0.8,1,1,"normal","No" +"50",53,1,"nonanginal",130,197,1,2,152,0,1.2,3,0,"normal","No" +"51",41,0,"nontypical",105,198,0,0,168,0,0,1,1,"normal","No" +"52",65,1,"asymptomatic",120,177,0,0,140,0,0.4,1,0,"reversable","No" +"53",44,1,"asymptomatic",112,290,0,2,153,0,0,1,1,"normal","Yes" +"54",44,1,"nontypical",130,219,0,2,188,0,0,1,0,"normal","No" +"55",60,1,"asymptomatic",130,253,0,0,144,1,1.4,1,1,"reversable","Yes" +"56",54,1,"asymptomatic",124,266,0,2,109,1,2.2,2,1,"reversable","Yes" +"57",50,1,"nonanginal",140,233,0,0,163,0,0.6,2,1,"reversable","Yes" +"58",41,1,"asymptomatic",110,172,0,2,158,0,0,1,0,"reversable","Yes" +"59",54,1,"nonanginal",125,273,0,2,152,0,0.5,3,1,"normal","No" +"60",51,1,"typical",125,213,0,2,125,1,1.4,1,1,"normal","No" +"61",51,0,"asymptomatic",130,305,0,0,142,1,1.2,2,0,"reversable","Yes" +"62",46,0,"nonanginal",142,177,0,2,160,1,1.4,3,0,"normal","No" +"63",58,1,"asymptomatic",128,216,0,2,131,1,2.2,2,3,"reversable","Yes" +"64",54,0,"nonanginal",135,304,1,0,170,0,0,1,0,"normal","No" +"65",54,1,"asymptomatic",120,188,0,0,113,0,1.4,2,1,"reversable","Yes" +"66",60,1,"asymptomatic",145,282,0,2,142,1,2.8,2,2,"reversable","Yes" +"67",60,1,"nonanginal",140,185,0,2,155,0,3,2,0,"normal","Yes" +"68",54,1,"nonanginal",150,232,0,2,165,0,1.6,1,0,"reversable","No" +"69",59,1,"asymptomatic",170,326,0,2,140,1,3.4,3,0,"reversable","Yes" +"70",46,1,"nonanginal",150,231,0,0,147,0,3.6,2,0,"normal","Yes" +"71",65,0,"nonanginal",155,269,0,0,148,0,0.8,1,0,"normal","No" +"72",67,1,"asymptomatic",125,254,1,0,163,0,0.2,2,2,"reversable","Yes" +"73",62,1,"asymptomatic",120,267,0,0,99,1,1.8,2,2,"reversable","Yes" +"74",65,1,"asymptomatic",110,248,0,2,158,0,0.6,1,2,"fixed","Yes" +"75",44,1,"asymptomatic",110,197,0,2,177,0,0,1,1,"normal","Yes" +"76",65,0,"nonanginal",160,360,0,2,151,0,0.8,1,0,"normal","No" +"77",60,1,"asymptomatic",125,258,0,2,141,1,2.8,2,1,"reversable","Yes" +"78",51,0,"nonanginal",140,308,0,2,142,0,1.5,1,1,"normal","No" +"79",48,1,"nontypical",130,245,0,2,180,0,0.2,2,0,"normal","No" +"80",58,1,"asymptomatic",150,270,0,2,111,1,0.8,1,0,"reversable","Yes" +"81",45,1,"asymptomatic",104,208,0,2,148,1,3,2,0,"normal","No" +"82",53,0,"asymptomatic",130,264,0,2,143,0,0.4,2,0,"normal","No" +"83",39,1,"nonanginal",140,321,0,2,182,0,0,1,0,"normal","No" +"84",68,1,"nonanginal",180,274,1,2,150,1,1.6,2,0,"reversable","Yes" +"85",52,1,"nontypical",120,325,0,0,172,0,0.2,1,0,"normal","No" +"86",44,1,"nonanginal",140,235,0,2,180,0,0,1,0,"normal","No" +"87",47,1,"nonanginal",138,257,0,2,156,0,0,1,0,"normal","No" +"88",53,0,"nonanginal",128,216,0,2,115,0,0,1,0,NA,"No" +"89",53,0,"asymptomatic",138,234,0,2,160,0,0,1,0,"normal","No" +"90",51,0,"nonanginal",130,256,0,2,149,0,0.5,1,0,"normal","No" +"91",66,1,"asymptomatic",120,302,0,2,151,0,0.4,2,0,"normal","No" +"92",62,0,"asymptomatic",160,164,0,2,145,0,6.2,3,3,"reversable","Yes" +"93",62,1,"nonanginal",130,231,0,0,146,0,1.8,2,3,"reversable","No" +"94",44,0,"nonanginal",108,141,0,0,175,0,0.6,2,0,"normal","No" +"95",63,0,"nonanginal",135,252,0,2,172,0,0,1,0,"normal","No" +"96",52,1,"asymptomatic",128,255,0,0,161,1,0,1,1,"reversable","Yes" +"97",59,1,"asymptomatic",110,239,0,2,142,1,1.2,2,1,"reversable","Yes" +"98",60,0,"asymptomatic",150,258,0,2,157,0,2.6,2,2,"reversable","Yes" +"99",52,1,"nontypical",134,201,0,0,158,0,0.8,1,1,"normal","No" +"100",48,1,"asymptomatic",122,222,0,2,186,0,0,1,0,"normal","No" +"101",45,1,"asymptomatic",115,260,0,2,185,0,0,1,0,"normal","No" +"102",34,1,"typical",118,182,0,2,174,0,0,1,0,"normal","No" +"103",57,0,"asymptomatic",128,303,0,2,159,0,0,1,1,"normal","No" +"104",71,0,"nonanginal",110,265,1,2,130,0,0,1,1,"normal","No" +"105",49,1,"nonanginal",120,188,0,0,139,0,2,2,3,"reversable","Yes" +"106",54,1,"nontypical",108,309,0,0,156,0,0,1,0,"reversable","No" +"107",59,1,"asymptomatic",140,177,0,0,162,1,0,1,1,"reversable","Yes" +"108",57,1,"nonanginal",128,229,0,2,150,0,0.4,2,1,"reversable","Yes" +"109",61,1,"asymptomatic",120,260,0,0,140,1,3.6,2,1,"reversable","Yes" +"110",39,1,"asymptomatic",118,219,0,0,140,0,1.2,2,0,"reversable","Yes" +"111",61,0,"asymptomatic",145,307,0,2,146,1,1,2,0,"reversable","Yes" +"112",56,1,"asymptomatic",125,249,1,2,144,1,1.2,2,1,"normal","Yes" +"113",52,1,"typical",118,186,0,2,190,0,0,2,0,"fixed","No" +"114",43,0,"asymptomatic",132,341,1,2,136,1,3,2,0,"reversable","Yes" +"115",62,0,"nonanginal",130,263,0,0,97,0,1.2,2,1,"reversable","Yes" +"116",41,1,"nontypical",135,203,0,0,132,0,0,2,0,"fixed","No" +"117",58,1,"nonanginal",140,211,1,2,165,0,0,1,0,"normal","No" +"118",35,0,"asymptomatic",138,183,0,0,182,0,1.4,1,0,"normal","No" +"119",63,1,"asymptomatic",130,330,1,2,132,1,1.8,1,3,"reversable","Yes" +"120",65,1,"asymptomatic",135,254,0,2,127,0,2.8,2,1,"reversable","Yes" +"121",48,1,"asymptomatic",130,256,1,2,150,1,0,1,2,"reversable","Yes" +"122",63,0,"asymptomatic",150,407,0,2,154,0,4,2,3,"reversable","Yes" +"123",51,1,"nonanginal",100,222,0,0,143,1,1.2,2,0,"normal","No" +"124",55,1,"asymptomatic",140,217,0,0,111,1,5.6,3,0,"reversable","Yes" +"125",65,1,"typical",138,282,1,2,174,0,1.4,2,1,"normal","Yes" +"126",45,0,"nontypical",130,234,0,2,175,0,0.6,2,0,"normal","No" +"127",56,0,"asymptomatic",200,288,1,2,133,1,4,3,2,"reversable","Yes" +"128",54,1,"asymptomatic",110,239,0,0,126,1,2.8,2,1,"reversable","Yes" +"129",44,1,"nontypical",120,220,0,0,170,0,0,1,0,"normal","No" +"130",62,0,"asymptomatic",124,209,0,0,163,0,0,1,0,"normal","No" +"131",54,1,"nonanginal",120,258,0,2,147,0,0.4,2,0,"reversable","No" +"132",51,1,"nonanginal",94,227,0,0,154,1,0,1,1,"reversable","No" +"133",29,1,"nontypical",130,204,0,2,202,0,0,1,0,"normal","No" +"134",51,1,"asymptomatic",140,261,0,2,186,1,0,1,0,"normal","No" +"135",43,0,"nonanginal",122,213,0,0,165,0,0.2,2,0,"normal","No" +"136",55,0,"nontypical",135,250,0,2,161,0,1.4,2,0,"normal","No" +"137",70,1,"asymptomatic",145,174,0,0,125,1,2.6,3,0,"reversable","Yes" +"138",62,1,"nontypical",120,281,0,2,103,0,1.4,2,1,"reversable","Yes" +"139",35,1,"asymptomatic",120,198,0,0,130,1,1.6,2,0,"reversable","Yes" +"140",51,1,"nonanginal",125,245,1,2,166,0,2.4,2,0,"normal","No" +"141",59,1,"nontypical",140,221,0,0,164,1,0,1,0,"normal","No" +"142",59,1,"typical",170,288,0,2,159,0,0.2,2,0,"reversable","Yes" +"143",52,1,"nontypical",128,205,1,0,184,0,0,1,0,"normal","No" +"144",64,1,"nonanginal",125,309,0,0,131,1,1.8,2,0,"reversable","Yes" +"145",58,1,"nonanginal",105,240,0,2,154,1,0.6,2,0,"reversable","No" +"146",47,1,"nonanginal",108,243,0,0,152,0,0,1,0,"normal","Yes" +"147",57,1,"asymptomatic",165,289,1,2,124,0,1,2,3,"reversable","Yes" +"148",41,1,"nonanginal",112,250,0,0,179,0,0,1,0,"normal","No" +"149",45,1,"nontypical",128,308,0,2,170,0,0,1,0,"normal","No" +"150",60,0,"nonanginal",102,318,0,0,160,0,0,1,1,"normal","No" +"151",52,1,"typical",152,298,1,0,178,0,1.2,2,0,"reversable","No" +"152",42,0,"asymptomatic",102,265,0,2,122,0,0.6,2,0,"normal","No" +"153",67,0,"nonanginal",115,564,0,2,160,0,1.6,2,0,"reversable","No" +"154",55,1,"asymptomatic",160,289,0,2,145,1,0.8,2,1,"reversable","Yes" +"155",64,1,"asymptomatic",120,246,0,2,96,1,2.2,3,1,"normal","Yes" +"156",70,1,"asymptomatic",130,322,0,2,109,0,2.4,2,3,"normal","Yes" +"157",51,1,"asymptomatic",140,299,0,0,173,1,1.6,1,0,"reversable","Yes" +"158",58,1,"asymptomatic",125,300,0,2,171,0,0,1,2,"reversable","Yes" +"159",60,1,"asymptomatic",140,293,0,2,170,0,1.2,2,2,"reversable","Yes" +"160",68,1,"nonanginal",118,277,0,0,151,0,1,1,1,"reversable","No" +"161",46,1,"nontypical",101,197,1,0,156,0,0,1,0,"reversable","No" +"162",77,1,"asymptomatic",125,304,0,2,162,1,0,1,3,"normal","Yes" +"163",54,0,"nonanginal",110,214,0,0,158,0,1.6,2,0,"normal","No" +"164",58,0,"asymptomatic",100,248,0,2,122,0,1,2,0,"normal","No" +"165",48,1,"nonanginal",124,255,1,0,175,0,0,1,2,"normal","No" +"166",57,1,"asymptomatic",132,207,0,0,168,1,0,1,0,"reversable","No" +"167",52,1,"nonanginal",138,223,0,0,169,0,0,1,NA,"normal","No" +"168",54,0,"nontypical",132,288,1,2,159,1,0,1,1,"normal","No" +"169",35,1,"asymptomatic",126,282,0,2,156,1,0,1,0,"reversable","Yes" +"170",45,0,"nontypical",112,160,0,0,138,0,0,2,0,"normal","No" +"171",70,1,"nonanginal",160,269,0,0,112,1,2.9,2,1,"reversable","Yes" +"172",53,1,"asymptomatic",142,226,0,2,111,1,0,1,0,"reversable","No" +"173",59,0,"asymptomatic",174,249,0,0,143,1,0,2,0,"normal","Yes" +"174",62,0,"asymptomatic",140,394,0,2,157,0,1.2,2,0,"normal","No" +"175",64,1,"asymptomatic",145,212,0,2,132,0,2,2,2,"fixed","Yes" +"176",57,1,"asymptomatic",152,274,0,0,88,1,1.2,2,1,"reversable","Yes" +"177",52,1,"asymptomatic",108,233,1,0,147,0,0.1,1,3,"reversable","No" +"178",56,1,"asymptomatic",132,184,0,2,105,1,2.1,2,1,"fixed","Yes" +"179",43,1,"nonanginal",130,315,0,0,162,0,1.9,1,1,"normal","No" +"180",53,1,"nonanginal",130,246,1,2,173,0,0,1,3,"normal","No" +"181",48,1,"asymptomatic",124,274,0,2,166,0,0.5,2,0,"reversable","Yes" +"182",56,0,"asymptomatic",134,409,0,2,150,1,1.9,2,2,"reversable","Yes" +"183",42,1,"typical",148,244,0,2,178,0,0.8,1,2,"normal","No" +"184",59,1,"typical",178,270,0,2,145,0,4.2,3,0,"reversable","No" +"185",60,0,"asymptomatic",158,305,0,2,161,0,0,1,0,"normal","Yes" +"186",63,0,"nontypical",140,195,0,0,179,0,0,1,2,"normal","No" +"187",42,1,"nonanginal",120,240,1,0,194,0,0.8,3,0,"reversable","No" +"188",66,1,"nontypical",160,246,0,0,120,1,0,2,3,"fixed","Yes" +"189",54,1,"nontypical",192,283,0,2,195,0,0,1,1,"reversable","Yes" +"190",69,1,"nonanginal",140,254,0,2,146,0,2,2,3,"reversable","Yes" +"191",50,1,"nonanginal",129,196,0,0,163,0,0,1,0,"normal","No" +"192",51,1,"asymptomatic",140,298,0,0,122,1,4.2,2,3,"reversable","Yes" +"193",43,1,"asymptomatic",132,247,1,2,143,1,0.1,2,NA,"reversable","Yes" +"194",62,0,"asymptomatic",138,294,1,0,106,0,1.9,2,3,"normal","Yes" +"195",68,0,"nonanginal",120,211,0,2,115,0,1.5,2,0,"normal","No" +"196",67,1,"asymptomatic",100,299,0,2,125,1,0.9,2,2,"normal","Yes" +"197",69,1,"typical",160,234,1,2,131,0,0.1,2,1,"normal","No" +"198",45,0,"asymptomatic",138,236,0,2,152,1,0.2,2,0,"normal","No" +"199",50,0,"nontypical",120,244,0,0,162,0,1.1,1,0,"normal","No" +"200",59,1,"typical",160,273,0,2,125,0,0,1,0,"normal","Yes" +"201",50,0,"asymptomatic",110,254,0,2,159,0,0,1,0,"normal","No" +"202",64,0,"asymptomatic",180,325,0,0,154,1,0,1,0,"normal","No" +"203",57,1,"nonanginal",150,126,1,0,173,0,0.2,1,1,"reversable","No" +"204",64,0,"nonanginal",140,313,0,0,133,0,0.2,1,0,"reversable","No" +"205",43,1,"asymptomatic",110,211,0,0,161,0,0,1,0,"reversable","No" +"206",45,1,"asymptomatic",142,309,0,2,147,1,0,2,3,"reversable","Yes" +"207",58,1,"asymptomatic",128,259,0,2,130,1,3,2,2,"reversable","Yes" +"208",50,1,"asymptomatic",144,200,0,2,126,1,0.9,2,0,"reversable","Yes" +"209",55,1,"nontypical",130,262,0,0,155,0,0,1,0,"normal","No" +"210",62,0,"asymptomatic",150,244,0,0,154,1,1.4,2,0,"normal","Yes" +"211",37,0,"nonanginal",120,215,0,0,170,0,0,1,0,"normal","No" +"212",38,1,"typical",120,231,0,0,182,1,3.8,2,0,"reversable","Yes" +"213",41,1,"nonanginal",130,214,0,2,168,0,2,2,0,"normal","No" +"214",66,0,"asymptomatic",178,228,1,0,165,1,1,2,2,"reversable","Yes" +"215",52,1,"asymptomatic",112,230,0,0,160,0,0,1,1,"normal","Yes" +"216",56,1,"typical",120,193,0,2,162,0,1.9,2,0,"reversable","No" +"217",46,0,"nontypical",105,204,0,0,172,0,0,1,0,"normal","No" +"218",46,0,"asymptomatic",138,243,0,2,152,1,0,2,0,"normal","No" +"219",64,0,"asymptomatic",130,303,0,0,122,0,2,2,2,"normal","No" +"220",59,1,"asymptomatic",138,271,0,2,182,0,0,1,0,"normal","No" +"221",41,0,"nonanginal",112,268,0,2,172,1,0,1,0,"normal","No" +"222",54,0,"nonanginal",108,267,0,2,167,0,0,1,0,"normal","No" +"223",39,0,"nonanginal",94,199,0,0,179,0,0,1,0,"normal","No" +"224",53,1,"asymptomatic",123,282,0,0,95,1,2,2,2,"reversable","Yes" +"225",63,0,"asymptomatic",108,269,0,0,169,1,1.8,2,2,"normal","Yes" +"226",34,0,"nontypical",118,210,0,0,192,0,0.7,1,0,"normal","No" +"227",47,1,"asymptomatic",112,204,0,0,143,0,0.1,1,0,"normal","No" +"228",67,0,"nonanginal",152,277,0,0,172,0,0,1,1,"normal","No" +"229",54,1,"asymptomatic",110,206,0,2,108,1,0,2,1,"normal","Yes" +"230",66,1,"asymptomatic",112,212,0,2,132,1,0.1,1,1,"normal","Yes" +"231",52,0,"nonanginal",136,196,0,2,169,0,0.1,2,0,"normal","No" +"232",55,0,"asymptomatic",180,327,0,1,117,1,3.4,2,0,"normal","Yes" +"233",49,1,"nonanginal",118,149,0,2,126,0,0.8,1,3,"normal","Yes" +"234",74,0,"nontypical",120,269,0,2,121,1,0.2,1,1,"normal","No" +"235",54,0,"nonanginal",160,201,0,0,163,0,0,1,1,"normal","No" +"236",54,1,"asymptomatic",122,286,0,2,116,1,3.2,2,2,"normal","Yes" +"237",56,1,"asymptomatic",130,283,1,2,103,1,1.6,3,0,"reversable","Yes" +"238",46,1,"asymptomatic",120,249,0,2,144,0,0.8,1,0,"reversable","Yes" +"239",49,0,"nontypical",134,271,0,0,162,0,0,2,0,"normal","No" +"240",42,1,"nontypical",120,295,0,0,162,0,0,1,0,"normal","No" +"241",41,1,"nontypical",110,235,0,0,153,0,0,1,0,"normal","No" +"242",41,0,"nontypical",126,306,0,0,163,0,0,1,0,"normal","No" +"243",49,0,"asymptomatic",130,269,0,0,163,0,0,1,0,"normal","No" +"244",61,1,"typical",134,234,0,0,145,0,2.6,2,2,"normal","Yes" +"245",60,0,"nonanginal",120,178,1,0,96,0,0,1,0,"normal","No" +"246",67,1,"asymptomatic",120,237,0,0,71,0,1,2,0,"normal","Yes" +"247",58,1,"asymptomatic",100,234,0,0,156,0,0.1,1,1,"reversable","Yes" +"248",47,1,"asymptomatic",110,275,0,2,118,1,1,2,1,"normal","Yes" +"249",52,1,"asymptomatic",125,212,0,0,168,0,1,1,2,"reversable","Yes" +"250",62,1,"nontypical",128,208,1,2,140,0,0,1,0,"normal","No" +"251",57,1,"asymptomatic",110,201,0,0,126,1,1.5,2,0,"fixed","No" +"252",58,1,"asymptomatic",146,218,0,0,105,0,2,2,1,"reversable","Yes" +"253",64,1,"asymptomatic",128,263,0,0,105,1,0.2,2,1,"reversable","No" +"254",51,0,"nonanginal",120,295,0,2,157,0,0.6,1,0,"normal","No" +"255",43,1,"asymptomatic",115,303,0,0,181,0,1.2,2,0,"normal","No" +"256",42,0,"nonanginal",120,209,0,0,173,0,0,2,0,"normal","No" +"257",67,0,"asymptomatic",106,223,0,0,142,0,0.3,1,2,"normal","No" +"258",76,0,"nonanginal",140,197,0,1,116,0,1.1,2,0,"normal","No" +"259",70,1,"nontypical",156,245,0,2,143,0,0,1,0,"normal","No" +"260",57,1,"nontypical",124,261,0,0,141,0,0.3,1,0,"reversable","Yes" +"261",44,0,"nonanginal",118,242,0,0,149,0,0.3,2,1,"normal","No" +"262",58,0,"nontypical",136,319,1,2,152,0,0,1,2,"normal","Yes" +"263",60,0,"typical",150,240,0,0,171,0,0.9,1,0,"normal","No" +"264",44,1,"nonanginal",120,226,0,0,169,0,0,1,0,"normal","No" +"265",61,1,"asymptomatic",138,166,0,2,125,1,3.6,2,1,"normal","Yes" +"266",42,1,"asymptomatic",136,315,0,0,125,1,1.8,2,0,"fixed","Yes" +"267",52,1,"asymptomatic",128,204,1,0,156,1,1,2,0,NA,"Yes" +"268",59,1,"nonanginal",126,218,1,0,134,0,2.2,2,1,"fixed","Yes" +"269",40,1,"asymptomatic",152,223,0,0,181,0,0,1,0,"reversable","Yes" +"270",42,1,"nonanginal",130,180,0,0,150,0,0,1,0,"normal","No" +"271",61,1,"asymptomatic",140,207,0,2,138,1,1.9,1,1,"reversable","Yes" +"272",66,1,"asymptomatic",160,228,0,2,138,0,2.3,1,0,"fixed","No" +"273",46,1,"asymptomatic",140,311,0,0,120,1,1.8,2,2,"reversable","Yes" +"274",71,0,"asymptomatic",112,149,0,0,125,0,1.6,2,0,"normal","No" +"275",59,1,"typical",134,204,0,0,162,0,0.8,1,2,"normal","Yes" +"276",64,1,"typical",170,227,0,2,155,0,0.6,2,0,"reversable","No" +"277",66,0,"nonanginal",146,278,0,2,152,0,0,2,1,"normal","No" +"278",39,0,"nonanginal",138,220,0,0,152,0,0,2,0,"normal","No" +"279",57,1,"nontypical",154,232,0,2,164,0,0,1,1,"normal","Yes" +"280",58,0,"asymptomatic",130,197,0,0,131,0,0.6,2,0,"normal","No" +"281",57,1,"asymptomatic",110,335,0,0,143,1,3,2,1,"reversable","Yes" +"282",47,1,"nonanginal",130,253,0,0,179,0,0,1,0,"normal","No" +"283",55,0,"asymptomatic",128,205,0,1,130,1,2,2,1,"reversable","Yes" +"284",35,1,"nontypical",122,192,0,0,174,0,0,1,0,"normal","No" +"285",61,1,"asymptomatic",148,203,0,0,161,0,0,1,1,"reversable","Yes" +"286",58,1,"asymptomatic",114,318,0,1,140,0,4.4,3,3,"fixed","Yes" +"287",58,0,"asymptomatic",170,225,1,2,146,1,2.8,2,2,"fixed","Yes" +"288",58,1,"nontypical",125,220,0,0,144,0,0.4,2,NA,"reversable","No" +"289",56,1,"nontypical",130,221,0,2,163,0,0,1,0,"reversable","No" +"290",56,1,"nontypical",120,240,0,0,169,0,0,3,0,"normal","No" +"291",67,1,"nonanginal",152,212,0,2,150,0,0.8,2,0,"reversable","Yes" +"292",55,0,"nontypical",132,342,0,0,166,0,1.2,1,0,"normal","No" +"293",44,1,"asymptomatic",120,169,0,0,144,1,2.8,3,0,"fixed","Yes" +"294",63,1,"asymptomatic",140,187,0,2,144,1,4,1,2,"reversable","Yes" +"295",63,0,"asymptomatic",124,197,0,0,136,1,0,2,0,"normal","Yes" +"296",41,1,"nontypical",120,157,0,0,182,0,0,1,0,"normal","No" +"297",59,1,"asymptomatic",164,176,1,2,90,0,1,2,2,"fixed","Yes" +"298",57,0,"asymptomatic",140,241,0,0,123,1,0.2,2,0,"reversable","Yes" +"299",45,1,"typical",110,264,0,0,132,0,1.2,2,0,"reversable","Yes" +"300",68,1,"asymptomatic",144,193,1,0,141,0,3.4,2,2,"reversable","Yes" +"301",57,1,"asymptomatic",130,131,0,0,115,1,1.2,2,1,"reversable","Yes" +"302",57,0,"nontypical",130,236,0,2,174,0,0,2,1,"normal","Yes" +"303",38,1,"nonanginal",138,175,0,0,173,0,0,1,NA,"normal","No" diff --git a/ISLP_data/Income1.csv b/ISLP_data/Income1.csv new file mode 100644 index 0000000..10108b3 --- /dev/null +++ b/ISLP_data/Income1.csv @@ -0,0 +1,31 @@ +"","Education","Income" +"1",10,26.6588387834389 +"2",10.4013377926421,27.3064353457772 +"3",10.8428093645485,22.1324101716143 +"4",11.2441471571906,21.1698405046065 +"5",11.6454849498328,15.1926335164307 +"6",12.0869565217391,26.3989510407284 +"7",12.4882943143813,17.435306578572 +"8",12.8896321070234,25.5078852305278 +"9",13.2909698996656,36.884594694235 +"10",13.7324414715719,39.666108747637 +"11",14.133779264214,34.3962805641312 +"12",14.5351170568562,41.4979935356871 +"13",14.9765886287625,44.9815748660704 +"14",15.3779264214047,47.039595257834 +"15",15.7792642140468,48.2525782901863 +"16",16.2207357859532,57.0342513373801 +"17",16.6220735785953,51.4909192102538 +"18",17.0234113712375,61.3366205527288 +"19",17.4648829431438,57.581988179306 +"20",17.866220735786,68.5537140185881 +"21",18.2675585284281,64.310925303692 +"22",18.7090301003344,68.9590086393083 +"23",19.1103678929766,74.6146392793647 +"24",19.5117056856187,71.8671953042483 +"25",19.9130434782609,76.098135379724 +"26",20.3545150501672,75.77521802986 +"27",20.7558528428094,72.4860553152424 +"28",21.1571906354515,77.3550205741877 +"29",21.5986622073579,72.1187904524136 +"30",22,80.2605705009016 diff --git a/ISLP_data/Income2.csv b/ISLP_data/Income2.csv new file mode 100644 index 0000000..235dea3 --- /dev/null +++ b/ISLP_data/Income2.csv @@ -0,0 +1,31 @@ +"","Education","Seniority","Income" +"1",21.5862068965517,113.103448275862,99.9171726114381 +"2",18.2758620689655,119.310344827586,92.579134855529 +"3",12.0689655172414,100.689655172414,34.6787271520874 +"4",17.0344827586207,187.586206896552,78.7028062353695 +"5",19.9310344827586,20,68.0099216471551 +"6",18.2758620689655,26.2068965517241,71.5044853814318 +"7",19.9310344827586,150.344827586207,87.9704669939115 +"8",21.1724137931034,82.0689655172414,79.8110298331255 +"9",20.3448275862069,88.2758620689655,90.00632710858 +"10",10,113.103448275862,45.6555294997364 +"11",13.7241379310345,51.0344827586207,31.9138079371295 +"12",18.6896551724138,144.137931034483,96.2829968022869 +"13",11.6551724137931,20,27.9825049000603 +"14",16.6206896551724,94.4827586206897,66.601792415137 +"15",10,187.586206896552,41.5319924201478 +"16",20.3448275862069,94.4827586206897,89.00070081522 +"17",14.1379310344828,20,28.8163007592387 +"18",16.6206896551724,44.8275862068966,57.6816942573605 +"19",16.6206896551724,175.172413793103,70.1050960424457 +"20",20.3448275862069,187.586206896552,98.8340115435447 +"21",18.2758620689655,100.689655172414,74.7046991976891 +"22",14.551724137931,137.931034482759,53.5321056283034 +"23",17.448275862069,94.4827586206897,72.0789236655191 +"24",10.4137931034483,32.4137931034483,18.5706650327685 +"25",21.5862068965517,20,78.8057842852386 +"26",11.2413793103448,44.8275862068966,21.388561306174 +"27",19.9310344827586,168.965517241379,90.8140351180409 +"28",11.6551724137931,57.2413793103448,22.6361626208955 +"29",12.0689655172414,32.4137931034483,17.613593041445 +"30",17.0344827586207,106.896551724138,74.6109601985289 diff --git a/ISLP_data/book_images.zip b/ISLP_data/book_images.zip new file mode 100644 index 0000000..769af5f Binary files /dev/null and b/ISLP_data/book_images.zip differ diff --git a/ISLP_data/imagenet_class_index.json b/ISLP_data/imagenet_class_index.json new file mode 100644 index 0000000..5fe0dfe --- /dev/null +++ b/ISLP_data/imagenet_class_index.json @@ -0,0 +1 @@ +{"0": ["n01440764", "tench"], "1": ["n01443537", "goldfish"], "2": ["n01484850", "great_white_shark"], "3": ["n01491361", "tiger_shark"], "4": ["n01494475", "hammerhead"], "5": ["n01496331", "electric_ray"], "6": ["n01498041", "stingray"], "7": ["n01514668", "cock"], "8": ["n01514859", "hen"], "9": ["n01518878", "ostrich"], "10": ["n01530575", "brambling"], "11": ["n01531178", "goldfinch"], "12": ["n01532829", "house_finch"], "13": ["n01534433", "junco"], "14": ["n01537544", "indigo_bunting"], "15": ["n01558993", "robin"], "16": ["n01560419", "bulbul"], "17": ["n01580077", "jay"], "18": ["n01582220", "magpie"], "19": ["n01592084", "chickadee"], "20": ["n01601694", "water_ouzel"], "21": ["n01608432", "kite"], "22": ["n01614925", "bald_eagle"], "23": ["n01616318", "vulture"], "24": ["n01622779", "great_grey_owl"], "25": ["n01629819", "European_fire_salamander"], "26": ["n01630670", "common_newt"], "27": ["n01631663", "eft"], "28": ["n01632458", "spotted_salamander"], "29": ["n01632777", "axolotl"], "30": ["n01641577", "bullfrog"], "31": ["n01644373", "tree_frog"], "32": ["n01644900", "tailed_frog"], "33": ["n01664065", "loggerhead"], "34": ["n01665541", "leatherback_turtle"], "35": ["n01667114", "mud_turtle"], "36": ["n01667778", "terrapin"], "37": ["n01669191", "box_turtle"], "38": ["n01675722", "banded_gecko"], "39": ["n01677366", "common_iguana"], "40": ["n01682714", "American_chameleon"], "41": ["n01685808", "whiptail"], "42": ["n01687978", "agama"], "43": ["n01688243", "frilled_lizard"], "44": ["n01689811", "alligator_lizard"], "45": ["n01692333", "Gila_monster"], "46": ["n01693334", "green_lizard"], "47": ["n01694178", "African_chameleon"], "48": ["n01695060", "Komodo_dragon"], "49": ["n01697457", "African_crocodile"], "50": ["n01698640", "American_alligator"], "51": ["n01704323", "triceratops"], "52": ["n01728572", "thunder_snake"], "53": ["n01728920", "ringneck_snake"], "54": ["n01729322", "hognose_snake"], "55": ["n01729977", "green_snake"], "56": ["n01734418", "king_snake"], "57": ["n01735189", "garter_snake"], "58": ["n01737021", "water_snake"], "59": ["n01739381", "vine_snake"], "60": ["n01740131", "night_snake"], "61": ["n01742172", "boa_constrictor"], "62": ["n01744401", "rock_python"], "63": ["n01748264", "Indian_cobra"], "64": ["n01749939", "green_mamba"], "65": ["n01751748", "sea_snake"], "66": ["n01753488", "horned_viper"], "67": ["n01755581", "diamondback"], "68": ["n01756291", "sidewinder"], "69": ["n01768244", "trilobite"], "70": ["n01770081", "harvestman"], "71": ["n01770393", "scorpion"], "72": ["n01773157", "black_and_gold_garden_spider"], "73": ["n01773549", "barn_spider"], "74": ["n01773797", "garden_spider"], "75": ["n01774384", "black_widow"], "76": ["n01774750", "tarantula"], "77": ["n01775062", "wolf_spider"], "78": ["n01776313", "tick"], "79": ["n01784675", "centipede"], "80": ["n01795545", "black_grouse"], "81": ["n01796340", "ptarmigan"], "82": ["n01797886", "ruffed_grouse"], "83": ["n01798484", "prairie_chicken"], "84": ["n01806143", "peacock"], "85": ["n01806567", "quail"], "86": ["n01807496", "partridge"], "87": ["n01817953", "African_grey"], "88": ["n01818515", "macaw"], "89": ["n01819313", "sulphur-crested_cockatoo"], "90": ["n01820546", "lorikeet"], "91": ["n01824575", "coucal"], "92": ["n01828970", "bee_eater"], "93": ["n01829413", "hornbill"], "94": ["n01833805", "hummingbird"], "95": ["n01843065", "jacamar"], "96": ["n01843383", "toucan"], "97": ["n01847000", "drake"], "98": ["n01855032", "red-breasted_merganser"], "99": ["n01855672", "goose"], "100": ["n01860187", "black_swan"], "101": ["n01871265", "tusker"], "102": ["n01872401", "echidna"], "103": ["n01873310", "platypus"], "104": ["n01877812", "wallaby"], "105": ["n01882714", "koala"], "106": ["n01883070", "wombat"], "107": ["n01910747", "jellyfish"], "108": ["n01914609", "sea_anemone"], "109": ["n01917289", "brain_coral"], "110": ["n01924916", "flatworm"], "111": ["n01930112", "nematode"], "112": ["n01943899", "conch"], "113": ["n01944390", "snail"], "114": ["n01945685", "slug"], "115": ["n01950731", "sea_slug"], "116": ["n01955084", "chiton"], "117": ["n01968897", "chambered_nautilus"], "118": ["n01978287", "Dungeness_crab"], "119": ["n01978455", "rock_crab"], "120": ["n01980166", "fiddler_crab"], "121": ["n01981276", "king_crab"], "122": ["n01983481", "American_lobster"], "123": ["n01984695", "spiny_lobster"], "124": ["n01985128", "crayfish"], "125": ["n01986214", "hermit_crab"], "126": ["n01990800", "isopod"], "127": ["n02002556", "white_stork"], "128": ["n02002724", "black_stork"], "129": ["n02006656", "spoonbill"], "130": ["n02007558", "flamingo"], "131": ["n02009229", "little_blue_heron"], "132": ["n02009912", "American_egret"], "133": ["n02011460", "bittern"], "134": ["n02012849", "crane"], "135": ["n02013706", "limpkin"], "136": ["n02017213", "European_gallinule"], "137": ["n02018207", "American_coot"], "138": ["n02018795", "bustard"], "139": ["n02025239", "ruddy_turnstone"], "140": ["n02027492", "red-backed_sandpiper"], "141": ["n02028035", "redshank"], "142": ["n02033041", "dowitcher"], "143": ["n02037110", "oystercatcher"], "144": ["n02051845", "pelican"], "145": ["n02056570", "king_penguin"], "146": ["n02058221", "albatross"], "147": ["n02066245", "grey_whale"], "148": ["n02071294", "killer_whale"], "149": ["n02074367", "dugong"], "150": ["n02077923", "sea_lion"], "151": ["n02085620", "Chihuahua"], "152": ["n02085782", "Japanese_spaniel"], "153": ["n02085936", "Maltese_dog"], "154": ["n02086079", "Pekinese"], "155": ["n02086240", "Shih-Tzu"], "156": ["n02086646", "Blenheim_spaniel"], "157": ["n02086910", "papillon"], "158": ["n02087046", "toy_terrier"], "159": ["n02087394", "Rhodesian_ridgeback"], "160": ["n02088094", "Afghan_hound"], "161": ["n02088238", "basset"], "162": ["n02088364", "beagle"], "163": ["n02088466", "bloodhound"], "164": ["n02088632", "bluetick"], "165": ["n02089078", "black-and-tan_coonhound"], "166": ["n02089867", "Walker_hound"], "167": ["n02089973", "English_foxhound"], "168": ["n02090379", "redbone"], "169": ["n02090622", "borzoi"], "170": ["n02090721", "Irish_wolfhound"], "171": ["n02091032", "Italian_greyhound"], "172": ["n02091134", "whippet"], "173": ["n02091244", "Ibizan_hound"], "174": ["n02091467", "Norwegian_elkhound"], "175": ["n02091635", "otterhound"], "176": ["n02091831", "Saluki"], "177": ["n02092002", "Scottish_deerhound"], "178": ["n02092339", "Weimaraner"], "179": ["n02093256", "Staffordshire_bullterrier"], "180": ["n02093428", "American_Staffordshire_terrier"], "181": ["n02093647", "Bedlington_terrier"], "182": ["n02093754", "Border_terrier"], "183": ["n02093859", "Kerry_blue_terrier"], "184": ["n02093991", "Irish_terrier"], "185": ["n02094114", "Norfolk_terrier"], "186": ["n02094258", "Norwich_terrier"], "187": ["n02094433", "Yorkshire_terrier"], "188": ["n02095314", "wire-haired_fox_terrier"], "189": ["n02095570", "Lakeland_terrier"], "190": ["n02095889", "Sealyham_terrier"], "191": ["n02096051", "Airedale"], "192": ["n02096177", "cairn"], "193": ["n02096294", "Australian_terrier"], "194": ["n02096437", "Dandie_Dinmont"], "195": ["n02096585", "Boston_bull"], "196": ["n02097047", "miniature_schnauzer"], "197": ["n02097130", "giant_schnauzer"], "198": ["n02097209", "standard_schnauzer"], "199": ["n02097298", "Scotch_terrier"], "200": ["n02097474", "Tibetan_terrier"], "201": ["n02097658", "silky_terrier"], "202": ["n02098105", "soft-coated_wheaten_terrier"], "203": ["n02098286", "West_Highland_white_terrier"], "204": ["n02098413", "Lhasa"], "205": ["n02099267", "flat-coated_retriever"], "206": ["n02099429", "curly-coated_retriever"], "207": ["n02099601", "golden_retriever"], "208": ["n02099712", "Labrador_retriever"], "209": ["n02099849", "Chesapeake_Bay_retriever"], "210": ["n02100236", "German_short-haired_pointer"], "211": ["n02100583", "vizsla"], "212": ["n02100735", "English_setter"], "213": ["n02100877", "Irish_setter"], "214": ["n02101006", "Gordon_setter"], "215": ["n02101388", "Brittany_spaniel"], "216": ["n02101556", "clumber"], "217": ["n02102040", "English_springer"], "218": ["n02102177", "Welsh_springer_spaniel"], "219": ["n02102318", "cocker_spaniel"], "220": ["n02102480", "Sussex_spaniel"], "221": ["n02102973", "Irish_water_spaniel"], "222": ["n02104029", "kuvasz"], "223": ["n02104365", "schipperke"], "224": ["n02105056", "groenendael"], "225": ["n02105162", "malinois"], "226": ["n02105251", "briard"], "227": ["n02105412", "kelpie"], "228": ["n02105505", "komondor"], "229": ["n02105641", "Old_English_sheepdog"], "230": ["n02105855", "Shetland_sheepdog"], "231": ["n02106030", "collie"], "232": ["n02106166", "Border_collie"], "233": ["n02106382", "Bouvier_des_Flandres"], "234": ["n02106550", "Rottweiler"], "235": ["n02106662", "German_shepherd"], "236": ["n02107142", "Doberman"], "237": ["n02107312", "miniature_pinscher"], "238": ["n02107574", "Greater_Swiss_Mountain_dog"], "239": ["n02107683", "Bernese_mountain_dog"], "240": ["n02107908", "Appenzeller"], "241": ["n02108000", "EntleBucher"], "242": ["n02108089", "boxer"], "243": ["n02108422", "bull_mastiff"], "244": ["n02108551", "Tibetan_mastiff"], "245": ["n02108915", "French_bulldog"], "246": ["n02109047", "Great_Dane"], "247": ["n02109525", "Saint_Bernard"], "248": ["n02109961", "Eskimo_dog"], "249": ["n02110063", "malamute"], "250": ["n02110185", "Siberian_husky"], "251": ["n02110341", "dalmatian"], "252": ["n02110627", "affenpinscher"], "253": ["n02110806", "basenji"], "254": ["n02110958", "pug"], "255": ["n02111129", "Leonberg"], "256": ["n02111277", "Newfoundland"], "257": ["n02111500", "Great_Pyrenees"], "258": ["n02111889", "Samoyed"], "259": ["n02112018", "Pomeranian"], "260": ["n02112137", "chow"], "261": ["n02112350", "keeshond"], "262": ["n02112706", "Brabancon_griffon"], "263": ["n02113023", "Pembroke"], "264": ["n02113186", "Cardigan"], "265": ["n02113624", "toy_poodle"], "266": ["n02113712", "miniature_poodle"], "267": ["n02113799", "standard_poodle"], "268": ["n02113978", "Mexican_hairless"], "269": ["n02114367", "timber_wolf"], "270": ["n02114548", "white_wolf"], "271": ["n02114712", "red_wolf"], "272": ["n02114855", "coyote"], "273": ["n02115641", "dingo"], "274": ["n02115913", "dhole"], "275": ["n02116738", "African_hunting_dog"], "276": ["n02117135", "hyena"], "277": ["n02119022", "red_fox"], "278": ["n02119789", "kit_fox"], "279": ["n02120079", "Arctic_fox"], "280": ["n02120505", "grey_fox"], "281": ["n02123045", "tabby"], "282": ["n02123159", "tiger_cat"], "283": ["n02123394", "Persian_cat"], "284": ["n02123597", "Siamese_cat"], "285": ["n02124075", "Egyptian_cat"], "286": ["n02125311", "cougar"], "287": ["n02127052", "lynx"], "288": ["n02128385", "leopard"], "289": ["n02128757", "snow_leopard"], "290": ["n02128925", "jaguar"], "291": ["n02129165", "lion"], "292": ["n02129604", "tiger"], "293": ["n02130308", "cheetah"], "294": ["n02132136", "brown_bear"], "295": ["n02133161", "American_black_bear"], "296": ["n02134084", "ice_bear"], "297": ["n02134418", "sloth_bear"], "298": ["n02137549", "mongoose"], "299": ["n02138441", "meerkat"], "300": ["n02165105", "tiger_beetle"], "301": ["n02165456", "ladybug"], "302": ["n02167151", "ground_beetle"], "303": ["n02168699", "long-horned_beetle"], "304": ["n02169497", "leaf_beetle"], "305": ["n02172182", "dung_beetle"], "306": ["n02174001", "rhinoceros_beetle"], "307": ["n02177972", "weevil"], "308": ["n02190166", "fly"], "309": ["n02206856", "bee"], "310": ["n02219486", "ant"], "311": ["n02226429", "grasshopper"], "312": ["n02229544", "cricket"], "313": ["n02231487", "walking_stick"], "314": ["n02233338", "cockroach"], "315": ["n02236044", "mantis"], "316": ["n02256656", "cicada"], "317": ["n02259212", "leafhopper"], "318": ["n02264363", "lacewing"], "319": ["n02268443", "dragonfly"], "320": ["n02268853", "damselfly"], "321": ["n02276258", "admiral"], "322": ["n02277742", "ringlet"], "323": ["n02279972", "monarch"], "324": ["n02280649", "cabbage_butterfly"], "325": ["n02281406", "sulphur_butterfly"], "326": ["n02281787", "lycaenid"], "327": ["n02317335", "starfish"], "328": ["n02319095", "sea_urchin"], "329": ["n02321529", "sea_cucumber"], "330": ["n02325366", "wood_rabbit"], "331": ["n02326432", "hare"], "332": ["n02328150", "Angora"], "333": ["n02342885", "hamster"], "334": ["n02346627", "porcupine"], "335": ["n02356798", "fox_squirrel"], "336": ["n02361337", "marmot"], "337": ["n02363005", "beaver"], "338": ["n02364673", "guinea_pig"], "339": ["n02389026", "sorrel"], "340": ["n02391049", "zebra"], "341": ["n02395406", "hog"], "342": ["n02396427", "wild_boar"], "343": ["n02397096", "warthog"], "344": ["n02398521", "hippopotamus"], "345": ["n02403003", "ox"], "346": ["n02408429", "water_buffalo"], "347": ["n02410509", "bison"], "348": ["n02412080", "ram"], "349": ["n02415577", "bighorn"], "350": ["n02417914", "ibex"], "351": ["n02422106", "hartebeest"], "352": ["n02422699", "impala"], "353": ["n02423022", "gazelle"], "354": ["n02437312", "Arabian_camel"], "355": ["n02437616", "llama"], "356": ["n02441942", "weasel"], "357": ["n02442845", "mink"], "358": ["n02443114", "polecat"], "359": ["n02443484", "black-footed_ferret"], "360": ["n02444819", "otter"], "361": ["n02445715", "skunk"], "362": ["n02447366", "badger"], "363": ["n02454379", "armadillo"], "364": ["n02457408", "three-toed_sloth"], "365": ["n02480495", "orangutan"], "366": ["n02480855", "gorilla"], "367": ["n02481823", "chimpanzee"], "368": ["n02483362", "gibbon"], "369": ["n02483708", "siamang"], "370": ["n02484975", "guenon"], "371": ["n02486261", "patas"], "372": ["n02486410", "baboon"], "373": ["n02487347", "macaque"], "374": ["n02488291", "langur"], "375": ["n02488702", "colobus"], "376": ["n02489166", "proboscis_monkey"], "377": ["n02490219", "marmoset"], "378": ["n02492035", "capuchin"], "379": ["n02492660", "howler_monkey"], "380": ["n02493509", "titi"], "381": ["n02493793", "spider_monkey"], "382": ["n02494079", "squirrel_monkey"], "383": ["n02497673", "Madagascar_cat"], "384": ["n02500267", "indri"], "385": ["n02504013", "Indian_elephant"], "386": ["n02504458", "African_elephant"], "387": ["n02509815", "lesser_panda"], "388": ["n02510455", "giant_panda"], "389": ["n02514041", "barracouta"], "390": ["n02526121", "eel"], "391": ["n02536864", "coho"], "392": ["n02606052", "rock_beauty"], "393": ["n02607072", "anemone_fish"], "394": ["n02640242", "sturgeon"], "395": ["n02641379", "gar"], "396": ["n02643566", "lionfish"], "397": ["n02655020", "puffer"], "398": ["n02666196", "abacus"], "399": ["n02667093", "abaya"], "400": ["n02669723", "academic_gown"], "401": ["n02672831", "accordion"], "402": ["n02676566", "acoustic_guitar"], "403": ["n02687172", "aircraft_carrier"], "404": ["n02690373", "airliner"], "405": ["n02692877", "airship"], "406": ["n02699494", "altar"], "407": ["n02701002", "ambulance"], "408": ["n02704792", "amphibian"], "409": ["n02708093", "analog_clock"], "410": ["n02727426", "apiary"], "411": ["n02730930", "apron"], "412": ["n02747177", "ashcan"], "413": ["n02749479", "assault_rifle"], "414": ["n02769748", "backpack"], "415": ["n02776631", "bakery"], "416": ["n02777292", "balance_beam"], "417": ["n02782093", "balloon"], "418": ["n02783161", "ballpoint"], "419": ["n02786058", "Band_Aid"], "420": ["n02787622", "banjo"], "421": ["n02788148", "bannister"], "422": ["n02790996", "barbell"], "423": ["n02791124", "barber_chair"], "424": ["n02791270", "barbershop"], "425": ["n02793495", "barn"], "426": ["n02794156", "barometer"], "427": ["n02795169", "barrel"], "428": ["n02797295", "barrow"], "429": ["n02799071", "baseball"], "430": ["n02802426", "basketball"], "431": ["n02804414", "bassinet"], "432": ["n02804610", "bassoon"], "433": ["n02807133", "bathing_cap"], "434": ["n02808304", "bath_towel"], "435": ["n02808440", "bathtub"], "436": ["n02814533", "beach_wagon"], "437": ["n02814860", "beacon"], "438": ["n02815834", "beaker"], "439": ["n02817516", "bearskin"], "440": ["n02823428", "beer_bottle"], "441": ["n02823750", "beer_glass"], "442": ["n02825657", "bell_cote"], "443": ["n02834397", "bib"], "444": ["n02835271", "bicycle-built-for-two"], "445": ["n02837789", "bikini"], "446": ["n02840245", "binder"], "447": ["n02841315", "binoculars"], "448": ["n02843684", "birdhouse"], "449": ["n02859443", "boathouse"], "450": ["n02860847", "bobsled"], "451": ["n02865351", "bolo_tie"], "452": ["n02869837", "bonnet"], "453": ["n02870880", "bookcase"], "454": ["n02871525", "bookshop"], "455": ["n02877765", "bottlecap"], "456": ["n02879718", "bow"], "457": ["n02883205", "bow_tie"], "458": ["n02892201", "brass"], "459": ["n02892767", "brassiere"], "460": ["n02894605", "breakwater"], "461": ["n02895154", "breastplate"], "462": ["n02906734", "broom"], "463": ["n02909870", "bucket"], "464": ["n02910353", "buckle"], "465": ["n02916936", "bulletproof_vest"], "466": ["n02917067", "bullet_train"], "467": ["n02927161", "butcher_shop"], "468": ["n02930766", "cab"], "469": ["n02939185", "caldron"], "470": ["n02948072", "candle"], "471": ["n02950826", "cannon"], "472": ["n02951358", "canoe"], "473": ["n02951585", "can_opener"], "474": ["n02963159", "cardigan"], "475": ["n02965783", "car_mirror"], "476": ["n02966193", "carousel"], "477": ["n02966687", "carpenter's_kit"], "478": ["n02971356", "carton"], "479": ["n02974003", "car_wheel"], "480": ["n02977058", "cash_machine"], "481": ["n02978881", "cassette"], "482": ["n02979186", "cassette_player"], "483": ["n02980441", "castle"], "484": ["n02981792", "catamaran"], "485": ["n02988304", "CD_player"], "486": ["n02992211", "cello"], "487": ["n02992529", "cellular_telephone"], "488": ["n02999410", "chain"], "489": ["n03000134", "chainlink_fence"], "490": ["n03000247", "chain_mail"], "491": ["n03000684", "chain_saw"], "492": ["n03014705", "chest"], "493": ["n03016953", "chiffonier"], "494": ["n03017168", "chime"], "495": ["n03018349", "china_cabinet"], "496": ["n03026506", "Christmas_stocking"], "497": ["n03028079", "church"], "498": ["n03032252", "cinema"], "499": ["n03041632", "cleaver"], "500": ["n03042490", "cliff_dwelling"], "501": ["n03045698", "cloak"], "502": ["n03047690", "clog"], "503": ["n03062245", "cocktail_shaker"], "504": ["n03063599", "coffee_mug"], "505": ["n03063689", "coffeepot"], "506": ["n03065424", "coil"], "507": ["n03075370", "combination_lock"], "508": ["n03085013", "computer_keyboard"], "509": ["n03089624", "confectionery"], "510": ["n03095699", "container_ship"], "511": ["n03100240", "convertible"], "512": ["n03109150", "corkscrew"], "513": ["n03110669", "cornet"], "514": ["n03124043", "cowboy_boot"], "515": ["n03124170", "cowboy_hat"], "516": ["n03125729", "cradle"], "517": ["n03126707", "crane"], "518": ["n03127747", "crash_helmet"], "519": ["n03127925", "crate"], "520": ["n03131574", "crib"], "521": ["n03133878", "Crock_Pot"], "522": ["n03134739", "croquet_ball"], "523": ["n03141823", "crutch"], "524": ["n03146219", "cuirass"], "525": ["n03160309", "dam"], "526": ["n03179701", "desk"], "527": ["n03180011", "desktop_computer"], "528": ["n03187595", "dial_telephone"], "529": ["n03188531", "diaper"], "530": ["n03196217", "digital_clock"], "531": ["n03197337", "digital_watch"], "532": ["n03201208", "dining_table"], "533": ["n03207743", "dishrag"], "534": ["n03207941", "dishwasher"], "535": ["n03208938", "disk_brake"], "536": ["n03216828", "dock"], "537": ["n03218198", "dogsled"], "538": ["n03220513", "dome"], "539": ["n03223299", "doormat"], "540": ["n03240683", "drilling_platform"], "541": ["n03249569", "drum"], "542": ["n03250847", "drumstick"], "543": ["n03255030", "dumbbell"], "544": ["n03259280", "Dutch_oven"], "545": ["n03271574", "electric_fan"], "546": ["n03272010", "electric_guitar"], "547": ["n03272562", "electric_locomotive"], "548": ["n03290653", "entertainment_center"], "549": ["n03291819", "envelope"], "550": ["n03297495", "espresso_maker"], "551": ["n03314780", "face_powder"], "552": ["n03325584", "feather_boa"], "553": ["n03337140", "file"], "554": ["n03344393", "fireboat"], "555": ["n03345487", "fire_engine"], "556": ["n03347037", "fire_screen"], "557": ["n03355925", "flagpole"], "558": ["n03372029", "flute"], "559": ["n03376595", "folding_chair"], "560": ["n03379051", "football_helmet"], "561": ["n03384352", "forklift"], "562": ["n03388043", "fountain"], "563": ["n03388183", "fountain_pen"], "564": ["n03388549", "four-poster"], "565": ["n03393912", "freight_car"], "566": ["n03394916", "French_horn"], "567": ["n03400231", "frying_pan"], "568": ["n03404251", "fur_coat"], "569": ["n03417042", "garbage_truck"], "570": ["n03424325", "gasmask"], "571": ["n03425413", "gas_pump"], "572": ["n03443371", "goblet"], "573": ["n03444034", "go-kart"], "574": ["n03445777", "golf_ball"], "575": ["n03445924", "golfcart"], "576": ["n03447447", "gondola"], "577": ["n03447721", "gong"], "578": ["n03450230", "gown"], "579": ["n03452741", "grand_piano"], "580": ["n03457902", "greenhouse"], "581": ["n03459775", "grille"], "582": ["n03461385", "grocery_store"], "583": ["n03467068", "guillotine"], "584": ["n03476684", "hair_slide"], "585": ["n03476991", "hair_spray"], "586": ["n03478589", "half_track"], "587": ["n03481172", "hammer"], "588": ["n03482405", "hamper"], "589": ["n03483316", "hand_blower"], "590": ["n03485407", "hand-held_computer"], "591": ["n03485794", "handkerchief"], "592": ["n03492542", "hard_disc"], "593": ["n03494278", "harmonica"], "594": ["n03495258", "harp"], "595": ["n03496892", "harvester"], "596": ["n03498962", "hatchet"], "597": ["n03527444", "holster"], "598": ["n03529860", "home_theater"], "599": ["n03530642", "honeycomb"], "600": ["n03532672", "hook"], "601": ["n03534580", "hoopskirt"], "602": ["n03535780", "horizontal_bar"], "603": ["n03538406", "horse_cart"], "604": ["n03544143", "hourglass"], "605": ["n03584254", "iPod"], "606": ["n03584829", "iron"], "607": ["n03590841", "jack-o'-lantern"], "608": ["n03594734", "jean"], "609": ["n03594945", "jeep"], "610": ["n03595614", "jersey"], "611": ["n03598930", "jigsaw_puzzle"], "612": ["n03599486", "jinrikisha"], "613": ["n03602883", "joystick"], "614": ["n03617480", "kimono"], "615": ["n03623198", "knee_pad"], "616": ["n03627232", "knot"], "617": ["n03630383", "lab_coat"], "618": ["n03633091", "ladle"], "619": ["n03637318", "lampshade"], "620": ["n03642806", "laptop"], "621": ["n03649909", "lawn_mower"], "622": ["n03657121", "lens_cap"], "623": ["n03658185", "letter_opener"], "624": ["n03661043", "library"], "625": ["n03662601", "lifeboat"], "626": ["n03666591", "lighter"], "627": ["n03670208", "limousine"], "628": ["n03673027", "liner"], "629": ["n03676483", "lipstick"], "630": ["n03680355", "Loafer"], "631": ["n03690938", "lotion"], "632": ["n03691459", "loudspeaker"], "633": ["n03692522", "loupe"], "634": ["n03697007", "lumbermill"], "635": ["n03706229", "magnetic_compass"], "636": ["n03709823", "mailbag"], "637": ["n03710193", "mailbox"], "638": ["n03710637", "maillot"], "639": ["n03710721", "maillot"], "640": ["n03717622", "manhole_cover"], "641": ["n03720891", "maraca"], "642": ["n03721384", "marimba"], "643": ["n03724870", "mask"], "644": ["n03729826", "matchstick"], "645": ["n03733131", "maypole"], "646": ["n03733281", "maze"], "647": ["n03733805", "measuring_cup"], "648": ["n03742115", "medicine_chest"], "649": ["n03743016", "megalith"], "650": ["n03759954", "microphone"], "651": ["n03761084", "microwave"], "652": ["n03763968", "military_uniform"], "653": ["n03764736", "milk_can"], "654": ["n03769881", "minibus"], "655": ["n03770439", "miniskirt"], "656": ["n03770679", "minivan"], "657": ["n03773504", "missile"], "658": ["n03775071", "mitten"], "659": ["n03775546", "mixing_bowl"], "660": ["n03776460", "mobile_home"], "661": ["n03777568", "Model_T"], "662": ["n03777754", "modem"], "663": ["n03781244", "monastery"], "664": ["n03782006", "monitor"], "665": ["n03785016", "moped"], "666": ["n03786901", "mortar"], "667": ["n03787032", "mortarboard"], "668": ["n03788195", "mosque"], "669": ["n03788365", "mosquito_net"], "670": ["n03791053", "motor_scooter"], "671": ["n03792782", "mountain_bike"], "672": ["n03792972", "mountain_tent"], "673": ["n03793489", "mouse"], "674": ["n03794056", "mousetrap"], "675": ["n03796401", "moving_van"], "676": ["n03803284", "muzzle"], "677": ["n03804744", "nail"], "678": ["n03814639", "neck_brace"], "679": ["n03814906", "necklace"], "680": ["n03825788", "nipple"], "681": ["n03832673", "notebook"], "682": ["n03837869", "obelisk"], "683": ["n03838899", "oboe"], "684": ["n03840681", "ocarina"], "685": ["n03841143", "odometer"], "686": ["n03843555", "oil_filter"], "687": ["n03854065", "organ"], "688": ["n03857828", "oscilloscope"], "689": ["n03866082", "overskirt"], "690": ["n03868242", "oxcart"], "691": ["n03868863", "oxygen_mask"], "692": ["n03871628", "packet"], "693": ["n03873416", "paddle"], "694": ["n03874293", "paddlewheel"], "695": ["n03874599", "padlock"], "696": ["n03876231", "paintbrush"], "697": ["n03877472", "pajama"], "698": ["n03877845", "palace"], "699": ["n03884397", "panpipe"], "700": ["n03887697", "paper_towel"], "701": ["n03888257", "parachute"], "702": ["n03888605", "parallel_bars"], "703": ["n03891251", "park_bench"], "704": ["n03891332", "parking_meter"], "705": ["n03895866", "passenger_car"], "706": ["n03899768", "patio"], "707": ["n03902125", "pay-phone"], "708": ["n03903868", "pedestal"], "709": ["n03908618", "pencil_box"], "710": ["n03908714", "pencil_sharpener"], "711": ["n03916031", "perfume"], "712": ["n03920288", "Petri_dish"], "713": ["n03924679", "photocopier"], "714": ["n03929660", "pick"], "715": ["n03929855", "pickelhaube"], "716": ["n03930313", "picket_fence"], "717": ["n03930630", "pickup"], "718": ["n03933933", "pier"], "719": ["n03935335", "piggy_bank"], "720": ["n03937543", "pill_bottle"], "721": ["n03938244", "pillow"], "722": ["n03942813", "ping-pong_ball"], "723": ["n03944341", "pinwheel"], "724": ["n03947888", "pirate"], "725": ["n03950228", "pitcher"], "726": ["n03954731", "plane"], "727": ["n03956157", "planetarium"], "728": ["n03958227", "plastic_bag"], "729": ["n03961711", "plate_rack"], "730": ["n03967562", "plow"], "731": ["n03970156", "plunger"], "732": ["n03976467", "Polaroid_camera"], "733": ["n03976657", "pole"], "734": ["n03977966", "police_van"], "735": ["n03980874", "poncho"], "736": ["n03982430", "pool_table"], "737": ["n03983396", "pop_bottle"], "738": ["n03991062", "pot"], "739": ["n03992509", "potter's_wheel"], "740": ["n03995372", "power_drill"], "741": ["n03998194", "prayer_rug"], "742": ["n04004767", "printer"], "743": ["n04005630", "prison"], "744": ["n04008634", "projectile"], "745": ["n04009552", "projector"], "746": ["n04019541", "puck"], "747": ["n04023962", "punching_bag"], "748": ["n04026417", "purse"], "749": ["n04033901", "quill"], "750": ["n04033995", "quilt"], "751": ["n04037443", "racer"], "752": ["n04039381", "racket"], "753": ["n04040759", "radiator"], "754": ["n04041544", "radio"], "755": ["n04044716", "radio_telescope"], "756": ["n04049303", "rain_barrel"], "757": ["n04065272", "recreational_vehicle"], "758": ["n04067472", "reel"], "759": ["n04069434", "reflex_camera"], "760": ["n04070727", "refrigerator"], "761": ["n04074963", "remote_control"], "762": ["n04081281", "restaurant"], "763": ["n04086273", "revolver"], "764": ["n04090263", "rifle"], "765": ["n04099969", "rocking_chair"], "766": ["n04111531", "rotisserie"], "767": ["n04116512", "rubber_eraser"], "768": ["n04118538", "rugby_ball"], "769": ["n04118776", "rule"], "770": ["n04120489", "running_shoe"], "771": ["n04125021", "safe"], "772": ["n04127249", "safety_pin"], "773": ["n04131690", "saltshaker"], "774": ["n04133789", "sandal"], "775": ["n04136333", "sarong"], "776": ["n04141076", "sax"], "777": ["n04141327", "scabbard"], "778": ["n04141975", "scale"], "779": ["n04146614", "school_bus"], "780": ["n04147183", "schooner"], "781": ["n04149813", "scoreboard"], "782": ["n04152593", "screen"], "783": ["n04153751", "screw"], "784": ["n04154565", "screwdriver"], "785": ["n04162706", "seat_belt"], "786": ["n04179913", "sewing_machine"], "787": ["n04192698", "shield"], "788": ["n04200800", "shoe_shop"], "789": ["n04201297", "shoji"], "790": ["n04204238", "shopping_basket"], "791": ["n04204347", "shopping_cart"], "792": ["n04208210", "shovel"], "793": ["n04209133", "shower_cap"], "794": ["n04209239", "shower_curtain"], "795": ["n04228054", "ski"], "796": ["n04229816", "ski_mask"], "797": ["n04235860", "sleeping_bag"], "798": ["n04238763", "slide_rule"], "799": ["n04239074", "sliding_door"], "800": ["n04243546", "slot"], "801": ["n04251144", "snorkel"], "802": ["n04252077", "snowmobile"], "803": ["n04252225", "snowplow"], "804": ["n04254120", "soap_dispenser"], "805": ["n04254680", "soccer_ball"], "806": ["n04254777", "sock"], "807": ["n04258138", "solar_dish"], "808": ["n04259630", "sombrero"], "809": ["n04263257", "soup_bowl"], "810": ["n04264628", "space_bar"], "811": ["n04265275", "space_heater"], "812": ["n04266014", "space_shuttle"], "813": ["n04270147", "spatula"], "814": ["n04273569", "speedboat"], "815": ["n04275548", "spider_web"], "816": ["n04277352", "spindle"], "817": ["n04285008", "sports_car"], "818": ["n04286575", "spotlight"], "819": ["n04296562", "stage"], "820": ["n04310018", "steam_locomotive"], "821": ["n04311004", "steel_arch_bridge"], "822": ["n04311174", "steel_drum"], "823": ["n04317175", "stethoscope"], "824": ["n04325704", "stole"], "825": ["n04326547", "stone_wall"], "826": ["n04328186", "stopwatch"], "827": ["n04330267", "stove"], "828": ["n04332243", "strainer"], "829": ["n04335435", "streetcar"], "830": ["n04336792", "stretcher"], "831": ["n04344873", "studio_couch"], "832": ["n04346328", "stupa"], "833": ["n04347754", "submarine"], "834": ["n04350905", "suit"], "835": ["n04355338", "sundial"], "836": ["n04355933", "sunglass"], "837": ["n04356056", "sunglasses"], "838": ["n04357314", "sunscreen"], "839": ["n04366367", "suspension_bridge"], "840": ["n04367480", "swab"], "841": ["n04370456", "sweatshirt"], "842": ["n04371430", "swimming_trunks"], "843": ["n04371774", "swing"], "844": ["n04372370", "switch"], "845": ["n04376876", "syringe"], "846": ["n04380533", "table_lamp"], "847": ["n04389033", "tank"], "848": ["n04392985", "tape_player"], "849": ["n04398044", "teapot"], "850": ["n04399382", "teddy"], "851": ["n04404412", "television"], "852": ["n04409515", "tennis_ball"], "853": ["n04417672", "thatch"], "854": ["n04418357", "theater_curtain"], "855": ["n04423845", "thimble"], "856": ["n04428191", "thresher"], "857": ["n04429376", "throne"], "858": ["n04435653", "tile_roof"], "859": ["n04442312", "toaster"], "860": ["n04443257", "tobacco_shop"], "861": ["n04447861", "toilet_seat"], "862": ["n04456115", "torch"], "863": ["n04458633", "totem_pole"], "864": ["n04461696", "tow_truck"], "865": ["n04462240", "toyshop"], "866": ["n04465501", "tractor"], "867": ["n04467665", "trailer_truck"], "868": ["n04476259", "tray"], "869": ["n04479046", "trench_coat"], "870": ["n04482393", "tricycle"], "871": ["n04483307", "trimaran"], "872": ["n04485082", "tripod"], "873": ["n04486054", "triumphal_arch"], "874": ["n04487081", "trolleybus"], "875": ["n04487394", "trombone"], "876": ["n04493381", "tub"], "877": ["n04501370", "turnstile"], "878": ["n04505470", "typewriter_keyboard"], "879": ["n04507155", "umbrella"], "880": ["n04509417", "unicycle"], "881": ["n04515003", "upright"], "882": ["n04517823", "vacuum"], "883": ["n04522168", "vase"], "884": ["n04523525", "vault"], "885": ["n04525038", "velvet"], "886": ["n04525305", "vending_machine"], "887": ["n04532106", "vestment"], "888": ["n04532670", "viaduct"], "889": ["n04536866", "violin"], "890": ["n04540053", "volleyball"], "891": ["n04542943", "waffle_iron"], "892": ["n04548280", "wall_clock"], "893": ["n04548362", "wallet"], "894": ["n04550184", "wardrobe"], "895": ["n04552348", "warplane"], "896": ["n04553703", "washbasin"], "897": ["n04554684", "washer"], "898": ["n04557648", "water_bottle"], "899": ["n04560804", "water_jug"], "900": ["n04562935", "water_tower"], "901": ["n04579145", "whiskey_jug"], "902": ["n04579432", "whistle"], "903": ["n04584207", "wig"], "904": ["n04589890", "window_screen"], "905": ["n04590129", "window_shade"], "906": ["n04591157", "Windsor_tie"], "907": ["n04591713", "wine_bottle"], "908": ["n04592741", "wing"], "909": ["n04596742", "wok"], "910": ["n04597913", "wooden_spoon"], "911": ["n04599235", "wool"], "912": ["n04604644", "worm_fence"], "913": ["n04606251", "wreck"], "914": ["n04612504", "yawl"], "915": ["n04613696", "yurt"], "916": ["n06359193", "web_site"], "917": ["n06596364", "comic_book"], "918": ["n06785654", "crossword_puzzle"], "919": ["n06794110", "street_sign"], "920": ["n06874185", "traffic_light"], "921": ["n07248320", "book_jacket"], "922": ["n07565083", "menu"], "923": ["n07579787", "plate"], "924": ["n07583066", "guacamole"], "925": ["n07584110", "consomme"], "926": ["n07590611", "hot_pot"], "927": ["n07613480", "trifle"], "928": ["n07614500", "ice_cream"], "929": ["n07615774", "ice_lolly"], "930": ["n07684084", "French_loaf"], "931": ["n07693725", "bagel"], "932": ["n07695742", "pretzel"], "933": ["n07697313", "cheeseburger"], "934": ["n07697537", "hotdog"], "935": ["n07711569", "mashed_potato"], "936": ["n07714571", "head_cabbage"], "937": ["n07714990", "broccoli"], "938": ["n07715103", "cauliflower"], "939": ["n07716358", "zucchini"], "940": ["n07716906", "spaghetti_squash"], "941": ["n07717410", "acorn_squash"], "942": ["n07717556", "butternut_squash"], "943": ["n07718472", "cucumber"], "944": ["n07718747", "artichoke"], "945": ["n07720875", "bell_pepper"], "946": ["n07730033", "cardoon"], "947": ["n07734744", "mushroom"], "948": ["n07742313", "Granny_Smith"], "949": ["n07745940", "strawberry"], "950": ["n07747607", "orange"], "951": ["n07749582", "lemon"], "952": ["n07753113", "fig"], "953": ["n07753275", "pineapple"], "954": ["n07753592", "banana"], "955": ["n07754684", "jackfruit"], "956": ["n07760859", "custard_apple"], "957": ["n07768694", "pomegranate"], "958": ["n07802026", "hay"], "959": ["n07831146", "carbonara"], "960": ["n07836838", "chocolate_sauce"], "961": ["n07860988", "dough"], "962": ["n07871810", "meat_loaf"], "963": ["n07873807", "pizza"], "964": ["n07875152", "potpie"], "965": ["n07880968", "burrito"], "966": ["n07892512", "red_wine"], "967": ["n07920052", "espresso"], "968": ["n07930864", "cup"], "969": ["n07932039", "eggnog"], "970": ["n09193705", "alp"], "971": ["n09229709", "bubble"], "972": ["n09246464", "cliff"], "973": ["n09256479", "coral_reef"], "974": ["n09288635", "geyser"], "975": ["n09332890", "lakeside"], "976": ["n09399592", "promontory"], "977": ["n09421951", "sandbar"], "978": ["n09428293", "seashore"], "979": ["n09468604", "valley"], "980": ["n09472597", "volcano"], "981": ["n09835506", "ballplayer"], "982": ["n10148035", "groom"], "983": ["n10565667", "scuba_diver"], "984": ["n11879895", "rapeseed"], "985": ["n11939491", "daisy"], "986": ["n12057211", "yellow_lady's_slipper"], "987": ["n12144580", "corn"], "988": ["n12267677", "acorn"], "989": ["n12620546", "hip"], "990": ["n12768682", "buckeye"], "991": ["n12985857", "coral_fungus"], "992": ["n12998815", "agaric"], "993": ["n13037406", "gyromitra"], "994": ["n13040303", "stinkhorn"], "995": ["n13044778", "earthstar"], "996": ["n13052670", "hen-of-the-woods"], "997": ["n13054560", "bolete"], "998": ["n13133613", "ear"], "999": ["n15075141", "toilet_tissue"]} \ No newline at end of file diff --git a/ISLP_labs_R4DS/Auto copy.csv b/ISLP_labs_R4DS/Auto copy.csv new file mode 100644 index 0000000..78a4921 --- /dev/null +++ b/ISLP_labs_R4DS/Auto copy.csv @@ -0,0 +1,393 @@ +mpg,cylinders,displacement,horsepower,weight,acceleration,year,origin,name +18.0,8,307.0,130,3504,12.0,70,1,chevrolet chevelle malibu +15.0,8,350.0,165,3693,11.5,70,1,buick skylark 320 +18.0,8,318.0,150,3436,11.0,70,1,plymouth satellite +16.0,8,304.0,150,3433,12.0,70,1,amc rebel sst +17.0,8,302.0,140,3449,10.5,70,1,ford torino +15.0,8,429.0,198,4341,10.0,70,1,ford galaxie 500 +14.0,8,454.0,220,4354,9.0,70,1,chevrolet impala +14.0,8,440.0,215,4312,8.5,70,1,plymouth fury iii +14.0,8,455.0,225,4425,10.0,70,1,pontiac catalina +15.0,8,390.0,190,3850,8.5,70,1,amc ambassador dpl +15.0,8,383.0,170,3563,10.0,70,1,dodge challenger se +14.0,8,340.0,160,3609,8.0,70,1,plymouth 'cuda 340 +15.0,8,400.0,150,3761,9.5,70,1,chevrolet monte carlo +14.0,8,455.0,225,3086,10.0,70,1,buick estate wagon (sw) +24.0,4,113.0,95,2372,15.0,70,3,toyota corona mark ii +22.0,6,198.0,95,2833,15.5,70,1,plymouth duster +18.0,6,199.0,97,2774,15.5,70,1,amc hornet +21.0,6,200.0,85,2587,16.0,70,1,ford maverick +27.0,4,97.0,88,2130,14.5,70,3,datsun pl510 +26.0,4,97.0,46,1835,20.5,70,2,volkswagen 1131 deluxe sedan +25.0,4,110.0,87,2672,17.5,70,2,peugeot 504 +24.0,4,107.0,90,2430,14.5,70,2,audi 100 ls +25.0,4,104.0,95,2375,17.5,70,2,saab 99e +26.0,4,121.0,113,2234,12.5,70,2,bmw 2002 +21.0,6,199.0,90,2648,15.0,70,1,amc gremlin +10.0,8,360.0,215,4615,14.0,70,1,ford f250 +10.0,8,307.0,200,4376,15.0,70,1,chevy c20 +11.0,8,318.0,210,4382,13.5,70,1,dodge d200 +9.0,8,304.0,193,4732,18.5,70,1,hi 1200d +27.0,4,97.0,88,2130,14.5,71,3,datsun pl510 +28.0,4,140.0,90,2264,15.5,71,1,chevrolet vega 2300 +25.0,4,113.0,95,2228,14.0,71,3,toyota corona +19.0,6,232.0,100,2634,13.0,71,1,amc gremlin +16.0,6,225.0,105,3439,15.5,71,1,plymouth satellite custom +17.0,6,250.0,100,3329,15.5,71,1,chevrolet chevelle malibu +19.0,6,250.0,88,3302,15.5,71,1,ford torino 500 +18.0,6,232.0,100,3288,15.5,71,1,amc matador +14.0,8,350.0,165,4209,12.0,71,1,chevrolet impala +14.0,8,400.0,175,4464,11.5,71,1,pontiac catalina brougham +14.0,8,351.0,153,4154,13.5,71,1,ford galaxie 500 +14.0,8,318.0,150,4096,13.0,71,1,plymouth fury iii +12.0,8,383.0,180,4955,11.5,71,1,dodge monaco (sw) +13.0,8,400.0,170,4746,12.0,71,1,ford country squire (sw) +13.0,8,400.0,175,5140,12.0,71,1,pontiac safari (sw) +18.0,6,258.0,110,2962,13.5,71,1,amc hornet sportabout (sw) +22.0,4,140.0,72,2408,19.0,71,1,chevrolet vega (sw) +19.0,6,250.0,100,3282,15.0,71,1,pontiac firebird +18.0,6,250.0,88,3139,14.5,71,1,ford mustang +23.0,4,122.0,86,2220,14.0,71,1,mercury capri 2000 +28.0,4,116.0,90,2123,14.0,71,2,opel 1900 +30.0,4,79.0,70,2074,19.5,71,2,peugeot 304 +30.0,4,88.0,76,2065,14.5,71,2,fiat 124b +31.0,4,71.0,65,1773,19.0,71,3,toyota corolla 1200 +35.0,4,72.0,69,1613,18.0,71,3,datsun 1200 +27.0,4,97.0,60,1834,19.0,71,2,volkswagen model 111 +26.0,4,91.0,70,1955,20.5,71,1,plymouth cricket +24.0,4,113.0,95,2278,15.5,72,3,toyota corona hardtop +25.0,4,97.5,80,2126,17.0,72,1,dodge colt hardtop +23.0,4,97.0,54,2254,23.5,72,2,volkswagen type 3 +20.0,4,140.0,90,2408,19.5,72,1,chevrolet vega +21.0,4,122.0,86,2226,16.5,72,1,ford pinto runabout +13.0,8,350.0,165,4274,12.0,72,1,chevrolet impala +14.0,8,400.0,175,4385,12.0,72,1,pontiac catalina +15.0,8,318.0,150,4135,13.5,72,1,plymouth fury iii +14.0,8,351.0,153,4129,13.0,72,1,ford galaxie 500 +17.0,8,304.0,150,3672,11.5,72,1,amc ambassador sst +11.0,8,429.0,208,4633,11.0,72,1,mercury marquis +13.0,8,350.0,155,4502,13.5,72,1,buick lesabre custom +12.0,8,350.0,160,4456,13.5,72,1,oldsmobile delta 88 royale +13.0,8,400.0,190,4422,12.5,72,1,chrysler newport royal +19.0,3,70.0,97,2330,13.5,72,3,mazda rx2 coupe +15.0,8,304.0,150,3892,12.5,72,1,amc matador (sw) +13.0,8,307.0,130,4098,14.0,72,1,chevrolet chevelle concours (sw) +13.0,8,302.0,140,4294,16.0,72,1,ford gran torino (sw) +14.0,8,318.0,150,4077,14.0,72,1,plymouth satellite custom (sw) +18.0,4,121.0,112,2933,14.5,72,2,volvo 145e (sw) +22.0,4,121.0,76,2511,18.0,72,2,volkswagen 411 (sw) +21.0,4,120.0,87,2979,19.5,72,2,peugeot 504 (sw) +26.0,4,96.0,69,2189,18.0,72,2,renault 12 (sw) +22.0,4,122.0,86,2395,16.0,72,1,ford pinto (sw) +28.0,4,97.0,92,2288,17.0,72,3,datsun 510 (sw) +23.0,4,120.0,97,2506,14.5,72,3,toyouta corona mark ii (sw) +28.0,4,98.0,80,2164,15.0,72,1,dodge colt (sw) +27.0,4,97.0,88,2100,16.5,72,3,toyota corolla 1600 (sw) +13.0,8,350.0,175,4100,13.0,73,1,buick century 350 +14.0,8,304.0,150,3672,11.5,73,1,amc matador +13.0,8,350.0,145,3988,13.0,73,1,chevrolet malibu +14.0,8,302.0,137,4042,14.5,73,1,ford gran torino +15.0,8,318.0,150,3777,12.5,73,1,dodge coronet custom +12.0,8,429.0,198,4952,11.5,73,1,mercury marquis brougham +13.0,8,400.0,150,4464,12.0,73,1,chevrolet caprice classic +13.0,8,351.0,158,4363,13.0,73,1,ford ltd +14.0,8,318.0,150,4237,14.5,73,1,plymouth fury gran sedan +13.0,8,440.0,215,4735,11.0,73,1,chrysler new yorker brougham +12.0,8,455.0,225,4951,11.0,73,1,buick electra 225 custom +13.0,8,360.0,175,3821,11.0,73,1,amc ambassador brougham +18.0,6,225.0,105,3121,16.5,73,1,plymouth valiant +16.0,6,250.0,100,3278,18.0,73,1,chevrolet nova custom +18.0,6,232.0,100,2945,16.0,73,1,amc hornet +18.0,6,250.0,88,3021,16.5,73,1,ford maverick +23.0,6,198.0,95,2904,16.0,73,1,plymouth duster +26.0,4,97.0,46,1950,21.0,73,2,volkswagen super beetle +11.0,8,400.0,150,4997,14.0,73,1,chevrolet impala +12.0,8,400.0,167,4906,12.5,73,1,ford country +13.0,8,360.0,170,4654,13.0,73,1,plymouth custom suburb +12.0,8,350.0,180,4499,12.5,73,1,oldsmobile vista cruiser +18.0,6,232.0,100,2789,15.0,73,1,amc gremlin +20.0,4,97.0,88,2279,19.0,73,3,toyota carina +21.0,4,140.0,72,2401,19.5,73,1,chevrolet vega +22.0,4,108.0,94,2379,16.5,73,3,datsun 610 +18.0,3,70.0,90,2124,13.5,73,3,maxda rx3 +19.0,4,122.0,85,2310,18.5,73,1,ford pinto +21.0,6,155.0,107,2472,14.0,73,1,mercury capri v6 +26.0,4,98.0,90,2265,15.5,73,2,fiat 124 sport coupe +15.0,8,350.0,145,4082,13.0,73,1,chevrolet monte carlo s +16.0,8,400.0,230,4278,9.5,73,1,pontiac grand prix +29.0,4,68.0,49,1867,19.5,73,2,fiat 128 +24.0,4,116.0,75,2158,15.5,73,2,opel manta +20.0,4,114.0,91,2582,14.0,73,2,audi 100ls +19.0,4,121.0,112,2868,15.5,73,2,volvo 144ea +15.0,8,318.0,150,3399,11.0,73,1,dodge dart custom +24.0,4,121.0,110,2660,14.0,73,2,saab 99le +20.0,6,156.0,122,2807,13.5,73,3,toyota mark ii +11.0,8,350.0,180,3664,11.0,73,1,oldsmobile omega +20.0,6,198.0,95,3102,16.5,74,1,plymouth duster +19.0,6,232.0,100,2901,16.0,74,1,amc hornet +15.0,6,250.0,100,3336,17.0,74,1,chevrolet nova +31.0,4,79.0,67,1950,19.0,74,3,datsun b210 +26.0,4,122.0,80,2451,16.5,74,1,ford pinto +32.0,4,71.0,65,1836,21.0,74,3,toyota corolla 1200 +25.0,4,140.0,75,2542,17.0,74,1,chevrolet vega +16.0,6,250.0,100,3781,17.0,74,1,chevrolet chevelle malibu classic +16.0,6,258.0,110,3632,18.0,74,1,amc matador +18.0,6,225.0,105,3613,16.5,74,1,plymouth satellite sebring +16.0,8,302.0,140,4141,14.0,74,1,ford gran torino +13.0,8,350.0,150,4699,14.5,74,1,buick century luxus (sw) +14.0,8,318.0,150,4457,13.5,74,1,dodge coronet custom (sw) +14.0,8,302.0,140,4638,16.0,74,1,ford gran torino (sw) +14.0,8,304.0,150,4257,15.5,74,1,amc matador (sw) +29.0,4,98.0,83,2219,16.5,74,2,audi fox +26.0,4,79.0,67,1963,15.5,74,2,volkswagen dasher +26.0,4,97.0,78,2300,14.5,74,2,opel manta +31.0,4,76.0,52,1649,16.5,74,3,toyota corona +32.0,4,83.0,61,2003,19.0,74,3,datsun 710 +28.0,4,90.0,75,2125,14.5,74,1,dodge colt +24.0,4,90.0,75,2108,15.5,74,2,fiat 128 +26.0,4,116.0,75,2246,14.0,74,2,fiat 124 tc +24.0,4,120.0,97,2489,15.0,74,3,honda civic +26.0,4,108.0,93,2391,15.5,74,3,subaru +31.0,4,79.0,67,2000,16.0,74,2,fiat x1.9 +19.0,6,225.0,95,3264,16.0,75,1,plymouth valiant custom +18.0,6,250.0,105,3459,16.0,75,1,chevrolet nova +15.0,6,250.0,72,3432,21.0,75,1,mercury monarch +15.0,6,250.0,72,3158,19.5,75,1,ford maverick +16.0,8,400.0,170,4668,11.5,75,1,pontiac catalina +15.0,8,350.0,145,4440,14.0,75,1,chevrolet bel air +16.0,8,318.0,150,4498,14.5,75,1,plymouth grand fury +14.0,8,351.0,148,4657,13.5,75,1,ford ltd +17.0,6,231.0,110,3907,21.0,75,1,buick century +16.0,6,250.0,105,3897,18.5,75,1,chevroelt chevelle malibu +15.0,6,258.0,110,3730,19.0,75,1,amc matador +18.0,6,225.0,95,3785,19.0,75,1,plymouth fury +21.0,6,231.0,110,3039,15.0,75,1,buick skyhawk +20.0,8,262.0,110,3221,13.5,75,1,chevrolet monza 2+2 +13.0,8,302.0,129,3169,12.0,75,1,ford mustang ii +29.0,4,97.0,75,2171,16.0,75,3,toyota corolla +23.0,4,140.0,83,2639,17.0,75,1,ford pinto +20.0,6,232.0,100,2914,16.0,75,1,amc gremlin +23.0,4,140.0,78,2592,18.5,75,1,pontiac astro +24.0,4,134.0,96,2702,13.5,75,3,toyota corona +25.0,4,90.0,71,2223,16.5,75,2,volkswagen dasher +24.0,4,119.0,97,2545,17.0,75,3,datsun 710 +18.0,6,171.0,97,2984,14.5,75,1,ford pinto +29.0,4,90.0,70,1937,14.0,75,2,volkswagen rabbit +19.0,6,232.0,90,3211,17.0,75,1,amc pacer +23.0,4,115.0,95,2694,15.0,75,2,audi 100ls +23.0,4,120.0,88,2957,17.0,75,2,peugeot 504 +22.0,4,121.0,98,2945,14.5,75,2,volvo 244dl +25.0,4,121.0,115,2671,13.5,75,2,saab 99le +33.0,4,91.0,53,1795,17.5,75,3,honda civic cvcc +28.0,4,107.0,86,2464,15.5,76,2,fiat 131 +25.0,4,116.0,81,2220,16.9,76,2,opel 1900 +25.0,4,140.0,92,2572,14.9,76,1,capri ii +26.0,4,98.0,79,2255,17.7,76,1,dodge colt +27.0,4,101.0,83,2202,15.3,76,2,renault 12tl +17.5,8,305.0,140,4215,13.0,76,1,chevrolet chevelle malibu classic +16.0,8,318.0,150,4190,13.0,76,1,dodge coronet brougham +15.5,8,304.0,120,3962,13.9,76,1,amc matador +14.5,8,351.0,152,4215,12.8,76,1,ford gran torino +22.0,6,225.0,100,3233,15.4,76,1,plymouth valiant +22.0,6,250.0,105,3353,14.5,76,1,chevrolet nova +24.0,6,200.0,81,3012,17.6,76,1,ford maverick +22.5,6,232.0,90,3085,17.6,76,1,amc hornet +29.0,4,85.0,52,2035,22.2,76,1,chevrolet chevette +24.5,4,98.0,60,2164,22.1,76,1,chevrolet woody +29.0,4,90.0,70,1937,14.2,76,2,vw rabbit +33.0,4,91.0,53,1795,17.4,76,3,honda civic +20.0,6,225.0,100,3651,17.7,76,1,dodge aspen se +18.0,6,250.0,78,3574,21.0,76,1,ford granada ghia +18.5,6,250.0,110,3645,16.2,76,1,pontiac ventura sj +17.5,6,258.0,95,3193,17.8,76,1,amc pacer d/l +29.5,4,97.0,71,1825,12.2,76,2,volkswagen rabbit +32.0,4,85.0,70,1990,17.0,76,3,datsun b-210 +28.0,4,97.0,75,2155,16.4,76,3,toyota corolla +26.5,4,140.0,72,2565,13.6,76,1,ford pinto +20.0,4,130.0,102,3150,15.7,76,2,volvo 245 +13.0,8,318.0,150,3940,13.2,76,1,plymouth volare premier v8 +19.0,4,120.0,88,3270,21.9,76,2,peugeot 504 +19.0,6,156.0,108,2930,15.5,76,3,toyota mark ii +16.5,6,168.0,120,3820,16.7,76,2,mercedes-benz 280s +16.5,8,350.0,180,4380,12.1,76,1,cadillac seville +13.0,8,350.0,145,4055,12.0,76,1,chevy c10 +13.0,8,302.0,130,3870,15.0,76,1,ford f108 +13.0,8,318.0,150,3755,14.0,76,1,dodge d100 +31.5,4,98.0,68,2045,18.5,77,3,honda accord cvcc +30.0,4,111.0,80,2155,14.8,77,1,buick opel isuzu deluxe +36.0,4,79.0,58,1825,18.6,77,2,renault 5 gtl +25.5,4,122.0,96,2300,15.5,77,1,plymouth arrow gs +33.5,4,85.0,70,1945,16.8,77,3,datsun f-10 hatchback +17.5,8,305.0,145,3880,12.5,77,1,chevrolet caprice classic +17.0,8,260.0,110,4060,19.0,77,1,oldsmobile cutlass supreme +15.5,8,318.0,145,4140,13.7,77,1,dodge monaco brougham +15.0,8,302.0,130,4295,14.9,77,1,mercury cougar brougham +17.5,6,250.0,110,3520,16.4,77,1,chevrolet concours +20.5,6,231.0,105,3425,16.9,77,1,buick skylark +19.0,6,225.0,100,3630,17.7,77,1,plymouth volare custom +18.5,6,250.0,98,3525,19.0,77,1,ford granada +16.0,8,400.0,180,4220,11.1,77,1,pontiac grand prix lj +15.5,8,350.0,170,4165,11.4,77,1,chevrolet monte carlo landau +15.5,8,400.0,190,4325,12.2,77,1,chrysler cordoba +16.0,8,351.0,149,4335,14.5,77,1,ford thunderbird +29.0,4,97.0,78,1940,14.5,77,2,volkswagen rabbit custom +24.5,4,151.0,88,2740,16.0,77,1,pontiac sunbird coupe +26.0,4,97.0,75,2265,18.2,77,3,toyota corolla liftback +25.5,4,140.0,89,2755,15.8,77,1,ford mustang ii 2+2 +30.5,4,98.0,63,2051,17.0,77,1,chevrolet chevette +33.5,4,98.0,83,2075,15.9,77,1,dodge colt m/m +30.0,4,97.0,67,1985,16.4,77,3,subaru dl +30.5,4,97.0,78,2190,14.1,77,2,volkswagen dasher +22.0,6,146.0,97,2815,14.5,77,3,datsun 810 +21.5,4,121.0,110,2600,12.8,77,2,bmw 320i +21.5,3,80.0,110,2720,13.5,77,3,mazda rx-4 +43.1,4,90.0,48,1985,21.5,78,2,volkswagen rabbit custom diesel +36.1,4,98.0,66,1800,14.4,78,1,ford fiesta +32.8,4,78.0,52,1985,19.4,78,3,mazda glc deluxe +39.4,4,85.0,70,2070,18.6,78,3,datsun b210 gx +36.1,4,91.0,60,1800,16.4,78,3,honda civic cvcc +19.9,8,260.0,110,3365,15.5,78,1,oldsmobile cutlass salon brougham +19.4,8,318.0,140,3735,13.2,78,1,dodge diplomat +20.2,8,302.0,139,3570,12.8,78,1,mercury monarch ghia +19.2,6,231.0,105,3535,19.2,78,1,pontiac phoenix lj +20.5,6,200.0,95,3155,18.2,78,1,chevrolet malibu +20.2,6,200.0,85,2965,15.8,78,1,ford fairmont (auto) +25.1,4,140.0,88,2720,15.4,78,1,ford fairmont (man) +20.5,6,225.0,100,3430,17.2,78,1,plymouth volare +19.4,6,232.0,90,3210,17.2,78,1,amc concord +20.6,6,231.0,105,3380,15.8,78,1,buick century special +20.8,6,200.0,85,3070,16.7,78,1,mercury zephyr +18.6,6,225.0,110,3620,18.7,78,1,dodge aspen +18.1,6,258.0,120,3410,15.1,78,1,amc concord d/l +19.2,8,305.0,145,3425,13.2,78,1,chevrolet monte carlo landau +17.7,6,231.0,165,3445,13.4,78,1,buick regal sport coupe (turbo) +18.1,8,302.0,139,3205,11.2,78,1,ford futura +17.5,8,318.0,140,4080,13.7,78,1,dodge magnum xe +30.0,4,98.0,68,2155,16.5,78,1,chevrolet chevette +27.5,4,134.0,95,2560,14.2,78,3,toyota corona +27.2,4,119.0,97,2300,14.7,78,3,datsun 510 +30.9,4,105.0,75,2230,14.5,78,1,dodge omni +21.1,4,134.0,95,2515,14.8,78,3,toyota celica gt liftback +23.2,4,156.0,105,2745,16.7,78,1,plymouth sapporo +23.8,4,151.0,85,2855,17.6,78,1,oldsmobile starfire sx +23.9,4,119.0,97,2405,14.9,78,3,datsun 200-sx +20.3,5,131.0,103,2830,15.9,78,2,audi 5000 +17.0,6,163.0,125,3140,13.6,78,2,volvo 264gl +21.6,4,121.0,115,2795,15.7,78,2,saab 99gle +16.2,6,163.0,133,3410,15.8,78,2,peugeot 604sl +31.5,4,89.0,71,1990,14.9,78,2,volkswagen scirocco +29.5,4,98.0,68,2135,16.6,78,3,honda accord lx +21.5,6,231.0,115,3245,15.4,79,1,pontiac lemans v6 +19.8,6,200.0,85,2990,18.2,79,1,mercury zephyr 6 +22.3,4,140.0,88,2890,17.3,79,1,ford fairmont 4 +20.2,6,232.0,90,3265,18.2,79,1,amc concord dl 6 +20.6,6,225.0,110,3360,16.6,79,1,dodge aspen 6 +17.0,8,305.0,130,3840,15.4,79,1,chevrolet caprice classic +17.6,8,302.0,129,3725,13.4,79,1,ford ltd landau +16.5,8,351.0,138,3955,13.2,79,1,mercury grand marquis +18.2,8,318.0,135,3830,15.2,79,1,dodge st. regis +16.9,8,350.0,155,4360,14.9,79,1,buick estate wagon (sw) +15.5,8,351.0,142,4054,14.3,79,1,ford country squire (sw) +19.2,8,267.0,125,3605,15.0,79,1,chevrolet malibu classic (sw) +18.5,8,360.0,150,3940,13.0,79,1,chrysler lebaron town @ country (sw) +31.9,4,89.0,71,1925,14.0,79,2,vw rabbit custom +34.1,4,86.0,65,1975,15.2,79,3,maxda glc deluxe +35.7,4,98.0,80,1915,14.4,79,1,dodge colt hatchback custom +27.4,4,121.0,80,2670,15.0,79,1,amc spirit dl +25.4,5,183.0,77,3530,20.1,79,2,mercedes benz 300d +23.0,8,350.0,125,3900,17.4,79,1,cadillac eldorado +27.2,4,141.0,71,3190,24.8,79,2,peugeot 504 +23.9,8,260.0,90,3420,22.2,79,1,oldsmobile cutlass salon brougham +34.2,4,105.0,70,2200,13.2,79,1,plymouth horizon +34.5,4,105.0,70,2150,14.9,79,1,plymouth horizon tc3 +31.8,4,85.0,65,2020,19.2,79,3,datsun 210 +37.3,4,91.0,69,2130,14.7,79,2,fiat strada custom +28.4,4,151.0,90,2670,16.0,79,1,buick skylark limited +28.8,6,173.0,115,2595,11.3,79,1,chevrolet citation +26.8,6,173.0,115,2700,12.9,79,1,oldsmobile omega brougham +33.5,4,151.0,90,2556,13.2,79,1,pontiac phoenix +41.5,4,98.0,76,2144,14.7,80,2,vw rabbit +38.1,4,89.0,60,1968,18.8,80,3,toyota corolla tercel +32.1,4,98.0,70,2120,15.5,80,1,chevrolet chevette +37.2,4,86.0,65,2019,16.4,80,3,datsun 310 +28.0,4,151.0,90,2678,16.5,80,1,chevrolet citation +26.4,4,140.0,88,2870,18.1,80,1,ford fairmont +24.3,4,151.0,90,3003,20.1,80,1,amc concord +19.1,6,225.0,90,3381,18.7,80,1,dodge aspen +34.3,4,97.0,78,2188,15.8,80,2,audi 4000 +29.8,4,134.0,90,2711,15.5,80,3,toyota corona liftback +31.3,4,120.0,75,2542,17.5,80,3,mazda 626 +37.0,4,119.0,92,2434,15.0,80,3,datsun 510 hatchback +32.2,4,108.0,75,2265,15.2,80,3,toyota corolla +46.6,4,86.0,65,2110,17.9,80,3,mazda glc +27.9,4,156.0,105,2800,14.4,80,1,dodge colt +40.8,4,85.0,65,2110,19.2,80,3,datsun 210 +44.3,4,90.0,48,2085,21.7,80,2,vw rabbit c (diesel) +43.4,4,90.0,48,2335,23.7,80,2,vw dasher (diesel) +36.4,5,121.0,67,2950,19.9,80,2,audi 5000s (diesel) +30.0,4,146.0,67,3250,21.8,80,2,mercedes-benz 240d +44.6,4,91.0,67,1850,13.8,80,3,honda civic 1500 gl +33.8,4,97.0,67,2145,18.0,80,3,subaru dl +29.8,4,89.0,62,1845,15.3,80,2,vokswagen rabbit +32.7,6,168.0,132,2910,11.4,80,3,datsun 280-zx +23.7,3,70.0,100,2420,12.5,80,3,mazda rx-7 gs +35.0,4,122.0,88,2500,15.1,80,2,triumph tr7 coupe +32.4,4,107.0,72,2290,17.0,80,3,honda accord +27.2,4,135.0,84,2490,15.7,81,1,plymouth reliant +26.6,4,151.0,84,2635,16.4,81,1,buick skylark +25.8,4,156.0,92,2620,14.4,81,1,dodge aries wagon (sw) +23.5,6,173.0,110,2725,12.6,81,1,chevrolet citation +30.0,4,135.0,84,2385,12.9,81,1,plymouth reliant +39.1,4,79.0,58,1755,16.9,81,3,toyota starlet +39.0,4,86.0,64,1875,16.4,81,1,plymouth champ +35.1,4,81.0,60,1760,16.1,81,3,honda civic 1300 +32.3,4,97.0,67,2065,17.8,81,3,subaru +37.0,4,85.0,65,1975,19.4,81,3,datsun 210 mpg +37.7,4,89.0,62,2050,17.3,81,3,toyota tercel +34.1,4,91.0,68,1985,16.0,81,3,mazda glc 4 +34.7,4,105.0,63,2215,14.9,81,1,plymouth horizon 4 +34.4,4,98.0,65,2045,16.2,81,1,ford escort 4w +29.9,4,98.0,65,2380,20.7,81,1,ford escort 2h +33.0,4,105.0,74,2190,14.2,81,2,volkswagen jetta +33.7,4,107.0,75,2210,14.4,81,3,honda prelude +32.4,4,108.0,75,2350,16.8,81,3,toyota corolla +32.9,4,119.0,100,2615,14.8,81,3,datsun 200sx +31.6,4,120.0,74,2635,18.3,81,3,mazda 626 +28.1,4,141.0,80,3230,20.4,81,2,peugeot 505s turbo diesel +30.7,6,145.0,76,3160,19.6,81,2,volvo diesel +25.4,6,168.0,116,2900,12.6,81,3,toyota cressida +24.2,6,146.0,120,2930,13.8,81,3,datsun 810 maxima +22.4,6,231.0,110,3415,15.8,81,1,buick century +26.6,8,350.0,105,3725,19.0,81,1,oldsmobile cutlass ls +20.2,6,200.0,88,3060,17.1,81,1,ford granada gl +17.6,6,225.0,85,3465,16.6,81,1,chrysler lebaron salon +28.0,4,112.0,88,2605,19.6,82,1,chevrolet cavalier +27.0,4,112.0,88,2640,18.6,82,1,chevrolet cavalier wagon +34.0,4,112.0,88,2395,18.0,82,1,chevrolet cavalier 2-door +31.0,4,112.0,85,2575,16.2,82,1,pontiac j2000 se hatchback +29.0,4,135.0,84,2525,16.0,82,1,dodge aries se +27.0,4,151.0,90,2735,18.0,82,1,pontiac phoenix +24.0,4,140.0,92,2865,16.4,82,1,ford fairmont futura +36.0,4,105.0,74,1980,15.3,82,2,volkswagen rabbit l +37.0,4,91.0,68,2025,18.2,82,3,mazda glc custom l +31.0,4,91.0,68,1970,17.6,82,3,mazda glc custom +38.0,4,105.0,63,2125,14.7,82,1,plymouth horizon miser +36.0,4,98.0,70,2125,17.3,82,1,mercury lynx l +36.0,4,120.0,88,2160,14.5,82,3,nissan stanza xe +36.0,4,107.0,75,2205,14.5,82,3,honda accord +34.0,4,108.0,70,2245,16.9,82,3,toyota corolla +38.0,4,91.0,67,1965,15.0,82,3,honda civic +32.0,4,91.0,67,1965,15.7,82,3,honda civic (auto) +38.0,4,91.0,67,1995,16.2,82,3,datsun 310 gx +25.0,6,181.0,110,2945,16.4,82,1,buick century limited +38.0,6,262.0,85,3015,17.0,82,1,oldsmobile cutlass ciera (diesel) +26.0,4,156.0,92,2585,14.5,82,1,chrysler lebaron medallion +22.0,6,232.0,112,2835,14.7,82,1,ford granada l +32.0,4,144.0,96,2665,13.9,82,3,toyota celica gt +36.0,4,135.0,84,2370,13.0,82,1,dodge charger 2.2 +27.0,4,151.0,90,2950,17.3,82,1,chevrolet camaro +27.0,4,140.0,86,2790,15.6,82,1,ford mustang gl +44.0,4,97.0,52,2130,24.6,82,2,vw pickup +32.0,4,135.0,84,2295,11.6,82,1,dodge rampage +28.0,4,120.0,79,2625,18.6,82,1,ford ranger +31.0,4,119.0,82,2720,19.4,82,1,chevy s-10 diff --git a/ISLP_labs_R4DS/Ch02-statlearn-lab copy.Rmd b/ISLP_labs_R4DS/Ch02-statlearn-lab copy.Rmd new file mode 100644 index 0000000..65fb279 --- /dev/null +++ b/ISLP_labs_R4DS/Ch02-statlearn-lab copy.Rmd @@ -0,0 +1,1402 @@ +--- +jupyter: + jupytext: + cell_metadata_filter: -all + formats: Rmd,ipynb + text_representation: + extension: .Rmd + format_name: rmarkdown + format_version: '1.2' + jupytext_version: 1.14.7 +--- + + +# Chapter 2 + +# Lab: Introduction to Python + + + + +## Getting Started + + +To run the labs in this book, you will need two things: + +* An installation of `Python3`, which is the specific version of `Python` used in the labs. +* Access to `Jupyter`, a very popular `Python` interface that runs code through a file called a *notebook*. + + +You can download and install `Python3` by following the instructions available at [anaconda.com](http://anaconda.com). + + + There are a number of ways to get access to `Jupyter`. Here are just a few: + + * Using Google's `Colaboratory` service: [colab.research.google.com/](https://colab.research.google.com/). + * Using `JupyterHub`, available at [jupyter.org/hub](https://jupyter.org/hub). + * Using your own `jupyter` installation. Installation instructions are available at [jupyter.org/install](https://jupyter.org/install). + +Please see the `Python` resources page on the book website [statlearning.com](https://www.statlearning.com) for up-to-date information about getting `Python` and `Jupyter` working on your computer. + +You will need to install the `ISLP` package, which provides access to the datasets and custom-built functions that we provide. +Inside a macOS or Linux terminal type `pip install ISLP`; this also installs most other packages needed in the labs. The `Python` resources page has a link to the `ISLP` documentation website. + +To run this lab, download the file `Ch2-statlearn-lab.ipynb` from the `Python` resources page. +Now run the following code at the command line: `jupyter lab Ch2-statlearn-lab.ipynb`. + +If you're using Windows, you can use the `start menu` to access `anaconda`, and follow the links. For example, to install `ISLP` and run this lab, you can run the same code above in an `anaconda` shell. + + + +## Basic Commands + + + +In this lab, we will introduce some simple `Python` commands. + For more resources about `Python` in general, readers may want to consult the tutorial at [docs.python.org/3/tutorial/](https://docs.python.org/3/tutorial/). + + + + + + +Like most programming languages, `Python` uses *functions* +to perform operations. To run a +function called `fun`, we type +`fun(input1,input2)`, where the inputs (or *arguments*) +`input1` and `input2` tell +`Python` how to run the function. A function can have any number of +inputs. For example, the +`print()` function outputs a text representation of all of its arguments to the console. + +```{python} +print('fit a model with', 11, 'variables') + +``` + + The following command will provide information about the `print()` function. + +```{python} +# print? + +``` + +Adding two integers in `Python` is pretty intuitive. + +```{python} +3 + 5 + +``` + +In `Python`, textual data is handled using +*strings*. For instance, `"hello"` and +`'hello'` +are strings. +We can concatenate them using the addition `+` symbol. + +```{python} +"hello" + " " + "world" + +``` + + A string is actually a type of *sequence*: this is a generic term for an ordered list. + The three most important types of sequences are lists, tuples, and strings. +We introduce lists now. + + +The following command instructs `Python` to join together +the numbers 3, 4, and 5, and to save them as a +*list* named `x`. When we +type `x`, it gives us back the list. + +```{python} +x = [3, 4, 5] +x + +``` + +Note that we used the brackets +`[]` to construct this list. + +We will often want to add two sets of numbers together. It is reasonable to try the following code, +though it will not produce the desired results. + +```{python} +y = [4, 9, 7] +x + y + +``` + +The result may appear slightly counterintuitive: why did `Python` not add the entries of the lists +element-by-element? + In `Python`, lists hold *arbitrary* objects, and are added using *concatenation*. + In fact, concatenation is the behavior that we saw earlier when we entered `"hello" + " " + "world"`. + + + +This example reflects the fact that + `Python` is a general-purpose programming language. Much of `Python`'s data-specific +functionality comes from other packages, notably `numpy` +and `pandas`. +In the next section, we will introduce the `numpy` package. +See [docs.scipy.org/doc/numpy/user/quickstart.html](https://docs.scipy.org/doc/numpy/user/quickstart.html) for more information about `numpy`. + + + +## Introduction to Numerical Python + +As mentioned earlier, this book makes use of functionality that is contained in the `numpy` + *library*, or *package*. A package is a collection of modules that are not necessarily included in + the base `Python` distribution. The name `numpy` is an abbreviation for *numerical Python*. + + + To access `numpy`, we must first `import` it. + +```{python} +import numpy as np +``` +In the previous line, we named the `numpy` *module* `np`; an abbreviation for easier referencing. + + +In `numpy`, an *array* is a generic term for a multidimensional +set of numbers. +We use the `np.array()` function to define `x` and `y`, which are one-dimensional arrays, i.e. vectors. + +```{python} +x = np.array([3, 4, 5]) +y = np.array([4, 9, 7]) +``` +Note that if you forgot to run the `import numpy as np` command earlier, then +you will encounter an error in calling the `np.array()` function in the previous line. + The syntax `np.array()` indicates that the function being called +is part of the `numpy` package, which we have abbreviated as `np`. + + +Since `x` and `y` have been defined using `np.array()`, we get a sensible result when we add them together. Compare this to our results in the previous section, + when we tried to add two lists without using `numpy`. + +```{python} +x + y +``` + + + + + +In `numpy`, matrices are typically represented as two-dimensional arrays, and vectors as one-dimensional arrays. {While it is also possible to create matrices using `np.matrix()`, we will use `np.array()` throughout the labs in this book.} +We can create a two-dimensional array as follows. + +```{python} +x = np.array([[1, 2], [3, 4]]) +x +``` + + + + + +The object `x` has several +*attributes*, or associated objects. To access an attribute of `x`, we type `x.attribute`, where we replace `attribute` +with the name of the attribute. +For instance, we can access the `ndim` attribute of `x` as follows. + +```{python} +x.ndim +``` + +The output indicates that `x` is a two-dimensional array. +Similarly, `x.dtype` is the *data type* attribute of the object `x`. This indicates that `x` is +comprised of 64-bit integers: + +```{python} +x.dtype +``` +Why is `x` comprised of integers? This is because we created `x` by passing in exclusively integers to the `np.array()` function. + If +we had passed in any decimals, then we would have obtained an array of +*floating point numbers* (i.e. real-valued numbers). + +```{python} +np.array([[1, 2], [3.0, 4]]).dtype + +``` + + +Typing `fun?` will cause `Python` to display +documentation associated with the function `fun`, if it exists. +We can try this for `np.array()`. + +```{python} +# np.array? + +``` +This documentation indicates that we could create a floating point array by passing a `dtype` argument into `np.array()`. + +```{python} +np.array([[1, 2], [3, 4]], float).dtype + +``` + + +The array `x` is two-dimensional. We can find out the number of rows and columns by looking +at its `shape` attribute. + +```{python} +x.shape + +``` + + +A *method* is a function that is associated with an +object. +For instance, given an array `x`, the expression +`x.sum()` sums all of its elements, using the `sum()` +method for arrays. +The call `x.sum()` automatically provides `x` as the +first argument to its `sum()` method. + +```{python} +x = np.array([1, 2, 3, 4]) +x.sum() +``` +We could also sum the elements of `x` by passing in `x` as an argument to the `np.sum()` function. + +```{python} +x = np.array([1, 2, 3, 4]) +np.sum(x) +``` + As another example, the +`reshape()` method returns a new array with the same elements as +`x`, but a different shape. + We do this by passing in a `tuple` in our call to + `reshape()`, in this case `(2, 3)`. This tuple specifies that we would like to create a two-dimensional array with +$2$ rows and $3$ columns. {Like lists, tuples represent a sequence of objects. Why do we need more than one way to create a sequence? There are a few differences between tuples and lists, but perhaps the most important is that elements of a tuple cannot be modified, whereas elements of a list can be.} + +In what follows, the +`\n` character creates a *new line*. + +```{python} +x = np.array([1, 2, 3, 4, 5, 6]) +print('beginning x:\n', x) +x_reshape = x.reshape((2, 3)) +print('reshaped x:\n', x_reshape) + +``` + +The previous output reveals that `numpy` arrays are specified as a sequence +of *rows*. This is called *row-major ordering*, as opposed to *column-major ordering*. + + +`Python` (and hence `numpy`) uses 0-based +indexing. This means that to access the top left element of `x_reshape`, +we type in `x_reshape[0,0]`. + +```{python} +x_reshape[0, 0] +``` +Similarly, `x_reshape[1,2]` yields the element in the second row and the third column +of `x_reshape`. + +```{python} +x_reshape[1, 2] +``` +Similarly, `x[2]` yields the +third entry of `x`. + +Now, let's modify the top left element of `x_reshape`. To our surprise, we discover that the first element of `x` has been modified as well! + + + +```{python} +print('x before we modify x_reshape:\n', x) +print('x_reshape before we modify x_reshape:\n', x_reshape) +x_reshape[0, 0] = 5 +print('x_reshape after we modify its top left element:\n', x_reshape) +print('x after we modify top left element of x_reshape:\n', x) + +``` + +Modifying `x_reshape` also modified `x` because the two objects occupy the same space in memory. + + + + + +We just saw that we can modify an element of an array. Can we also modify a tuple? It turns out that we cannot --- and trying to do so introduces +an *exception*, or error. + +```{python} +my_tuple = (3, 4, 5) +my_tuple[0] = 2 + +``` + + +We now briefly mention some attributes of arrays that will come in handy. An array's `shape` attribute contains its dimension; this is always a tuple. +The `ndim` attribute yields the number of dimensions, and `T` provides its transpose. + +```{python} +x_reshape.shape, x_reshape.ndim, x_reshape.T + +``` + +Notice that the three individual outputs `(2,3)`, `2`, and `array([[5, 4],[2, 5], [3,6]])` are themselves output as a tuple. + +We will often want to apply functions to arrays. +For instance, we can compute the +square root of the entries using the `np.sqrt()` function: + +```{python} +np.sqrt(x) + +``` + +We can also square the elements: + +```{python} +x**2 + +``` + +We can compute the square roots using the same notation, raising to the power of $1/2$ instead of 2. + +```{python} +x**0.5 + +``` + + +Throughout this book, we will often want to generate random data. +The `np.random.normal()` function generates a vector of random +normal variables. We can learn more about this function by looking at the help page, via a call to `np.random.normal?`. +The first line of the help page reads `normal(loc=0.0, scale=1.0, size=None)`. + This *signature* line tells us that the function's arguments are `loc`, `scale`, and `size`. These are *keyword* arguments, which means that when they are passed into + the function, they can be referred to by name (in any order). {`Python` also uses *positional* arguments. Positional arguments do not need to use a keyword. To see an example, type in `np.sum?`. We see that `a` is a positional argument, i.e. this function assumes that the first unnamed argument that it receives is the array to be summed. By contrast, `axis` and `dtype` are keyword arguments: the position in which these arguments are entered into `np.sum()` does not matter.} + By default, this function will generate random normal variable(s) with mean (`loc`) $0$ and standard deviation (`scale`) $1$; furthermore, + a single random variable will be generated unless the argument to `size` is changed. + +We now generate 50 independent random variables from a $N(0,1)$ distribution. + +```{python} +x = np.random.normal(size=50) +x + +``` + +We create an array `y` by adding an independent $N(50,1)$ random variable to each element of `x`. + +```{python} +y = x + np.random.normal(loc=50, scale=1, size=50) +``` +The `np.corrcoef()` function computes the correlation matrix between `x` and `y`. The off-diagonal elements give the +correlation between `x` and `y`. + +```{python} +np.corrcoef(x, y) +``` + +If you're following along in your own `Jupyter` notebook, then you probably noticed that you got a different set of results when you ran the past few +commands. In particular, + each +time we call `np.random.normal()`, we will get a different answer, as shown in the following example. + +```{python} +print(np.random.normal(scale=5, size=2)) +print(np.random.normal(scale=5, size=2)) + +``` + + + +In order to ensure that our code provides exactly the same results +each time it is run, we can set a *random seed* +using the +`np.random.default_rng()` function. +This function takes an arbitrary, user-specified integer argument. If we set a random seed before +generating random data, then re-running our code will yield the same results. The +object `rng` has essentially all the random number generating methods found in `np.random`. Hence, to +generate normal data we use `rng.normal()`. + +```{python} +rng = np.random.default_rng(1303) +print(rng.normal(scale=5, size=2)) +rng2 = np.random.default_rng(1303) +print(rng2.normal(scale=5, size=2)) +``` + +Throughout the labs in this book, we use `np.random.default_rng()` whenever we +perform calculations involving random quantities within `numpy`. In principle, this +should enable the reader to exactly reproduce the stated results. However, as new versions of `numpy` become available, it is possible +that some small discrepancies may occur between the output +in the labs and the output +from `numpy`. + +The `np.mean()`, `np.var()`, and `np.std()` functions can be used +to compute the mean, variance, and standard deviation of arrays. These functions are also +available as methods on the arrays. + +```{python} +rng = np.random.default_rng(3) +y = rng.standard_normal(10) +np.mean(y), y.mean() +``` + + + +```{python} +np.var(y), y.var(), np.mean((y - y.mean())**2) +``` + + +Notice that by default `np.var()` divides by the sample size $n$ rather +than $n-1$; see the `ddof` argument in `np.var?`. + + +```{python} +np.sqrt(np.var(y)), np.std(y) +``` + +The `np.mean()`, `np.var()`, and `np.std()` functions can also be applied to the rows and columns of a matrix. +To see this, we construct a $10 \times 3$ matrix of $N(0,1)$ random variables, and consider computing its row sums. + +```{python} +X = rng.standard_normal((10, 3)) +X +``` + +Since arrays are row-major ordered, the first axis, i.e. `axis=0`, refers to its rows. We pass this argument into the `mean()` method for the object `X`. + +```{python} +X.mean(axis=0) +``` + +The following yields the same result. + +```{python} +X.mean(0) +``` + + + +## Graphics +In `Python`, common practice is to use the library +`matplotlib` for graphics. +However, since `Python` was not written with data analysis in mind, + the notion of plotting is not intrinsic to the language. +We will use the `subplots()` function +from `matplotlib.pyplot` to create a figure and the +axes onto which we plot our data. +For many more examples of how to make plots in `Python`, +readers are encouraged to visit [matplotlib.org/stable/gallery/](https://matplotlib.org/stable/gallery/index.html). + +In `matplotlib`, a plot consists of a *figure* and one or more *axes*. You can think of the figure as the blank canvas upon which +one or more plots will be displayed: it is the entire plotting window. +The *axes* contain important information about each plot, such as its $x$- and $y$-axis labels, +title, and more. (Note that in `matplotlib`, the word *axes* is not the plural of *axis*: a plot's *axes* contains much more information +than just the $x$-axis and the $y$-axis.) + +We begin by importing the `subplots()` function +from `matplotlib`. We use this function +throughout when creating figures. +The function returns a tuple of length two: a figure +object as well as the relevant axes object. We will typically +pass `figsize` as a keyword argument. +Having created our axes, we attempt our first plot using its `plot()` method. +To learn more about it, +type `ax.plot?`. + +```{python} +from matplotlib.pyplot import subplots +fig, ax = subplots(figsize=(8, 8)) +x = rng.standard_normal(100) +y = rng.standard_normal(100) +ax.plot(x, y); + +``` + +We pause here to note that we have *unpacked* the tuple of length two returned by `subplots()` into the two distinct +variables `fig` and `ax`. Unpacking +is typically preferred to the following equivalent but slightly more verbose code: + +```{python} +output = subplots(figsize=(8, 8)) +fig = output[0] +ax = output[1] +``` + +We see that our earlier cell produced a line plot, which is the default. To create a scatterplot, we provide an additional argument to `ax.plot()`, indicating that circles should be displayed. + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax.plot(x, y, 'o'); +``` +Different values +of this additional argument can be used to produce different colored lines +as well as different linestyles. + + + +As an alternative, we could use the `ax.scatter()` function to create a scatterplot. + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax.scatter(x, y, marker='o'); +``` + +Notice that in the code blocks above, we have ended +the last line with a semicolon. This prevents `ax.plot(x, y)` from printing +text to the notebook. However, it does not prevent a plot from being produced. + If we omit the trailing semi-colon, then we obtain the following output: + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax.scatter(x, y, marker='o') + +``` +In what follows, we will use + trailing semicolons whenever the text that would be output is not +germane to the discussion at hand. + + + + + + +To label our plot, we make use of the `set_xlabel()`, `set_ylabel()`, and `set_title()` methods +of `ax`. + + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax.scatter(x, y, marker='o') +ax.set_xlabel("this is the x-axis") +ax.set_ylabel("this is the y-axis") +ax.set_title("Plot of X vs Y"); +``` + + Having access to the figure object `fig` itself means that we can go in and change some aspects and then redisplay it. Here, we change + the size from `(8, 8)` to `(12, 3)`. + + +```{python} +fig.set_size_inches(12,3) +fig +``` + + + +Occasionally we will want to create several plots within a figure. This can be +achieved by passing additional arguments to `subplots()`. +Below, we create a $2 \times 3$ grid of plots +in a figure of size determined by the `figsize` argument. In such +situations, there is often a relationship between the axes in the plots. For example, +all plots may have a common $x$-axis. The `subplots()` function can automatically handle +this situation when passed the keyword argument `sharex=True`. +The `axes` object below is an array pointing to different plots in the figure. + +```{python} +fig, axes = subplots(nrows=2, + ncols=3, + figsize=(15, 5)) +``` +We now produce a scatter plot with `'o'` in the second column of the first row and +a scatter plot with `'+'` in the third column of the second row. + +```{python} +axes[0,1].plot(x, y, 'o') +axes[1,2].scatter(x, y, marker='+') +fig +``` +Type `subplots?` to learn more about +`subplots()`. + + + + + +To save the output of `fig`, we call its `savefig()` +method. The argument `dpi` is the dots per inch, used +to determine how large the figure will be in pixels. + +```{python} +fig.savefig("Figure.png", dpi=400) +fig.savefig("Figure.pdf", dpi=200); + +``` + + +We can continue to modify `fig` using step-by-step updates; for example, we can modify the range of the $x$-axis, re-save the figure, and even re-display it. + +```{python} +axes[0,1].set_xlim([-1,1]) +fig.savefig("Figure_updated.jpg") +fig +``` + +We now create some more sophisticated plots. The +`ax.contour()` method produces a *contour plot* +in order to represent three-dimensional data, similar to a +topographical map. It takes three arguments: + +* A vector of `x` values (the first dimension), +* A vector of `y` values (the second dimension), and +* A matrix whose elements correspond to the `z` value (the third +dimension) for each pair of `(x,y)` coordinates. + +To create `x` and `y`, we’ll use the command `np.linspace(a, b, n)`, +which returns a vector of `n` numbers starting at `a` and ending at `b`. + +```{python} +fig, ax = subplots(figsize=(8, 8)) +x = np.linspace(-np.pi, np.pi, 50) +y = x +f = np.multiply.outer(np.cos(y), 1 / (1 + x**2)) +ax.contour(x, y, f); + +``` +We can increase the resolution by adding more levels to the image. + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax.contour(x, y, f, levels=45); +``` +To fine-tune the output of the +`ax.contour()` function, take a +look at the help file by typing `?plt.contour`. + +The `ax.imshow()` method is similar to +`ax.contour()`, except that it produces a color-coded plot +whose colors depend on the `z` value. This is known as a +*heatmap*, and is sometimes used to plot temperature in +weather forecasts. + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax.imshow(f); + +``` + + +## Sequences and Slice Notation + + +As seen above, the +function `np.linspace()` can be used to create a sequence +of numbers. + +```{python} +seq1 = np.linspace(0, 10, 11) +seq1 + +``` + + +The function `np.arange()` + returns a sequence of numbers spaced out by `step`. If `step` is not specified, then a default value of $1$ is used. Let's create a sequence + that starts at $0$ and ends at $10$. + +```{python} +seq2 = np.arange(0, 10) +seq2 + +``` + +Why isn't $10$ output above? This has to do with *slice* notation in `Python`. +Slice notation +is used to index sequences such as lists, tuples and arrays. +Suppose we want to retrieve the fourth through sixth (inclusive) entries +of a string. We obtain a slice of the string using the indexing notation `[3:6]`. + +```{python} +"hello world"[3:6] +``` +In the code block above, the notation `3:6` is shorthand for `slice(3,6)` when used inside +`[]`. + +```{python} +"hello world"[slice(3,6)] + +``` + +You might have expected `slice(3,6)` to output the fourth through seventh characters in the text string (recalling that `Python` begins its indexing at zero), but instead it output the fourth through sixth. + This also explains why the earlier `np.arange(0, 10)` command output only the integers from $0$ to $9$. +See the documentation `slice?` for useful options in creating slices. + + + + + + + + + + + + + + + + + + + + + + + +## Indexing Data +To begin, we create a two-dimensional `numpy` array. + +```{python} +A = np.array(np.arange(16)).reshape((4, 4)) +A + +``` + +Typing `A[1,2]` retrieves the element corresponding to the second row and third +column. (As usual, `Python` indexes from $0.$) + +```{python} +A[1,2] + +``` + +The first number after the open-bracket symbol `[` + refers to the row, and the second number refers to the column. + +### Indexing Rows, Columns, and Submatrices + To select multiple rows at a time, we can pass in a list + specifying our selection. For instance, `[1,3]` will retrieve the second and fourth rows: + +```{python} +A[[1,3]] + +``` + +To select the first and third columns, we pass in `[0,2]` as the second argument in the square brackets. +In this case we need to supply the first argument `:` +which selects all rows. + +```{python} +A[:,[0,2]] + +``` + +Now, suppose that we want to select the submatrix made up of the second and fourth +rows as well as the first and third columns. This is where +indexing gets slightly tricky. It is natural to try to use lists to retrieve the rows and columns: + +```{python} +A[[1,3],[0,2]] + +``` + + Oops --- what happened? We got a one-dimensional array of length two identical to + +```{python} +np.array([A[1,0],A[3,2]]) + +``` + + Similarly, the following code fails to extract the submatrix comprised of the second and fourth rows and the first, third, and fourth columns: + +```{python} +A[[1,3],[0,2,3]] + +``` + +We can see what has gone wrong here. When supplied with two indexing lists, the `numpy` interpretation is that these provide pairs of $i,j$ indices for a series of entries. That is why the pair of lists must have the same length. However, that was not our intent, since we are looking for a submatrix. + +One easy way to do this is as follows. We first create a submatrix by subsetting the rows of `A`, and then on the fly we make a further submatrix by subsetting its columns. + + +```{python} +A[[1,3]][:,[0,2]] + +``` + + + +There are more efficient ways of achieving the same result. + +The *convenience function* `np.ix_()` allows us to extract a submatrix +using lists, by creating an intermediate *mesh* object. + +```{python} +idx = np.ix_([1,3],[0,2,3]) +A[idx] + +``` + + +Alternatively, we can subset matrices efficiently using slices. + +The slice +`1:4:2` captures the second and fourth items of a sequence, while the slice `0:3:2` captures +the first and third items (the third element in a slice sequence is the step size). + +```{python} +A[1:4:2,0:3:2] + +``` + + + +Why are we able to retrieve a submatrix directly using slices but not using lists? +Its because they are different `Python` types, and +are treated differently by `numpy`. +Slices can be used to extract objects from arbitrary sequences, such as strings, lists, and tuples, while the use of lists for indexing is more limited. + + + + + + + + + + + + + +### Boolean Indexing +In `numpy`, a *Boolean* is a type that equals either `True` or `False` (also represented as $1$ and $0$, respectively). +The next line creates a vector of $0$'s, represented as Booleans, of length equal to the first dimension of `A`. + +```{python} +keep_rows = np.zeros(A.shape[0], bool) +keep_rows +``` +We now set two of the elements to `True`. + +```{python} +keep_rows[[1,3]] = True +keep_rows + +``` + +Note that the elements of `keep_rows`, when viewed as integers, are the same as the +values of `np.array([0,1,0,1])`. Below, we use `==` to verify their equality. When +applied to two arrays, the `==` operation is applied elementwise. + +```{python} +np.all(keep_rows == np.array([0,1,0,1])) + +``` + +(Here, the function `np.all()` has checked whether +all entries of an array are `True`. A similar function, `np.any()`, can be used to check whether any entries of an array are `True`.) + + + However, even though `np.array([0,1,0,1])` and `keep_rows` are equal according to `==`, they index different sets of rows! +The former retrieves the first, second, first, and second rows of `A`. + +```{python} +A[np.array([0,1,0,1])] + +``` + + By contrast, `keep_rows` retrieves only the second and fourth rows of `A` --- i.e. the rows for which the Boolean equals `TRUE`. + +```{python} +A[keep_rows] + +``` + +This example shows that Booleans and integers are treated differently by `numpy`. + + +We again make use of the `np.ix_()` function + to create a mesh containing the second and fourth rows, and the first, third, and fourth columns. This time, we apply the function to Booleans, + rather than lists. + +```{python} +keep_cols = np.zeros(A.shape[1], bool) +keep_cols[[0, 2, 3]] = True +idx_bool = np.ix_(keep_rows, keep_cols) +A[idx_bool] + +``` + +We can also mix a list with an array of Booleans in the arguments to `np.ix_()`: + +```{python} +idx_mixed = np.ix_([1,3], keep_cols) +A[idx_mixed] + +``` + + + +For more details on indexing in `numpy`, readers are referred +to the `numpy` tutorial mentioned earlier. + + + +## Loading Data + +Data sets often contain different types of data, and may have names associated with the rows or columns. +For these reasons, they typically are best accommodated using a + *data frame*. + We can think of a data frame as a sequence +of arrays of identical length; these are the columns. Entries in the +different arrays can be combined to form a row. + The `pandas` +library can be used to create and work with data frame objects. + + +### Reading in a Data Set + +The first step of most analyses involves importing a data set into +`Python`. + Before attempting to load +a data set, we must make sure that `Python` knows where to find the file containing it. +If the +file is in the same location +as this notebook file, then we are all set. +Otherwise, +the command +`os.chdir()` can be used to *change directory*. (You will need to call `import os` before calling `os.chdir()`.) + + +We will begin by reading in `Auto.csv`, available on the book website. This is a comma-separated file, and can be read in using `pd.read_csv()`: + +```{python} +import pandas as pd +Auto = pd.read_csv('Auto.csv') +Auto + +``` + +The book website also has a whitespace-delimited version of this data, called `Auto.data`. This can be read in as follows: + +```{python} +Auto = pd.read_csv('Auto.data', delim_whitespace=True) + +``` + Both `Auto.csv` and `Auto.data` are simply text +files. Before loading data into `Python`, it is a good idea to view it using +a text editor or other software, such as Microsoft Excel. + + + + +We now take a look at the column of `Auto` corresponding to the variable `horsepower`: + +```{python} +Auto['horsepower'] + +``` +We see that the `dtype` of this column is `object`. +It turns out that all values of the `horsepower` column were interpreted as strings when reading +in the data. +We can find out why by looking at the unique values. + +```{python} +np.unique(Auto['horsepower']) + +``` +We see the culprit is the value `?`, which is being used to encode missing values. + + + + +To fix the problem, we must provide `pd.read_csv()` with an argument called `na_values`. +Now, each instance of `?` in the file is replaced with the +value `np.nan`, which means *not a number*: + +```{python} +Auto = pd.read_csv('Auto.data', + na_values=['?'], + delim_whitespace=True) +Auto['horsepower'].sum() + +``` + + +The `Auto.shape` attribute tells us that the data has 397 +observations, or rows, and nine variables, or columns. + +```{python} +Auto.shape + +``` + +There are +various ways to deal with missing data. +In this case, since only five of the rows contain missing +observations, we choose to use the `Auto.dropna()` method to simply remove these rows. + +```{python} +Auto_new = Auto.dropna() +Auto_new.shape + +``` + + +### Basics of Selecting Rows and Columns + +We can use `Auto.columns` to check the variable names. + +```{python} +Auto = Auto_new # overwrite the previous value +Auto.columns + +``` + + +Accessing the rows and columns of a data frame is similar, but not identical, to accessing the rows and columns of an array. +Recall that the first argument to the `[]` method +is always applied to the rows of the array. +Similarly, +passing in a slice to the `[]` method creates a data frame whose *rows* are determined by the slice: + +```{python} +Auto[:3] + +``` +Similarly, an array of Booleans can be used to subset the rows: + +```{python} +idx_80 = Auto['year'] > 80 +Auto[idx_80] + +``` +However, if we pass in a list of strings to the `[]` method, then we obtain a data frame containing the corresponding set of *columns*. + +```{python} +Auto[['mpg', 'horsepower']] + +``` +Since we did not specify an *index* column when we loaded our data frame, the rows are labeled using integers +0 to 396. + +```{python} +Auto.index + +``` +We can use the +`set_index()` method to re-name the rows using the contents of `Auto['name']`. + +```{python} +Auto_re = Auto.set_index('name') +Auto_re + +``` + +```{python} +Auto_re.columns + +``` +We see that the column `'name'` is no longer there. + +Now that the index has been set to `name`, we can access rows of the data +frame by `name` using the `{loc[]`} method of +`Auto`: + +```{python} +rows = ['amc rebel sst', 'ford torino'] +Auto_re.loc[rows] + +``` +As an alternative to using the index name, we could retrieve the 4th and 5th rows of `Auto` using the `{iloc[]`} method: + +```{python} +Auto_re.iloc[[3,4]] + +``` +We can also use it to retrieve the 1st, 3rd and and 4th columns of `Auto_re`: + +```{python} +Auto_re.iloc[:,[0,2,3]] + +``` +We can extract the 4th and 5th rows, as well as the 1st, 3rd and 4th columns, using +a single call to `iloc[]`: + +```{python} +Auto_re.iloc[[3,4],[0,2,3]] + +``` +Index entries need not be unique: there are several cars in the data frame named `ford galaxie 500`. + +```{python} +Auto_re.loc['ford galaxie 500', ['mpg', 'origin']] + +``` +### More on Selecting Rows and Columns +Suppose now that we want to create a data frame consisting of the `weight` and `origin` of the subset of cars with +`year` greater than 80 --- i.e. those built after 1980. +To do this, we first create a Boolean array that indexes the rows. +The `loc[]` method allows for Boolean entries as well as strings: + +```{python} +idx_80 = Auto_re['year'] > 80 +Auto_re.loc[idx_80, ['weight', 'origin']] + +``` + + +To do this more concisely, we can use an anonymous function called a `lambda`: + +```{python} +Auto_re.loc[lambda df: df['year'] > 80, ['weight', 'origin']] + +``` +The `lambda` call creates a function that takes a single +argument, here `df`, and returns `df['year']>80`. +Since it is created inside the `loc[]` method for the +dataframe `Auto_re`, that dataframe will be the argument supplied. +As another example of using a `lambda`, suppose that +we want all cars built after 1980 that achieve greater than 30 miles per gallon: + +```{python} +Auto_re.loc[lambda df: (df['year'] > 80) & (df['mpg'] > 30), + ['weight', 'origin'] + ] + +``` +The symbol `&` computes an element-wise *and* operation. +As another example, suppose that we want to retrieve all `Ford` and `Datsun` +cars with `displacement` less than 300. We check whether each `name` entry contains either the string `ford` or `datsun` using the `str.contains()` method of the `index` attribute of +of the dataframe: + +```{python} +Auto_re.loc[lambda df: (df['displacement'] < 300) + & (df.index.str.contains('ford') + | df.index.str.contains('datsun')), + ['weight', 'origin'] + ] + +``` +Here, the symbol `|` computes an element-wise *or* operation. + +In summary, a powerful set of operations is available to index the rows and columns of data frames. For integer based queries, use the `iloc[]` method. For string and Boolean +selections, use the `loc[]` method. For functional queries that filter rows, use the `loc[]` method +with a function (typically a `lambda`) in the rows argument. + +## For Loops +A `for` loop is a standard tool in many languages that +repeatedly evaluates some chunk of code while +varying different values inside the code. +For example, suppose we loop over elements of a list and compute their sum. + +```{python} +total = 0 +for value in [3,2,19]: + total += value +print('Total is: {0}'.format(total)) + +``` +The indented code beneath the line with the `for` statement is run +for each value in the sequence +specified in the `for` statement. The loop ends either +when the cell ends or when code is indented at the same level +as the original `for` statement. +We see that the final line above which prints the total is executed +only once after the for loop has terminated. Loops +can be nested by additional indentation. + +```{python} +total = 0 +for value in [2,3,19]: + for weight in [3, 2, 1]: + total += value * weight +print('Total is: {0}'.format(total)) +``` +Above, we summed over each combination of `value` and `weight`. +We also took advantage of the *increment* notation +in `Python`: the expression `a += b` is equivalent +to `a = a + b`. Besides +being a convenient notation, this can save time in computationally +heavy tasks in which the intermediate value of `a+b` need not +be explicitly created. + +Perhaps a more +common task would be to sum over `(value, weight)` pairs. For instance, +to compute the average value of a random variable that takes on +possible values 2, 3 or 19 with probability 0.2, 0.3, 0.5 respectively +we would compute the weighted sum. Tasks such as this +can often be accomplished using the `zip()` function that +loops over a sequence of tuples. + +```{python} +total = 0 +for value, weight in zip([2,3,19], + [0.2,0.3,0.5]): + total += weight * value +print('Weighted average is: {0}'.format(total)) + +``` + +### String Formatting +In the code chunk above we also printed a string +displaying the total. However, the object `total` +is an integer and not a string. +Inserting the value of something into +a string is a common task, made +simple using +some of the powerful string formatting +tools in `Python`. +Many data cleaning tasks involve +manipulating and programmatically +producing strings. + +For example we may want to loop over the columns of a data frame and +print the percent missing in each column. +Let’s create a data frame `D` with columns in which 20% of the entries are missing i.e. set +to `np.nan`. We’ll create the +values in `D` from a normal distribution with mean 0 and variance 1 using `rng.standard_normal()` +and then overwrite some random entries using `rng.choice()`. + +```{python} +rng = np.random.default_rng(1) +A = rng.standard_normal((127, 5)) +M = rng.choice([0, np.nan], p=[0.8,0.2], size=A.shape) +A += M +D = pd.DataFrame(A, columns=['food', + 'bar', + 'pickle', + 'snack', + 'popcorn']) +D[:3] + +``` + + +```{python} +for col in D.columns: + template = 'Column "{0}" has {1:.2%} missing values' + print(template.format(col, + np.isnan(D[col]).mean())) + +``` +We see that the `template.format()` method expects two arguments `{0}` +and `{1:.2%}`, and the latter includes some formatting +information. In particular, it specifies that the second argument should be expressed as a percent with two decimal digits. + +The reference +[docs.python.org/3/library/string.html](https://docs.python.org/3/library/string.html) +includes many helpful and more complex examples. + + +## Additional Graphical and Numerical Summaries +We can use the `ax.plot()` or `ax.scatter()` functions to display the quantitative variables. However, simply typing the variable names will produce an error message, +because `Python` does not know to look in the `Auto` data set for those variables. + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax.plot(horsepower, mpg, 'o'); +``` +We can address this by accessing the columns directly: + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax.plot(Auto['horsepower'], Auto['mpg'], 'o'); + +``` +Alternatively, we can use the `plot()` method with the call `Auto.plot()`. +Using this method, +the variables can be accessed by name. +The plot methods of a data frame return a familiar object: +an axes. We can use it to update the plot as we did previously: + +```{python} +ax = Auto.plot.scatter('horsepower', 'mpg') +ax.set_title('Horsepower vs. MPG'); +``` +If we want to save +the figure that contains a given axes, we can find the relevant figure +by accessing the `figure` attribute: + +```{python} +fig = ax.figure +fig.savefig('horsepower_mpg.png'); +``` + +We can further instruct the data frame to plot to a particular axes object. In this +case the corresponding `plot()` method will return the +modified axes we passed in as an argument. Note that +when we request a one-dimensional grid of plots, the object `axes` is similarly +one-dimensional. We place our scatter plot in the middle plot of a row of three plots +within a figure. + +```{python} +fig, axes = subplots(ncols=3, figsize=(15, 5)) +Auto.plot.scatter('horsepower', 'mpg', ax=axes[1]); + +``` + +Note also that the columns of a data frame can be accessed as attributes: try typing in `Auto.horsepower`. + + +We now consider the `cylinders` variable. Typing in `Auto.cylinders.dtype` reveals that it is being treated as a quantitative variable. +However, since there is only a small number of possible values for this variable, we may wish to treat it as + qualitative. Below, we replace +the `cylinders` column with a categorical version of `Auto.cylinders`. The function `pd.Series()` owes its name to the fact that `pandas` is often used in time series applications. + +```{python} +Auto.cylinders = pd.Series(Auto.cylinders, dtype='category') +Auto.cylinders.dtype + +``` + Now that `cylinders` is qualitative, we can display it using + the `boxplot()` method. + +```{python} +fig, ax = subplots(figsize=(8, 8)) +Auto.boxplot('mpg', by='cylinders', ax=ax); + +``` + +The `hist()` method can be used to plot a *histogram*. + +```{python} +fig, ax = subplots(figsize=(8, 8)) +Auto.hist('mpg', ax=ax); + +``` +The color of the bars and the number of bins can be changed: + +```{python} +fig, ax = subplots(figsize=(8, 8)) +Auto.hist('mpg', color='red', bins=12, ax=ax); + +``` + See `Auto.hist?` for more plotting +options. + +We can use the `pd.plotting.scatter_matrix()` function to create a *scatterplot matrix* to visualize all of the pairwise relationships between the columns in +a data frame. + +```{python} +pd.plotting.scatter_matrix(Auto); + +``` + We can also produce scatterplots +for a subset of the variables. + +```{python} +pd.plotting.scatter_matrix(Auto[['mpg', + 'displacement', + 'weight']]); + +``` +The `describe()` method produces a numerical summary of each column in a data frame. + +```{python} +Auto[['mpg', 'weight']].describe() + +``` +We can also produce a summary of just a single column. + +```{python} +Auto['cylinders'].describe() +Auto['mpg'].describe() + +``` +To exit `Jupyter`, select `File / Close and Halt`. + + + + diff --git a/ISLP_labs_R4DS/Ch03-linreg-lab copy.Rmd b/ISLP_labs_R4DS/Ch03-linreg-lab copy.Rmd new file mode 100644 index 0000000..afb1570 --- /dev/null +++ b/ISLP_labs_R4DS/Ch03-linreg-lab copy.Rmd @@ -0,0 +1,617 @@ +--- +jupyter: + jupytext: + cell_metadata_filter: -all + formats: Rmd,ipynb + main_language: python + text_representation: + extension: .Rmd + format_name: rmarkdown + format_version: '1.2' + jupytext_version: 1.14.7 +--- + + +# Chapter 3 + + +# Lab: Linear Regression + +## Importing packages +We import our standard libraries at this top +level. + +```{python} +import numpy as np +import pandas as pd +from matplotlib.pyplot import subplots + +``` + + +### New imports +Throughout this lab we will introduce new functions and libraries. However, +we will import them here to emphasize these are the new +code objects in this lab. Keeping imports near the top +of a notebook makes the code more readable, since scanning the first few +lines tells us what libraries are used. + +```{python} +import statsmodels.api as sm + +``` + We will provide relevant details about the +functions below as they are needed. + +Besides importing whole modules, it is also possible +to import only a few items from a given module. This +will help keep the *namespace* clean. +We will use a few specific objects from the `statsmodels` package +which we import here. + +```{python} +from statsmodels.stats.outliers_influence \ + import variance_inflation_factor as VIF +from statsmodels.stats.anova import anova_lm + +``` + +As one of the import statements above is quite a long line, we inserted a line break `\` to +ease readability. + +We will also use some functions written for the labs in this book in the `ISLP` +package. + +```{python} +from ISLP import load_data +from ISLP.models import (ModelSpec as MS, + summarize, + poly) + +``` + +### Inspecting Objects and Namespaces +The +function `dir()` +provides a list of +objects in a namespace. + +```{python} +dir() + +``` + This shows you everything that `Python` can find at the top level. +There are certain objects like `__builtins__` that contain references to built-in +functions like `print()`. + +Every python object has its own notion of +namespace, also accessible with `dir()`. This will include +both the attributes of the object +as well as any methods associated with it. For instance, we see `'sum'` in the listing for an +array. + +```{python} +A = np.array([3,5,11]) +dir(A) + +``` + This indicates that the object `A.sum` exists. In this case it is a method +that can be used to compute the sum of the array `A` as can be seen by typing `A.sum?`. + +```{python} +A.sum() + +``` + + + +## Simple Linear Regression +In this section we will construct model +matrices (also called design matrices) using the `ModelSpec()` transform from `ISLP.models`. + +We will use the `Boston` housing data set, which is contained in the `ISLP` package. The `Boston` dataset records `medv` (median house value) for $506$ neighborhoods +around Boston. We will build a regression model to predict `medv` using $13$ +predictors such as `rmvar` (average number of rooms per house), + `age` (proportion of owner-occupied units built prior to 1940), and `lstat` (percent of +households with low socioeconomic status). We will use `statsmodels` for this +task, a `Python` package that implements several commonly used +regression methods. + +We have included a simple loading function `load_data()` in the +`ISLP` package: + +```{python} +Boston = load_data("Boston") +Boston.columns + +``` + +Type `Boston?` to find out more about these data. + +We start by using the `sm.OLS()` function to fit a +simple linear regression model. Our response will be + `medv` and `lstat` will be the single predictor. +For this model, we can create the model matrix by hand. + + +```{python} +X = pd.DataFrame({'intercept': np.ones(Boston.shape[0]), + 'lstat': Boston['lstat']}) +X[:4] + +``` + +We extract the response, and fit the model. + +```{python} +y = Boston['medv'] +model = sm.OLS(y, X) +results = model.fit() + +``` +Note that `sm.OLS()` does +not fit the model; it specifies the model, and then `model.fit()` does the actual fitting. + +Our `ISLP` function `summarize()` produces a simple table of the parameter estimates, +their standard errors, t-statistics and p-values. +The function takes a single argument, such as the object `results` +returned here by the `fit` +method, and returns such a summary. + +```{python} +summarize(results) + +``` + + +Before we describe other methods for working with fitted models, we outline a more useful and general framework for constructing a model matrix~`X`. +### Using Transformations: Fit and Transform +Our model above has a single predictor, and constructing `X` was straightforward. +In practice we often fit models with more than one predictor, typically selected from an array or data frame. +We may wish to introduce transformations to the variables before fitting the model, specify interactions between variables, and expand some particular variables into sets of variables (e.g. polynomials). +The `sklearn` package has a particular notion +for this type of task: a *transform*. A transform is an object +that is created with some parameters as arguments. The +object has two main methods: `fit()` and `transform()`. + +We provide a general approach for specifying models and constructing +the model matrix through the transform `ModelSpec()` in the `ISLP` library. +`ModelSpec()` +(renamed `MS()` in the preamble) creates a +transform object, and then a pair of methods +`transform()` and `fit()` are used to construct a +corresponding model matrix. + +We first describe this process for our simple regression model using a single predictor `lstat` in +the `Boston` data frame, but will use it repeatedly in more +complex tasks in this and other labs in this book. +In our case the transform is created by the expression +`design = MS(['lstat'])`. + +The `fit()` method takes the original array and may do some +initial computations on it, as specified in the transform object. +For example, it may compute means and standard deviations for centering and scaling. +The `transform()` +method applies the fitted transformation to the array of data, and produces the model matrix. + + +```{python} +design = MS(['lstat']) +design = design.fit(Boston) +X = design.transform(Boston) +X[:4] +``` +In this simple case, the `fit()` method does very little; it simply checks that the variable `'lstat'` specified in `design` exists in `Boston`. Then `transform()` constructs the model matrix with two columns: an `intercept` and the variable `lstat`. + +These two operations can be combined with the +`fit_transform()` method. + +```{python} +design = MS(['lstat']) +X = design.fit_transform(Boston) +X[:4] +``` +Note that, as in the previous code chunk when the two steps were done separately, the `design` object is changed as a result of the `fit()` operation. The power of this pipeline will become clearer when we fit more complex models that involve interactions and transformations. + + +Let's return to our fitted regression model. +The object +`results` has several methods that can be used for inference. +We already presented a function `summarize()` for showing the essentials of the fit. +For a full and somewhat exhaustive summary of the fit, we can use the `summary()` +method. + +```{python} +results.summary() + +``` + +The fitted coefficients can also be retrieved as the +`params` attribute of `results`. + +```{python} +results.params + +``` + + +The `get_prediction()` method can be used to obtain predictions, and produce confidence intervals and +prediction intervals for the prediction of `medv` for given values of `lstat`. + +We first create a new data frame, in this case containing only the variable `lstat`, with the values for this variable at which we wish to make predictions. +We then use the `transform()` method of `design` to create the corresponding model matrix. + +```{python} +new_df = pd.DataFrame({'lstat':[5, 10, 15]}) +newX = design.transform(new_df) +newX + +``` + +Next we compute the predictions at `newX`, and view them by extracting the `predicted_mean` attribute. + +```{python} +new_predictions = results.get_prediction(newX); +new_predictions.predicted_mean + +``` +We can produce confidence intervals for the predicted values. + +```{python} +new_predictions.conf_int(alpha=0.05) + +``` +Prediction intervals are computing by setting `obs=True`: + +```{python} +new_predictions.conf_int(obs=True, alpha=0.05) + +``` + For instance, the 95% confidence interval associated with an + `lstat` value of 10 is (24.47, 25.63), and the 95% prediction +interval is (12.82, 37.28). As expected, the confidence and +prediction intervals are centered around the same point (a predicted +value of 25.05 for `medv` when `lstat` equals +10), but the latter are substantially wider. + +Next we will plot `medv` and `lstat` +using `DataFrame.plot.scatter()`, \definelongblankMR{plot.scatter()}{plot.slashslashscatter()} +and wish to +add the regression line to the resulting plot. + + +### Defining Functions +While there is a function +within the `ISLP` package that adds a line to an existing plot, we take this opportunity +to define our first function to do so. + +```{python} +def abline(ax, b, m): + "Add a line with slope m and intercept b to ax" + xlim = ax.get_xlim() + ylim = [m * xlim[0] + b, m * xlim[1] + b] + ax.plot(xlim, ylim) + +``` + A few things are illustrated above. First we see the syntax for defining a function: +`def funcname(...)`. The function has arguments `ax, b, m` +where `ax` is an axis object for an exisiting plot, `b` is the intercept and +`m` is the slope of the desired line. Other plotting options can be passed on to +`ax.plot` by including additional optional arguments as follows: + +```{python} +def abline(ax, b, m, *args, **kwargs): + "Add a line with slope m and intercept b to ax" + xlim = ax.get_xlim() + ylim = [m * xlim[0] + b, m * xlim[1] + b] + ax.plot(xlim, ylim, *args, **kwargs) + +``` +The addition of `*args` allows any number of +non-named arguments to `abline`, while `*kwargs` allows any +number of named arguments (such as `linewidth=3`) to `abline`. +In our function, we pass +these arguments verbatim to `ax.plot` above. Readers +interested in learning more about +functions are referred to the section on +defining functions in [docs.python.org/tutorial](https://docs.python.org/3/tutorial/controlflow.html#defining-functions). + +Let’s use our new function to add this regression line to a plot of +`medv` vs. `lstat`. + +```{python} +ax = Boston.plot.scatter('lstat', 'medv') +abline(ax, + results.params[0], + results.params[1], + 'r--', + linewidth=3) + +``` +Thus, the final call to `ax.plot()` is `ax.plot(xlim, ylim, 'r--', linewidth=3)`. +We have used the argument `'r--'` to produce a red dashed line, and added +an argument to make it of width 3. +There is some evidence for non-linearity in the relationship between `lstat` and `medv`. We will explore this issue later in this lab. + +As mentioned above, there is an existing function to add a line to a plot --- `ax.axline()` --- but knowing how to write such functions empowers us to create more expressive displays. + + + + +Next we examine some diagnostic plots, several of which were discussed +in Section 3.3.3. +We can find the fitted values and residuals +of the fit as attributes of the `results` object. +Various influence measures describing the regression model +are computed with the `get_influence()` method. +As we will not use the `fig` component returned +as the first value from `subplots()`, we simply +capture the second returned value in `ax` below. + +```{python} +ax = subplots(figsize=(8,8))[1] +ax.scatter(results.fittedvalues, results.resid) +ax.set_xlabel('Fitted value') +ax.set_ylabel('Residual') +ax.axhline(0, c='k', ls='--'); + +``` + We add a horizontal line at 0 for reference using the + `ax.axhline()` method, indicating +it should be black (`c='k'`) and have a dashed linestyle (`ls='--'`). + +On the basis of the residual plot, there is some evidence of non-linearity. +Leverage statistics can be computed for any number of predictors using the +`hat_matrix_diag` attribute of the value returned by the +`get_influence()` method. + +```{python} +infl = results.get_influence() +ax = subplots(figsize=(8,8))[1] +ax.scatter(np.arange(X.shape[0]), infl.hat_matrix_diag) +ax.set_xlabel('Index') +ax.set_ylabel('Leverage') +np.argmax(infl.hat_matrix_diag) + +``` + The `np.argmax()` function identifies the index of the largest element of an array, optionally computed over an axis of the array. +In this case, we maximized over the entire array +to determine which observation has the largest leverage statistic. + + +## Multiple Linear Regression +In order to fit a multiple linear regression model using least squares, we again use +the `ModelSpec()` transform to construct the required +model matrix and response. The arguments +to `ModelSpec()` can be quite general, but in this case +a list of column names suffice. We consider a fit here with +the two variables `lstat` and `age`. + +```{python} +X = MS(['lstat', 'age']).fit_transform(Boston) +model1 = sm.OLS(y, X) +results1 = model1.fit() +summarize(results1) +``` +Notice how we have compacted the first line into a succinct expression describing the construction of `X`. + +The `Boston` data set contains 12 variables, and so it would be cumbersome +to have to type all of these in order to perform a regression using all of the predictors. +Instead, we can use the following short-hand:\definelongblankMR{columns.drop()}{columns.slashslashdrop()} + +```{python} +terms = Boston.columns.drop('medv') +terms + +``` + +We can now fit the model with all the variables in `terms` using +the same model matrix builder. + +```{python} +X = MS(terms).fit_transform(Boston) +model = sm.OLS(y, X) +results = model.fit() +summarize(results) + +``` + +What if we would like to perform a regression using all of the variables but one? For +example, in the above regression output, `age` has a high $p$-value. +So we may wish to run a regression excluding this predictor. +The following syntax results in a regression using all predictors except `age`. + +```{python} +minus_age = Boston.columns.drop(['medv', 'age']) +Xma = MS(minus_age).fit_transform(Boston) +model1 = sm.OLS(y, Xma) +summarize(model1.fit()) + +``` + +## Multivariate Goodness of Fit +We can access the individual components of `results` by name +(`dir(results)` shows us what is available). Hence +`results.rsquared` gives us the $R^2$, +and +`np.sqrt(results.scale)` gives us the RSE. + +Variance inflation factors (section 3.3.3) are sometimes useful +to assess the effect of collinearity in the model matrix of a regression model. +We will compute the VIFs in our multiple regression fit, and use the opportunity to introduce the idea of *list comprehension*. + +### List Comprehension +Often we encounter a sequence of objects which we would like to transform +for some other task. Below, we compute the VIF for each +feature in our `X` matrix and produce a data frame +whose index agrees with the columns of `X`. +The notion of list comprehension can often make such +a task easier. + +List comprehensions are simple and powerful ways to form +lists of `Python` objects. The language also supports +dictionary and *generator* comprehension, though these are +beyond our scope here. Let's look at an example. We compute the VIF for each of the variables +in the model matrix `X`, using the function `variance_inflation_factor()`. + + +```{python} +vals = [VIF(X, i) + for i in range(1, X.shape[1])] +vif = pd.DataFrame({'vif':vals}, + index=X.columns[1:]) +vif + +``` +The function `VIF()` takes two arguments: a dataframe or array, +and a variable column index. In the code above we call `VIF()` on the fly for all columns in `X`. +We have excluded column 0 above (the intercept), which is not of interest. In this case the VIFs are not that exciting. + +The object `vals` above could have been constructed with the following for loop: + +```{python} +vals = [] +for i in range(1, X.values.shape[1]): + vals.append(VIF(X.values, i)) + +``` +List comprehension allows us to perform such repetitive operations in a more straightforward way. +## Interaction Terms +It is easy to include interaction terms in a linear model using `ModelSpec()`. +Including a tuple `("lstat","age")` tells the model +matrix builder to include an interaction term between + `lstat` and `age`. + +```{python} +X = MS(['lstat', + 'age', + ('lstat', 'age')]).fit_transform(Boston) +model2 = sm.OLS(y, X) +summarize(model2.fit()) + +``` + + +## Non-linear Transformations of the Predictors +The model matrix builder can include terms beyond +just column names and interactions. For instance, +the `poly()` function supplied in `ISLP` specifies that +columns representing polynomial functions +of its first argument are added to the model matrix. + +```{python} +X = MS([poly('lstat', degree=2), 'age']).fit_transform(Boston) +model3 = sm.OLS(y, X) +results3 = model3.fit() +summarize(results3) + +``` +The effectively zero *p*-value associated with the quadratic term +(i.e. the third row above) suggests that it leads to an improved model. + +By default, `poly()` creates a basis matrix for inclusion in the +model matrix whose +columns are *orthogonal polynomials*, which are designed for stable +least squares computations. {Actually, `poly()` is a wrapper for the workhorse and standalone function `Poly()` that does the work in building the model matrix.} +Alternatively, had we included an argument +`raw=True` in the above call to `poly()`, the basis matrix would consist simply of +`lstat` and `lstat**2`. Since either of these bases +represent quadratic polynomials, the fitted values would not +change in this case, just the polynomial coefficients. Also by default, the columns +created by `poly()` do not include an intercept column as +that is automatically added by `MS()`. + +We use the `anova_lm()` function to further quantify the extent to which the quadratic fit is +superior to the linear fit. + +```{python} +anova_lm(results1, results3) + +``` +Here `results1` represents the linear submodel containing +predictors `lstat` and `age`, +while `results3` corresponds to the larger model above with a quadratic +term in `lstat`. +The `anova_lm()` function performs a hypothesis test +comparing the two models. The null hypothesis is that the quadratic +term in the bigger model is not needed, and the alternative hypothesis is that the +bigger model is superior. Here the *F*-statistic is 177.28 and +the associated *p*-value is zero. +In this case the *F*-statistic is the square of the +*t*-statistic for the quadratic term in the linear model summary +for `results3` --- a consequence of the fact that these nested +models differ by one degree of freedom. +This provides very clear evidence that the quadratic polynomial in +`lstat` improves the linear model. +This is not surprising, since earlier we saw evidence for non-linearity in the relationship between `medv` +and `lstat`. + +The function `anova_lm()` can take more than two nested models +as input, in which case it compares every successive pair of models. +That also explains why their are `NaN`s in the first row above, since +there is no previous model with which to compare the first. + + +```{python} +ax = subplots(figsize=(8,8))[1] +ax.scatter(results3.fittedvalues, results3.resid) +ax.set_xlabel('Fitted value') +ax.set_ylabel('Residual') +ax.axhline(0, c='k', ls='--') + +``` +We see that when the quadratic term is included in the model, +there is little discernible pattern in the residuals. +In order to create a cubic or higher-degree polynomial fit, we can simply change the degree argument +to `poly()`. + + + +## Qualitative Predictors +Here we use the `Carseats` data, which is included in the +`ISLP` package. We will attempt to predict `Sales` +(child car seat sales) in 400 locations based on a number of +predictors. + +```{python} +Carseats = load_data('Carseats') +Carseats.columns + +``` +The `Carseats` + data includes qualitative predictors such as + `ShelveLoc`, an indicator of the quality of the shelving + location --- that is, +the space within a store in which the car seat is displayed. The predictor + `ShelveLoc` takes on three possible values, `Bad`, `Medium`, and `Good`. +Given a qualitative variable such as `ShelveLoc`, `ModelSpec()` generates dummy +variables automatically. +These variables are often referred to as a *one-hot encoding* of the categorical +feature. Their columns sum to one, so to avoid collinearity with an intercept, the first column is dropped. Below we see +the column `ShelveLoc[Bad]` has been dropped, since `Bad` is the first level of `ShelveLoc`. +Below we fit a multiple regression model that includes some interaction terms. + +```{python} +allvars = list(Carseats.columns.drop('Sales')) +y = Carseats['Sales'] +final = allvars + [('Income', 'Advertising'), + ('Price', 'Age')] +X = MS(final).fit_transform(Carseats) +model = sm.OLS(y, X) +summarize(model.fit()) + +``` +In the first line above, we made `allvars` a list, so that we +could add the interaction terms two lines down. +Our model-matrix builder has created a `ShelveLoc[Good]` +dummy variable that takes on a value of 1 if the +shelving location is good, and 0 otherwise. It has also created a `ShelveLoc[Medium]` +dummy variable that equals 1 if the shelving location is medium, and 0 otherwise. +A bad shelving location corresponds to a zero for each of the two dummy variables. +The fact that the coefficient for `ShelveLoc[Good]` in the regression output is +positive indicates that a good shelving location is associated with high sales (relative to a bad location). +And `ShelveLoc[Medium]` has a smaller positive coefficient, +indicating that a medium shelving location leads to higher sales than a bad +shelving location, but lower sales than a good shelving location. + + diff --git a/ISLP_labs_R4DS/Ch04-classification-lab copy.Rmd b/ISLP_labs_R4DS/Ch04-classification-lab copy.Rmd new file mode 100644 index 0000000..de81f30 --- /dev/null +++ b/ISLP_labs_R4DS/Ch04-classification-lab copy.Rmd @@ -0,0 +1,1176 @@ +--- +jupyter: + jupytext: + cell_metadata_filter: -all + formats: Rmd,ipynb + main_language: python + text_representation: + extension: .Rmd + format_name: rmarkdown + format_version: '1.2' + jupytext_version: 1.14.7 +--- + + +# Chapter 4 + + + + + +# Lab: Logistic Regression, LDA, QDA, and KNN + + + +## The Stock Market Data + +In this lab we will examine the `Smarket` +data, which is part of the `ISLP` +library. This data set consists of percentage returns for the S&P 500 +stock index over 1,250 days, from the beginning of 2001 until the end +of 2005. For each date, we have recorded the percentage returns for +each of the five previous trading days, `Lag1` through + `Lag5`. We have also recorded `Volume` (the number of +shares traded on the previous day, in billions), `Today` (the +percentage return on the date in question) and `Direction` +(whether the market was `Up` or `Down` on this date). + +We start by importing our libraries at this top level; these are all imports we have seen in previous labs. + +```{python} +import numpy as np +import pandas as pd +from matplotlib.pyplot import subplots +import statsmodels.api as sm +from ISLP import load_data +from ISLP.models import (ModelSpec as MS, + summarize) +``` + +We also collect together the new imports needed for this lab. + +```{python} +from ISLP import confusion_table +from ISLP.models import contrast +from sklearn.discriminant_analysis import \ + (LinearDiscriminantAnalysis as LDA, + QuadraticDiscriminantAnalysis as QDA) +from sklearn.naive_bayes import GaussianNB +from sklearn.neighbors import KNeighborsClassifier +from sklearn.preprocessing import StandardScaler +from sklearn.model_selection import train_test_split +from sklearn.linear_model import LogisticRegression + +``` + + +Now we are ready to load the `Smarket` data. + +```{python} +Smarket = load_data('Smarket') +Smarket +``` +This gives a truncated listing of the data. +We can see what the variable names are. + +```{python} +Smarket.columns +``` +We compute the correlation matrix using the `corr()` method +for data frames, which produces a matrix that contains all of +the pairwise correlations among the variables. + +By instructing `pandas` to use only numeric variables, the `corr()` method does not report a correlation for the `Direction` variable because it is + qualitative. + +```{python} +Smarket.corr(numeric_only=True) + +``` +As one would expect, the correlations between the lagged return variables and +today’s return are close to zero. The only substantial correlation is between `Year` and + `Volume`. By plotting the data we see that `Volume` +is increasing over time. In other words, the average number of shares traded +daily increased from 2001 to 2005. + + +```{python} +Smarket.plot(y='Volume'); + +``` + + +## Logistic Regression +Next, we will fit a logistic regression model in order to predict + `Direction` using `Lag1` through `Lag5` and + `Volume`. The `sm.GLM()` function fits *generalized linear models*, a class of +models that includes logistic regression. Alternatively, +the function `sm.Logit()` fits a logistic regression +model directly. The syntax of +`sm.GLM()` is similar to that of `sm.OLS()`, except +that we must pass in the argument `family=sm.families.Binomial()` +in order to tell `statsmodels` to run a logistic regression rather than some other +type of generalized linear model. + +```{python} +allvars = Smarket.columns.drop(['Today', 'Direction', 'Year']) +design = MS(allvars) +X = design.fit_transform(Smarket) +y = Smarket.Direction == 'Up' +glm = sm.GLM(y, + X, + family=sm.families.Binomial()) +results = glm.fit() +summarize(results) + +``` +The smallest *p*-value here is associated with `Lag1`. The +negative coefficient for this predictor suggests that if the market +had a positive return yesterday, then it is less likely to go up +today. However, at a value of 0.15, the *p*-value is still +relatively large, and so there is no clear evidence of a real +association between `Lag1` and `Direction`. + +We use the `params` attribute of `results` +in order to access just the +coefficients for this fitted model. + +```{python} +results.params + +``` +Likewise we can use the +`pvalues` attribute to access the *p*-values for the coefficients. + +```{python} +results.pvalues + +``` + +The `predict()` method of `results` can be used to predict the +probability that the market will go up, given values of the +predictors. This method returns predictions +on the probability scale. If no data set is supplied to the `predict()` +function, then the probabilities are computed for the training data +that was used to fit the logistic regression model. +As with linear regression, one can pass an optional `exog` argument consistent +with a design matrix if desired. Here we have +printed only the first ten probabilities. + +```{python} +probs = results.predict() +probs[:10] + +``` +In order to make a prediction as to whether the market will go up or +down on a particular day, we must convert these predicted +probabilities into class labels, `Up` or `Down`. The +following two commands create a vector of class predictions based on +whether the predicted probability of a market increase is greater than +or less than 0.5. + +```{python} +labels = np.array(['Down']*1250) +labels[probs>0.5] = "Up" + +``` +The `confusion_table()` +function from the `ISLP` package summarizes these predictions, showing how +many observations were correctly or incorrectly classified. Our function, which is adapted from a similar function +in the module `sklearn.metrics`, transposes the resulting +matrix and includes row and column labels. +The `confusion_table()` function takes as first argument the +predicted labels, and second argument the true labels. + +```{python} +confusion_table(labels, Smarket.Direction) + +``` + +The diagonal elements of the confusion matrix indicate correct +predictions, while the off-diagonals represent incorrect +predictions. Hence our model correctly predicted that the market would +go up on 507 days and that it would go down on 145 days, for a +total of 507 + 145 = 652 correct predictions. The `np.mean()` +function can be used to compute the fraction of days for which the +prediction was correct. In this case, logistic regression correctly +predicted the movement of the market 52.2% of the time. + + +```{python} +(507+145)/1250, np.mean(labels == Smarket.Direction) + +``` + + + + +At first glance, it appears that the logistic regression model is +working a little better than random guessing. However, this result is +misleading because we trained and tested the model on the same set of +1,250 observations. In other words, $100-52.2=47.8%$ is the +*training* error rate. As we have seen +previously, the training error rate is often overly optimistic --- it +tends to underestimate the test error rate. In +order to better assess the accuracy of the logistic regression model +in this setting, we can fit the model using part of the data, and +then examine how well it predicts the *held out* data. This +will yield a more realistic error rate, in the sense that in practice +we will be interested in our model’s performance not on the data that +we used to fit the model, but rather on days in the future for which +the market’s movements are unknown. + +To implement this strategy, we first create a Boolean vector +corresponding to the observations from 2001 through 2004. We then +use this vector to create a held out data set of observations from +2005. + +```{python} +train = (Smarket.Year < 2005) +Smarket_train = Smarket.loc[train] +Smarket_test = Smarket.loc[~train] +Smarket_test.shape + +``` + + +The object `train` is a vector of 1,250 elements, corresponding +to the observations in our data set. The elements of the vector that +correspond to observations that occurred before 2005 are set to +`True`, whereas those that correspond to observations in 2005 are +set to `False`. Hence `train` is a +*boolean* array, since its +elements are `True` and `False`. Boolean arrays can be used +to obtain a subset of the rows or columns of a data frame +using the `loc` method. For instance, +the command `Smarket.loc[train]` would pick out a submatrix of the +stock market data set, corresponding only to the dates before 2005, +since those are the ones for which the elements of `train` are +`True`. The `~` symbol can be used to negate all of the +elements of a Boolean vector. That is, `~train` is a vector +similar to `train`, except that the elements that are `True` +in `train` get swapped to `False` in `~train`, and vice versa. +Therefore, `Smarket.loc[~train]` yields a +subset of the rows of the data frame +of the stock market data containing only the observations for which +`train` is `False`. +The output above indicates that there are 252 such +observations. + +We now fit a logistic regression model using only the subset of the +observations that correspond to dates before 2005. We then obtain predicted probabilities of the +stock market going up for each of the days in our test set --- that is, +for the days in 2005. + +```{python} +X_train, X_test = X.loc[train], X.loc[~train] +y_train, y_test = y.loc[train], y.loc[~train] +glm_train = sm.GLM(y_train, + X_train, + family=sm.families.Binomial()) +results = glm_train.fit() +probs = results.predict(exog=X_test) + +``` + +Notice that we have trained and tested our model on two completely +separate data sets: training was performed using only the dates before +2005, and testing was performed using only the dates in 2005. + +Finally, we compare the predictions for 2005 to the +actual movements of the market over that time period. +We will first store the test and training labels (recall `y_test` is binary). + +```{python} +D = Smarket.Direction +L_train, L_test = D.loc[train], D.loc[~train] + +``` +Now we threshold the +fitted probability at 50% to form +our predicted labels. + +```{python} +labels = np.array(['Down']*252) +labels[probs>0.5] = 'Up' +confusion_table(labels, L_test) + +``` +The test accuracy is about 48% while the error rate is about 52% + +```{python} +np.mean(labels == L_test), np.mean(labels != L_test) + +``` + + +The `!=` notation means *not equal to*, and so the last command +computes the test set error rate. The results are rather +disappointing: the test error rate is 52%, which is worse than +random guessing! Of course this result is not all that surprising, +given that one would not generally expect to be able to use previous +days’ returns to predict future market performance. (After all, if it +were possible to do so, then the authors of this book would be out +striking it rich rather than writing a statistics textbook.) + +We recall that the logistic regression model had very underwhelming +*p*-values associated with all of the predictors, and that the +smallest *p*-value, though not very small, corresponded to + `Lag1`. Perhaps by removing the variables that appear not to be +helpful in predicting `Direction`, we can obtain a more +effective model. After all, using predictors that have no relationship +with the response tends to cause a deterioration in the test error +rate (since such predictors cause an increase in variance without a +corresponding decrease in bias), and so removing such predictors may +in turn yield an improvement. Below we refit the logistic +regression using just `Lag1` and `Lag2`, which seemed to +have the highest predictive power in the original logistic regression +model. + +```{python} +model = MS(['Lag1', 'Lag2']).fit(Smarket) +X = model.transform(Smarket) +X_train, X_test = X.loc[train], X.loc[~train] +glm_train = sm.GLM(y_train, + X_train, + family=sm.families.Binomial()) +results = glm_train.fit() +probs = results.predict(exog=X_test) +labels = np.array(['Down']*252) +labels[probs>0.5] = 'Up' +confusion_table(labels, L_test) + +``` + + +Let’s evaluate the overall accuracy as well as the accuracy within the days when +logistic regression predicts an increase. + +```{python} +(35+106)/252,106/(106+76) + +``` + +Now the results appear to be a little better: 56% of the daily +movements have been correctly predicted. It is worth noting that in +this case, a much simpler strategy of predicting that the market will +increase every day will also be correct 56% of the time! Hence, in +terms of overall error rate, the logistic regression method is no +better than the naive approach. However, the confusion matrix +shows that on days when logistic regression predicts an increase in +the market, it has a 58% accuracy rate. This suggests a possible +trading strategy of buying on days when the model predicts an +increasing market, and avoiding trades on days when a decrease is +predicted. Of course one would need to investigate more carefully +whether this small improvement was real or just due to random chance. + +Suppose that we want to predict the returns associated with particular +values of `Lag1` and `Lag2`. In particular, we want to +predict `Direction` on a day when `Lag1` and + `Lag2` equal $1.2$ and $1.1$, respectively, and on a day when they +equal $1.5$ and $-0.8$. We do this using the `predict()` +function. + +```{python} +newdata = pd.DataFrame({'Lag1':[1.2, 1.5], + 'Lag2':[1.1, -0.8]}); +newX = model.transform(newdata) +results.predict(newX) + +``` + + +## Linear Discriminant Analysis + + +We begin by performing LDA on the `Smarket` data, using the function +`LinearDiscriminantAnalysis()`, which we have abbreviated `LDA()`. We +fit the model using only the observations before 2005. + +```{python} +lda = LDA(store_covariance=True) + +``` + +Since the `LDA` estimator automatically +adds an intercept, we should remove the column corresponding to the +intercept in both `X_train` and `X_test`. We can also directly +use the labels rather than the Boolean vectors `y_train`. + +```{python} +X_train, X_test = [M.drop(columns=['intercept']) + for M in [X_train, X_test]] +lda.fit(X_train, L_train) + +``` +Here we have used the list comprehensions introduced +in Section 3.6.4. Looking at our first line above, we see that the right-hand side is a list +of length two. This is because the code `for M in [X_train, X_test]` iterates over a list +of length two. While here we loop over a list, +the list comprehension method works when looping over any iterable object. +We then apply the `drop()` method to each element in the iteration, collecting +the result in a list. The left-hand side tells `Python` to unpack this list +of length two, assigning its elements to the variables `X_train` and `X_test`. Of course, +this overwrites the previous values of `X_train` and `X_test`. + +Having fit the model, we can extract the means in the two classes with the `means_` attribute. These are the average of each predictor within each class, and +are used by LDA as estimates of $\mu_k$. These suggest that there is +a tendency for the previous 2 days’ returns to be negative on days +when the market increases, and a tendency for the previous days’ +returns to be positive on days when the market declines. + +```{python} +lda.means_ + +``` + + +The estimated prior probabilities are stored in the `priors_` attribute. +The package `sklearn` typically uses this trailing `_` to denote +a quantity estimated when using the `fit()` method. We can be sure of which +entry corresponds to which label by looking at the `classes_` attribute. + +```{python} +lda.classes_ + +``` + + +The LDA output indicates that $\hat\pi_{Down}=0.492$ and +$\hat\pi_{Up}=0.508$. + + +```{python} +lda.priors_ + +``` + + +The linear discriminant vectors can be found in the `scalings_` attribute: + +```{python} +lda.scalings_ + +``` + +These values provide the linear combination of `Lag1` and `Lag2` that are used to form the LDA decision rule. In other words, these are the multipliers of the elements of $X=x$ in (4.24). + If $-0.64\times `Lag1` - 0.51 \times `Lag2` $ is large, then the LDA classifier will predict a market increase, and if it is small, then the LDA classifier will predict a market decline. + +```{python} +lda_pred = lda.predict(X_test) + +``` + +As we observed in our comparison of classification methods + (Section 4.5), the LDA and logistic +regression predictions are almost identical. + +```{python} +confusion_table(lda_pred, L_test) + +``` + + +We can also estimate the +probability of each class for +each point in a training set. Applying a 50% threshold to the posterior probabilities of +being in class one allows us to +recreate the predictions contained in `lda_pred`. + +```{python} +lda_prob = lda.predict_proba(X_test) +np.all( + np.where(lda_prob[:,1] >= 0.5, 'Up','Down') == lda_pred + ) + +``` + + +Above, we used the `np.where()` function that +creates an array with value `'Up'` for indices where +the second column of `lda_prob` (the estimated +posterior probability of `'Up'`) is greater than 0.5. +For problems with more than two classes the labels are chosen as the class whose posterior probability is highest: + +```{python} +np.all( + [lda.classes_[i] for i in np.argmax(lda_prob, 1)] == lda_pred + ) + +``` + + +If we wanted to use a posterior probability threshold other than +50% in order to make predictions, then we could easily do so. For +instance, suppose that we wish to predict a market decrease only if we +are very certain that the market will indeed decrease on that +day --- say, if the posterior probability is at least 90%. +We know that the first column of `lda_prob` corresponds to the +label `Down` after having checked the `classes_` attribute, hence we use +the column index 0 rather than 1 as we did above. + +```{python} +np.sum(lda_prob[:,0] > 0.9) + +``` + +No days in 2005 meet that threshold! In fact, the greatest posterior +probability of decrease in all of 2005 was 52.02%. + +The LDA classifier above is the first classifier from the +`sklearn` library. We will use several other objects +from this library. The objects +follow a common structure that simplifies tasks such as cross-validation, +which we will see in Chapter 5. Specifically, +the methods first create a generic classifier without +referring to any data. This classifier is then fit +to data with the `fit()` method and predictions are +always produced with the `predict()` method. This pattern +of first instantiating the classifier, followed by fitting it, and +then producing predictions is an explicit design choice of `sklearn`. This uniformity +makes it possible to cleanly copy the classifier so that it can be fit +on different data; e.g. different training sets arising in cross-validation. +This standard pattern also allows for a predictable formation of workflows. + + + +## Quadratic Discriminant Analysis +We will now fit a QDA model to the `Smarket` data. QDA is +implemented via +`QuadraticDiscriminantAnalysis()` +in the `sklearn` package, which we abbreviate to `QDA()`. +The syntax is very similar to `LDA()`. + +```{python} +qda = QDA(store_covariance=True) +qda.fit(X_train, L_train) + +``` + +The `QDA()` function will again compute `means_` and `priors_`. + +```{python} +qda.means_, qda.priors_ + +``` + +The `QDA()` classifier will estimate one covariance per class. Here is the +estimated covariance in the first class: + +```{python} +qda.covariance_[0] + +``` +The output contains the group means. But it does not contain the +coefficients of the linear discriminants, because the QDA classifier +involves a quadratic, rather than a linear, function of the +predictors. The `predict()` function works in exactly the +same fashion as for LDA. + +```{python} +qda_pred = qda.predict(X_test) +confusion_table(qda_pred, L_test) + +``` +Interestingly, the QDA predictions are accurate almost 60% of the +time, even though the 2005 data was not used to fit the model. + +```{python} +np.mean(qda_pred == L_test) + +``` + + +This level of accuracy is quite impressive for stock market data, which is +known to be quite hard to model accurately. This suggests that the +quadratic form assumed by QDA may capture the true relationship more +accurately than the linear forms assumed by LDA and logistic +regression. However, we recommend evaluating this method’s +performance on a larger test set before betting that this approach +will consistently beat the market! + + +## Naive Bayes +Next, we fit a naive Bayes model to the `Smarket` data. The syntax is +similar to that of `LDA()` and `QDA()`. By +default, this implementation `GaussianNB()` of the naive Bayes classifier models each +quantitative feature using a Gaussian distribution. However, a kernel +density method can also be used to estimate the distributions. + +```{python} +NB = GaussianNB() +NB.fit(X_train, L_train) + +``` + + +The classes are stored as `classes_`. + +```{python} +NB.classes_ + +``` + + +The class prior probabilities are stored in the `class_prior_` attribute. + +```{python} +NB.class_prior_ + +``` + + +The parameters of the features can be found in the `theta_` and `var_` attributes. The number of rows +is equal to the number of classes, while the number of columns is equal to the number of features. +We see below that the mean for feature `Lag1` in the `Down` class is 0.043. + +```{python} +NB.theta_ + +``` + +Its variance is 1.503. + +```{python} +NB.var_ + +``` +How do we know the names of these attributes? We use `NB?` (or `?NB`). + +We can easily verify the mean computation: + +```{python} +X_train[L_train == 'Down'].mean() + +``` + + +Similarly for the variance: + +```{python} +X_train[L_train == 'Down'].var(ddof=0) + +``` +Since `NB()` is a classifier in the `sklearn` library, making predictions +uses the same syntax as for `LDA()` and `QDA()` above. + +```{python} +nb_labels = NB.predict(X_test) +confusion_table(nb_labels, L_test) + +``` + +Naive Bayes performs well on these data, with accurate predictions over 59% of the time. This is slightly worse than QDA, but much better than LDA. + +As for `LDA`, the `predict_proba()` method estimates the probability that each observation belongs to a particular class. + +```{python} +NB.predict_proba(X_test)[:5] + +``` + + +## K-Nearest Neighbors +We will now perform KNN using the `KNeighborsClassifier()` function. This function works similarly +to the other model-fitting functions that we have +encountered thus far. + +As is the +case for LDA and QDA, we fit the classifier +using the `fit` method. New +predictions are formed using the `predict` method +of the object returned by `fit()`. + +```{python} +knn1 = KNeighborsClassifier(n_neighbors=1) +X_train, X_test = [np.asarray(X) for X in [X_train, X_test]] +knn1.fit(X_train, L_train) +knn1_pred = knn1.predict(X_test) +confusion_table(knn1_pred, L_test) + +``` + +The results using $K=1$ are not very good, since only $50%$ of the +observations are correctly predicted. Of course, it may be that $K=1$ +results in an overly-flexible fit to the data. + +```{python} +(83+43)/252, np.mean(knn1_pred == L_test) + +``` + +We repeat the +analysis below using $K=3$. + +```{python} +knn3 = KNeighborsClassifier(n_neighbors=3) +knn3_pred = knn3.fit(X_train, L_train).predict(X_test) +np.mean(knn3_pred == L_test) +``` +The results have improved slightly. But increasing *K* further +provides no further improvements. It appears that for these data, and this train/test split, +QDA gives the best results of the methods that we have examined so +far. + + +KNN does not perform well on the `Smarket` data, but it often does provide impressive results. As an example we will apply the KNN approach to the `Caravan` data set, which is part of the `ISLP` library. This data set includes 85 +predictors that measure demographic characteristics for 5,822 +individuals. The response variable is `Purchase`, which +indicates whether or not a given individual purchases a caravan +insurance policy. In this data set, only 6% of people purchased +caravan insurance. + +```{python} +Caravan = load_data('Caravan') +Purchase = Caravan.Purchase +Purchase.value_counts() + +``` + + +The method `value_counts()` takes a `pd.Series` or `pd.DataFrame` and returns +a `pd.Series` with the corresponding counts +for each unique element. In this case `Purchase` has only `Yes` and `No` values +and the method returns how many values of each there are. + +```{python} +348 / 5822 + +``` + + +Our features will include all columns except `Purchase`. + +```{python} +feature_df = Caravan.drop(columns=['Purchase']) + +``` + +Because the KNN classifier predicts the class of a given test +observation by identifying the observations that are nearest to it, +the scale of the variables matters. Any variables that are on a large +scale will have a much larger effect on the *distance* between +the observations, and hence on the KNN classifier, than variables that +are on a small scale. For instance, imagine a data set that contains +two variables, `salary` and `age` (measured in dollars +and years, respectively). As far as KNN is concerned, a difference of +1,000 USD in salary is enormous compared to a difference of 50 years in +age. Consequently, `salary` will drive the KNN classification +results, and `age` will have almost no effect. This is contrary +to our intuition that a salary difference of 1,000 USD is quite small +compared to an age difference of 50 years. Furthermore, the +importance of scale to the KNN classifier leads to another issue: if +we measured `salary` in Japanese yen, or if we measured + `age` in minutes, then we’d get quite different classification +results from what we get if these two variables are measured in +dollars and years. + +A good way to handle this problem is to *standardize* the data so that all variables are +given a mean of zero and a standard deviation of one. Then all +variables will be on a comparable scale. This is accomplished +using +the `StandardScaler()` +transformation. + +```{python} +scaler = StandardScaler(with_mean=True, + with_std=True, + copy=True) + +``` +The argument `with_mean` indicates whether or not +we should subtract the mean, while `with_std` indicates +whether or not we should scale the columns to have standard +deviation of 1 or not. Finally, the argument `copy=True` +indicates that we will always copy data, rather than +trying to do calculations in place where possible. + +This transformation can be fit +and then applied to arbitrary data. In the first line +below, the parameters for the scaling are computed and +stored in `scaler`, while the second line actually +constructs the standardized set of features. + +```{python} +scaler.fit(feature_df) +X_std = scaler.transform(feature_df) + +``` +Now every column of `feature_std` below has a standard deviation of +one and a mean of zero. + +```{python} +feature_std = pd.DataFrame( + X_std, + columns=feature_df.columns); +feature_std.std() + +``` + +Notice that the standard deviations are not quite $1$ here; this is again due to some procedures using the $1/n$ convention for variances (in this case `scaler()`), while others use $1/(n-1)$ (the `std()` method). See the footnote on page 200. +In this case it does not matter, as long as the variables are all on the same scale. + +Using the function `train_test_split()` we now split the observations into a test set, +containing 1000 observations, and a training set containing the remaining +observations. The argument `random_state=0` ensures that we get +the same split each time we rerun the code. + +```{python} +(X_train, + X_test, + y_train, + y_test) = train_test_split(np.asarray(feature_std), + Purchase, + test_size=1000, + random_state=0) + +``` +`?train_test_split` reveals that the non-keyword arguments can be `lists`, `arrays`, `pandas dataframes` etc that all have the same length (`shape[0]`) and hence are *indexable*. In this case they are the dataframe `feature_std` and the response variable `Purchase`. + {Note that we have converted `feature_std` to an `ndarray` to address a bug in `sklearn`.} +We fit a KNN model on the training data using $K=1$, +and evaluate its performance on the test data. + +```{python} +knn1 = KNeighborsClassifier(n_neighbors=1) +knn1_pred = knn1.fit(X_train, y_train).predict(X_test) +np.mean(y_test != knn1_pred), np.mean(y_test != "No") + +``` + + +The KNN error rate on the 1,000 test observations is about $11%$. +At first glance, this may appear to be fairly good. However, since +just over 6% of customers purchased insurance, we could get the error +rate down to almost 6% by always predicting `No` regardless of the +values of the predictors! This is known as the *null rate*.} + +Suppose that there is some non-trivial cost to trying to sell +insurance to a given individual. For instance, perhaps a salesperson +must visit each potential customer. If the company tries to sell +insurance to a random selection of customers, then the success rate +will be only 6%, which may be far too low given the costs +involved. Instead, the company would like to try to sell insurance +only to customers who are likely to buy it. So the overall error rate +is not of interest. Instead, the fraction of individuals that are +correctly predicted to buy insurance is of interest. + +```{python} +confusion_table(knn1_pred, y_test) + +``` + +It turns out that KNN with $K=1$ does far better than random guessing +among the customers that are predicted to buy insurance. Among 62 +such customers, 9, or 14.5%, actually do purchase insurance. +This is double the rate that one would obtain from random guessing. + +```{python} +9/(53+9) +``` + + +### Tuning Parameters + +The number of neighbors in KNN is referred to as a *tuning parameter*, also referred to as a *hyperparameter*. +We do not know *a priori* what value to use. It is therefore of interest +to see how the classifier performs on test data as we vary these +parameters. This can be achieved with a `for` loop, described in Section 2.3.8. +Here we use a for loop to look at the accuracy of our classifier in the group predicted to purchase +insurance as we vary the number of neighbors from 1 to 5: + +```{python} +for K in range(1,6): + knn = KNeighborsClassifier(n_neighbors=K) + knn_pred = knn.fit(X_train, y_train).predict(X_test) + C = confusion_table(knn_pred, y_test) + templ = ('K={0:d}: # predicted to rent: {1:>2},' + + ' # who did rent {2:d}, accuracy {3:.1%}') + pred = C.loc['Yes'].sum() + did_rent = C.loc['Yes','Yes'] + print(templ.format( + K, + pred, + did_rent, + did_rent / pred)) + +``` +We see some variability --- the numbers for `K=4` are very different from the rest. + +### Comparison to Logistic Regression +As a comparison, we can also fit a logistic regression model to the +data. This can also be done +with `sklearn`, though by default it fits +something like the *ridge regression* version +of logistic regression, which we introduce in Chapter 6. This can +be modified by appropriately setting the argument `C` below. Its default +value is 1 but by setting it to a very large number, the algorithm converges to the same solution as the usual (unregularized) +logistic regression estimator discussed above. + +Unlike the +`statsmodels` package, `sklearn` focuses less on +inference and more on classification. Hence, +the `summary` methods seen in `statsmodels` +and our simplified version seen with `summarize` are not +generally available for the classifiers in `sklearn`. + +```{python} +logit = LogisticRegression(C=1e10, solver='liblinear') +logit.fit(X_train, y_train) +logit_pred = logit.predict_proba(X_test) +logit_labels = np.where(logit_pred[:,1] > .5, 'Yes', 'No') +confusion_table(logit_labels, y_test) + +``` + +We used the argument `solver='liblinear'` above to +avoid a warning with the default solver which would indicate that +the algorithm does not converge. + +If we use $0.5$ as the predicted probability cut-off for the +classifier, then we have a problem: only two of the test observations +are predicted to purchase insurance. However, we are not required to use a +cut-off of $0.5$. If we instead predict a purchase any time the +predicted probability of purchase exceeds $0.25$, we get much better +results: we predict that 29 people will purchase insurance, and we are +correct for about 31% of these people. This is almost five times +better than random guessing! + +```{python} +logit_labels = np.where(logit_pred[:,1]>0.25, 'Yes', 'No') +confusion_table(logit_labels, y_test) +``` + +```{python} +9/(20+9) + +``` +## Linear and Poisson Regression on the Bikeshare Data +Here we fit linear and Poisson regression models to the `Bikeshare` data, as described in Section 4.6. +The response `bikers` measures the number of bike rentals per hour +in Washington, DC in the period 2010--2012. + +```{python} +Bike = load_data('Bikeshare') + +``` +Let's have a peek at the dimensions and names of the variables in this dataframe. + +```{python} +Bike.shape, Bike.columns + +``` + +### Linear Regression + +We begin by fitting a linear regression model to the data. + +```{python} +X = MS(['mnth', + 'hr', + 'workingday', + 'temp', + 'weathersit']).fit_transform(Bike) +Y = Bike['bikers'] +M_lm = sm.OLS(Y, X).fit() +summarize(M_lm) + +``` + +There are 24 levels in `hr` and 40 rows in all. +In `M_lm`, the first levels `hr[0]` and `mnth[Jan]` are treated +as the baseline values, and so no coefficient estimates are provided +for them: implicitly, their coefficient estimates are zero, and all +other levels are measured relative to these baselines. For example, +the Feb coefficient of $6.845$ signifies that, holding all other +variables constant, there are on average about 7 more riders in +February than in January. Similarly there are about 16.5 more riders +in March than in January. + +The results seen in Section 4.6.1 +used a slightly different coding of the variables `hr` and `mnth`, as follows: + +```{python} +hr_encode = contrast('hr', 'sum') +mnth_encode = contrast('mnth', 'sum') + +``` +Refitting again: + +```{python} +X2 = MS([mnth_encode, + hr_encode, + 'workingday', + 'temp', + 'weathersit']).fit_transform(Bike) +M2_lm = sm.OLS(Y, X2).fit() +S2 = summarize(M2_lm) +S2 +``` + +What is the difference between the two codings? In `M2_lm`, a +coefficient estimate is reported for all but level `23` of `hr` +and level `Dec` of `mnth`. Importantly, in `M2_lm`, the (unreported) coefficient estimate +for the last level of `mnth` is not zero: instead, it equals the +negative of the sum of the coefficient estimates for all of the +other levels. Similarly, in `M2_lm`, the coefficient estimate +for the last level of `hr` is the negative of the sum of the +coefficient estimates for all of the other levels. This means that the +coefficients of `hr` and `mnth` in `M2_lm` will always sum +to zero, and can be interpreted as the difference from the mean +level. For example, the coefficient for January of $-46.087$ indicates +that, holding all other variables constant, there are typically 46 +fewer riders in January relative to the yearly average. + +It is important to realize that the choice of coding really does not +matter, provided that we interpret the model output correctly in light +of the coding used. For example, we see that the predictions from the +linear model are the same regardless of coding: + +```{python} +np.sum((M_lm.fittedvalues - M2_lm.fittedvalues)**2) + +``` + +The sum of squared differences is zero. We can also see this using the +`np.allclose()` function: + +```{python} +np.allclose(M_lm.fittedvalues, M2_lm.fittedvalues) + +``` + + +To reproduce the left-hand side of Figure 4.13 +we must first obtain the coefficient estimates associated with +`mnth`. The coefficients for January through November can be obtained +directly from the `M2_lm` object. The coefficient for December +must be explicitly computed as the negative sum of all the other +months. We first extract all the coefficients for month from +the coefficients of `M2_lm`. + +```{python} +coef_month = S2[S2.index.str.contains('mnth')]['coef'] +coef_month + +``` +Next, we append `Dec` as the negative of the sum of all other months. + +```{python} +months = Bike['mnth'].dtype.categories +coef_month = pd.concat([ + coef_month, + pd.Series([-coef_month.sum()], + index=['mnth[Dec]' + ]) + ]) +coef_month + +``` +Finally, to make the plot neater, we’ll just use the first letter of each month, which is the $6$th entry of each of +the labels in the index. + +```{python} +fig_month, ax_month = subplots(figsize=(8,8)) +x_month = np.arange(coef_month.shape[0]) +ax_month.plot(x_month, coef_month, marker='o', ms=10) +ax_month.set_xticks(x_month) +ax_month.set_xticklabels([l[5] for l in coef_month.index], fontsize=20) +ax_month.set_xlabel('Month', fontsize=20) +ax_month.set_ylabel('Coefficient', fontsize=20); + +``` + +Reproducing the right-hand plot in Figure 4.13 follows a similar process. + +```{python} +coef_hr = S2[S2.index.str.contains('hr')]['coef'] +coef_hr = coef_hr.reindex(['hr[{0}]'.format(h) for h in range(23)]) +coef_hr = pd.concat([coef_hr, + pd.Series([-coef_hr.sum()], index=['hr[23]']) + ]) + +``` + +We now make the hour plot. + +```{python} +fig_hr, ax_hr = subplots(figsize=(8,8)) +x_hr = np.arange(coef_hr.shape[0]) +ax_hr.plot(x_hr, coef_hr, marker='o', ms=10) +ax_hr.set_xticks(x_hr[::2]) +ax_hr.set_xticklabels(range(24)[::2], fontsize=20) +ax_hr.set_xlabel('Hour', fontsize=20) +ax_hr.set_ylabel('Coefficient', fontsize=20); + +``` + +### Poisson Regression + +Now we fit instead a Poisson regression model to the +`Bikeshare` data. Very little changes, except that we now use the +function `sm.GLM()` with the Poisson family specified: + +```{python} +M_pois = sm.GLM(Y, X2, family=sm.families.Poisson()).fit() + +``` + +We can plot the coefficients associated with `mnth` and `hr`, in order to reproduce Figure 4.15. We first complete these coefficients as before. + +```{python} +S_pois = summarize(M_pois) +coef_month = S_pois[S_pois.index.str.contains('mnth')]['coef'] +coef_month = pd.concat([coef_month, + pd.Series([-coef_month.sum()], + index=['mnth[Dec]'])]) +coef_hr = S_pois[S_pois.index.str.contains('hr')]['coef'] +coef_hr = pd.concat([coef_hr, + pd.Series([-coef_hr.sum()], + index=['hr[23]'])]) +``` +The plotting is as before. + +```{python} +fig_pois, (ax_month, ax_hr) = subplots(1, 2, figsize=(16,8)) +ax_month.plot(x_month, coef_month, marker='o', ms=10) +ax_month.set_xticks(x_month) +ax_month.set_xticklabels([l[5] for l in coef_month.index], fontsize=20) +ax_month.set_xlabel('Month', fontsize=20) +ax_month.set_ylabel('Coefficient', fontsize=20) +ax_hr.plot(x_hr, coef_hr, marker='o', ms=10) +ax_hr.set_xticklabels(range(24)[::2], fontsize=20) +ax_hr.set_xlabel('Hour', fontsize=20) +ax_hr.set_ylabel('Coefficient', fontsize=20); + +``` +We compare the fitted values of the two models. +The fitted values are stored in the `fittedvalues` attribute +returned by the `fit()` method for both the linear regression and the Poisson +fits. The linear predictors are stored as the attribute `lin_pred`. + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax.scatter(M2_lm.fittedvalues, + M_pois.fittedvalues, + s=20) +ax.set_xlabel('Linear Regression Fit', fontsize=20) +ax.set_ylabel('Poisson Regression Fit', fontsize=20) +ax.axline([0,0], c='black', linewidth=3, + linestyle='--', slope=1); + +``` + +The predictions from the Poisson regression model are correlated with +those from the linear model; however, the former are non-negative. As +a result the Poisson regression predictions tend to be larger than +those from the linear model for either very low or very high levels of +ridership. + +In this section, we fit Poisson regression models using the `sm.GLM()` function with the argument +`family=sm.families.Poisson()`. Earlier in this lab we used the `sm.GLM()` function +with `family=sm.families.Binomial()` to perform logistic regression. Other +choices for the `family` argument can be used to fit other types +of GLMs. For instance, `family=sm.families.Gamma()` fits a Gamma regression +model. + + diff --git a/ISLP_labs_R4DS/Ch05-resample-lab copy.Rmd b/ISLP_labs_R4DS/Ch05-resample-lab copy.Rmd new file mode 100644 index 0000000..1733c87 --- /dev/null +++ b/ISLP_labs_R4DS/Ch05-resample-lab copy.Rmd @@ -0,0 +1,564 @@ +--- +jupyter: + jupytext: + cell_metadata_filter: -all + formats: Rmd,ipynb + main_language: python + text_representation: + extension: .Rmd + format_name: rmarkdown + format_version: '1.2' + jupytext_version: 1.14.7 +--- + + +# Chapter 5 + + + + +# Lab: Cross-Validation and the Bootstrap +In this lab, we explore the resampling techniques covered in this +chapter. Some of the commands in this lab may take a while to run on +your computer. + +We again begin by placing most of our imports at this top level. + +```{python} +import numpy as np +import statsmodels.api as sm +from ISLP import load_data +from ISLP.models import (ModelSpec as MS, + summarize, + poly) +from sklearn.model_selection import train_test_split + +``` + + +There are several new imports needed for this lab. + +```{python} +from functools import partial +from sklearn.model_selection import \ + (cross_validate, + KFold, + ShuffleSplit) +from sklearn.base import clone +from ISLP.models import sklearn_sm + +``` + + +## The Validation Set Approach +We explore the use of the validation set approach in order to estimate +the test error rates that result from fitting various linear models on +the `Auto` data set. + +We use the function `train_test_split()` to split +the data into training and validation sets. As there are 392 observations, +we split into two equal sets of size 196 using the +argument `test_size=196`. It is generally a good idea to set a random seed +when performing operations like this that contain an +element of randomness, so that the results obtained can be reproduced +precisely at a later time. We set the random seed of the splitter +with the argument `random_state=0`. + +```{python} +Auto = load_data('Auto') +Auto_train, Auto_valid = train_test_split(Auto, + test_size=196, + random_state=0) + +``` + +Now we can fit a linear regression using only the observations corresponding to the training set `Auto_train`. + +```{python} +hp_mm = MS(['horsepower']) +X_train = hp_mm.fit_transform(Auto_train) +y_train = Auto_train['mpg'] +model = sm.OLS(y_train, X_train) +results = model.fit() + +``` + +We now use the `predict()` method of `results` evaluated on the model matrix for this model +created using the validation data set. We also calculate the validation MSE of our model. + +```{python} +X_valid = hp_mm.transform(Auto_valid) +y_valid = Auto_valid['mpg'] +valid_pred = results.predict(X_valid) +np.mean((y_valid - valid_pred)**2) + +``` + +Hence our estimate for the validation MSE of the linear regression +fit is $23.62$. + +We can also estimate the validation error for +higher-degree polynomial regressions. We first provide a function `evalMSE()` that takes a model string as well +as a training and test set and returns the MSE on the test set. + +```{python} +def evalMSE(terms, + response, + train, + test): + + mm = MS(terms) + X_train = mm.fit_transform(train) + y_train = train[response] + + X_test = mm.transform(test) + y_test = test[response] + + results = sm.OLS(y_train, X_train).fit() + test_pred = results.predict(X_test) + + return np.mean((y_test - test_pred)**2) + +``` + +Let’s use this function to estimate the validation MSE +using linear, quadratic and cubic fits. We use the `enumerate()` function +here, which gives both the values and indices of objects as one iterates +over a for loop. + +```{python} +MSE = np.zeros(3) +for idx, degree in enumerate(range(1, 4)): + MSE[idx] = evalMSE([poly('horsepower', degree)], + 'mpg', + Auto_train, + Auto_valid) +MSE + +``` + +These error rates are $23.62, 18.76$, and $18.80$, respectively. If we +choose a different training/validation split instead, then we +can expect somewhat different errors on the validation set. + +```{python} +Auto_train, Auto_valid = train_test_split(Auto, + test_size=196, + random_state=3) +MSE = np.zeros(3) +for idx, degree in enumerate(range(1, 4)): + MSE[idx] = evalMSE([poly('horsepower', degree)], + 'mpg', + Auto_train, + Auto_valid) +MSE +``` + +Using this split of the observations into a training set and a validation set, +we find that the validation set error rates for the models with linear, quadratic, and cubic terms are $20.76$, $16.95$, and $16.97$, respectively. + +These results are consistent with our previous findings: a model that +predicts `mpg` using a quadratic function of `horsepower` +performs better than a model that involves only a linear function of +`horsepower`, and there is no evidence of an improvement in using a cubic function of `horsepower`. + + +## Cross-Validation +In theory, the cross-validation estimate can be computed for any generalized +linear model. {} +In practice, however, the simplest way to cross-validate in +Python is to use `sklearn`, which has a different interface or API +than `statsmodels`, the code we have been using to fit GLMs. + +This is a problem which often confronts data scientists: "I have a function to do task $A$, and need to feed it into something that performs task $B$, so that I can compute $B(A(D))$, where $D$ is my data." When $A$ and $B$ don’t naturally speak to each other, this +requires the use of a *wrapper*. +In the `ISLP` package, +we provide +a wrapper, `sklearn_sm()`, that enables us to easily use the cross-validation tools of `sklearn` with +models fit by `statsmodels`. + +The class `sklearn_sm()` +has as its first argument +a model from `statsmodels`. It can take two additional +optional arguments: `model_str` which can be +used to specify a formula, and `model_args` which should +be a dictionary of additional arguments used when fitting +the model. For example, to fit a logistic regression model +we have to specify a `family` argument. This +is passed as `model_args={'family':sm.families.Binomial()}`. + +Here is our wrapper in action: + +```{python} +hp_model = sklearn_sm(sm.OLS, + MS(['horsepower'])) +X, Y = Auto.drop(columns=['mpg']), Auto['mpg'] +cv_results = cross_validate(hp_model, + X, + Y, + cv=Auto.shape[0]) +cv_err = np.mean(cv_results['test_score']) +cv_err + +``` +The arguments to `cross_validate()` are as follows: an +object with the appropriate `fit()`, `predict()`, +and `score()` methods, an +array of features `X` and a response `Y`. +We also included an additional argument `cv` to `cross_validate()`; specifying an integer +$K$ results in $K$-fold cross-validation. We have provided a value +corresponding to the total number of observations, which results in +leave-one-out cross-validation (LOOCV). The `cross_validate()` function produces a dictionary with several components; +we simply want the cross-validated test score here (MSE), which is estimated to be 24.23. + + +We can repeat this procedure for increasingly complex polynomial fits. +To automate the process, we again +use a for loop which iteratively fits polynomial +regressions of degree 1 to 5, computes the +associated cross-validation error, and stores it in the $i$th element +of the vector `cv_error`. The variable `d` in the for loop +corresponds to the degree of the polynomial. We begin by initializing the +vector. This command may take a couple of seconds to run. + +```{python} +cv_error = np.zeros(5) +H = np.array(Auto['horsepower']) +M = sklearn_sm(sm.OLS) +for i, d in enumerate(range(1,6)): + X = np.power.outer(H, np.arange(d+1)) + M_CV = cross_validate(M, + X, + Y, + cv=Auto.shape[0]) + cv_error[i] = np.mean(M_CV['test_score']) +cv_error + +``` +As in Figure 5.4, we see a sharp drop in the estimated test MSE between the linear and +quadratic fits, but then no clear improvement from using higher-degree polynomials. + +Above we introduced the `outer()` method of the `np.power()` +function. The `outer()` method is applied to an operation +that has two arguments, such as `add()`, `min()`, or +`power()`. +It has two arrays as +arguments, and then forms a larger +array where the operation is applied to each pair of elements of the +two arrays. + +```{python} +A = np.array([3, 5, 9]) +B = np.array([2, 4]) +np.add.outer(A, B) + +``` + +In the CV example above, we used $K=n$, but of course we can also use $K 250 # shorthand +glm = sm.GLM(y > 250, + X, + family=sm.families.Binomial()) +B = glm.fit() +summarize(B) + +``` + + +Once again, we make predictions using the `get_prediction()` method. + +```{python} +newX = poly_age.transform(age_df) +preds = B.get_prediction(newX) +bands = preds.conf_int(alpha=0.05) + +``` + +We now plot the estimated relationship. + +```{python} +fig, ax = subplots(figsize=(8,8)) +rng = np.random.default_rng(0) +ax.scatter(age + + 0.2 * rng.uniform(size=y.shape[0]), + np.where(high_earn, 0.198, 0.002), + fc='gray', + marker='|') +for val, ls in zip([preds.predicted_mean, + bands[:,0], + bands[:,1]], + ['b','r--','r--']): + ax.plot(age_df.values, val, ls, linewidth=3) +ax.set_title('Degree-4 Polynomial', fontsize=20) +ax.set_xlabel('Age', fontsize=20) +ax.set_ylim([0,0.2]) +ax.set_ylabel('P(Wage > 250)', fontsize=20); + +``` +We have drawn the `age` values corresponding to the observations with +`wage` values above 250 as gray marks on the top of the plot, and +those with `wage` values below 250 are shown as gray marks on the +bottom of the plot. We added a small amount of noise to jitter +the `age` values a bit so that observations with the same `age` +value do not cover each other up. This type of plot is often called a +*rug plot*. + +In order to fit a step function, as discussed in +Section 7.2, we first use the `pd.qcut()` +function to discretize `age` based on quantiles. Then we use `pd.get_dummies()` to create the +columns of the model matrix for this categorical variable. Note that this function will +include *all* columns for a given categorical, rather than the usual approach which drops one +of the levels. + +```{python} +cut_age = pd.qcut(age, 4) +summarize(sm.OLS(y, pd.get_dummies(cut_age)).fit()) + +``` + + +Here `pd.qcut()` automatically picked the cutpoints based on the quantiles 25%, 50% and 75%, which results in four regions. We could also have specified our own +quantiles directly instead of the argument `4`. For cuts not based +on quantiles we would use the `pd.cut()` function. +The function `pd.qcut()` (and `pd.cut()`) returns an ordered categorical variable. + The regression model then creates a set of +dummy variables for use in the regression. Since `age` is the only variable in the model, the value $94,158.40 is the average salary for those under 33.75 years of +age, and the other coefficients are the average +salary for those in the other age groups. We can produce +predictions and plots just as we did in the case of the polynomial +fit. + + +## Splines +In order to fit regression splines, we use transforms +from the `ISLP` package. The actual spline +evaluation functions are in the `scipy.interpolate` package; +we have simply wrapped them as transforms +similar to `Poly()` and `PCA()`. + +In Section 7.4, we saw +that regression splines can be fit by constructing an appropriate +matrix of basis functions. The `BSpline()` function generates the +entire matrix of basis functions for splines with the specified set of +knots. By default, the B-splines produced are cubic. To change the degree, use +the argument `degree`. + +```{python} +bs_ = BSpline(internal_knots=[25,40,60], intercept=True).fit(age) +bs_age = bs_.transform(age) +bs_age.shape + +``` +This results in a seven-column matrix, which is what is expected for a cubic-spline basis with 3 interior knots. +We can form this same matrix using the `bs()` object, +which facilitates adding this to a model-matrix builder (as in `poly()` versus its workhorse `Poly()`) described in Section 7.8.1. + +We now fit a cubic spline model to the `Wage` data. + +```{python} +bs_age = MS([bs('age', internal_knots=[25,40,60])]) +Xbs = bs_age.fit_transform(Wage) +M = sm.OLS(y, Xbs).fit() +summarize(M) + +``` +The column names are a little cumbersome, and have caused us to truncate the printed summary. They can be set on construction using the `name` argument as follows. + +```{python} +bs_age = MS([bs('age', + internal_knots=[25,40,60], + name='bs(age)')]) +Xbs = bs_age.fit_transform(Wage) +M = sm.OLS(y, Xbs).fit() +summarize(M) + +``` + +Notice that there are 6 spline coefficients rather than 7. This is because, by default, +`bs()` assumes `intercept=False`, since we typically have an overall intercept in the model. +So it generates the spline basis with the given knots, and then discards one of the basis functions to account for the intercept. + +We could also use the `df` (degrees of freedom) option to +specify the complexity of the spline. We see above that with 3 knots, +the spline basis has 6 columns or degrees of freedom. When we specify +`df=6` rather than the actual knots, `bs()` will produce a +spline with 3 knots chosen at uniform quantiles of the training data. +We can see these chosen knots most easily using `Bspline()` directly: + +```{python} +BSpline(df=6).fit(age).internal_knots_ + +``` + When asking for six degrees of freedom, +the transform chooses knots at ages 33.75, 42.0, and 51.0, +which correspond to the 25th, 50th, and 75th percentiles of +`age`. + +When using B-splines we need not limit ourselves to cubic polynomials +(i.e. `degree=3`). For instance, using `degree=0` results +in piecewise constant functions, as in our example with +`pd.qcut()` above. + + +```{python} +bs_age0 = MS([bs('age', + df=3, + degree=0)]).fit(Wage) +Xbs0 = bs_age0.transform(Wage) +summarize(sm.OLS(y, Xbs0).fit()) + +``` +This fit should be compared with cell [15] where we use `qcut()` +to create four bins by cutting at the 25%, 50% and 75% quantiles of +`age`. Since we specified `df=3` for degree-zero splines here, there will also be +knots at the same three quantiles. Although the coefficients appear different, we +see that this is a result of the different coding. For example, the +first coefficient is identical in both cases, and is the mean response +in the first bin. For the second coefficient, we have +$94.158 + 22.349 = 116.507 \approx 116.611$, the latter being the mean +in the second bin in cell [15]. Here the intercept is coded by a column +of ones, so the second, third and fourth coefficients are increments +for those bins. Why is the sum not exactly the same? It turns out that +the `qcut()` uses $\leq$, while `bs()` uses $<$ when +deciding bin membership. + + + + + + + + + + + +In order to fit a natural spline, we use the `NaturalSpline()` +transform with the corresponding helper `ns()`. Here we fit a natural spline with five +degrees of freedom (excluding the intercept) and plot the results. + +```{python} +ns_age = MS([ns('age', df=5)]).fit(Wage) +M_ns = sm.OLS(y, ns_age.transform(Wage)).fit() +summarize(M_ns) +``` +We now plot the natural spline using our plotting function. + +```{python} +plot_wage_fit(age_df, + ns_age, + 'Natural spline, df=5'); + +``` + +## Smoothing Splines and GAMs +A smoothing spline is a special case of a GAM with squared-error loss +and a single feature. To fit GAMs in `Python` we will use the +`pygam` package which can be installed via `pip install pygam`. The +estimator `LinearGAM()` uses squared-error loss. +The GAM is specified by associating each column +of a model matrix with a particular smoothing operation: +`s` for smoothing spline; `l` for linear, and `f` for factor or categorical variables. +The argument `0` passed to `s` below indicates that this smoother will +apply to the first column of a feature matrix. Below, we pass it a +matrix with a single column: `X_age`. The argument `lam` is the penalty parameter $\lambda$ as discussed in Section 7.5.2. + +```{python} +X_age = np.asarray(age).reshape((-1,1)) +gam = LinearGAM(s_gam(0, lam=0.6)) +gam.fit(X_age, y) + +``` + +The `pygam` library generally expects a matrix of features so we reshape `age` to be a matrix (a two-dimensional array) instead +of a vector (i.e. a one-dimensional array). The `-1` in the call to the `reshape()` method tells `numpy` to impute the +size of that dimension based on the remaining entries of the shape tuple. + +Let’s investigate how the fit changes with the smoothing parameter `lam`. +The function `np.logspace()` is similar to `np.linspace()` but spaces points +evenly on the log-scale. Below we vary `lam` from $10^{-2}$ to $10^6$. + +```{python} +fig, ax = subplots(figsize=(8,8)) +ax.scatter(age, y, facecolor='gray', alpha=0.5) +for lam in np.logspace(-2, 6, 5): + gam = LinearGAM(s_gam(0, lam=lam)).fit(X_age, y) + ax.plot(age_grid, + gam.predict(age_grid), + label='{:.1e}'.format(lam), + linewidth=3) +ax.set_xlabel('Age', fontsize=20) +ax.set_ylabel('Wage', fontsize=20); +ax.legend(title='$\lambda$'); + +``` + +The `pygam` package can perform a search for an optimal smoothing parameter. + +```{python} +gam_opt = gam.gridsearch(X_age, y) +ax.plot(age_grid, + gam_opt.predict(age_grid), + label='Grid search', + linewidth=4) +ax.legend() +fig + +``` + +Alternatively, we can fix the degrees of freedom of the smoothing +spline using a function included in the `ISLP.pygam` package. Below we +find a value of $\lambda$ that gives us roughly four degrees of +freedom. We note here that these degrees of freedom include the +unpenalized intercept and linear term of the smoothing spline, hence there are at least two +degrees of freedom. + +```{python} +age_term = gam.terms[0] +lam_4 = approx_lam(X_age, age_term, 4) +age_term.lam = lam_4 +degrees_of_freedom(X_age, age_term) + +``` + + +Let’s vary the degrees of freedom in a similar plot to above. We choose the degrees of freedom +as the desired degrees of freedom plus one to account for the fact that these smoothing +splines always have an intercept term. Hence, a value of one for `df` is just a linear fit. + +```{python} +fig, ax = subplots(figsize=(8,8)) +ax.scatter(X_age, + y, + facecolor='gray', + alpha=0.3) +for df in [1,3,4,8,15]: + lam = approx_lam(X_age, age_term, df+1) + age_term.lam = lam + gam.fit(X_age, y) + ax.plot(age_grid, + gam.predict(age_grid), + label='{:d}'.format(df), + linewidth=4) +ax.set_xlabel('Age', fontsize=20) +ax.set_ylabel('Wage', fontsize=20); +ax.legend(title='Degrees of freedom'); + +``` + +### Additive Models with Several Terms +The strength of generalized additive models lies in their ability to fit multivariate regression models with more flexibility than linear models. We demonstrate two approaches: the first in a more manual fashion using natural splines and piecewise constant functions, and the second using the `pygam` package and smoothing splines. + +We now fit a GAM by hand to predict +`wage` using natural spline functions of `year` and `age`, +treating `education` as a qualitative predictor, as in (7.16). +Since this is just a big linear regression model +using an appropriate choice of basis functions, we can simply do this +using the `sm.OLS()` function. + +We will build the model matrix in a more manual fashion here, since we wish +to access the pieces separately when constructing partial dependence plots. + +```{python} +ns_age = NaturalSpline(df=4).fit(age) +ns_year = NaturalSpline(df=5).fit(Wage['year']) +Xs = [ns_age.transform(age), + ns_year.transform(Wage['year']), + pd.get_dummies(Wage['education']).values] +X_bh = np.hstack(Xs) +gam_bh = sm.OLS(y, X_bh).fit() + +``` +Here the function `NaturalSpline()` is the workhorse supporting +the `ns()` helper function. We chose to use all columns of the +indicator matrix for the categorical variable `education`, making an intercept redundant. +Finally, we stacked the three component matrices horizontally to form the model matrix `X_bh`. + +We now show how to construct partial dependence plots for each of the terms in our rudimentary GAM. We can do this by hand, +given grids for `age` and `year`. + We simply predict with new $X$ matrices, fixing all but one of the features at a time. + +```{python} +age_grid = np.linspace(age.min(), + age.max(), + 100) +X_age_bh = X_bh.copy()[:100] +X_age_bh[:] = X_bh[:].mean(0)[None,:] +X_age_bh[:,:4] = ns_age.transform(age_grid) +preds = gam_bh.get_prediction(X_age_bh) +bounds_age = preds.conf_int(alpha=0.05) +partial_age = preds.predicted_mean +center = partial_age.mean() +partial_age -= center +bounds_age -= center +fig, ax = subplots(figsize=(8,8)) +ax.plot(age_grid, partial_age, 'b', linewidth=3) +ax.plot(age_grid, bounds_age[:,0], 'r--', linewidth=3) +ax.plot(age_grid, bounds_age[:,1], 'r--', linewidth=3) +ax.set_xlabel('Age') +ax.set_ylabel('Effect on wage') +ax.set_title('Partial dependence of age on wage', fontsize=20); + +``` +Let's explain in some detail what we did above. The idea is to create a new prediction matrix, where all but the columns belonging to `age` are constant (and set to their training-data means). The four columns for `age` are filled in with the natural spline basis evaluated at the 100 values in `age_grid`. + +* We made a grid of length 100 in `age`, and created a matrix `X_age_bh` with 100 rows and the same number of columns as `X_bh`. +* We replaced every row of this matrix with the column means of the original. +* We then replace just the first four columns representing `age` with the natural spline basis computed at the values in `age_grid`. + +The remaining steps should by now be familiar. + +We also look at the effect of `year` on `wage`; the process is the same. + +```{python} +year_grid = np.linspace(2003, 2009, 100) +year_grid = np.linspace(Wage['year'].min(), + Wage['year'].max(), + 100) +X_year_bh = X_bh.copy()[:100] +X_year_bh[:] = X_bh[:].mean(0)[None,:] +X_year_bh[:,4:9] = ns_year.transform(year_grid) +preds = gam_bh.get_prediction(X_year_bh) +bounds_year = preds.conf_int(alpha=0.05) +partial_year = preds.predicted_mean +center = partial_year.mean() +partial_year -= center +bounds_year -= center +fig, ax = subplots(figsize=(8,8)) +ax.plot(year_grid, partial_year, 'b', linewidth=3) +ax.plot(year_grid, bounds_year[:,0], 'r--', linewidth=3) +ax.plot(year_grid, bounds_year[:,1], 'r--', linewidth=3) +ax.set_xlabel('Year') +ax.set_ylabel('Effect on wage') +ax.set_title('Partial dependence of year on wage', fontsize=20); + +``` + +We now fit the model (7.16) using smoothing splines rather +than natural splines. All of the +terms in (7.16) are fit simultaneously, taking each other +into account to explain the response. The `pygam` package only works with matrices, so we must convert +the categorical series `education` to its array representation, which can be found +with the `cat.codes` attribute of `education`. As `year` only has 7 unique values, we +use only seven basis functions for it. + +```{python} +gam_full = LinearGAM(s_gam(0) + + s_gam(1, n_splines=7) + + f_gam(2, lam=0)) +Xgam = np.column_stack([age, + Wage['year'], + Wage['education'].cat.codes]) +gam_full = gam_full.fit(Xgam, y) + +``` +The two `s_gam()` terms result in smoothing spline fits, and use a default value for $\lambda$ (`lam=0.6`), which is somewhat arbitrary. For the categorical term `education`, specified using a `f_gam()` term, we specify `lam=0` to avoid any shrinkage. +We produce the partial dependence plot in `age` to see the effect of these choices. + +The values for the plot +are generated by the `pygam` package. We provide a `plot_gam()` +function for partial-dependence plots in `ISLP.pygam`, which makes this job easier than in our last example with natural splines. + +```{python} +fig, ax = subplots(figsize=(8,8)) +plot_gam(gam_full, 0, ax=ax) +ax.set_xlabel('Age') +ax.set_ylabel('Effect on wage') +ax.set_title('Partial dependence of age on wage - default lam=0.6', fontsize=20); + +``` + +We see that the function is somewhat wiggly. It is more natural to specify the `df` than a value for `lam`. +We refit a GAM using four degrees of freedom each for +`age` and `year`. Recall that the addition of one below takes into account the intercept +of the smoothing spline. + +```{python} +age_term = gam_full.terms[0] +age_term.lam = approx_lam(Xgam, age_term, df=4+1) +year_term = gam_full.terms[1] +year_term.lam = approx_lam(Xgam, year_term, df=4+1) +gam_full = gam_full.fit(Xgam, y) + +``` +Note that updating `age_term.lam` above updates it in `gam_full.terms[0]` as well! Likewise for `year_term.lam`. + +Repeating the plot for `age`, we see that it is much smoother. +We also produce the plot for `year`. + +```{python} +fig, ax = subplots(figsize=(8,8)) +plot_gam(gam_full, + 1, + ax=ax) +ax.set_xlabel('Year') +ax.set_ylabel('Effect on wage') +ax.set_title('Partial dependence of year on wage', fontsize=20) + +``` +Finally we plot `education`, which is categorical. The partial dependence plot is different, and more suitable for the set of fitted constants for each level of this variable. + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax = plot_gam(gam_full, 2) +ax.set_xlabel('Education') +ax.set_ylabel('Effect on wage') +ax.set_title('Partial dependence of wage on education', + fontsize=20); +ax.set_xticklabels(Wage['education'].cat.categories, fontsize=8); + +``` + +### ANOVA Tests for Additive Models +In all of our models, the function of `year` looks rather linear. We can +perform a series of ANOVA tests in order to determine which of these +three models is best: a GAM that excludes `year` ($\mathcal{M}_1$), +a GAM that uses a linear function of `year` ($\mathcal{M}_2$), or a +GAM that uses a spline function of `year` ($\mathcal{M}_3$). + +```{python} +gam_0 = LinearGAM(age_term + f_gam(2, lam=0)) +gam_0.fit(Xgam, y) +gam_linear = LinearGAM(age_term + + l_gam(1, lam=0) + + f_gam(2, lam=0)) +gam_linear.fit(Xgam, y) + +``` + +Notice our use of `age_term` in the expressions above. We do this because +earlier we set the value for `lam` in this term to achieve four degrees of freedom. + +To directly assess the effect of `year` we run an ANOVA on the +three models fit above. + +```{python} +anova_gam(gam_0, gam_linear, gam_full) + +``` + We find that there is compelling evidence that a GAM with a linear +function in `year` is better than a GAM that does not include +`year` at all ($p$-value= 0.002). However, there is no +evidence that a non-linear function of `year` is needed +($p$-value=0.435). In other words, based on the results of this +ANOVA, $\mathcal{M}_2$ is preferred. + +We can repeat the same process for `age` as well. We see there is very clear evidence that +a non-linear term is required for `age`. + +```{python} +gam_0 = LinearGAM(year_term + + f_gam(2, lam=0)) +gam_linear = LinearGAM(l_gam(0, lam=0) + + year_term + + f_gam(2, lam=0)) +gam_0.fit(Xgam, y) +gam_linear.fit(Xgam, y) +anova_gam(gam_0, gam_linear, gam_full) +``` + +There is a (verbose) `summary()` method for the GAM fit. + +```{python} +gam_full.summary() + +``` + +We can make predictions from `gam` objects, just like from +`lm` objects, using the `predict()` method for the class +`gam`. Here we make predictions on the training set. + +```{python} +Yhat = gam_full.predict(Xgam) + +``` + +In order to fit a logistic regression GAM, we use `LogisticGAM()` +from `pygam`. + +```{python} +gam_logit = LogisticGAM(age_term + + l_gam(1, lam=0) + + f_gam(2, lam=0)) +gam_logit.fit(Xgam, high_earn) + +``` + + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax = plot_gam(gam_logit, 2) +ax.set_xlabel('Education') +ax.set_ylabel('Effect on wage') +ax.set_title('Partial dependence of wage on education', + fontsize=20); +ax.set_xticklabels(Wage['education'].cat.categories, fontsize=8); + +``` +The model seems to be very flat, with especially high error bars for the first category. +Let's look at the data a bit more closely. + +```{python} +pd.crosstab(Wage['high_earn'], Wage['education']) + +``` +We see that there are no high earners in the first category of +education, meaning that the model will have a hard time fitting. We +will fit a logistic regression GAM excluding all observations falling into this +category. This provides more sensible results. + +To do so, +we could subset the model matrix, though this will not remove the +column from `Xgam`. While we can deduce which column corresponds +to this feature, for reproducibility’s sake we reform the model matrix +on this smaller subset. + + +```{python} +only_hs = Wage['education'] == '1. < HS Grad' +Wage_ = Wage.loc[~only_hs] +Xgam_ = np.column_stack([Wage_['age'], + Wage_['year'], + Wage_['education'].cat.codes-1]) +high_earn_ = Wage_['high_earn'] + +``` +In the second-to-last line above, we subtract one from the codes of the category, due to a bug in `pygam`. It just relabels +the education values and hence has no effect on the fit. + +We now fit the model. + +```{python} +gam_logit_ = LogisticGAM(age_term + + year_term + + f_gam(2, lam=0)) +gam_logit_.fit(Xgam_, high_earn_) + +``` + + +Let’s look at the effect of `education`, `year` and `age` on high earner status now that we’ve +removed those observations. + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax = plot_gam(gam_logit_, 2) +ax.set_xlabel('Education') +ax.set_ylabel('Effect on wage') +ax.set_title('Partial dependence of high earner status on education', fontsize=20); +ax.set_xticklabels(Wage['education'].cat.categories[1:], + fontsize=8); + +``` + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax = plot_gam(gam_logit_, 1) +ax.set_xlabel('Year') +ax.set_ylabel('Effect on wage') +ax.set_title('Partial dependence of high earner status on year', + fontsize=20); + +``` + +```{python} +fig, ax = subplots(figsize=(8, 8)) +ax = plot_gam(gam_logit_, 0) +ax.set_xlabel('Age') +ax.set_ylabel('Effect on wage') +ax.set_title('Partial dependence of high earner status on age', fontsize=20); + +``` + + +## Local Regression +We illustrate the use of local regression using the `lowess()` +function from `sm.nonparametric`. Some implementations of +GAMs allow terms to be local regression operators; this is not the case in `pygam`. + +Here we fit local linear regression models using spans of 0.2 +and 0.5; that is, each neighborhood consists of 20% or 50% of +the observations. As expected, using a span of 0.5 is smoother than 0.2. + +```{python} +lowess = sm.nonparametric.lowess +fig, ax = subplots(figsize=(8,8)) +ax.scatter(age, y, facecolor='gray', alpha=0.5) +for span in [0.2, 0.5]: + fitted = lowess(y, + age, + frac=span, + xvals=age_grid) + ax.plot(age_grid, + fitted, + label='{:.1f}'.format(span), + linewidth=4) +ax.set_xlabel('Age', fontsize=20) +ax.set_ylabel('Wage', fontsize=20); +ax.legend(title='span', fontsize=15); + +``` + + + diff --git a/ISLP_labs_R4DS/Ch08-baggboost-lab copy.Rmd b/ISLP_labs_R4DS/Ch08-baggboost-lab copy.Rmd new file mode 100644 index 0000000..67a23d6 --- /dev/null +++ b/ISLP_labs_R4DS/Ch08-baggboost-lab copy.Rmd @@ -0,0 +1,567 @@ +--- +jupyter: + jupytext: + cell_metadata_filter: -all + formats: Rmd,ipynb + main_language: python + text_representation: + extension: .Rmd + format_name: rmarkdown + format_version: '1.2' + jupytext_version: 1.14.7 +--- + + +# Chapter 8 + + + + +# Lab: Tree-Based Methods +We import some of our usual libraries at this top +level. + +```{python} +import numpy as np +import pandas as pd +from matplotlib.pyplot import subplots +import sklearn.model_selection as skm +from ISLP import load_data, confusion_table +from ISLP.models import ModelSpec as MS + +``` +We also collect the new imports +needed for this lab. + +```{python} +from sklearn.tree import (DecisionTreeClassifier as DTC, + DecisionTreeRegressor as DTR, + plot_tree, + export_text) +from sklearn.metrics import (accuracy_score, + log_loss) +from sklearn.ensemble import \ + (RandomForestRegressor as RF, + GradientBoostingRegressor as GBR) +from ISLP.bart import BART + +``` + + +## Fitting Classification Trees + + +We first use classification trees to analyze the `Carseats` data set. +In these data, `Sales` is a continuous variable, and so we begin +by recoding it as a binary variable. We use the `where()` +function to create a variable, called `High`, which takes on a +value of `Yes` if the `Sales` variable exceeds 8, and takes +on a value of `No` otherwise. + +```{python} +Carseats = load_data('Carseats') +High = np.where(Carseats.Sales > 8, + "Yes", + "No") + +``` + +We now use `DecisionTreeClassifier()` to fit a classification tree in +order to predict `High` using all variables but `Sales`. +To do so, we must form a model matrix as we did when fitting regression +models. + +```{python} +model = MS(Carseats.columns.drop('Sales'), intercept=False) +D = model.fit_transform(Carseats) +feature_names = list(D.columns) +X = np.asarray(D) + +``` +We have converted `D` from a data frame to an array `X`, which is needed in some of the analysis below. We also need the `feature_names` for annotating our plots later. + +There are several options needed to specify the classifier, +such as `max_depth` (how deep to grow the tree), `min_samples_split` +(minimum number of observations in a node to be eligible for splitting) +and `criterion` (whether to use Gini or cross-entropy as the split criterion). +We also set `random_state` for reproducibility; ties in the split criterion are broken at random. + +```{python} +clf = DTC(criterion='entropy', + max_depth=3, + random_state=0) +clf.fit(X, High) + +``` + + +In our discussion of qualitative features in Section 3.3, +we noted that for a linear regression model such a feature could be +represented by including a matrix of dummy variables (one-hot-encoding) in the model +matrix, using the formula notation of `statsmodels`. +As mentioned in Section 8.1, there is a more +natural way to handle qualitative features when building a decision +tree, that does not require such dummy variables; each split amounts to partitioning the levels into two groups. +However, +the `sklearn` implementation of decision trees does not take +advantage of this approach; instead it simply treats the one-hot-encoded levels as separate variables. + +```{python} +accuracy_score(High, clf.predict(X)) + +``` + + +With only the default arguments, the training error rate is +21%. +For classification trees, we can +access the value of the deviance using `log_loss()`, +\begin{equation*} +\begin{split} +-2 \sum_m \sum_k n_{mk} \log \hat{p}_{mk}, +\end{split} +\end{equation*} +where $n_{mk}$ is the number of observations in the $m$th terminal +node that belong to the $k$th class. + +```{python} +resid_dev = np.sum(log_loss(High, clf.predict_proba(X))) +resid_dev + +``` + +This is closely related to the *entropy*, defined in (8.7). +A small deviance indicates a +tree that provides a good fit to the (training) data. + +One of the most attractive properties of trees is that they can +be graphically displayed. Here we use the `plot()` function +to display the tree structure. + +```{python} +ax = subplots(figsize=(12,12))[1] +plot_tree(clf, + feature_names=feature_names, + ax=ax); + +``` +The most important indicator of `Sales` appears to be `ShelveLoc`. + +We can see a text representation of the tree using +`export_text()`, which displays the split +criterion (e.g. `Price <= 92.5`) for each branch. +For leaf nodes it shows the overall prediction +(`Yes` or `No`). + We can also see the number of observations in that +leaf that take on values of `Yes` and `No` by specifying `show_weights=True`. + +```{python} +print(export_text(clf, + feature_names=feature_names, + show_weights=True)) + +``` + +In order to properly evaluate the performance of a classification tree +on these data, we must estimate the test error rather than simply +computing the training error. We split the observations into a +training set and a test set, build the tree using the training set, +and evaluate its performance on the test data. This pattern is +similar to that in Chapter 6, with the linear models +replaced here by decision trees --- the code for validation +is almost identical. This approach leads to correct predictions +for 68.5% of the locations in the test data set. + +```{python} +validation = skm.ShuffleSplit(n_splits=1, + test_size=200, + random_state=0) +results = skm.cross_validate(clf, + D, + High, + cv=validation) +results['test_score'] + +``` + + + +Next, we consider whether pruning the tree might lead to improved +classification performance. We first split the data into a training and +test set. We will use cross-validation to prune the tree on the training +set, and then evaluate the performance of the pruned tree on the test +set. + +```{python} +(X_train, + X_test, + High_train, + High_test) = skm.train_test_split(X, + High, + test_size=0.5, + random_state=0) + +``` +We first refit the full tree on the training set; here we do not set a `max_depth` parameter, since we will learn that through cross-validation. + + +```{python} +clf = DTC(criterion='entropy', random_state=0) +clf.fit(X_train, High_train) +accuracy_score(High_test, clf.predict(X_test)) + +``` +Next we use the `cost_complexity_pruning_path()` method of +`clf` to extract cost-complexity values. + +```{python} +ccp_path = clf.cost_complexity_pruning_path(X_train, High_train) +kfold = skm.KFold(10, + random_state=1, + shuffle=True) + +``` +This yields a set of impurities and $\alpha$ values +from which we can extract an optimal one by cross-validation. + +```{python} +grid = skm.GridSearchCV(clf, + {'ccp_alpha': ccp_path.ccp_alphas}, + refit=True, + cv=kfold, + scoring='accuracy') +grid.fit(X_train, High_train) +grid.best_score_ + +``` +Let’s take a look at the pruned true. + +```{python} +ax = subplots(figsize=(12, 12))[1] +best_ = grid.best_estimator_ +plot_tree(best_, + feature_names=feature_names, + ax=ax); + +``` +This is quite a bushy tree. We could count the leaves, or query +`best_` instead. + +```{python} +best_.tree_.n_leaves + +``` +The tree with 30 terminal +nodes results in the lowest cross-validation error rate, with an accuracy of +68.5%. How well does this pruned tree perform on the test data set? Once +again, we apply the `predict()` function. + +```{python} +print(accuracy_score(High_test, + best_.predict(X_test))) +confusion = confusion_table(best_.predict(X_test), + High_test) +confusion + +``` + + +Now 72.0% of the test observations are correctly classified, which is slightly worse than the error for the full tree (with 35 leaves). So cross-validation has not helped us much here; it only pruned off 5 leaves, at a cost of a slightly worse error. These results would change if we were to change the random number seeds above; even though cross-validation gives an unbiased approach to model selection, it does have variance. + + + + +## Fitting Regression Trees +Here we fit a regression tree to the `Boston` data set. The +steps are similar to those for classification trees. + +```{python} +Boston = load_data("Boston") +model = MS(Boston.columns.drop('medv'), intercept=False) +D = model.fit_transform(Boston) +feature_names = list(D.columns) +X = np.asarray(D) + +``` + +First, we split the data into training and test sets, and fit the tree +to the training data. Here we use 30% of the data for the test set. + + +```{python} +(X_train, + X_test, + y_train, + y_test) = skm.train_test_split(X, + Boston['medv'], + test_size=0.3, + random_state=0) + +``` + +Having formed our training and test data sets, we fit the regression tree. + +```{python} +reg = DTR(max_depth=3) +reg.fit(X_train, y_train) +ax = subplots(figsize=(12,12))[1] +plot_tree(reg, + feature_names=feature_names, + ax=ax); + +``` + +The variable `lstat` measures the percentage of individuals with +lower socioeconomic status. The tree indicates that lower +values of `lstat` correspond to more expensive houses. +The tree predicts a median house price of $12,042 for small-sized homes (`rm < 6.8`), in +suburbs in which residents have low socioeconomic status (`lstat > 14.4`) and the crime-rate is moderate (`crim > 5.8`). + + +Now we use the cross-validation function to see whether pruning +the tree will improve performance. + +```{python} +ccp_path = reg.cost_complexity_pruning_path(X_train, y_train) +kfold = skm.KFold(5, + shuffle=True, + random_state=10) +grid = skm.GridSearchCV(reg, + {'ccp_alpha': ccp_path.ccp_alphas}, + refit=True, + cv=kfold, + scoring='neg_mean_squared_error') +G = grid.fit(X_train, y_train) + +``` + +In keeping with the cross-validation results, we use the pruned tree +to make predictions on the test set. + +```{python} +best_ = grid.best_estimator_ +np.mean((y_test - best_.predict(X_test))**2) + +``` + + +In other words, the test set MSE associated with the regression tree +is 28.07. The square root of +the MSE is therefore around +5.30, +indicating that this model leads to test predictions that are within around +$5300 +of the true median home value for the suburb. + +Let’s plot the best tree to see how interpretable it is. + +```{python} +ax = subplots(figsize=(12,12))[1] +plot_tree(G.best_estimator_, + feature_names=feature_names, + ax=ax); + +``` + + + + +## Bagging and Random Forests + + +Here we apply bagging and random forests to the `Boston` data, using +the `RandomForestRegressor()` from the `sklearn.ensemble` package. Recall +that bagging is simply a special case of a random forest with +$m=p$. Therefore, the `RandomForestRegressor()` function can be used to +perform both bagging and random forests. We start with bagging. + +```{python} +bag_boston = RF(max_features=X_train.shape[1], random_state=0) +bag_boston.fit(X_train, y_train) + +``` + + +The argument `max_features` indicates that all 12 predictors should +be considered for each split of the tree --- in other words, that +bagging should be done. How well does this bagged model perform on +the test set? + +```{python} +ax = subplots(figsize=(8,8))[1] +y_hat_bag = bag_boston.predict(X_test) +ax.scatter(y_hat_bag, y_test) +np.mean((y_test - y_hat_bag)**2) + +``` + +The test set MSE associated with the bagged regression tree is +14.63, about half that obtained using an optimally-pruned single +tree. We could change the number of trees grown from the default of +100 by +using the `n_estimators` argument: + +```{python} +bag_boston = RF(max_features=X_train.shape[1], + n_estimators=500, + random_state=0).fit(X_train, y_train) +y_hat_bag = bag_boston.predict(X_test) +np.mean((y_test - y_hat_bag)**2) +``` +There is not much change. Bagging and random forests cannot overfit by +increasing the number of trees, but can underfit if the number is too small. + +Growing a random forest proceeds in exactly the same way, except that +we use a smaller value of the `max_features` argument. By default, +`RandomForestRegressor()` uses $p$ variables when building a random +forest of regression trees (i.e. it defaults to bagging), and `RandomForestClassifier()` uses +$\sqrt{p}$ variables when building a +random forest of classification trees. Here we use `max_features=6`. + +```{python} +RF_boston = RF(max_features=6, + random_state=0).fit(X_train, y_train) +y_hat_RF = RF_boston.predict(X_test) +np.mean((y_test - y_hat_RF)**2) + +``` + + +The test set MSE is 20.04; +this indicates that random forests did somewhat worse than bagging +in this case. Extracting the `feature_importances_` values from the fitted model, we can view the +importance of each variable. + +```{python} +feature_imp = pd.DataFrame( + {'importance':RF_boston.feature_importances_}, + index=feature_names) +feature_imp.sort_values(by='importance', ascending=False) +``` + This +is a relative measure of the total decrease in node impurity that results from +splits over that variable, averaged over all trees (this was plotted in Figure 8.9 for a model fit to the `Heart` data). + +The results indicate that across all of the trees considered in the +random forest, the wealth level of the community (`lstat`) and the +house size (`rm`) are by far the two most important variables. + + + + +## Boosting + + +Here we use `GradientBoostingRegressor()` from `sklearn.ensemble` +to fit boosted regression trees to the `Boston` data +set. For classification we would use `GradientBoostingClassifier()`. +The argument `n_estimators=5000` +indicates that we want 5000 trees, and the option +`max_depth=3` limits the depth of each tree. The +argument `learning_rate` is the $\lambda$ +mentioned earlier in the description of boosting. + +```{python} +boost_boston = GBR(n_estimators=5000, + learning_rate=0.001, + max_depth=3, + random_state=0) +boost_boston.fit(X_train, y_train) + +``` + +We can see how the training error decreases with the `train_score_` attribute. +To get an idea of how the test error decreases we can use the +`staged_predict()` method to get the predicted values along the path. + +```{python} +test_error = np.zeros_like(boost_boston.train_score_) +for idx, y_ in enumerate(boost_boston.staged_predict(X_test)): + test_error[idx] = np.mean((y_test - y_)**2) + +plot_idx = np.arange(boost_boston.train_score_.shape[0]) +ax = subplots(figsize=(8,8))[1] +ax.plot(plot_idx, + boost_boston.train_score_, + 'b', + label='Training') +ax.plot(plot_idx, + test_error, + 'r', + label='Test') +ax.legend(); + +``` + +We now use the boosted model to predict `medv` on the test set: + +```{python} +y_hat_boost = boost_boston.predict(X_test); +np.mean((y_test - y_hat_boost)**2) + +``` + + The test MSE obtained is 14.48, +similar to the test MSE for bagging. If we want to, we can +perform boosting with a different value of the shrinkage parameter +$\lambda$ in (8.10). The default value is 0.001, but +this is easily modified. Here we take $\lambda=0.2$. + +```{python} +boost_boston = GBR(n_estimators=5000, + learning_rate=0.2, + max_depth=3, + random_state=0) +boost_boston.fit(X_train, + y_train) +y_hat_boost = boost_boston.predict(X_test); +np.mean((y_test - y_hat_boost)**2) + +``` + + +In this case, using $\lambda=0.2$ leads to a almost the same test MSE +as when using $\lambda=0.001$. + + + + +## Bayesian Additive Regression Trees + + +In this section we demonstrate a `Python` implementation of BART found in the +`ISLP.bart` package. We fit a model +to the `Boston` housing data set. This `BART()` estimator is +designed for quantitative outcome variables, though other implementations are available for +fitting logistic and probit models to categorical outcomes. + +```{python} +bart_boston = BART(random_state=0, burnin=5, ndraw=15) +bart_boston.fit(X_train, y_train) + +``` + + +On this data set, with this split into test and training, we see that the test error of BART is similar to that of random forest. + +```{python} +yhat_test = bart_boston.predict(X_test.astype(np.float32)) +np.mean((y_test - yhat_test)**2) + +``` + + +We can check how many times each variable appeared in the collection of trees. +This gives a summary similar to the variable importance plot for boosting and random forests. + +```{python} +var_inclusion = pd.Series(bart_boston.variable_inclusion_.mean(0), + index=D.columns) +var_inclusion + +``` + + + + + diff --git a/ISLP_labs_R4DS/Ch09-svm-lab copy.Rmd b/ISLP_labs_R4DS/Ch09-svm-lab copy.Rmd new file mode 100644 index 0000000..e08f732 --- /dev/null +++ b/ISLP_labs_R4DS/Ch09-svm-lab copy.Rmd @@ -0,0 +1,559 @@ +--- +jupyter: + jupytext: + cell_metadata_filter: -all + formats: Rmd,ipynb + main_language: python + text_representation: + extension: .Rmd + format_name: rmarkdown + format_version: '1.2' + jupytext_version: 1.14.7 +--- + + +# Chapter 9 + + + + +# Lab: Support Vector Machines +In this lab, we use the `sklearn.svm` library to demonstrate the support +vector classifier and the support vector machine. + +We import some of our usual libraries. + +```{python} +import numpy as np +from matplotlib.pyplot import subplots, cm +import sklearn.model_selection as skm +from ISLP import load_data, confusion_table + +``` +We also collect the new imports +needed for this lab. + +```{python} +from sklearn.svm import SVC +from ISLP.svm import plot as plot_svm +from sklearn.metrics import RocCurveDisplay + +``` + +We will use the function `RocCurveDisplay.from_estimator()` to +produce several ROC plots, using a shorthand `roc_curve`. + +```{python} +roc_curve = RocCurveDisplay.from_estimator # shorthand + +``` + +## Support Vector Classifier + +We now use the `SupportVectorClassifier()` function (abbreviated `SVC()`) from `sklearn` to fit the support vector +classifier for a given value of the parameter `C`. The +`C` argument allows us to specify the cost of a violation to +the margin. When the `cost` argument is small, then the margins +will be wide and many support vectors will be on the margin or will +violate the margin. When the `C` argument is large, then the +margins will be narrow and there will be few support vectors on the +margin or violating the margin. + +Here we demonstrate +the use of `SVC()` on a two-dimensional example, so that we can +plot the resulting decision boundary. We begin by generating the +observations, which belong to two classes, and checking whether the +classes are linearly separable. + +```{python} +rng = np.random.default_rng(1) +X = rng.standard_normal((50, 2)) +y = np.array([-1]*25+[1]*25) +X[y==1] += 1 +fig, ax = subplots(figsize=(8,8)) +ax.scatter(X[:,0], + X[:,1], + c=y, + cmap=cm.coolwarm); + +``` +They are not. We now fit the classifier. + +```{python} +svm_linear = SVC(C=10, kernel='linear') +svm_linear.fit(X, y) + +``` + + +The support vector classifier with two features can +be visualized by plotting values of its *decision function*. +We have included a function for this in the `ISLP` package (inspired by a similar +example in the `sklearn` docs). + +```{python} +fig, ax = subplots(figsize=(8,8)) +plot_svm(X, + y, + svm_linear, + ax=ax) + +``` + +The decision +boundary between the two classes is linear (because we used the +argument `kernel='linear'`). The support vectors are marked with `+` +and the remaining observations are plotted as circles. + +What if we instead used a smaller value of the cost parameter? + +```{python} +svm_linear_small = SVC(C=0.1, kernel='linear') +svm_linear_small.fit(X, y) +fig, ax = subplots(figsize=(8,8)) +plot_svm(X, + y, + svm_linear_small, + ax=ax) + +``` +With a smaller value of the cost parameter, we +obtain a larger number of support vectors, because the margin is now +wider. For linear kernels, we can extract the +coefficients of the linear decision boundary as follows: + +```{python} +svm_linear.coef_ + +``` + + +Since the support vector machine is an estimator in `sklearn`, we +can use the usual machinery to tune it. + +```{python} +kfold = skm.KFold(5, + random_state=0, + shuffle=True) +grid = skm.GridSearchCV(svm_linear, + {'C':[0.001,0.01,0.1,1,5,10,100]}, + refit=True, + cv=kfold, + scoring='accuracy') +grid.fit(X, y) +grid.best_params_ + +``` + + +We can easily access the cross-validation errors for each of these models +in `grid.cv_results_`. This prints out a lot of detail, so we +extract the accuracy results only. + +```{python} +grid.cv_results_[('mean_test_score')] + +``` +We see that `C=1` results in the highest cross-validation +accuracy of 0.74, though +the accuracy is the same for several values of `C`. +The classifier `grid.best_estimator_` can be used to predict the class +label on a set of test observations. Let’s generate a test data set. + +```{python} +X_test = rng.standard_normal((20, 2)) +y_test = np.array([-1]*10+[1]*10) +X_test[y_test==1] += 1 + +``` + +Now we predict the class labels of these test observations. Here we +use the best model selected by cross-validation in order to make the +predictions. + +```{python} +best_ = grid.best_estimator_ +y_test_hat = best_.predict(X_test) +confusion_table(y_test_hat, y_test) + +``` + +Thus, with this value of `C`, +70% of the test +observations are correctly classified. What if we had instead used +`C=0.001`? + +```{python} +svm_ = SVC(C=0.001, + kernel='linear').fit(X, y) +y_test_hat = svm_.predict(X_test) +confusion_table(y_test_hat, y_test) + +``` + +In this case 60% of test observations are correctly classified. + +We now consider a situation in which the two classes are linearly +separable. Then we can find an optimal separating hyperplane using the +`SVC()` estimator. We first +further separate the two classes in our simulated data so that they +are linearly separable: + +```{python} +X[y==1] += 1.9; +fig, ax = subplots(figsize=(8,8)) +ax.scatter(X[:,0], X[:,1], c=y, cmap=cm.coolwarm); + +``` + +Now the observations are just barely linearly separable. + +```{python} +svm_ = SVC(C=1e5, kernel='linear').fit(X, y) +y_hat = svm_.predict(X) +confusion_table(y_hat, y) + +``` + +We fit the +support vector classifier and plot the resulting hyperplane, using a +very large value of `C` so that no observations are +misclassified. + +```{python} +fig, ax = subplots(figsize=(8,8)) +plot_svm(X, + y, + svm_, + ax=ax) + +``` +Indeed no training errors were made and only three support vectors were used. +In fact, the large value of `C` also means that these three support points are *on the margin*, and define it. +One may wonder how good the classifier could be on test data that depends on only three data points! + We now try a smaller +value of `C`. + +```{python} +svm_ = SVC(C=0.1, kernel='linear').fit(X, y) +y_hat = svm_.predict(X) +confusion_table(y_hat, y) + +``` + +Using `C=0.1`, we again do not misclassify any training observations, but we +also obtain a much wider margin and make use of twelve support +vectors. These jointly define the orientation of the decision boundary, and since there are more of them, it is more stable. It seems possible that this model will perform better on test +data than the model with `C=1e5` (and indeed, a simple experiment with a large test set would bear this out). + +```{python} +fig, ax = subplots(figsize=(8,8)) +plot_svm(X, + y, + svm_, + ax=ax) + +``` + + +## Support Vector Machine +In order to fit an SVM using a non-linear kernel, we once again use +the `SVC()` estimator. However, now we use a different value +of the parameter `kernel`. To fit an SVM with a polynomial +kernel we use `kernel="poly"`, and to fit an SVM with a +radial kernel we use +`kernel="rbf"`. In the former case we also use the +`degree` argument to specify a degree for the polynomial kernel +(this is $d$ in (9.22)), and in the latter case we use +`gamma` to specify a value of $\gamma$ for the radial basis +kernel (9.24). + +We first generate some data with a non-linear class boundary, as follows: + +```{python} +X = rng.standard_normal((200, 2)) +X[:100] += 2 +X[100:150] -= 2 +y = np.array([1]*150+[2]*50) + +``` + +Plotting the data makes it clear that the class boundary is indeed non-linear. + +```{python} +fig, ax = subplots(figsize=(8,8)) +ax.scatter(X[:,0], + X[:,1], + c=y, + cmap=cm.coolwarm) + +``` + + +The data is randomly split into training and testing groups. We then +fit the training data using the `SVC()` estimator with a +radial kernel and $\gamma=1$: + +```{python} +(X_train, + X_test, + y_train, + y_test) = skm.train_test_split(X, + y, + test_size=0.5, + random_state=0) +svm_rbf = SVC(kernel="rbf", gamma=1, C=1) +svm_rbf.fit(X_train, y_train) + +``` + +The plot shows that the resulting SVM has a decidedly non-linear +boundary. + +```{python} +fig, ax = subplots(figsize=(8,8)) +plot_svm(X_train, + y_train, + svm_rbf, + ax=ax) + +``` + +We can see from the figure that there are a fair number of training +errors in this SVM fit. If we increase the value of `C`, we +can reduce the number of training errors. However, this comes at the +price of a more irregular decision boundary that seems to be at risk +of overfitting the data. + +```{python} +svm_rbf = SVC(kernel="rbf", gamma=1, C=1e5) +svm_rbf.fit(X_train, y_train) +fig, ax = subplots(figsize=(8,8)) +plot_svm(X_train, + y_train, + svm_rbf, + ax=ax) + +``` + +We can perform cross-validation using `skm.GridSearchCV()` to select the +best choice of $\gamma$ and `C` for an SVM with a radial +kernel: + +```{python} +kfold = skm.KFold(5, + random_state=0, + shuffle=True) +grid = skm.GridSearchCV(svm_rbf, + {'C':[0.1,1,10,100,1000], + 'gamma':[0.5,1,2,3,4]}, + refit=True, + cv=kfold, + scoring='accuracy'); +grid.fit(X_train, y_train) +grid.best_params_ + +``` + +The best choice of parameters under five-fold CV is achieved at `C=1` +and `gamma=0.5`, though several other values also achieve the same +value. + +```{python} +best_svm = grid.best_estimator_ +fig, ax = subplots(figsize=(8,8)) +plot_svm(X_train, + y_train, + best_svm, + ax=ax) + +y_hat_test = best_svm.predict(X_test) +confusion_table(y_hat_test, y_test) + +``` + +With these parameters, 12% of test +observations are misclassified by this SVM. + + +## ROC Curves + +SVMs and support vector classifiers output class labels for each +observation. However, it is also possible to obtain *fitted values* +for each observation, which are the numerical scores used to +obtain the class labels. For instance, in the case of a support vector +classifier, the fitted value for an observation $X= (X_1, X_2, \ldots, +X_p)^T$ takes the form $\hat{\beta}_0 + \hat{\beta}_1 X_1 + +\hat{\beta}_2 X_2 + \ldots + \hat{\beta}_p X_p$. For an SVM with a +non-linear kernel, the equation that yields the fitted value is given +in (9.23). The sign of the fitted value +determines on which side of the decision boundary the observation +lies. Therefore, the relationship between the fitted value and the +class prediction for a given observation is simple: if the fitted +value exceeds zero then the observation is assigned to one class, and +if it is less than zero then it is assigned to the other. +By changing this threshold from zero to some positive value, +we skew the classifications in favor of one class versus the other. +By considering a range of these thresholds, positive and negative, we produce the ingredients for a ROC plot. +We can access these values by calling the `decision_function()` +method of a fitted SVM estimator. + +The function `ROCCurveDisplay.from_estimator()` (which we have abbreviated to `roc_curve()`) will produce a plot of a ROC curve. It takes a fitted estimator as its first argument, followed +by a model matrix $X$ and labels $y$. The argument `name` is used in the legend, +while `color` is used for the color of the line. Results are plotted +on our axis object `ax`. + +```{python} +fig, ax = subplots(figsize=(8,8)) +roc_curve(best_svm, + X_train, + y_train, + name='Training', + color='r', + ax=ax); + +``` + In this example, the SVM appears to provide accurate predictions. By increasing +$\gamma$ we can produce a more flexible fit and generate further +improvements in accuracy. + +```{python} +svm_flex = SVC(kernel="rbf", + gamma=50, + C=1) +svm_flex.fit(X_train, y_train) +fig, ax = subplots(figsize=(8,8)) +roc_curve(svm_flex, + X_train, + y_train, + name='Training $\gamma=50$', + color='r', + ax=ax); + +``` + +However, these ROC curves are all on the training data. We are really +more interested in the level of prediction accuracy on the test +data. When we compute the ROC curves on the test data, the model with +$\gamma=0.5$ appears to provide the most accurate results. + +```{python} +roc_curve(svm_flex, + X_test, + y_test, + name='Test $\gamma=50$', + color='b', + ax=ax) +fig; + +``` + +Let’s look at our tuned SVM. + +```{python} +fig, ax = subplots(figsize=(8,8)) +for (X_, y_, c, name) in zip( + (X_train, X_test), + (y_train, y_test), + ('r', 'b'), + ('CV tuned on training', + 'CV tuned on test')): + roc_curve(best_svm, + X_, + y_, + name=name, + ax=ax, + color=c) + +``` + +## SVM with Multiple Classes + +If the response is a factor containing more than two levels, then the +`SVC()` function will perform multi-class classification using +either the one-versus-one approach (when `decision_function_shape=='ovo'`) +or one-versus-rest {One-versus-rest is also known as one-versus-all.} (when `decision_function_shape=='ovr'`). +We explore that setting briefly here by +generating a third class of observations. + +```{python} +rng = np.random.default_rng(123) +X = np.vstack([X, rng.standard_normal((50, 2))]) +y = np.hstack([y, [0]*50]) +X[y==0,1] += 2 +fig, ax = subplots(figsize=(8,8)) +ax.scatter(X[:,0], X[:,1], c=y, cmap=cm.coolwarm); + +``` + +We now fit an SVM to the data: + +```{python} +svm_rbf_3 = SVC(kernel="rbf", + C=10, + gamma=1, + decision_function_shape='ovo'); +svm_rbf_3.fit(X, y) +fig, ax = subplots(figsize=(8,8)) +plot_svm(X, + y, + svm_rbf_3, + scatter_cmap=cm.tab10, + ax=ax) + +``` +The `sklearn.svm` library can also be used to perform support vector +regression with a numerical response using the estimator `SupportVectorRegression()`. + + +## Application to Gene Expression Data + +We now examine the `Khan` data set, which consists of a number of +tissue samples corresponding to four distinct types of small round +blue cell tumors. For each tissue sample, gene expression measurements +are available. The data set consists of training data, `xtrain` +and `ytrain`, and testing data, `xtest` and `ytest`. + +We examine the dimension of the data: + +```{python} +Khan = load_data('Khan') +Khan['xtrain'].shape, Khan['xtest'].shape + +``` + +This data set consists of expression measurements for 2,308 +genes. The training and test sets consist of 63 and 20 +observations, respectively. + +We will use a support vector approach to predict cancer subtype using +gene expression measurements. In this data set, there is a very +large number of features relative to the number of observations. This +suggests that we should use a linear kernel, because the additional +flexibility that will result from using a polynomial or radial kernel +is unnecessary. + +```{python} +khan_linear = SVC(kernel='linear', C=10) +khan_linear.fit(Khan['xtrain'], Khan['ytrain']) +confusion_table(khan_linear.predict(Khan['xtrain']), + Khan['ytrain']) + +``` + +We see that there are *no* training +errors. In fact, this is not surprising, because the large number of +variables relative to the number of observations implies that it is +easy to find hyperplanes that fully separate the classes. We are more +interested in the support vector classifier’s performance on the +test observations. + +```{python} +confusion_table(khan_linear.predict(Khan['xtest']), + Khan['ytest']) + +``` + +We see that using `C=10` yields two test set errors on these data. + + diff --git a/ISLP_labs_R4DS/Ch10-deeplearning-lab copy.Rmd b/ISLP_labs_R4DS/Ch10-deeplearning-lab copy.Rmd new file mode 100644 index 0000000..ce234b5 --- /dev/null +++ b/ISLP_labs_R4DS/Ch10-deeplearning-lab copy.Rmd @@ -0,0 +1,1847 @@ +--- +jupyter: + jupytext: + cell_metadata_filter: -all + formats: Rmd,ipynb + text_representation: + extension: .Rmd + format_name: rmarkdown + format_version: '1.2' + jupytext_version: 1.14.7 + kernelspec: + display_name: Python 3 (ipykernel) + language: python + name: python3 +--- + + +# Chapter 10 + +# Lab: Deep Learning +In this section we demonstrate how to fit the examples discussed +in the text. We use the `Python` `torch` package, along with the +`pytorch_lightning` package which provides utilities to simplify +fitting and evaluating models. This code can be impressively fast +with certain special processors, such as Apple’s new M1 chip. The package is well-structured, flexible, and will feel comfortable +to `Python` users. A good companion is the site +[pytorch.org/tutorials](https://pytorch.org/tutorials/beginner/basics/intro.html). +Much of our code is adapted from there, as well as the `pytorch_lightning` documentation. {The precise URLs at the time of writing are and .} + +We start with several standard imports that we have seen before. + +```{python} +import numpy as np, pandas as pd +from matplotlib.pyplot import subplots +from sklearn.linear_model import \ + (LinearRegression, + LogisticRegression, + Lasso) +from sklearn.preprocessing import StandardScaler +from sklearn.model_selection import KFold +from sklearn.pipeline import Pipeline +from ISLP import load_data +from ISLP.models import ModelSpec as MS +from sklearn.model_selection import \ + (train_test_split, + GridSearchCV) +``` + + +### Torch-Specific Imports +There are a number of imports for `torch`. (These are not +included with `ISLP`, so must be installed separately.) +First we import the main library +and essential tools used to specify sequentially-structured networks. + +```{python} +import torch +from torch import nn +from torch.optim import RMSprop +from torch.utils.data import TensorDataset + +``` + +There are several other helper packages for `torch`. For instance, +the `torchmetrics` package has utilities to compute +various metrics to evaluate performance when fitting +a model. The `torchinfo` package provides a useful +summary of the layers of a model. We use the `read_image()` +function when loading test images in Section 10.9.4. + +```{python} +from torchmetrics import (MeanAbsoluteError, + R2Score) +from torchinfo import summary + +``` + +The package `pytorch_lightning` is a somewhat higher-level +interface to `torch` that simplifies the specification and +fitting of +models by reducing the amount of boilerplate code needed +(compared to using `torch` alone). + +```{python} +from pytorch_lightning import Trainer +from pytorch_lightning.loggers import CSVLogger + +``` + +In order to reproduce results we use `seed_everything()`. We will also instruct `torch` to use deterministic algorithms +where possible. + +```{python} +from pytorch_lightning import seed_everything +seed_everything(0, workers=True) +torch.use_deterministic_algorithms(True, warn_only=True) + +``` + +We will use several datasets shipped with `torchvision` for our +examples: a pretrained network for image classification, +as well as some transforms used for preprocessing. + +```{python} +from torchvision.io import read_image +from torchvision.datasets import MNIST, CIFAR100 +from torchvision.models import (resnet50, + ResNet50_Weights) +from torchvision.transforms import (Resize, + Normalize, + CenterCrop, + ToTensor) +``` +We have provided a few utilities in `ISLP` specifically for this lab. +The `SimpleDataModule` and `SimpleModule` are simple +versions of objects used in `pytorch_lightning`, the +high-level module for fitting `torch` models. Although more advanced +uses such as computing on graphical processing units (GPUs) and parallel data processing +are possible in this module, we will not be focusing much on these +in this lab. The `ErrorTracker` handles +collections of targets and predictions over each mini-batch +in the validation or test stage, allowing computation +of the metric over the entire validation or test data set. + +```{python} +from ISLP.torch import (SimpleDataModule, + SimpleModule, + ErrorTracker, + rec_num_workers) + +``` + +In addition we have included some helper +functions to load the +`IMDb` database, as well as a lookup that maps integers +to particular keys in the database. We’ve included +a slightly modified copy of the preprocessed +`IMDb` data from `keras`, a separate package +for fitting deep learning models. This saves us significant +preprocessing and allows us to focus on specifying and fitting +the models themselves. + +```{python} +from ISLP.torch.imdb import (load_lookup, + load_tensor, + load_sparse, + load_sequential) + +``` + +Finally, we introduce some utility imports not directly related to +`torch`. +The `glob()` function from the `glob` module is used +to find all files matching wildcard characters, which we will use +in our example applying the `ResNet50` model +to some of our own images. +The `json` module will be used to load +a JSON file for looking up classes to identify the labels of the +pictures in the `ResNet50` example. + +```{python} +from glob import glob +import json + +``` + + +## Single Layer Network on Hitters Data +We start by fitting the models in Section 10.6 on the `Hitters` data. + +```{python} +Hitters = load_data('Hitters').dropna() +n = Hitters.shape[0] + +``` + We will fit two linear models (least squares and lasso) and compare their performance +to that of a neural network. For this comparison we will use mean absolute error on a validation dataset. +\begin{equation*} +\begin{split} +\mbox{MAE}(y,\hat{y}) = \frac{1}{n} \sum_{i=1}^n |y_i-\hat{y}_i|. +\end{split} +\end{equation*} +We set up the model matrix and the response. + +```{python} +model = MS(Hitters.columns.drop('Salary'), intercept=False) +X = model.fit_transform(Hitters).to_numpy() +Y = Hitters['Salary'].to_numpy() + +``` +The `to_numpy()` method above converts `pandas` +data frames or series to `numpy` arrays. +We do this because we will need to use `sklearn` to fit the lasso model, +and it requires this conversion. +We also use a linear regression method from `sklearn`, rather than the method +in Chapter~3 from `statsmodels`, to facilitate the comparisons. + + +We now split the data into test and training, fixing the random +state used by `sklearn` to do the split. + +```{python} +(X_train, + X_test, + Y_train, + Y_test) = train_test_split(X, + Y, + test_size=1/3, + random_state=1) +``` + +### Linear Models +We fit the linear model and evaluate the test error directly. + +```{python} +hit_lm = LinearRegression().fit(X_train, Y_train) +Yhat_test = hit_lm.predict(X_test) +np.abs(Yhat_test - Y_test).mean() +``` + +Next we fit the lasso using `sklearn`. We are using +mean absolute error to select and evaluate a model, rather than mean squared error. +The specialized solver we used in Section 6.5.2 uses only mean squared error. So here, with a bit more work, we create a cross-validation grid and perform the cross-validation directly. + +We encode a pipeline with two steps: we first normalize the features using a `StandardScaler()` transform, +and then fit the lasso without further normalization. + +```{python} +scaler = StandardScaler(with_mean=True, with_std=True) +lasso = Lasso(warm_start=True, max_iter=30000) +standard_lasso = Pipeline(steps=[('scaler', scaler), + ('lasso', lasso)]) +``` + +We need to create a grid of values for $\lambda$. As is common practice, +we choose a grid of 100 values of $\lambda$, uniform on the log scale from `lam_max` down to `0.01*lam_max`. Here `lam_max` is the smallest value of +$\lambda$ with an all-zero solution. This value equals the largest absolute inner-product between any predictor and the (centered) response. {The derivation of this result is beyond the scope of this book.} + +```{python} +X_s = scaler.fit_transform(X_train) +n = X_s.shape[0] +lam_max = np.fabs(X_s.T.dot(Y_train - Y_train.mean())).max() / n +param_grid = {'alpha': np.exp(np.linspace(0, np.log(0.01), 100)) + * lam_max} +``` +Note that we had to transform the data first, since the scale of the variables impacts the choice of $\lambda$. +We now perform cross-validation using this sequence of $\lambda$ values. + +```{python} +cv = KFold(10, + shuffle=True, + random_state=1) +grid = GridSearchCV(lasso, + param_grid, + cv=cv, + scoring='neg_mean_absolute_error') +grid.fit(X_train, Y_train); +``` + +We extract the lasso model with best cross-validated mean absolute error, and evaluate its +performance on `X_test` and `Y_test`, which were not used in +cross-validation. + +```{python} +trained_lasso = grid.best_estimator_ +Yhat_test = trained_lasso.predict(X_test) +np.fabs(Yhat_test - Y_test).mean() +``` +This is similar to the results we got for the linear model fit by least squares. However, these results can vary a lot for different train/test splits; we encourage the reader to try a different seed in code block 12 and rerun the subsequent code up to this point. + +### Specifying a Network: Classes and Inheritance +To fit the neural network, we first set up a model structure +that describes the network. +Doing so requires us to define new classes specific to the model we wish to fit. +Typically this is done in `pytorch` by sub-classing a generic +representation of a network, which is the approach we take here. +Although this example is simple, we will go through the steps in some detail, since it will serve us well +for the more complex examples to follow. + + +```{python} +class HittersModel(nn.Module): + + def __init__(self, input_size): + super(HittersModel, self).__init__() + self.flatten = nn.Flatten() + self.sequential = nn.Sequential( + nn.Linear(input_size, 50), + nn.ReLU(), + nn.Dropout(0.4), + nn.Linear(50, 1)) + + def forward(self, x): + x = self.flatten(x) + return torch.flatten(self.sequential(x)) + +``` + +The `class` statement identifies the code chunk as a +declaration for a class `HittersModel` +that inherits from the base class `nn.Module`. This base +class is ubiquitous in `torch` and represents the +mappings in the neural networks. + +Indented beneath the `class` statement are the methods of this class: +in this case `__init__` and `forward`. The `__init__` method is +called when an instance of the class is created as in the cell +below. In the methods, `self` always refers to an instance of the +class. In the `__init__` method, we have attached two objects to +`self` as attributes: `flatten` and `sequential`. These are used in +the `forward` method to describe the map that this module implements. + +There is one additional line in the `__init__` method, which +is a call to +`super()`. This function allows subclasses (i.e. `HittersModel`) +to access methods of the class they inherit from. For example, +the class `nn.Module` has its own `__init__` method, which is different from +the `HittersModel.__init__()` method we’ve written above. +Using `super()` allows us to call the method of the base class. For +`torch` models, we will always be making this `super()` call as it is necessary +for the model to be properly interpreted by `torch`. + +The object `nn.Module` has more methods than simply `__init__` and `forward`. These +methods are directly accessible to `HittersModel` instances because of this inheritance. +One such method we will see shortly is the `eval()` method, used +to disable dropout for when we want to evaluate the model on test data. + +```{python} +hit_model = HittersModel(X.shape[1]) + +``` + +The object `self.sequential` is a composition of four maps. The +first maps the 19 features of `Hitters` to 50 dimensions, introducing $50\times 19+50$ parameters +for the weights and *intercept* of the map (often called the *bias*). This layer +is then mapped to a ReLU layer followed by a 40% dropout layer, and finally a +linear map down to 1 dimension, again with a bias. The total number of +trainable parameters is therefore $50\times 19+50+50+1=1051$. + + + + +The package `torchinfo` provides a `summary()` function that neatly summarizes +this information. We specify the size of the input and see the size +of each tensor as it passes through layers of the network. + +```{python} +summary(hit_model, + input_size=X_train.shape, + col_names=['input_size', + 'output_size', + 'num_params']) + +``` +We have truncated the end of the output slightly, here and in subsequent uses. + +We now need to transform our training data into a form accessible to `torch`. +The basic +datatype in `torch` is a `tensor`, which is very similar +to an `ndarray` from early chapters. +We also note here that `torch` typically +works with 32-bit (*single precision*) +rather than 64-bit (*double precision*) floating point numbers. +We therefore convert our data to `np.float32` before +forming the tensor. +The $X$ and $Y$ tensors are then arranged into a `Dataset` +recognized by `torch` +using `TensorDataset()`. + +```{python} +X_train_t = torch.tensor(X_train.astype(np.float32)) +Y_train_t = torch.tensor(Y_train.astype(np.float32)) +hit_train = TensorDataset(X_train_t, Y_train_t) +``` +We do the same for the test data. + +```{python} +X_test_t = torch.tensor(X_test.astype(np.float32)) +Y_test_t = torch.tensor(Y_test.astype(np.float32)) +hit_test = TensorDataset(X_test_t, Y_test_t) + +``` + +Finally, this dataset is passed to a `DataLoader()` which ultimately +passes data into our network. While this may seem +like a lot of overhead, this structure is helpful for more +complex tasks where data may live on different machines, +or where data must be passed to a GPU. +We provide a helper function `SimpleDataModule()` in `ISLP` to make this task easier for +standard usage. +One of its arguments is `num_workers`, which indicates +how many processes we will use +for loading the data. For small +data like `Hitters` this will have little effect, but +it does provide an advantage for the `MNIST` and `CIFAR100` examples below. +The `torch` package will inspect the process running and determine a +maximum number of workers. {This depends on the computing hardware and the number of cores available.} We’ve included a function +`rec_num_workers()` to compute this so we know how many +workers might be reasonable (here the max was 16). + +```{python} +max_num_workers = rec_num_workers() +``` + +The general training setup in `pytorch_lightning` involves +training, validation and test data. These are each +represented by different data loaders. During each epoch, +we run a training step to learn the model and a validation +step to track the error. The test data is typically +used at the end of training to evaluate the model. + +In this case, as we had split only into test and training, +we’ll use the test data as validation data with the +argument `validation=hit_test`. The +`validation` argument can be a float between 0 and 1, an +integer, or a +`Dataset`. If a float (respectively, integer), it is interpreted +as a percentage (respectively number) of the *training* observations to be used for validation. +If it is a `Dataset`, it is passed directly to a data loader. + +```{python} +hit_dm = SimpleDataModule(hit_train, + hit_test, + batch_size=32, + num_workers=min(4, max_num_workers), + validation=hit_test) + +``` + +Next we must provide a `pytorch_lightning` module that controls +the steps performed during the training process. We provide methods for our +`SimpleModule()` that simply record the value +of the loss function and any additional +metrics at the end of each epoch. These operations +are controlled by the methods `SimpleModule.[training/test/validation]_step()`, though +we will not be modifying these in our examples. + +```{python} +hit_module = SimpleModule.regression(hit_model, + metrics={'mae':MeanAbsoluteError()}) + +``` + + By using the `SimpleModule.regression()` method, we indicate that we will use squared-error loss as in +(10.23). +We have also asked for mean absolute error to be tracked as well +in the metrics that are logged. + +We log our results via `CSVLogger()`, which in this case stores the results in a CSV file within a directory `logs/hitters`. After the fitting is complete, this allows us to load the +results as a `pd.DataFrame()` and visualize them below. There are +several ways to log the results within `pytorch_lightning`, though +we will not cover those here in detail. + +```{python} +hit_logger = CSVLogger('logs', name='hitters') +``` + +Finally we are ready to train our model and log the results. We +use the `Trainer()` object from `pytorch_lightning` +to do this work. The argument `datamodule=hit_dm` tells the trainer +how training/validation/test logs are produced, +while the first argument `hit_module` +specifies the network architecture +as well as the training/validation/test steps. +The `callbacks` argument allows for +several tasks to be carried out at various +points while training a model. Here +our `ErrorTracker()` callback will enable +us to compute validation error while training +and, finally, the test error. +We now fit the model for 50 epochs. + +```{python} +hit_trainer = Trainer(deterministic=True, + max_epochs=50, + log_every_n_steps=5, + logger=hit_logger, + callbacks=[ErrorTracker()]) +hit_trainer.fit(hit_module, datamodule=hit_dm) +``` +At each step of SGD, the algorithm randomly selects 32 training observations for +the computation of the gradient. Recall from Section 10.7 +that an epoch amounts to the number of SGD steps required to process $n$ +observations. Since the training set has +$n=175$, and we specified a `batch_size` of 32 in the construction of `hit_dm`, an epoch is $175/32=5.5$ SGD steps. + +After having fit the model, we can evaluate performance on our test +data using the `test()` method of our trainer. + +```{python} +hit_trainer.test(hit_module, datamodule=hit_dm) + +``` + + +The results of the fit have been logged into a CSV file. We can find the +results specific to this run in the `experiment.metrics_file_path` +attribute of our logger. Note that each time the model is fit, the logger will output +results into a new subdirectory of our directory `logs/hitters`. + +We now create a plot of the MAE (mean absolute error) as a function of +the number of epochs. +First we retrieve the logged summaries. + +```{python} +hit_results = pd.read_csv(hit_logger.experiment.metrics_file_path) +``` + +Since we will produce similar plots in later examples, we write a +simple generic function to produce this plot. + +```{python} +def summary_plot(results, + ax, + col='loss', + valid_legend='Validation', + training_legend='Training', + ylabel='Loss', + fontsize=20): + for (column, + color, + label) in zip([f'train_{col}_epoch', + f'valid_{col}'], + ['black', + 'red'], + [training_legend, + valid_legend]): + results.plot(x='epoch', + y=column, + label=label, + marker='o', + color=color, + ax=ax) + ax.set_xlabel('Epoch') + ax.set_ylabel(ylabel) + return ax +``` +We now set up our axes, and use our function to produce the MAE plot. + +```{python} +fig, ax = subplots(1, 1, figsize=(6, 6)) +ax = summary_plot(hit_results, + ax, + col='mae', + ylabel='MAE', + valid_legend='Validation (=Test)') +ax.set_ylim([0, 400]) +ax.set_xticks(np.linspace(0, 50, 11).astype(int)); +``` + + +We can predict directly from the final model, and +evaluate its performance on the test data. +Before fitting, we call the `eval()` method +of `hit_model`. +This tells +`torch` to effectively consider this model to be fitted, so that +we can use it to predict on new data. For our model here, +the biggest change is that the dropout layers will +be turned off, i.e. no weights will be randomly +dropped in predicting on new data. + +```{python} +hit_model.eval() +preds = hit_module(X_test_t) +torch.abs(Y_test_t - preds).mean() +``` + + + +### Cleanup +In setting up our data module, we had initiated +several worker processes that will remain running. +We delete all references to the torch objects to ensure these processes +will be killed. + + +```{python} +del(Hitters, + hit_model, hit_dm, + hit_logger, + hit_test, hit_train, + X, Y, + X_test, X_train, + Y_test, Y_train, + X_test_t, Y_test_t, + hit_trainer, hit_module) + +``` + + +## Multilayer Network on the MNIST Digit Data +The `torchvision` package comes with a number of example datasets, +including the `MNIST` digit data. Our first step is to retrieve +the training and test data sets; the `MNIST()` function within +`torchvision.datasets` is provided for this purpose. The +data will be downloaded the first time this function is executed, and stored in the directory `data/MNIST`. + +```{python} +(mnist_train, + mnist_test) = [MNIST(root='data', + train=train, + download=True, + transform=ToTensor()) + for train in [True, False]] +mnist_train + +``` + +There are 60,000 images in the training data and 10,000 in the test +data. The images are $28\times 28$, and stored as a matrix of pixels. We +need to transform each one into a vector. + +Neural networks are somewhat sensitive to the scale of the inputs, much as ridge and +lasso regularization are affected by scaling. Here the inputs are eight-bit +grayscale values between 0 and 255, so we rescale to the unit +interval. {Note: eight bits means $2^8$, which equals 256. Since the convention +is to start at $0$, the possible values range from $0$ to $255$.} +This transformation, along with some reordering +of the axes, is performed by the `ToTensor()` transform +from the `torchvision.transforms` package. + +As in our `Hitters` example, we form a data module +from the training and test datasets, setting aside 20% +of the training images for validation. + +```{python} +mnist_dm = SimpleDataModule(mnist_train, + mnist_test, + validation=0.2, + num_workers=max_num_workers, + batch_size=256) + +``` + +Let’s take a look at the data that will get fed into our network. We loop through the first few +chunks of the test dataset, breaking after 2 batches: + +```{python} +for idx, (X_ ,Y_) in enumerate(mnist_dm.train_dataloader()): + print('X: ', X_.shape) + print('Y: ', Y_.shape) + if idx >= 1: + break + +``` + + +We see that the $X$ for each batch consists of 256 images of size `1x28x28`. +Here the `1` indicates a single channel (greyscale). For RGB images such as `CIFAR100` below, +we will see that the `1` in the size will be replaced by `3` for the three RGB channels. + +Now we are ready to specify our neural network. + +```{python} +class MNISTModel(nn.Module): + def __init__(self): + super(MNISTModel, self).__init__() + self.layer1 = nn.Sequential( + nn.Flatten(), + nn.Linear(28*28, 256), + nn.ReLU(), + nn.Dropout(0.4)) + self.layer2 = nn.Sequential( + nn.Linear(256, 128), + nn.ReLU(), + nn.Dropout(0.3)) + self._forward = nn.Sequential( + self.layer1, + self.layer2, + nn.Linear(128, 10)) + def forward(self, x): + return self._forward(x) +``` + +We see that in the first layer, each `1x28x28` image is flattened, then mapped to +256 dimensions where we apply a ReLU activation with 40% dropout. +A second layer maps the first layer’s output down to +128 dimensions, applying a ReLU activation with 30% dropout. Finally, +the 128 dimensions are mapped down to 10, the number of classes in the +`MNIST` data. + +```{python} +mnist_model = MNISTModel() + +``` + +We can check that the model produces output of expected size based +on our existing batch `X_` above. + +```{python} +mnist_model(X_).size() +``` + +Let’s take a look at the summary of the model. Instead of an `input_size` we can pass +a tensor of correct shape. In this case, we pass through the final +batched `X_` from above. + +```{python} +summary(mnist_model, + input_data=X_, + col_names=['input_size', + 'output_size', + 'num_params']) +``` + +Having set up both the model and the data module, fitting this model is +now almost identical to the `Hitters` example. In contrast to our regression model, here we will use the +`SimpleModule.classification()` method which +uses the cross-entropy loss function instead of mean squared error. It must be supplied with the number of +classes in the problem. + +```{python} +mnist_module = SimpleModule.classification(mnist_model, + num_classes=10) +mnist_logger = CSVLogger('logs', name='MNIST') + +``` + +Now we are ready to go. The final step is to supply training data, and fit the model. + +```{python} +mnist_trainer = Trainer(deterministic=True, + max_epochs=30, + logger=mnist_logger, + callbacks=[ErrorTracker()]) +mnist_trainer.fit(mnist_module, + datamodule=mnist_dm) + +``` +We have suppressed the output here, which is a progress report on the +fitting of the model, grouped by epoch. This is very useful, since on +large datasets fitting can take time. Fitting this model took 245 +seconds on a MacBook Pro with an Apple M1 Pro chip with 10 cores and 16 GB of RAM. +Here we specified a +validation split of 20%, so training is actually performed on +80% of the 60,000 observations in the training set. This is an +alternative to actually supplying validation data, like we did for the `Hitters` data. +SGD uses batches +of 256 observations in computing the gradient, and doing the +arithmetic, we see that an epoch corresponds to 188 gradient steps. + + +`SimpleModule.classification()` includes +an accuracy metric by default. Other +classification metrics can be added from `torchmetrics`. +We will use our `summary_plot()` function to display +accuracy across epochs. + +```{python} +mnist_results = pd.read_csv(mnist_logger.experiment.metrics_file_path) +fig, ax = subplots(1, 1, figsize=(6, 6)) +summary_plot(mnist_results, + ax, + col='accuracy', + ylabel='Accuracy') +ax.set_ylim([0.5, 1]) +ax.set_ylabel('Accuracy') +ax.set_xticks(np.linspace(0, 30, 7).astype(int)); + +``` +Once again we evaluate the accuracy using the `test()` method of our trainer. This model achieves +97% accuracy on the test data. + +```{python} +mnist_trainer.test(mnist_module, + datamodule=mnist_dm) +``` + +Table 10.1 also reports the error rates resulting from LDA (Chapter 4) and multiclass logistic +regression. For LDA we refer the reader to Section 4.7.3. +Although we could use the `sklearn` function `LogisticRegression()` to fit +multiclass logistic regression, we are set up here to fit such a model +with `torch`. +We just have an input layer and an output layer, and omit the hidden layers! + +```{python} +class MNIST_MLR(nn.Module): + def __init__(self): + super(MNIST_MLR, self).__init__() + self.linear = nn.Sequential(nn.Flatten(), + nn.Linear(784, 10)) + def forward(self, x): + return self.linear(x) + +mlr_model = MNIST_MLR() +mlr_module = SimpleModule.classification(mlr_model, + num_classes=10) +mlr_logger = CSVLogger('logs', name='MNIST_MLR') +``` + +```{python} +mlr_trainer = Trainer(deterministic=True, + max_epochs=30, + callbacks=[ErrorTracker()]) +mlr_trainer.fit(mlr_module, datamodule=mnist_dm) +``` +We fit the model just as before and compute the test results. + +```{python} +mlr_trainer.test(mlr_module, + datamodule=mnist_dm) +``` +The accuracy is above 90% even for this pretty simple model. + +As in the `Hitters` example, we delete some of +the objects we created above. + +```{python} +del(mnist_test, + mnist_train, + mnist_model, + mnist_dm, + mnist_trainer, + mnist_module, + mnist_results, + mlr_model, + mlr_module, + mlr_trainer) +``` + + +## Convolutional Neural Networks +In this section we fit a CNN to the `CIFAR100` data, which is available in the `torchvision` +package. It is arranged in a similar fashion as the `MNIST` data. + +```{python} +(cifar_train, + cifar_test) = [CIFAR100(root="data", + train=train, + download=True) + for train in [True, False]] +``` + +```{python} +transform = ToTensor() +cifar_train_X = torch.stack([transform(x) for x in + cifar_train.data]) +cifar_test_X = torch.stack([transform(x) for x in + cifar_test.data]) +cifar_train = TensorDataset(cifar_train_X, + torch.tensor(cifar_train.targets)) +cifar_test = TensorDataset(cifar_test_X, + torch.tensor(cifar_test.targets)) +``` + +The `CIFAR100` dataset consists of 50,000 training images, each represented by a three-dimensional tensor: +each three-color image is represented as a set of three channels, each of which consists of +$32\times 32$ eight-bit pixels. We standardize as we did for the +digits, but keep the array structure. This is accomplished with the `ToTensor()` transform. + +Creating the data module is similar to the `MNIST` example. + +```{python} +cifar_dm = SimpleDataModule(cifar_train, + cifar_test, + validation=0.2, + num_workers=max_num_workers, + batch_size=128) + +``` +We again look at the shape of typical batches in our data loaders. + +```{python} +for idx, (X_ ,Y_) in enumerate(cifar_dm.train_dataloader()): + print('X: ', X_.shape) + print('Y: ', Y_.shape) + if idx >= 1: + break + +``` + + +Before we start, we look at some of the training images; similar code produced +Figure 10.5 on page 447. The example below also illustrates +that `TensorDataset` objects can be indexed with integers --- we are choosing +random images from the training data by indexing `cifar_train`. In order to display correctly, +we must reorder the dimensions by a call to `np.transpose()`. + +```{python} +fig, axes = subplots(5, 5, figsize=(10,10)) +rng = np.random.default_rng(4) +indices = rng.choice(np.arange(len(cifar_train)), 25, + replace=False).reshape((5,5)) +for i in range(5): + for j in range(5): + idx = indices[i,j] + axes[i,j].imshow(np.transpose(cifar_train[idx][0], + [1,2,0]), + interpolation=None) + axes[i,j].set_xticks([]) + axes[i,j].set_yticks([]) + +``` +Here the `imshow()` method recognizes from the shape of its argument that it is a 3-dimensional array, with the last dimension indexing the three RGB color channels. + +We specify a moderately-sized CNN for +demonstration purposes, similar in structure to Figure 10.8. +We use several layers, each consisting of convolution, ReLU, and max-pooling steps. +We first define a module that defines one of these layers. As in our +previous examples, we overwrite the `__init__()` and `forward()` methods +of `nn.Module`. This user-defined module can now be used in ways just like +`nn.Linear()` or `nn.Dropout()`. + +```{python} +class BuildingBlock(nn.Module): + + def __init__(self, + in_channels, + out_channels): + + super(BuildingBlock, self).__init__() + self.conv = nn.Conv2d(in_channels=in_channels, + out_channels=out_channels, + kernel_size=(3,3), + padding='same') + self.activation = nn.ReLU() + self.pool = nn.MaxPool2d(kernel_size=(2,2)) + + def forward(self, x): + return self.pool(self.activation(self.conv(x))) + +``` + +Notice that we used the `padding = "same"` argument to +`nn.Conv2d()`, which ensures that the output channels have the +same dimension as the input channels. There are 32 channels in the first +hidden layer, in contrast to the three channels in the input layer. We +use a $3\times 3$ convolution filter for each channel in all the layers. Each +convolution is followed by a max-pooling layer over $2\times2$ +blocks. + +In forming our deep learning model for the `CIFAR100` data, we use several of our `BuildingBlock()` +modules sequentially. This simple example +illustrates some of the power of `torch`. Users can +define modules of their own, which can be combined in other +modules. Ultimately, everything is fit by a generic trainer. + +```{python} +class CIFARModel(nn.Module): + + def __init__(self): + super(CIFARModel, self).__init__() + sizes = [(3,32), + (32,64), + (64,128), + (128,256)] + self.conv = nn.Sequential(*[BuildingBlock(in_, out_) + for in_, out_ in sizes]) + + self.output = nn.Sequential(nn.Dropout(0.5), + nn.Linear(2*2*256, 512), + nn.ReLU(), + nn.Linear(512, 100)) + def forward(self, x): + val = self.conv(x) + val = torch.flatten(val, start_dim=1) + return self.output(val) + +``` + +We build the model and look at the summary. (We had created examples of `X_` earlier.) + +```{python} +cifar_model = CIFARModel() +summary(cifar_model, + input_data=X_, + col_names=['input_size', + 'output_size', + 'num_params']) +``` + + +The total number of trainable parameters is 964,516. +By studying the size of the parameters, we can see that the channels halve in both +dimensions +after each of these max-pooling operations. After the last of these we +have a layer with 256 channels of dimension $2\times 2$. These are then +flattened to a dense layer of size 1,024; +in other words, each of the $2\times 2$ matrices is turned into a +$4$-vector, and put side-by-side in one layer. This is followed by a +dropout regularization layer, then +another dense layer of size 512, and finally, the +output layer. + +Up to now, we have been using a default +optimizer in `SimpleModule()`. For these data, +experiments show that a smaller learning rate performs +better than the default 0.01. We use a +custom optimizer here with a learning rate of 0.001. +Besides this, the logging and training +follow a similar pattern to our previous examples. The optimizer +takes an argument `params` that informs +the optimizer which parameters are involved in SGD (stochastic gradient descent). + +We saw earlier that entries of a module’s parameters are tensors. In passing +the parameters to the optimizer we are doing more than +simply passing arrays; part of the structure of the graph +is encoded in the tensors themselves. + +```{python} +cifar_optimizer = RMSprop(cifar_model.parameters(), lr=0.001) +cifar_module = SimpleModule.classification(cifar_model, + num_classes=100, + optimizer=cifar_optimizer) +cifar_logger = CSVLogger('logs', name='CIFAR100') + +``` + +```{python} +cifar_trainer = Trainer(deterministic=True, + max_epochs=30, + logger=cifar_logger, + callbacks=[ErrorTracker()]) +cifar_trainer.fit(cifar_module, + datamodule=cifar_dm) + +``` + +This model can take 10 minutes or more to run and achieves about 42% accuracy on the test +data. Although this is not terrible for 100-class data (a random +classifier gets 1% accuracy), searching the web we see results around +75%. Typically it takes a lot of architecture carpentry, +fiddling with regularization, and time, to achieve such results. + +Let’s take a look at the validation and training accuracy +across epochs. + +```{python} +log_path = cifar_logger.experiment.metrics_file_path +cifar_results = pd.read_csv(log_path) +fig, ax = subplots(1, 1, figsize=(6, 6)) +summary_plot(cifar_results, + ax, + col='accuracy', + ylabel='Accuracy') +ax.set_xticks(np.linspace(0, 10, 6).astype(int)) +ax.set_ylabel('Accuracy') +ax.set_ylim([0, 1]); +``` +Finally, we evaluate our model on our test data. + +```{python} +cifar_trainer.test(cifar_module, + datamodule=cifar_dm) + +``` + + +### Hardware Acceleration +As deep learning has become ubiquitous in machine learning, hardware +manufacturers have produced special libraries that can +often speed up the gradient-descent steps. + +For instance, Mac OS devices with the M1 chip may have the *Metal* programming framework +enabled, which can speed up the `torch` +computations. We present an example of how to use this acceleration. + +The main changes are to the `Trainer()` call as well as to the metrics +that will be evaluated on the data. These metrics must be told where +the data will be located at evaluation time. This is +accomplished with a call to the `to()` method of the metrics. + +```{python} +try: + for name, metric in cifar_module.metrics.items(): + cifar_module.metrics[name] = metric.to('mps') + cifar_trainer_mps = Trainer(accelerator='mps', + deterministic=True, + max_epochs=30) + cifar_trainer_mps.fit(cifar_module, + datamodule=cifar_dm) + cifar_trainer_mps.test(cifar_module, + datamodule=cifar_dm) +except: + pass +``` +This yields approximately two- or three-fold acceleration for each epoch. +We have protected this code block using `try:` and `except:` +clauses; if it works, we get the speedup, if it fails, nothing happens. + + +## Using Pretrained CNN Models +We now show how to use a CNN pretrained on the `imagenet` database to classify natural +images, and demonstrate how we produced Figure 10.10. +We copied six JPEG images from a digital photo album into the +directory `book_images`. These images are available +from the data section of , the ISLP book website. Download `book_images.zip`; when +clicked it creates the `book_images` directory. + +The pretrained network we use is called `resnet50`; specification details can be found on the web. +We will read in the images, and +convert them into the array format expected by the `torch` +software to match the specifications in `resnet50`. +The conversion involves a resize, a crop and then a predefined standardization for each of the three channels. +We now read in the images and preprocess them. + +```{python} +resize = Resize((232,232), antialias=True) +crop = CenterCrop(224) +normalize = Normalize([0.485,0.456,0.406], + [0.229,0.224,0.225]) +imgfiles = sorted([f for f in glob('book_images/*')]) +imgs = torch.stack([torch.div(crop(resize(read_image(f))), 255) + for f in imgfiles]) +imgs = normalize(imgs) +imgs.size() +``` + + +We now set up the trained network with the weights we read in code block~6. The model has 50 layers, with a fair bit of complexity. + +```{python} +resnet_model = resnet50(weights=ResNet50_Weights.DEFAULT) +summary(resnet_model, + input_data=imgs, + col_names=['input_size', + 'output_size', + 'num_params']) + +``` +We set the mode to `eval()` to ensure that the model is ready to predict on new data. + +```{python} +resnet_model.eval() +``` +Inspecting the output above, we see that when setting up the +`resnet_model`, the authors defined a `Bottleneck`, much like our +`BuildingBlock` module. + +We now feed our six images through the fitted network. + +```{python} +img_preds = resnet_model(imgs) + +``` + +Let’s look at the predicted probabilities for each of the top 3 choices. First we compute +the probabilities by applying the softmax to the logits in `img_preds`. Note that +we have had to call the `detach()` method on the tensor `img_preds` in order to convert +it to our a more familiar `ndarray`. + +```{python} +img_probs = np.exp(np.asarray(img_preds.detach())) +img_probs /= img_probs.sum(1)[:,None] + +``` + +In order to see the class labels, we must download the index file associated with `imagenet`. {This is avalable from the book website and [s3.amazonaws.com/deep-learning-models/image-models/imagenet_class_index.json](https://s3.amazonaws.com/deep-learning-models/image-models/imagenet_class_index.json).} + +```{python} +labs = json.load(open('imagenet_class_index.json')) +class_labels = pd.DataFrame([(int(k), v[1]) for k, v in + labs.items()], + columns=['idx', 'label']) +class_labels = class_labels.set_index('idx') +class_labels = class_labels.sort_index() + +``` + +We’ll now construct a data frame for each image file +with the labels with the three highest probabilities as +estimated by the model above. + +```{python} +for i, imgfile in enumerate(imgfiles): + img_df = class_labels.copy() + img_df['prob'] = img_probs[i] + img_df = img_df.sort_values(by='prob', ascending=False)[:3] + print(f'Image: {imgfile}') + print(img_df.reset_index().drop(columns=['idx'])) + +``` + + +We see that the model +is quite confident about `Flamingo.jpg`, but a little less so for the +other images. + +We end this section with our usual cleanup. + +```{python} +del(cifar_test, + cifar_train, + cifar_dm, + cifar_module, + cifar_logger, + cifar_optimizer, + cifar_trainer) +``` + + +## IMDB Document Classification +We now implement models for sentiment classification (Section 10.4) on the `IMDB` +dataset. As mentioned above code block~8, we are using +a preprocessed version of the `IMDB` dataset found in the +`keras` package. As `keras` uses `tensorflow`, a different +tensor and deep learning library, we have +converted the data to be suitable for `torch`. The +code used to convert from `keras` is +available in the module `ISLP.torch._make_imdb`. It +requires some of the `keras` packages to run. These data use a dictionary of size 10,000. + +We have stored three different representations of the review data for this lab: + +* `load_tensor()`, a sparse tensor version usable by `torch`; +* `load_sparse()`, a sparse matrix version usable by `sklearn`, since we will compare with a lasso fit; +* `load_sequential()`, a padded +version of the original sequence representation, limited to the last +500 words of each review. + + + +```{python} +(imdb_seq_train, + imdb_seq_test) = load_sequential(root='data/IMDB') +padded_sample = np.asarray(imdb_seq_train.tensors[0][0]) +sample_review = padded_sample[padded_sample > 0][:12] +sample_review[:12] + +``` +The datasets `imdb_seq_train` and `imdb_seq_test` are +both instances of the class `TensorDataset`. The +tensors used to construct them can be found in the `tensors` attribute, with +the first tensor the features `X` and the second the outcome `Y`. +We have taken the first row of features and stored it as `padded_sample`. In the preprocessing +used to form these data, sequences were padded with 0s in the beginning if they were +not long enough, hence we remove this padding by restricting to entries where +`padded_sample > 0`. We then provide the first 12 words of the sample review. + +We can find these words in the `lookup` dictionary from the `ISLP.torch.imdb` module. + +```{python} +lookup = load_lookup(root='data/IMDB') +' '.join(lookup[i] for i in sample_review) +``` + +For our first model, we have created a binary feature for each +of the 10,000 possible words in the dataset, with an entry of one +in the $i,j$ entry if word $j$ appears in review $i$. As most reviews +are quite short, such a feature matrix has over 98% zeros. These data +are accessed using `load_tensor()` from the `ISLP` library. + +```{python} +max_num_workers=10 +(imdb_train, + imdb_test) = load_tensor(root='data/IMDB') +imdb_dm = SimpleDataModule(imdb_train, + imdb_test, + validation=2000, + num_workers=min(6, max_num_workers), + batch_size=512) + +``` +We’ll use a two-layer model for our first model. + +```{python} +class IMDBModel(nn.Module): + + def __init__(self, input_size): + super(IMDBModel, self).__init__() + self.dense1 = nn.Linear(input_size, 16) + self.activation = nn.ReLU() + self.dense2 = nn.Linear(16, 16) + self.output = nn.Linear(16, 1) + + def forward(self, x): + val = x + for _map in [self.dense1, + self.activation, + self.dense2, + self.activation, + self.output]: + val = _map(val) + return torch.flatten(val) + +``` +We now instantiate our model and look at a summary. + +```{python} +imdb_model = IMDBModel(imdb_test.tensors[0].size()[1]) +summary(imdb_model, + input_size=imdb_test.tensors[0].size(), + col_names=['input_size', + 'output_size', + 'num_params']) + +``` + +We’ll again use +a smaller learning rate for these data, +hence we pass an `optimizer` to the +`SimpleModule`. +Since the reviews are classified into +positive or negative sentiment, we use +`SimpleModule.binary_classification()`. {Our use of + `binary_classification()` instead of `classification()` is + due to some subtlety in how `torchmetrics.Accuracy()` works, +as well as the data type of the targets.} + +```{python} +imdb_optimizer = RMSprop(imdb_model.parameters(), lr=0.001) +imdb_module = SimpleModule.binary_classification( + imdb_model, + optimizer=imdb_optimizer) + +``` + +Having loaded the datasets into a data module +and created a `SimpleModule`, the remaining steps +are familiar. + +```{python} +imdb_logger = CSVLogger('logs', name='IMDB') +imdb_trainer = Trainer(deterministic=True, + max_epochs=30, + logger=imdb_logger, + callbacks=[ErrorTracker()]) +imdb_trainer.fit(imdb_module, + datamodule=imdb_dm) +``` + +Evaluating the test error yields roughly 86% accuracy. + +```{python} +test_results = imdb_trainer.test(imdb_module, datamodule=imdb_dm) +test_results +``` + + +### Comparison to Lasso +We now fit a lasso logistic regression model + using `LogisticRegression()` from `sklearn`. Since `sklearn` does not recognize +the sparse tensors of `torch`, we use a sparse +matrix that is recognized by `sklearn.` + +```{python} +((X_train, Y_train), + (X_valid, Y_valid), + (X_test, Y_test)) = load_sparse(validation=2000, + random_state=0, + root='data/IMDB') + +``` + +Similar to what we did in +Section 10.9.1, +we construct a series of 50 values for the lasso reguralization parameter $\lambda$. + +```{python} +lam_max = np.abs(X_train.T * (Y_train - Y_train.mean())).max() +lam_val = lam_max * np.exp(np.linspace(np.log(1), + np.log(1e-4), 50)) + +``` +With `LogisticRegression()` the regularization parameter +$C$ is specified as the inverse of $\lambda$. There are several +solvers for logistic regression; here we use `liblinear` which +works well with the sparse input format. + +```{python} +logit = LogisticRegression(penalty='l1', + C=1/lam_max, + solver='liblinear', + warm_start=True, + fit_intercept=True) + +``` +The path of 50 values takes approximately 40 seconds to run. + +```{python} +coefs = [] +intercepts = [] + +for l in lam_val: + logit.C = 1/l + logit.fit(X_train, Y_train) + coefs.append(logit.coef_.copy()) + intercepts.append(logit.intercept_) + +``` + +The coefficient and intercepts have an extraneous dimension which can be removed +by the `np.squeeze()` function. + +```{python} +coefs = np.squeeze(coefs) +intercepts = np.squeeze(intercepts) + +``` +We’ll now make a plot to compare our neural network results with the +lasso. + +```{python} +# %%capture +fig, axes = subplots(1, 2, figsize=(16, 8), sharey=True) +for ((X_, Y_), + data_, + color) in zip([(X_train, Y_train), + (X_valid, Y_valid), + (X_test, Y_test)], + ['Training', 'Validation', 'Test'], + ['black', 'red', 'blue']): + linpred_ = X_ * coefs.T + intercepts[None,:] + label_ = np.array(linpred_ > 0) + accuracy_ = np.array([np.mean(Y_ == l) for l in label_.T]) + axes[0].plot(-np.log(lam_val / X_train.shape[0]), + accuracy_, + '.--', + color=color, + markersize=13, + linewidth=2, + label=data_) +axes[0].legend() +axes[0].set_xlabel(r'$-\log(\lambda)$', fontsize=20) +axes[0].set_ylabel('Accuracy', fontsize=20) + +``` +Notice the use of `%%capture`, which suppresses the displaying of the partially completed figure. This is useful +when making a complex figure, since the steps can be spread across two or more cells. +We now add a plot of the lasso accuracy, and display the composed figure by simply entering its name at the end of the cell. + +```{python} +imdb_results = pd.read_csv(imdb_logger.experiment.metrics_file_path) +summary_plot(imdb_results, + axes[1], + col='accuracy', + ylabel='Accuracy') +axes[1].set_xticks(np.linspace(0, 30, 7).astype(int)) +axes[1].set_ylabel('Accuracy', fontsize=20) +axes[1].set_xlabel('Epoch', fontsize=20) +axes[1].set_ylim([0.5, 1]); +axes[1].axhline(test_results[0]['test_accuracy'], + color='blue', + linestyle='--', + linewidth=3) +fig +``` +From the graphs we see that the accuracy of the lasso logistic regression peaks at about $0.88$, as it does for the neural network. + +Once again, we end with a cleanup. + +```{python} +del(imdb_model, + imdb_trainer, + imdb_logger, + imdb_dm, + imdb_train, + imdb_test) +``` + + +## Recurrent Neural Networks +In this lab we fit the models illustrated in +Section 10.5. + + +### Sequential Models for Document Classification +Here we fit a simple LSTM RNN for sentiment prediction to +the `IMDb` movie-review data, as discussed in Section 10.5.1. +For an RNN we use the sequence of words in a document, taking their +order into account. We loaded the preprocessed +data at the beginning of +Section 10.9.5. +A script that details the preprocessing can be found in the +`ISLP` library. Notably, since more than 90% of the documents +had fewer than 500 words, we set the document length to 500. For +longer documents, we used the last 500 words, and for shorter +documents, we padded the front with blanks. + + +```{python} +imdb_seq_dm = SimpleDataModule(imdb_seq_train, + imdb_seq_test, + validation=2000, + batch_size=300, + num_workers=min(6, max_num_workers) + ) + +``` + +The first layer of the RNN is an embedding layer of size 32, which will be +learned during training. This layer one-hot encodes each document +as a matrix of dimension $500 \times 10,003$, and then maps these +$10,003$ dimensions down to $32$. {The extra 3 dimensions +correspond to commonly occurring non-word entries in the reviews.} + Since each word is represented by an +integer, this is effectively achieved by the creation of an embedding +matrix of size $10,003\times 32$; each of the 500 integers in the +document are then mapped to the appropriate 32 real numbers by +indexing the appropriate rows of this matrix. + + +The second layer is an LSTM with 32 units, and the output +layer is a single logit for the binary classification task. +In the last line of the `forward()` method below, +we take the last 32-dimensional output of the LSTM and map it to our response. + +```{python} +class LSTMModel(nn.Module): + def __init__(self, input_size): + super(LSTMModel, self).__init__() + self.embedding = nn.Embedding(input_size, 32) + self.lstm = nn.LSTM(input_size=32, + hidden_size=32, + batch_first=True) + self.dense = nn.Linear(32, 1) + def forward(self, x): + val, (h_n, c_n) = self.lstm(self.embedding(x)) + return torch.flatten(self.dense(val[:,-1])) +``` +We instantiate and take a look at the summary of the model, using the +first 10 documents in the corpus. + +```{python} +lstm_model = LSTMModel(X_test.shape[-1]) +summary(lstm_model, + input_data=imdb_seq_train.tensors[0][:10], + col_names=['input_size', + 'output_size', + 'num_params']) + +``` + +The 10,003 is suppressed in the summary, but we see it in the +parameter count, since $10,003\times 32=320,096$. + +```{python} +lstm_module = SimpleModule.binary_classification(lstm_model) +lstm_logger = CSVLogger('logs', name='IMDB_LSTM') + +``` + +```{python} +lstm_trainer = Trainer(deterministic=True, + max_epochs=20, + logger=lstm_logger, + callbacks=[ErrorTracker()]) +lstm_trainer.fit(lstm_module, + datamodule=imdb_seq_dm) + +``` +The rest is now similar to other networks we have fit. We +track the test performance as the network is fit, and see that it attains 85% accuracy. + +```{python} +lstm_trainer.test(lstm_module, datamodule=imdb_seq_dm) +``` + +We once again show the learning progress, followed by cleanup. + +```{python} +lstm_results = pd.read_csv(lstm_logger.experiment.metrics_file_path) +fig, ax = subplots(1, 1, figsize=(6, 6)) +summary_plot(lstm_results, + ax, + col='accuracy', + ylabel='Accuracy') +ax.set_xticks(np.linspace(0, 20, 5).astype(int)) +ax.set_ylabel('Accuracy') +ax.set_ylim([0.5, 1]) + +``` + + +```{python} +del(lstm_model, + lstm_trainer, + lstm_logger, + imdb_seq_dm, + imdb_seq_train, + imdb_seq_test) + +``` + + +### Time Series Prediction +We now show how to fit the models in Section 10.5.2 +for time series prediction. +We first load and standardize the data. + +```{python} +NYSE = load_data('NYSE') +cols = ['DJ_return', 'log_volume', 'log_volatility'] +X = pd.DataFrame(StandardScaler( + with_mean=True, + with_std=True).fit_transform(NYSE[cols]), + columns=NYSE[cols].columns, + index=NYSE.index) + +``` + +Next we set up the lagged versions of the data, dropping +any rows with missing values using the `dropna()` method. + +```{python} +for lag in range(1, 6): + for col in cols: + newcol = np.zeros(X.shape[0]) * np.nan + newcol[lag:] = X[col].values[:-lag] + X.insert(len(X.columns), "{0}_{1}".format(col, lag), newcol) +X.insert(len(X.columns), 'train', NYSE['train']) +X = X.dropna() + +``` + +Finally, we extract the response, training indicator, and drop the current day’s `DJ_return` and +`log_volatility` to predict only from previous day’s data. + +```{python} +Y, train = X['log_volume'], X['train'] +X = X.drop(columns=['train'] + cols) +X.columns + +``` + + +We first fit a simple linear model and compute the $R^2$ on the test data using +the `score()` method. + +```{python} +M = LinearRegression() +M.fit(X[train], Y[train]) +M.score(X[~train], Y[~train]) +``` + +We refit this model, including the factor variable `day_of_week`. +For a categorical series in `pandas`, we can form the indicators +using the `get_dummies()` method. + +```{python} +X_day = pd.merge(X, + pd.get_dummies(NYSE['day_of_week']), + on='date') +``` + Note that we do not have +to reinstantiate the linear regression model +as its `fit()` method accepts a design matrix and a response directly. + +```{python} +M.fit(X_day[train], Y[train]) +M.score(X_day[~train], Y[~train]) +``` +This model achieves an $R^2$ of about 46%. + + +To fit the RNN, we must reshape the data, as it will expect 5 lagged +versions of each feature as indicated by the `input_shape` argument +to the layer `nn.RNN()` below. We first +ensure the columns of our data frame are such that a reshaped +matrix will have the variables correctly lagged. We use the +`reindex()` method to do this. + +For an input shape `(5,3)`, each row represents a lagged version of the three variables. +The `nn.RNN()` layer also expects the first row of each +observation to be earliest in time, so we must reverse the current order. Hence we loop over +`range(5,0,-1)` below, which is +an example of using a `slice()` to index +iterable objects. The general notation is `start:end:step`. + +```{python} +ordered_cols = [] +for lag in range(5,0,-1): + for col in cols: + ordered_cols.append('{0}_{1}'.format(col, lag)) +X = X.reindex(columns=ordered_cols) +X.columns + +``` +We now reshape the data. + +```{python} +X_rnn = X.to_numpy().reshape((-1,5,3)) +X_rnn.shape +``` +By specifying the first size as -1, `numpy.reshape()` deduces its size based on the remaining arguments. + +Now we are ready to proceed with the RNN, which uses 12 hidden units, and 10% +dropout. +After passing through the RNN, we extract the final time point as `val[:,-1]` +in `forward()` below. This gets passed through a 10% dropout and then flattened through +a linear layer. + +```{python} +class NYSEModel(nn.Module): + def __init__(self): + super(NYSEModel, self).__init__() + self.rnn = nn.RNN(3, + 12, + batch_first=True) + self.dense = nn.Linear(12, 1) + self.dropout = nn.Dropout(0.1) + def forward(self, x): + val, h_n = self.rnn(x) + val = self.dense(self.dropout(val[:,-1])) + return torch.flatten(val) +nyse_model = NYSEModel() +``` + +We fit the model in a similar fashion to previous networks. We +supply the `fit` function with test data as validation data, so that when +we monitor its progress and plot the history function we can see the +progress on the test data. Of course we should not use this as a basis for +early stopping, since then the test performance would be biased. + +We form the training dataset similar to +our `Hitters` example. + +```{python} +datasets = [] +for mask in [train, ~train]: + X_rnn_t = torch.tensor(X_rnn[mask].astype(np.float32)) + Y_t = torch.tensor(Y[mask].astype(np.float32)) + datasets.append(TensorDataset(X_rnn_t, Y_t)) +nyse_train, nyse_test = datasets + +``` + +Following our usual pattern, we inspect the summary. + +```{python} +summary(nyse_model, + input_data=X_rnn_t, + col_names=['input_size', + 'output_size', + 'num_params']) + +``` +We again put the two datasets into a data module, with a +batch size of 64. + +```{python} +nyse_dm = SimpleDataModule(nyse_train, + nyse_test, + num_workers=min(4, max_num_workers), + validation=nyse_test, + batch_size=64) +``` +We run some data through our model to be sure the sizes match up correctly. + +```{python} +for idx, (x, y) in enumerate(nyse_dm.train_dataloader()): + out = nyse_model(x) + print(y.size(), out.size()) + if idx >= 2: + break + +``` + +We follow our previous example for setting up a trainer for a +regression problem, requesting the $R^2$ metric +to be be computed at each epoch. + +```{python} +nyse_optimizer = RMSprop(nyse_model.parameters(), + lr=0.001) +nyse_module = SimpleModule.regression(nyse_model, + optimizer=nyse_optimizer, + metrics={'r2':R2Score()}) + +``` + +Fitting the model should by now be familiar. +The results on the test data are very similar to the linear AR model. + +```{python} +nyse_trainer = Trainer(deterministic=True, + max_epochs=200, + callbacks=[ErrorTracker()]) +nyse_trainer.fit(nyse_module, + datamodule=nyse_dm) +nyse_trainer.test(nyse_module, + datamodule=nyse_dm) +``` + + +We could also fit a model without the `nn.RNN()` layer by just +using a `nn.Flatten()` layer instead. This would be a nonlinear AR model. If in addition we excluded the +hidden layer, this would be equivalent to our earlier linear AR model. + +Instead we will fit a nonlinear AR model using the feature set `X_day` that includes the `day_of_week` indicators. +To do so, we +must first create our test and training datasets and a corresponding +data module. This may seem a little burdensome, but is part of the +general pipeline for `torch`. + +```{python} +datasets = [] +for mask in [train, ~train]: + X_day_t = torch.tensor( + np.asarray(X_day[mask]).astype(np.float32)) + Y_t = torch.tensor(np.asarray(Y[mask]).astype(np.float32)) + datasets.append(TensorDataset(X_day_t, Y_t)) +day_train, day_test = datasets +``` + +Creating a data module follows a familiar pattern. + +```{python} +day_dm = SimpleDataModule(day_train, + day_test, + num_workers=min(4, max_num_workers), + validation=day_test, + batch_size=64) + +``` + +We build a `NonLinearARModel()` that takes as input the 20 features and a hidden layer with 32 units. The remaining steps are familiar. + +```{python} +class NonLinearARModel(nn.Module): + def __init__(self): + super(NonLinearARModel, self).__init__() + self._forward = nn.Sequential(nn.Flatten(), + nn.Linear(20, 32), + nn.ReLU(), + nn.Dropout(0.5), + nn.Linear(32, 1)) + def forward(self, x): + return torch.flatten(self._forward(x)) + +``` + +```{python} +nl_model = NonLinearARModel() +nl_optimizer = RMSprop(nl_model.parameters(), + lr=0.001) +nl_module = SimpleModule.regression(nl_model, + optimizer=nl_optimizer, + metrics={'r2':R2Score()}) + +``` + +We continue with the usual training steps, fit the model, +and evaluate the test error. We see the test $R^2$ is a slight improvement over the linear AR model that also includes `day_of_week`. + +```{python} +nl_trainer = Trainer(deterministic=True, + max_epochs=20, + callbacks=[ErrorTracker()]) +nl_trainer.fit(nl_module, datamodule=day_dm) +nl_trainer.test(nl_module, datamodule=day_dm) +``` + + + + diff --git a/ISLP_labs_R4DS/Ch11-surv-lab copy.Rmd b/ISLP_labs_R4DS/Ch11-surv-lab copy.Rmd new file mode 100644 index 0000000..2e3bdd0 --- /dev/null +++ b/ISLP_labs_R4DS/Ch11-surv-lab copy.Rmd @@ -0,0 +1,559 @@ +--- +jupyter: + jupytext: + cell_metadata_filter: -all + formats: Rmd,ipynb + main_language: python + text_representation: + extension: .Rmd + format_name: rmarkdown + format_version: '1.2' + jupytext_version: 1.14.7 +--- + + +# Chapter 11 + + + + +# Lab: Survival Analysis + In this lab, we perform survival analyses on three separate data +sets. In Section 11.8.1 we analyze the `BrainCancer` +data that was first described in Section 11.3. In Section 11.8.2, we examine the `Publication` +data from Section 11.5.4. Finally, Section 11.8.3 explores +a simulated call-center data set. + +We begin by importing some of our libraries at this top +level. This makes the code more readable, as scanning the first few +lines of the notebook tell us what libraries are used in this +notebook. + +```{python} +from matplotlib.pyplot import subplots +import numpy as np +import pandas as pd +from ISLP.models import ModelSpec as MS +from ISLP import load_data + +``` + +We also collect the new imports +needed for this lab. + +```{python} +from lifelines import \ + (KaplanMeierFitter, + CoxPHFitter) +from lifelines.statistics import \ + (logrank_test, + multivariate_logrank_test) +from ISLP.survival import sim_time + +``` + +## Brain Cancer Data + +We begin with the `BrainCancer` data set, contained in the `ISLP` package. + +```{python} +BrainCancer = load_data('BrainCancer') +BrainCancer.columns + +``` + +The rows index the 88 patients, while the 8 columns contain the predictors and outcome variables. +We first briefly examine the data. + +```{python} +BrainCancer['sex'].value_counts() + +``` + + +```{python} +BrainCancer['diagnosis'].value_counts() + +``` + + +```{python} +BrainCancer['status'].value_counts() + +``` + + +Before beginning an analysis, it is important to know how the +`status` variable has been coded. Most software +uses the convention that a `status` of 1 indicates an +uncensored observation (often death), and a `status` of 0 indicates a censored +observation. But some scientists might use the opposite coding. For +the `BrainCancer` data set 35 patients died before the end of +the study, so we are using the conventional coding. + +To begin the analysis, we re-create the Kaplan-Meier survival curve shown in Figure 11.2. The main +package we will use for survival analysis +is `lifelines`. +The variable `time` corresponds to $y_i$, the time to the $i$th event (either censoring or +death). The first argument to `km.fit` is the event time, and the +second argument is the censoring variable, with a 1 indicating an observed +failure time. The `plot()` method produces a survival curve with pointwise confidence +intervals. By default, these are 90% confidence intervals, but this can be changed +by setting the `alpha` argument to one minus the desired +confidence level. + +```{python} +fig, ax = subplots(figsize=(8,8)) +km = KaplanMeierFitter() +km_brain = km.fit(BrainCancer['time'], BrainCancer['status']) +km_brain.plot(label='Kaplan Meier estimate', ax=ax) + +``` + +Next we create Kaplan-Meier survival curves that are stratified by +`sex`, in order to reproduce Figure 11.3. +We do this using the `groupby()` method of a dataframe. +This method returns a generator that can +be iterated over in the `for` loop. In this case, +the items in the `for` loop are 2-tuples representing +the groups: the first entry is the value +of the grouping column `sex` while the second value +is the dataframe consisting of all rows in the +dataframe matching that value of `sex`. +We will want to use this data below +in the log-rank test, hence we store this +information in the dictionary `by_sex`. Finally, +we have also used the notion of + *string interpolation* to automatically +label the different lines in the plot. String +interpolation is a powerful technique to format strings --- +`Python` has many ways to facilitate such operations. + +```{python} +fig, ax = subplots(figsize=(8,8)) +by_sex = {} +for sex, df in BrainCancer.groupby('sex'): + by_sex[sex] = df + km_sex = km.fit(df['time'], df['status']) + km_sex.plot(label='Sex=%s' % sex, ax=ax) + +``` + +As discussed in Section 11.4, we can perform a +log-rank test to compare the survival of males to females. We use +the `logrank_test()` function from the `lifelines.statistics` module. +The first two arguments are the event times, with the second +denoting the corresponding (optional) censoring indicators. + +```{python} +logrank_test(by_sex['Male']['time'], + by_sex['Female']['time'], + by_sex['Male']['status'], + by_sex['Female']['status']) + +``` + + +The resulting $p$-value is $0.23$, indicating no evidence of a +difference in survival between the two sexes. + +Next, we use the `CoxPHFitter()` estimator +from `lifelines` to fit Cox proportional hazards models. +To begin, we consider a model that uses `sex` as the only predictor. + +```{python} +coxph = CoxPHFitter # shorthand +sex_df = BrainCancer[['time', 'status', 'sex']] +model_df = MS(['time', 'status', 'sex'], + intercept=False).fit_transform(sex_df) +cox_fit = coxph().fit(model_df, + 'time', + 'status') +cox_fit.summary[['coef', 'se(coef)', 'p']] + +``` + +The first argument to `fit` should be a data frame containing +at least the event time (the second argument `time` in this case), +as well as an optional censoring variable (the argument `status` in this case). +Note also that the Cox model does not include an intercept, which is why +we used the `intercept=False` argument to `ModelSpec` above. +The `summary()` method delivers many columns; we chose to abbreviate its output here. +It is possible to obtain the likelihood ratio test comparing this model to the one +with no features as follows: + +```{python} +cox_fit.log_likelihood_ratio_test() + +``` + +Regardless of which test we use, we see that there is no clear +evidence for a difference in survival between males and females. As +we learned in this chapter, the score test from the Cox model is +exactly equal to the log rank test statistic! + +Now we fit a model that makes use of additional predictors. We first note +that one of our `diagnosis` values is missing, hence +we drop that observation before continuing. + +```{python} +cleaned = BrainCancer.dropna() +all_MS = MS(cleaned.columns, intercept=False) +all_df = all_MS.fit_transform(cleaned) +fit_all = coxph().fit(all_df, + 'time', + 'status') +fit_all.summary[['coef', 'se(coef)', 'p']] + +``` + + The `diagnosis` variable has been coded so that the baseline +corresponds to HG glioma. The results indicate that the risk associated with HG glioma +is more than eight times (i.e. $e^{2.15}=8.62$) the risk associated +with meningioma. In other words, after adjusting for the other +predictors, patients with HG glioma have much worse survival compared +to those with meningioma. In addition, larger values of the Karnofsky +index, `ki`, are associated with lower risk, i.e. longer survival. + +Finally, we plot estimated survival curves for each diagnosis category, +adjusting for the other predictors. To make these plots, we set the +values of the other predictors equal to the mean for quantitative variables +and equal to the mode for categorical. To do this, we use the +`apply()` method across rows (i.e. `axis=0`) with a function +`representative` that checks if a column is categorical +or not. + +```{python} +levels = cleaned['diagnosis'].unique() +def representative(series): + if hasattr(series.dtype, 'categories'): + return pd.Series.mode(series) + else: + return series.mean() +modal_data = cleaned.apply(representative, axis=0) + +``` + +We make four +copies of the column means and assign the `diagnosis` column to be the four different +diagnoses. + +```{python} +modal_df = pd.DataFrame( + [modal_data.iloc[0] for _ in range(len(levels))]) +modal_df['diagnosis'] = levels +modal_df + +``` + +We then construct the model matrix based on the model specification `all_MS` used to fit +the model, and name the rows according to the levels of `diagnosis`. + +```{python} +modal_X = all_MS.transform(modal_df) +modal_X.index = levels +modal_X + +``` + +We can use the `predict_survival_function()` method to obtain the estimated survival function. + +```{python} +predicted_survival = fit_all.predict_survival_function(modal_X) +predicted_survival + +``` +This returns a data frame, +whose plot methods yields the different survival curves. To avoid clutter in +the plots, we do not display confidence intervals. + +```{python} +fig, ax = subplots(figsize=(8, 8)) +predicted_survival.plot(ax=ax); + +``` + + +## Publication Data +The `Publication` data presented in Section 11.5.4 can be +found in the `ISLP` package. +We first reproduce Figure 11.5 by plotting the Kaplan-Meier curves +stratified on the `posres` variable, which records whether the +study had a positive or negative result. + +```{python} +fig, ax = subplots(figsize=(8,8)) +Publication = load_data('Publication') +by_result = {} +for result, df in Publication.groupby('posres'): + by_result[result] = df + km_result = km.fit(df['time'], df['status']) + km_result.plot(label='Result=%d' % result, ax=ax) + +``` + +As discussed previously, the $p$-values from fitting Cox’s +proportional hazards model to the `posres` variable are quite +large, providing no evidence of a difference in time-to-publication +between studies with positive versus negative results. + +```{python} +posres_df = MS(['posres', + 'time', + 'status'], + intercept=False).fit_transform(Publication) +posres_fit = coxph().fit(posres_df, + 'time', + 'status') +posres_fit.summary[['coef', 'se(coef)', 'p']] + +``` + + +However, the results change dramatically when we include other +predictors in the model. Here we exclude the funding mechanism +variable. + +```{python} +model = MS(Publication.columns.drop('mech'), + intercept=False) +coxph().fit(model.fit_transform(Publication), + 'time', + 'status').summary[['coef', 'se(coef)', 'p']] + +``` + +We see that there are a number of statistically significant variables, +including whether the trial focused on a clinical endpoint, the impact +of the study, and whether the study had positive or negative results. + + +## Call Center Data + +In this section, we will simulate survival data using the relationship +between cumulative hazard and +the survival function explored in Exercise 8. +Our simulated data will represent the observed +wait times (in seconds) for 2,000 customers who have phoned a call +center. In this context, censoring occurs if a customer hangs up +before his or her call is answered. + +There are three covariates: `Operators` (the number of call +center operators available at the time of the call, which can range +from $5$ to $15$), `Center` (either A, B, or C), and +`Time` of day (Morning, Afternoon, or Evening). We generate data +for these covariates so that all possibilities are equally likely: for +instance, morning, afternoon and evening calls are equally likely, and +any number of operators from $5$ to $15$ is equally likely. + +```{python} +rng = np.random.default_rng(10) +N = 2000 +Operators = rng.choice(np.arange(5, 16), + N, + replace=True) +Center = rng.choice(['A', 'B', 'C'], + N, + replace=True) +Time = rng.choice(['Morn.', 'After.', 'Even.'], + N, + replace=True) +D = pd.DataFrame({'Operators': Operators, + 'Center': pd.Categorical(Center), + 'Time': pd.Categorical(Time)}) +``` + +We then build a model matrix (omitting the intercept) + +```{python} +model = MS(['Operators', + 'Center', + 'Time'], + intercept=False) +X = model.fit_transform(D) +``` + +It is worthwhile to take a peek at the model matrix `X`, so +that we can be sure that we understand how the variables have been coded. By default, +the levels of categorical variables are sorted and, as usual, the first column of the one-hot encoding +of the variable is dropped. + +```{python} +X[:5] + +``` + +Next, we specify the coefficients and the hazard function. + +```{python} +true_beta = np.array([0.04, -0.3, 0, 0.2, -0.2]) +true_linpred = X.dot(true_beta) +hazard = lambda t: 1e-5 * t + +``` + +Here, we have set the coefficient associated with `Operators` to +equal $0.04$; in other words, each additional operator leads to a +$e^{0.04}=1.041$-fold increase in the “risk” that the call will be +answered, given the `Center` and `Time` covariates. This +makes sense: the greater the number of operators at hand, the shorter +the wait time! The coefficient associated with `Center == B` is +$-0.3$, and `Center == A` is treated as the baseline. This means +that the risk of a call being answered at Center B is 0.74 times the +risk that it will be answered at Center A; in other words, the wait +times are a bit longer at Center B. + +Recall from Section 2.3.7 the use of `lambda` +for creating short functions on the fly. +We use the function +`sim_time()` from the `ISLP.survival` package. This function +uses the relationship between the survival function +and cumulative hazard $S(t) = \exp(-H(t))$ and the specific +form of the cumulative hazard function in the Cox model +to simulate data based on values of the linear predictor +`true_linpred` and the cumulative hazard. + We need to provide the cumulative hazard function, which we do here. + +```{python} +cum_hazard = lambda t: 1e-5 * t**2 / 2 + +``` +We are now ready to generate data under the Cox proportional hazards +model. We truncate the maximum time to 1000 seconds to keep +simulated wait times reasonable. The function +`sim_time()` takes a linear predictor, +a cumulative hazard function and a +random number generator. + +```{python} +W = np.array([sim_time(l, cum_hazard, rng) + for l in true_linpred]) +D['Wait time'] = np.clip(W, 0, 1000) + +``` + +We now simulate our censoring variable, for which we assume +90% of calls were answered (`Failed==1`) before the +customer hung up (`Failed==0`). + +```{python} +D['Failed'] = rng.choice([1, 0], + N, + p=[0.9, 0.1]) +D[:5] + +``` + + +```{python} +D['Failed'].mean() + +``` + +We now plot Kaplan-Meier survival curves. First, we stratify by `Center`. + +```{python} +fig, ax = subplots(figsize=(8,8)) +by_center = {} +for center, df in D.groupby('Center'): + by_center[center] = df + km_center = km.fit(df['Wait time'], df['Failed']) + km_center.plot(label='Center=%s' % center, ax=ax) +ax.set_title("Probability of Still Being on Hold") + +``` + +Next, we stratify by `Time`. + +```{python} +fig, ax = subplots(figsize=(8,8)) +by_time = {} +for time, df in D.groupby('Time'): + by_time[time] = df + km_time = km.fit(df['Wait time'], df['Failed']) + km_time.plot(label='Time=%s' % time, ax=ax) +ax.set_title("Probability of Still Being on Hold") + +``` + +It seems that calls at Call Center B take longer to be answered than +calls at Centers A and C. Similarly, it appears that wait times are +longest in the morning and shortest in the evening hours. We can use a +log-rank test to determine whether these differences are statistically +significant using the function `multivariate_logrank_test()`. + +```{python} +multivariate_logrank_test(D['Wait time'], + D['Center'], + D['Failed']) + +``` + + +Next, we consider the effect of `Time`. + +```{python} +multivariate_logrank_test(D['Wait time'], + D['Time'], + D['Failed']) + +``` + + +As in the case of a categorical variable with 2 levels, these +results are similar to the likelihood ratio test +from the Cox proportional hazards model. First, we +look at the results for `Center`. + +```{python} +X = MS(['Wait time', + 'Failed', + 'Center'], + intercept=False).fit_transform(D) +F = coxph().fit(X, 'Wait time', 'Failed') +F.log_likelihood_ratio_test() + +``` + + +Next, we look at the results for `Time`. + +```{python} +X = MS(['Wait time', + 'Failed', + 'Time'], + intercept=False).fit_transform(D) +F = coxph().fit(X, 'Wait time', 'Failed') +F.log_likelihood_ratio_test() + +``` + + +We find that differences between centers are highly significant, as +are differences between times of day. + +Finally, we fit Cox's proportional hazards model to the data. + +```{python} +X = MS(D.columns, + intercept=False).fit_transform(D) +fit_queuing = coxph().fit( + X, + 'Wait time', + 'Failed') +fit_queuing.summary[['coef', 'se(coef)', 'p']] + +``` + + +The $p$-values for Center B and evening time +are very small. It is also clear that the +hazard --- that is, the instantaneous risk that a call will be +answered --- increases with the number of operators. Since we +generated the data ourselves, we know that the true coefficients for + `Operators`, `Center = B`, `Center = C`, +`Time = Even.` and `Time = Morn.` are $0.04$, $-0.3$, +$0$, $0.2$, and $-0.2$, respectively. The coefficient estimates +from the fitted Cox model are fairly accurate. + + diff --git a/ISLP_labs_R4DS/Ch12-unsup-lab copy.Rmd b/ISLP_labs_R4DS/Ch12-unsup-lab copy.Rmd new file mode 100644 index 0000000..164607a --- /dev/null +++ b/ISLP_labs_R4DS/Ch12-unsup-lab copy.Rmd @@ -0,0 +1,910 @@ +--- +jupyter: + jupytext: + cell_metadata_filter: -all + formats: Rmd,ipynb + main_language: python + text_representation: + extension: .Rmd + format_name: rmarkdown + format_version: '1.2' + jupytext_version: 1.14.7 +--- + + +# Chapter 12 + +# Lab: Unsupervised Learning +In this lab we demonstrate PCA and clustering on several datasets. +As in other labs, we import some of our libraries at this top +level. This makes the code more readable, as scanning the first few +lines of the notebook tell us what libraries are used in this +notebook. + +```{python} +import numpy as np +import pandas as pd +import matplotlib.pyplot as plt +from statsmodels.datasets import get_rdataset +from sklearn.decomposition import PCA +from sklearn.preprocessing import StandardScaler +from ISLP import load_data + +``` +We also collect the new imports +needed for this lab. + +```{python} +from sklearn.cluster import \ + (KMeans, + AgglomerativeClustering) +from scipy.cluster.hierarchy import \ + (dendrogram, + cut_tree) +from ISLP.cluster import compute_linkage + +``` + +## Principal Components Analysis +In this lab, we perform PCA on `USArrests`, a data set in the +`R` computing environment. +We retrieve the data using `get_rdataset()`, which can fetch data from +many standard `R` packages. + +The rows of the data set contain the 50 states, in alphabetical order. + +```{python} +USArrests = get_rdataset('USArrests').data +USArrests + +``` + +The columns of the data set contain the four variables. + +```{python} +USArrests.columns + +``` + +We first briefly examine the data. We notice that the variables have vastly different means. + +```{python} +USArrests.mean() + +``` + + +Dataframes have several useful methods for computing +column-wise summaries. We can also examine the +variance of the four variables using the `var()` method. + +```{python} +USArrests.var() + +``` + +Not surprisingly, the variables also have vastly different variances. +The `UrbanPop` variable measures the percentage of the population +in each state living in an urban area, which is not a comparable +number to the number of rapes in each state per 100,000 individuals. +PCA looks for derived variables that account for most of the variance in the data set. +If we do not scale the variables before performing PCA, then the principal components +would mostly be driven by the +`Assault` variable, since it has by far the largest +variance. So if the variables are measured in different units or vary widely in scale, it is recommended to standardize the variables to have standard deviation one before performing PCA. +Typically we set the means to zero as well. + +This scaling can be done via the `StandardScaler()` transform imported above. We first `fit` the +scaler, which computes the necessary means and standard +deviations and then apply it to our data using the +`transform` method. As before, we combine these steps using the `fit_transform()` method. + + +```{python} +scaler = StandardScaler(with_std=True, + with_mean=True) +USArrests_scaled = scaler.fit_transform(USArrests) + +``` +Having scaled the data, we can then +perform principal components analysis using the `PCA()` transform +from the `sklearn.decomposition` package. + +```{python} +pcaUS = PCA() + +``` +(By default, the `PCA()` transform centers the variables to have +mean zero though it does not scale them.) The transform `pcaUS` +can be used to find the PCA +`scores` returned by `fit()`. Once the `fit` method has been called, the `pcaUS` object also contains a number of useful quantities. + +```{python} +pcaUS.fit(USArrests_scaled) + +``` + +After fitting, the `mean_` attribute corresponds to the means +of the variables. In this case, since we centered and scaled the data with +`scaler()` the means will all be 0. + +```{python} +pcaUS.mean_ + +``` + +The scores can be computed using the `transform()` method +of `pcaUS` after it has been fit. + +```{python} +scores = pcaUS.transform(USArrests_scaled) + +``` +We will plot these scores a bit further down. +The `components_` attribute provides the principal component loadings: +each row of `pcaUS.components_` contains the corresponding +principal component loading vector. + + +```{python} +pcaUS.components_ + +``` + +The `biplot` is a common visualization method used with +PCA. It is not built in as a standard +part of `sklearn`, though there are python +packages that do produce such plots. Here we +make a simple biplot manually. + +```{python} +i, j = 0, 1 # which components +fig, ax = plt.subplots(1, 1, figsize=(8, 8)) +ax.scatter(scores[:,0], scores[:,1]) +ax.set_xlabel('PC%d' % (i+1)) +ax.set_ylabel('PC%d' % (j+1)) +for k in range(pcaUS.components_.shape[1]): + ax.arrow(0, 0, pcaUS.components_[i,k], pcaUS.components_[j,k]) + ax.text(pcaUS.components_[i,k], + pcaUS.components_[j,k], + USArrests.columns[k]) + +``` +Notice that this figure is a reflection of Figure 12.1 through the $y$-axis. Recall that the +principal components are only unique up to a sign change, so we can +reproduce that figure by flipping the +signs of the second set of scores and loadings. +We also increase the length of the arrows to emphasize the loadings. + +```{python} +scale_arrow = s_ = 2 +scores[:,1] *= -1 +pcaUS.components_[1] *= -1 # flip the y-axis +fig, ax = plt.subplots(1, 1, figsize=(8, 8)) +ax.scatter(scores[:,0], scores[:,1]) +ax.set_xlabel('PC%d' % (i+1)) +ax.set_ylabel('PC%d' % (j+1)) +for k in range(pcaUS.components_.shape[1]): + ax.arrow(0, 0, s_*pcaUS.components_[i,k], s_*pcaUS.components_[j,k]) + ax.text(s_*pcaUS.components_[i,k], + s_*pcaUS.components_[j,k], + USArrests.columns[k]) + +``` + +The standard deviations of the principal component scores are as follows: + +```{python} +scores.std(0, ddof=1) +``` + + +The variance of each score can be extracted directly from the `pcaUS` object via +the `explained_variance_` attribute. + +```{python} +pcaUS.explained_variance_ + +``` +The proportion of variance explained by each principal +component (PVE) is stored as `explained_variance_ratio_`: + +```{python} +pcaUS.explained_variance_ratio_ + +``` +We see that the first principal component explains 62.0% of the +variance in the data, the next principal component explains 24.7% +of the variance, and so forth. +We can plot the PVE explained by each component, as well as the cumulative PVE. We first +plot the proportion of variance explained. + +```{python} +# %%capture +fig, axes = plt.subplots(1, 2, figsize=(15, 6)) +ticks = np.arange(pcaUS.n_components_)+1 +ax = axes[0] +ax.plot(ticks, + pcaUS.explained_variance_ratio_, + marker='o') +ax.set_xlabel('Principal Component'); +ax.set_ylabel('Proportion of Variance Explained') +ax.set_ylim([0,1]) +ax.set_xticks(ticks) + +``` +Notice the use of `%%capture`, which suppresses the displaying of the partially completed figure. + +```{python} +ax = axes[1] +ax.plot(ticks, + pcaUS.explained_variance_ratio_.cumsum(), + marker='o') +ax.set_xlabel('Principal Component') +ax.set_ylabel('Cumulative Proportion of Variance Explained') +ax.set_ylim([0, 1]) +ax.set_xticks(ticks) +fig + +``` +The result is similar to that shown in Figure 12.3. Note +that the method `cumsum()` computes the cumulative sum of +the elements of a numeric vector. For instance: + +```{python} +a = np.array([1,2,8,-3]) +np.cumsum(a) + +``` +## Matrix Completion + +We now re-create the analysis carried out on the `USArrests` data in +Section 12.3. + +We saw in Section 12.2.2 that solving the optimization +problem (12.6) on a centered data matrix $\bf X$ is +equivalent to computing the first $M$ principal +components of the data. We use our scaled +and centered `USArrests` data as $\bf X$ below. The *singular value decomposition* +(SVD) is a general algorithm for solving +(12.6). + +```{python} +X = USArrests_scaled +U, D, V = np.linalg.svd(X, full_matrices=False) +U.shape, D.shape, V.shape + +``` +The `np.linalg.svd()` function returns three components, `U`, `D` and `V`. The matrix `V` is equivalent to the +loading matrix from principal components (up to an unimportant sign flip). Using the `full_matrices=False` option ensures that +for a tall matrix the shape of `U` is the same as the shape of `X`. + +```{python} +V + +``` + +```{python} +pcaUS.components_ + +``` +The matrix `U` corresponds to a *standardized* version of the PCA score matrix (each column standardized to have sum-of-squares one). If we multiply each column of `U` by the corresponding element of `D`, we recover the PCA scores exactly (up to a meaningless sign flip). + +```{python} +(U * D[None,:])[:3] + +``` + +```{python} +scores[:3] + +``` +While it would be possible to carry out this lab using the `PCA()` estimator, +here we use the `np.linalg.svd()` function in order to illustrate its use. + +We now omit 20 entries in the $50\times 4$ data matrix at random. We do so +by first selecting 20 rows (states) at random, and then selecting one +of the four entries in each row at random. This ensures that every row has +at least three observed values. + +```{python} +n_omit = 20 +np.random.seed(15) +r_idx = np.random.choice(np.arange(X.shape[0]), + n_omit, + replace=False) +c_idx = np.random.choice(np.arange(X.shape[1]), + n_omit, + replace=True) +Xna = X.copy() +Xna[r_idx, c_idx] = np.nan + +``` + +Here the array `r_idx` +contains 20 integers from 0 to 49; this represents the states (rows of `X`) that are selected to contain missing values. And `c_idx` contains +20 integers from 0 to 3, representing the features (columns in `X`) that contain the missing values for each of the selected states. + +We now write some code to implement Algorithm 12.1. +We first write a function that takes in a matrix, and returns an approximation to the matrix using the `svd()` function. +This will be needed in Step 2 of Algorithm 12.1. + +```{python} +def low_rank(X, M=1): + U, D, V = np.linalg.svd(X) + L = U[:,:M] * D[None,:M] + return L.dot(V[:M]) + +``` +To conduct Step 1 of the algorithm, we initialize `Xhat` --- this is $\tilde{\bf X}$ in Algorithm 12.1 --- by replacing +the missing values with the column means of the non-missing entries. These are stored in +`Xbar` below after running `np.nanmean()` over the row axis. +We make a copy so that when we assign values to `Xhat` below we do not also overwrite the +values in `Xna`. + +```{python} +Xhat = Xna.copy() +Xbar = np.nanmean(Xhat, axis=0) +Xhat[r_idx, c_idx] = Xbar[c_idx] + +``` + +Before we begin Step 2, we set ourselves up to measure the progress of our +iterations: + +```{python} +thresh = 1e-7 +rel_err = 1 +count = 0 +ismiss = np.isnan(Xna) +mssold = np.mean(Xhat[~ismiss]**2) +mss0 = np.mean(Xna[~ismiss]**2) + +``` +Here `ismiss` is a logical matrix with the same dimensions as `Xna`; +a given element is `True` if the corresponding matrix element is missing. The notation `~ismiss` negates this boolean vector. This is useful +because it allows us to access both the missing and non-missing entries. We store the mean of the squared non-missing elements in `mss0`. +We store the mean squared error of the non-missing elements of the old version of `Xhat` in `mssold` (which currently +agrees with `mss0`). We plan to store the mean squared error of the non-missing elements of the current version of `Xhat` in `mss`, and will then +iterate Step 2 of Algorithm 12.1 until the *relative error*, defined as +`(mssold - mss) / mss0`, falls below `thresh = 1e-7`. + {Algorithm 12.1 tells us to iterate Step 2 until (12.14) is no longer decreasing. Determining whether (12.14) is decreasing requires us only to keep track of `mssold - mss`. However, in practice, we keep track of `(mssold - mss) / mss0` instead: this makes it so that the number of iterations required for Algorithm 12.1 to converge does not depend on whether we multiplied the raw data $\bf X$ by a constant factor.} + +In Step 2(a) of Algorithm 12.1, we approximate `Xhat` using `low_rank()`; we call this `Xapp`. In Step 2(b), we use `Xapp` to update the estimates for elements in `Xhat` that are missing in `Xna`. Finally, in Step 2(c), we compute the relative error. These three steps are contained in the following `while` loop: + +```{python} +while rel_err > thresh: + count += 1 + # Step 2(a) + Xapp = low_rank(Xhat, M=1) + # Step 2(b) + Xhat[ismiss] = Xapp[ismiss] + # Step 2(c) + mss = np.mean(((Xna - Xapp)[~ismiss])**2) + rel_err = (mssold - mss) / mss0 + mssold = mss + print("Iteration: {0}, MSS:{1:.3f}, Rel.Err {2:.2e}" + .format(count, mss, rel_err)) + +``` + +We see that after eight iterations, the relative error has fallen below `thresh = 1e-7`, and so the algorithm terminates. When this happens, the mean squared error of the non-missing elements equals 0.381. + +Finally, we compute the correlation between the 20 imputed values +and the actual values: + +```{python} +np.corrcoef(Xapp[ismiss], X[ismiss])[0,1] + +``` + + +In this lab, we implemented Algorithm 12.1 ourselves for didactic purposes. However, a reader who wishes to apply matrix completion to their data might look to more specialized `Python` implementations. + + +## Clustering + + +### $K$-Means Clustering + +The estimator `sklearn.cluster.KMeans()` performs $K$-means clustering in +`Python`. We begin with a simple simulated example in which there +truly are two clusters in the data: the first 25 observations have a +mean shift relative to the next 25 observations. + +```{python} +np.random.seed(0); +X = np.random.standard_normal((50,2)); +X[:25,0] += 3; +X[:25,1] -= 4; + +``` +We now perform $K$-means clustering with $K=2$. + +```{python} +kmeans = KMeans(n_clusters=2, + random_state=2, + n_init=20).fit(X) + +``` +We specify `random_state` to make the results reproducible. The cluster assignments of the 50 observations are contained in `kmeans.labels_`. + +```{python} +kmeans.labels_ + +``` +The $K$-means clustering perfectly separated the observations into two +clusters even though we did not supply any group information to +`KMeans()`. We can plot the data, with each observation +colored according to its cluster assignment. + +```{python} +fig, ax = plt.subplots(1, 1, figsize=(8,8)) +ax.scatter(X[:,0], X[:,1], c=kmeans.labels_) +ax.set_title("K-Means Clustering Results with K=2"); + +``` + +Here the observations can be easily plotted because they are +two-dimensional. If there were more than two variables then we could +instead perform PCA and plot the first two principal component score +vectors to represent the clusters. + +In this example, we knew that there really +were two clusters because we generated the data. However, for real +data, we do not know the true number of clusters, nor whether they exist in any precise way. We could +instead have performed $K$-means clustering on this example with +$K=3$. + +```{python} +kmeans = KMeans(n_clusters=3, + random_state=3, + n_init=20).fit(X) +fig, ax = plt.subplots(figsize=(8,8)) +ax.scatter(X[:,0], X[:,1], c=kmeans.labels_) +ax.set_title("K-Means Clustering Results with K=3"); + +``` +When $K=3$, $K$-means clustering splits up the two clusters. +We have used the `n_init` argument to run the $K$-means with 20 +initial cluster assignments (the default is 10). If a +value of `n_init` greater than one is used, then $K$-means +clustering will be performed using multiple random assignments in +Step 1 of Algorithm 12.2, and the `KMeans()` +function will report only the best results. Here we compare using +`n_init=1` to `n_init=20`. + +```{python} +kmeans1 = KMeans(n_clusters=3, + random_state=3, + n_init=1).fit(X) +kmeans20 = KMeans(n_clusters=3, + random_state=3, + n_init=20).fit(X); +kmeans1.inertia_, kmeans20.inertia_ + +``` +Note that `kmeans.inertia_` is the total within-cluster sum +of squares, which we seek to minimize by performing $K$-means +clustering (12.17). + +We *strongly* recommend always running $K$-means clustering with +a large value of `n_init`, such as 20 or 50, since otherwise an +undesirable local optimum may be obtained. + +When performing $K$-means clustering, in addition to using multiple +initial cluster assignments, it is also important to set a random seed +using the `random_state` argument to `KMeans()`. This way, the initial +cluster assignments in Step 1 can be replicated, and the $K$-means +output will be fully reproducible. + + +### Hierarchical Clustering + +The `AgglomerativeClustering()` class from +the `sklearn.clustering` package implements hierarchical clustering. +As its +name is long, we use the short hand `HClust` for *hierarchical clustering*. Note that this will not change the return type +when using this method, so instances will still be of class `AgglomerativeClustering`. +In the following example we use the data from the previous lab to plot the hierarchical clustering +dendrogram using complete, single, and average linkage clustering +with Euclidean distance as the dissimilarity measure. We begin by +clustering observations using complete linkage. + +```{python} +HClust = AgglomerativeClustering +hc_comp = HClust(distance_threshold=0, + n_clusters=None, + linkage='complete') +hc_comp.fit(X) + +``` + +This computes the entire dendrogram. +We could just as easily perform hierarchical clustering with average or single linkage instead: + +```{python} +hc_avg = HClust(distance_threshold=0, + n_clusters=None, + linkage='average'); +hc_avg.fit(X) +hc_sing = HClust(distance_threshold=0, + n_clusters=None, + linkage='single'); +hc_sing.fit(X); + +``` + +To use a precomputed distance matrix, we provide an additional +argument `metric="precomputed"`. In the code below, the first four lines computes the $50\times 50$ pairwise-distance matrix. + +```{python} +D = np.zeros((X.shape[0], X.shape[0])); +for i in range(X.shape[0]): + x_ = np.multiply.outer(np.ones(X.shape[0]), X[i]) + D[i] = np.sqrt(np.sum((X - x_)**2, 1)); +hc_sing_pre = HClust(distance_threshold=0, + n_clusters=None, + metric='precomputed', + linkage='single') +hc_sing_pre.fit(D) + +``` + +We use +`dendrogram()` from `scipy.cluster.hierarchy` to plot the dendrogram. However, +`dendrogram()` expects a so-called *linkage-matrix representation* +of the clustering, which is not provided by `AgglomerativeClustering()`, +but can be computed. The function `compute_linkage()` in the +`ISLP.cluster` package is provided for this purpose. + +We can now plot the dendrograms. The numbers at the bottom of the plot +identify each observation. The `dendrogram()` function has a default method to +color different branches of the tree that suggests a pre-defined cut of the tree at a particular depth. +We prefer to overwrite this default by setting this threshold to be infinite. Since we want this behavior for many dendrograms, we store these values in a dictionary `cargs` and pass this as keyword arguments using the notation `**cargs`. + +```{python} +cargs = {'color_threshold':-np.inf, + 'above_threshold_color':'black'} +linkage_comp = compute_linkage(hc_comp) +fig, ax = plt.subplots(1, 1, figsize=(8, 8)) +dendrogram(linkage_comp, + ax=ax, + **cargs); + +``` + +We may want to color branches of the tree above +and below a cut-threshold differently. This can be achieved +by changing the `color_threshold`. Let’s cut the tree at a height of 4, +coloring links that merge above 4 in black. + +```{python} +fig, ax = plt.subplots(1, 1, figsize=(8, 8)) +dendrogram(linkage_comp, + ax=ax, + color_threshold=4, + above_threshold_color='black'); + +``` + +To determine the cluster labels for each observation associated with a +given cut of the dendrogram, we can use the `cut_tree()` +function from `scipy.cluster.hierarchy`: + +```{python} +cut_tree(linkage_comp, n_clusters=4).T + +``` + +This can also be achieved by providing an argument `n_clusters` +to `HClust()`; however each cut would require recomputing +the clustering. Similarly, trees may be cut by distance threshold +with an argument of `distance_threshold` to `HClust()` +or `height` to `cut_tree()`. + +```{python} +cut_tree(linkage_comp, height=5) + +``` + +To scale the variables before performing hierarchical clustering of +the observations, we use `StandardScaler()` as in our PCA example: + +```{python} +scaler = StandardScaler() +X_scale = scaler.fit_transform(X) +hc_comp_scale = HClust(distance_threshold=0, + n_clusters=None, + linkage='complete').fit(X_scale) +linkage_comp_scale = compute_linkage(hc_comp_scale) +fig, ax = plt.subplots(1, 1, figsize=(8, 8)) +dendrogram(linkage_comp_scale, ax=ax, **cargs) +ax.set_title("Hierarchical Clustering with Scaled Features"); + +``` + +Correlation-based distances between observations can be used for +clustering. The correlation between two observations measures the +similarity of their feature values. {Suppose each observation has + $p$ features, each a single numerical value. We measure the + similarity of two such observations by computing the + correlation of these $p$ pairs of numbers.} +With $n$ observations, the $n\times n$ correlation matrix can then be used as a similarity (or affinity) matrix, i.e. so that one minus the correlation matrix is the dissimilarity matrix used for clustering. + +Note that using correlation only makes sense for +data with at least three features since the absolute correlation +between any two observations with measurements on two features is +always one. Hence, we will cluster a three-dimensional data set. + +```{python} +X = np.random.standard_normal((30, 3)) +corD = 1 - np.corrcoef(X) +hc_cor = HClust(linkage='complete', + distance_threshold=0, + n_clusters=None, + metric='precomputed') +hc_cor.fit(corD) +linkage_cor = compute_linkage(hc_cor) +fig, ax = plt.subplots(1, 1, figsize=(8, 8)) +dendrogram(linkage_cor, ax=ax, **cargs) +ax.set_title("Complete Linkage with Correlation-Based Dissimilarity"); + +``` + + +## NCI60 Data Example +Unsupervised techniques are often used in the analysis of genomic +data. In particular, PCA and hierarchical clustering are popular +tools. We illustrate these techniques on the `NCI60` cancer cell line +microarray data, which consists of 6830 gene expression +measurements on 64 cancer cell lines. + +```{python} +NCI60 = load_data('NCI60') +nci_labs = NCI60['labels'] +nci_data = NCI60['data'] + +``` + +Each cell line is labeled with a cancer type. We do not make use of +the cancer types in performing PCA and clustering, as these are +unsupervised techniques. But after performing PCA and clustering, we +will check to see the extent to which these cancer types agree with +the results of these unsupervised techniques. + +The data has 64 rows and 6830 columns. + +```{python} +nci_data.shape + +``` + + +We begin by examining the cancer types for the cell lines. + + +```{python} +nci_labs.value_counts() + +``` + + +### PCA on the NCI60 Data + +We first perform PCA on the data after scaling the variables (genes) +to have standard deviation one, although here one could reasonably argue +that it is better not to scale the genes as they are measured in the same units. + +```{python} +scaler = StandardScaler() +nci_scaled = scaler.fit_transform(nci_data) +nci_pca = PCA() +nci_scores = nci_pca.fit_transform(nci_scaled) + +``` + +We now plot the first few principal component score vectors, in order +to visualize the data. The observations (cell lines) corresponding to +a given cancer type will be plotted in the same color, so that we can +see to what extent the observations within a cancer type are similar +to each other. + +```{python} +cancer_types = list(np.unique(nci_labs)) +nci_groups = np.array([cancer_types.index(lab) + for lab in nci_labs.values]) +fig, axes = plt.subplots(1, 2, figsize=(15,6)) +ax = axes[0] +ax.scatter(nci_scores[:,0], + nci_scores[:,1], + c=nci_groups, + marker='o', + s=50) +ax.set_xlabel('PC1'); ax.set_ylabel('PC2') +ax = axes[1] +ax.scatter(nci_scores[:,0], + nci_scores[:,2], + c=nci_groups, + marker='o', + s=50) +ax.set_xlabel('PC1'); ax.set_ylabel('PC3'); + +``` +On the whole, cell lines corresponding to a single cancer type do tend to +have similar values on the first few principal component score +vectors. This indicates that cell lines from the same cancer type tend +to have pretty similar gene expression levels. + + + + + +We can also plot the percent variance +explained by the principal components as well as the cumulative percent variance explained. +This is similar to the plots we made earlier for the `USArrests` data. + +```{python} +fig, axes = plt.subplots(1, 2, figsize=(15,6)) +ax = axes[0] +ticks = np.arange(nci_pca.n_components_)+1 +ax.plot(ticks, + nci_pca.explained_variance_ratio_, + marker='o') +ax.set_xlabel('Principal Component'); +ax.set_ylabel('PVE') +ax = axes[1] +ax.plot(ticks, + nci_pca.explained_variance_ratio_.cumsum(), + marker='o'); +ax.set_xlabel('Principal Component') +ax.set_ylabel('Cumulative PVE'); + +``` +We see that together, the first seven principal components explain +around 40% of the variance in the data. This is not a huge amount +of the variance. However, looking at the scree plot, we see that while +each of the first seven principal components explain a substantial +amount of variance, there is a marked decrease in the variance +explained by further principal components. That is, there is an +*elbow* in the plot after approximately the seventh +principal component. This suggests that there may be little benefit +to examining more than seven or so principal components (though even +examining seven principal components may be difficult). + + +### Clustering the Observations of the NCI60 Data + +We now perform hierarchical clustering of the cell lines in the `NCI60` data using +complete, single, and average linkage. Once again, the goal is to find out whether or not the observations cluster into distinct types of cancer. Euclidean +distance is used as the dissimilarity measure. We first write a short +function to produce +the three dendrograms. + +```{python} +def plot_nci(linkage, ax, cut=-np.inf): + cargs = {'above_threshold_color':'black', + 'color_threshold':cut} + hc = HClust(n_clusters=None, + distance_threshold=0, + linkage=linkage.lower()).fit(nci_scaled) + linkage_ = compute_linkage(hc) + dendrogram(linkage_, + ax=ax, + labels=np.asarray(nci_labs), + leaf_font_size=10, + **cargs) + ax.set_title('%s Linkage' % linkage) + return hc + +``` + +Let’s plot our results. + +```{python} +fig, axes = plt.subplots(3, 1, figsize=(15,30)) +ax = axes[0]; hc_comp = plot_nci('Complete', ax) +ax = axes[1]; hc_avg = plot_nci('Average', ax) +ax = axes[2]; hc_sing = plot_nci('Single', ax) + +``` +We see that the +choice of linkage certainly does affect the results +obtained. Typically, single linkage will tend to yield *trailing* +clusters: very large clusters onto which individual observations +attach one-by-one. On the other hand, complete and average linkage +tend to yield more balanced, attractive clusters. For this reason, +complete and average linkage are generally preferred to single +linkage. Clearly cell lines within a single cancer type do tend to +cluster together, although the clustering is not perfect. We will use +complete linkage hierarchical clustering for the analysis that +follows. + +We can cut the dendrogram at the height that will yield a particular +number of clusters, say four: + +```{python} +linkage_comp = compute_linkage(hc_comp) +comp_cut = cut_tree(linkage_comp, n_clusters=4).reshape(-1) +pd.crosstab(nci_labs['label'], + pd.Series(comp_cut.reshape(-1), name='Complete')) + +``` + + +There are some clear patterns. All the leukemia cell lines fall in +one cluster, while the breast cancer cell lines are spread out over +three different clusters. + +We can plot a cut on the dendrogram that produces these four clusters: + +```{python} +fig, ax = plt.subplots(figsize=(10,10)) +plot_nci('Complete', ax, cut=140) +ax.axhline(140, c='r', linewidth=4); + +``` + +The `axhline()` function draws a horizontal line line on top of any +existing set of axes. The argument `140` plots a horizontal +line at height 140 on the dendrogram; this is a height that +results in four distinct clusters. It is easy to verify that the +resulting clusters are the same as the ones we obtained in +`comp_cut`. + +We claimed earlier in Section 12.4.2 that +$K$-means clustering and hierarchical clustering with the dendrogram +cut to obtain the same number of clusters can yield very different +results. How do these `NCI60` hierarchical clustering results compare +to what we get if we perform $K$-means clustering with $K=4$? + +```{python} +nci_kmeans = KMeans(n_clusters=4, + random_state=0, + n_init=20).fit(nci_scaled) +pd.crosstab(pd.Series(comp_cut, name='HClust'), + pd.Series(nci_kmeans.labels_, name='K-means')) + +``` + +We see that the four clusters obtained using hierarchical clustering +and $K$-means clustering are somewhat different. First we note +that the labels in the two clusterings are arbitrary. That is, swapping +the identifier of the cluster does not +change the clustering. We see here Cluster 3 in +$K$-means clustering is identical to cluster 2 in hierarchical +clustering. However, the other clusters differ: for instance, +cluster 0 in $K$-means clustering contains a portion of the +observations assigned to cluster 0 by hierarchical clustering, as well +as all of the observations assigned to cluster 1 by hierarchical +clustering. + +Rather than performing hierarchical clustering on the entire data +matrix, we can also perform hierarchical clustering on the first few +principal component score vectors, regarding these first few components +as a less noisy version of the data. + +```{python} +hc_pca = HClust(n_clusters=None, + distance_threshold=0, + linkage='complete' + ).fit(nci_scores[:,:5]) +linkage_pca = compute_linkage(hc_pca) +fig, ax = plt.subplots(figsize=(8,8)) +dendrogram(linkage_pca, + labels=np.asarray(nci_labs), + leaf_font_size=10, + ax=ax, + **cargs) +ax.set_title("Hier. Clust. on First Five Score Vectors") +pca_labels = pd.Series(cut_tree(linkage_pca, + n_clusters=4).reshape(-1), + name='Complete-PCA') +pd.crosstab(nci_labs['label'], pca_labels) + +``` + + + + + diff --git a/ISLP_labs_R4DS/Ch13-multiple-lab copy.Rmd b/ISLP_labs_R4DS/Ch13-multiple-lab copy.Rmd new file mode 100644 index 0000000..59d7b73 --- /dev/null +++ b/ISLP_labs_R4DS/Ch13-multiple-lab copy.Rmd @@ -0,0 +1,580 @@ +--- +jupyter: + jupytext: + cell_metadata_filter: -all + formats: Rmd,ipynb + main_language: python + text_representation: + extension: .Rmd + format_name: rmarkdown + format_version: '1.2' + jupytext_version: 1.14.7 +--- + + +# Chapter 13 + +# Lab: Multiple Testing + + + +We include our usual imports seen in earlier labs. + +```{python} +import numpy as np +import pandas as pd +import matplotlib.pyplot as plt +import statsmodels.api as sm +from ISLP import load_data + +``` + +We also collect the new imports +needed for this lab. + +```{python} +from scipy.stats import \ + (ttest_1samp, + ttest_rel, + ttest_ind, + t as t_dbn) +from statsmodels.stats.multicomp import \ + pairwise_tukeyhsd +from statsmodels.stats.multitest import \ + multipletests as mult_test + +``` + + +## Review of Hypothesis Tests +We begin by performing some one-sample $t$-tests. + +First we create 100 variables, each consisting of 10 observations. The +first 50 variables have mean $0.5$ and variance $1$, while the others +have mean $0$ and variance $1$. + +```{python} +rng = np.random.default_rng(12) +X = rng.standard_normal((10, 100)) +true_mean = np.array([0.5]*50 + [0]*50) +X += true_mean[None,:] + +``` + +To begin, we use `ttest_1samp()` from the +`scipy.stats` module to test $H_{0}: \mu_1=0$, the null +hypothesis that the first variable has mean zero. + +```{python} +result = ttest_1samp(X[:,0], 0) +result.pvalue + +``` + +The $p$-value comes out to 0.931, which is not low enough to +reject the null hypothesis at level $\alpha=0.05$. In this case, +$\mu_1=0.5$, so the null hypothesis is false. Therefore, we have made +a Type II error by failing to reject the null hypothesis when the null +hypothesis is false. + +We now test $H_{0,j}: \mu_j=0$ for $j=1,\ldots,100$. We compute the +100 $p$-values, and then construct a vector recording whether the +$j$th $p$-value is less than or equal to 0.05, in which case we reject +$H_{0j}$, or greater than 0.05, in which case we do not reject +$H_{0j}$, for $j=1,\ldots,100$. + +```{python} +p_values = np.empty(100) +for i in range(100): + p_values[i] = ttest_1samp(X[:,i], 0).pvalue +decision = pd.cut(p_values, + [0, 0.05, 1], + labels=['Reject H0', + 'Do not reject H0']) +truth = pd.Categorical(true_mean == 0, + categories=[True, False], + ordered=True) + +``` +Since this is a simulated data set, we can create a $2 \times 2$ table +similar to Table 13.2. + +```{python} +pd.crosstab(decision, + truth, + rownames=['Decision'], + colnames=['H0']) + +``` +Therefore, at level $\alpha=0.05$, we reject 15 of the 50 false +null hypotheses, and we incorrectly reject 5 of the true null +hypotheses. Using the notation from Section 13.3, we have +$V=5$, $S=15$, $U=45$ and $W=35$. +We have set $\alpha=0.05$, which means that we expect to reject around +5% of the true null hypotheses. This is in line with the $2 \times 2$ +table above, which indicates that we rejected $V=5$ of the $50$ true +null hypotheses. + +In the simulation above, for the false null hypotheses, the ratio of +the mean to the standard deviation was only $0.5/1 = 0.5$. This +amounts to quite a weak signal, and it resulted in a high number of +Type II errors. Let’s instead simulate data with a stronger signal, +so that the ratio of the mean to the standard deviation for the false +null hypotheses equals $1$. We make only 10 Type II errors. + + +```{python} +true_mean = np.array([1]*50 + [0]*50) +X = rng.standard_normal((10, 100)) +X += true_mean[None,:] +for i in range(100): + p_values[i] = ttest_1samp(X[:,i], 0).pvalue +decision = pd.cut(p_values, + [0, 0.05, 1], + labels=['Reject H0', + 'Do not reject H0']) +truth = pd.Categorical(true_mean == 0, + categories=[True, False], + ordered=True) +pd.crosstab(decision, + truth, + rownames=['Decision'], + colnames=['H0']) + +``` + + + +## Family-Wise Error Rate +Recall from (13.5) that if the null hypothesis is true +for each of $m$ independent hypothesis tests, then the FWER is equal +to $1-(1-\alpha)^m$. We can use this expression to compute the FWER +for $m=1,\ldots, 500$ and $\alpha=0.05$, $0.01$, and $0.001$. +We plot the FWER for these values of $\alpha$ in order to +reproduce Figure 13.2. + +```{python} +m = np.linspace(1, 501) +fig, ax = plt.subplots() +[ax.plot(m, + 1 - (1 - alpha)**m, + label=r'$\alpha=%s$' % str(alpha)) + for alpha in [0.05, 0.01, 0.001]] +ax.set_xscale('log') +ax.set_xlabel('Number of Hypotheses') +ax.set_ylabel('Family-Wise Error Rate') +ax.legend() +ax.axhline(0.05, c='k', ls='--'); + +``` + +As discussed previously, even for moderate values of $m$ such as $50$, +the FWER exceeds $0.05$ unless $\alpha$ is set to a very low value, +such as $0.001$. Of course, the problem with setting $\alpha$ to such +a low value is that we are likely to make a number of Type II errors: +in other words, our power is very low. + +We now conduct a one-sample $t$-test for each of the first five +managers in the +`Fund` dataset, in order to test the null +hypothesis that the $j$th fund manager’s mean return equals zero, +$H_{0,j}: \mu_j=0$. + +```{python} +Fund = load_data('Fund') +fund_mini = Fund.iloc[:,:5] +fund_mini_pvals = np.empty(5) +for i in range(5): + fund_mini_pvals[i] = ttest_1samp(fund_mini.iloc[:,i], 0).pvalue +fund_mini_pvals + +``` + +The $p$-values are low for Managers One and Three, and high for the +other three managers. However, we cannot simply reject $H_{0,1}$ and +$H_{0,3}$, since this would fail to account for the multiple testing +that we have performed. Instead, we will conduct Bonferroni’s method +and Holm’s method to control the FWER. + +To do this, we use the `multipletests()` function from the +`statsmodels` module (abbreviated to `mult_test()`). Given the $p$-values, +for methods like Holm and Bonferroni the function outputs +adjusted $p$-values, which +can be thought of as a new set of $p$-values that have been corrected +for multiple testing. If the adjusted $p$-value for a given hypothesis +is less than or equal to $\alpha$, then that hypothesis can be +rejected while maintaining a FWER of no more than $\alpha$. In other +words, for such methods, the adjusted $p$-values resulting from the `multipletests()` +function can simply be compared to the desired FWER in order to +determine whether or not to reject each hypothesis. We will later +see that we can use the same function to control FDR as well. + + +The `mult_test()` function takes $p$-values and a `method` argument, as well as an optional +`alpha` argument. It returns the decisions (`reject` below) +as well as the adjusted $p$-values (`bonf`). + +```{python} +reject, bonf = mult_test(fund_mini_pvals, method = "bonferroni")[:2] +reject + +``` + + +The $p$-values `bonf` are simply the `fund_mini_pvalues` multiplied by 5 and truncated to be less than +or equal to 1. + +```{python} +bonf, np.minimum(fund_mini_pvals * 5, 1) + +``` + +Therefore, using Bonferroni’s method, we are able to reject the null hypothesis only for Manager +One while controlling FWER at $0.05$. + +By contrast, using Holm’s method, the adjusted $p$-values indicate +that we can reject the null +hypotheses for Managers One and Three at a FWER of $0.05$. + +```{python} +mult_test(fund_mini_pvals, method = "holm", alpha=0.05)[:2] + +``` + + +As discussed previously, Manager One seems to perform particularly +well, whereas Manager Two has poor performance. + + +```{python} +fund_mini.mean() + +``` + + +Is there evidence of a meaningful difference in performance between +these two managers? We can check this by performing a paired $t$-test using the `ttest_rel()` function +from `scipy.stats`: + +```{python} +ttest_rel(fund_mini['Manager1'], + fund_mini['Manager2']).pvalue + +``` + +The test results in a $p$-value of 0.038, +suggesting a statistically significant difference. + +However, we decided to perform this test only after examining the data +and noting that Managers One and Two had the highest and lowest mean +performances. In a sense, this means that we have implicitly +performed ${5 \choose 2} = 5(5-1)/2=10$ hypothesis tests, rather than +just one, as discussed in Section 13.3.2. Hence, we use the +`pairwise_tukeyhsd()` function from +`statsmodels.stats.multicomp` to apply Tukey’s method + in order to adjust for multiple testing. This function takes +as input a fitted *ANOVA* regression model, which is +essentially just a linear regression in which all of the predictors +are qualitative. In this case, the response consists of the monthly +excess returns achieved by each manager, and the predictor indicates +the manager to which each return corresponds. + +```{python} +returns = np.hstack([fund_mini.iloc[:,i] for i in range(5)]) +managers = np.hstack([[i+1]*50 for i in range(5)]) +tukey = pairwise_tukeyhsd(returns, managers) +print(tukey.summary()) + +``` + + +The `pairwise_tukeyhsd()` function provides confidence intervals +for the difference between each pair of managers (`lower` and +`upper`), as well as a $p$-value. All of these quantities have +been adjusted for multiple testing. Notice that the $p$-value for the +difference between Managers One and Two has increased from $0.038$ to +$0.186$, so there is no longer clear evidence of a difference between +the managers’ performances. We can plot the confidence intervals for +the pairwise comparisons using the `plot_simultaneous()` method +of `tukey`. Any pair of intervals that don’t overlap indicates a significant difference at the nominal level of 0.05. In this case, +no differences are considered significant as reported in the table above. + +```{python} +fig, ax = plt.subplots(figsize=(8,8)) +tukey.plot_simultaneous(ax=ax); + +``` + +## False Discovery Rate +Now we perform hypothesis tests for all 2,000 fund managers in the +`Fund` dataset. We perform a one-sample $t$-test +of $H_{0,j}: \mu_j=0$, which states that the +$j$th fund manager’s mean return is zero. + +```{python} +fund_pvalues = np.empty(2000) +for i, manager in enumerate(Fund.columns): + fund_pvalues[i] = ttest_1samp(Fund[manager], 0).pvalue + +``` + +There are far too many managers to consider trying to control the FWER. +Instead, we focus on controlling the FDR: that is, the expected fraction of rejected null hypotheses that are actually false positives. +The `multipletests()` function (abbreviated `mult_test()`) can be used to carry out the Benjamini--Hochberg procedure. + +```{python} +fund_qvalues = mult_test(fund_pvalues, method = "fdr_bh")[1] +fund_qvalues[:10] + +``` + +The *q-values* output by the +Benjamini--Hochberg procedure can be interpreted as the smallest FDR +threshold at which we would reject a particular null hypothesis. For +instance, a $q$-value of $0.1$ indicates that we can reject the +corresponding null hypothesis at an FDR of 10% or greater, but that +we cannot reject the null hypothesis at an FDR below 10%. + +If we control the FDR at 10%, then for how many of the fund managers can we reject $H_{0,j}: \mu_j=0$? + +```{python} +(fund_qvalues <= 0.1).sum() + +``` +We find that 146 of the 2,000 fund managers have a $q$-value below +0.1; therefore, we are able to conclude that 146 of the fund managers +beat the market at an FDR of 10%. Only about 15 (10% of 146) of +these fund managers are likely to be false discoveries. + +By contrast, if we had instead used Bonferroni’s method to control the +FWER at level $\alpha=0.1$, then we would have failed to reject any +null hypotheses! + +```{python} +(fund_pvalues <= 0.1 / 2000).sum() + +``` + + +Figure 13.6 displays the ordered +$p$-values, $p_{(1)} \leq p_{(2)} \leq \cdots \leq p_{(2000)}$, for +the `Fund` dataset, as well as the threshold for rejection by the +Benjamini--Hochberg procedure. Recall that the Benjamini--Hochberg +procedure identifies the largest $p$-value such that $p_{(j)} 0: + selected_ = fund_pvalues < sorted_[sorted_set_].max() + sorted_set_ = np.arange(sorted_set_.max()) +else: + selected_ = [] + sorted_set_ = [] + +``` + +We now reproduce the middle panel of Figure 13.6. + +```{python} +fig, ax = plt.subplots() +ax.scatter(np.arange(0, sorted_.shape[0]) + 1, + sorted_, s=10) +ax.set_yscale('log') +ax.set_xscale('log') +ax.set_ylabel('P-Value') +ax.set_xlabel('Index') +ax.scatter(sorted_set_+1, sorted_[sorted_set_], c='r', s=20) +ax.axline((0, 0), (1,q/m), c='k', ls='--', linewidth=3); + +``` + + +## A Re-Sampling Approach +Here, we implement the re-sampling approach to hypothesis testing +using the `Khan` dataset, which we investigated in +Section 13.5. First, we merge the training and +testing data, which results in observations on 83 patients for +2,308 genes. + +```{python} +Khan = load_data('Khan') +D = pd.concat([Khan['xtrain'], Khan['xtest']]) +D['Y'] = pd.concat([Khan['ytrain'], Khan['ytest']]) +D['Y'].value_counts() + +``` + + +There are four classes of cancer. For each gene, we compare the mean +expression in the second class (rhabdomyosarcoma) to the mean +expression in the fourth class (Burkitt’s lymphoma). Performing a +standard two-sample $t$-test +using `ttest_ind()` from `scipy.stats` on the $11$th +gene produces a test-statistic of -2.09 and an associated $p$-value +of 0.0412, suggesting modest evidence of a difference in mean +expression levels between the two cancer types. + +```{python} +D2 = D[lambda df:df['Y'] == 2] +D4 = D[lambda df:df['Y'] == 4] +gene_11 = 'G0011' +observedT, pvalue = ttest_ind(D2[gene_11], + D4[gene_11], + equal_var=True) +observedT, pvalue + +``` + + +However, this $p$-value relies on the assumption that under the null +hypothesis of no difference between the two groups, the test statistic +follows a $t$-distribution with $29+25-2=52$ degrees of freedom. +Instead of using this theoretical null distribution, we can randomly +split the 54 patients into two groups of 29 and 25, and compute a new +test statistic. Under the null hypothesis of no difference between +the groups, this new test statistic should have the same distribution +as our original one. Repeating this process 10,000 times allows us to +approximate the null distribution of the test statistic. We compute +the fraction of the time that our observed test statistic exceeds the +test statistics obtained via re-sampling. + +```{python} +B = 10000 +Tnull = np.empty(B) +D_ = np.hstack([D2[gene_11], D4[gene_11]]) +n_ = D2[gene_11].shape[0] +D_null = D_.copy() +for b in range(B): + rng.shuffle(D_null) + ttest_ = ttest_ind(D_null[:n_], + D_null[n_:], + equal_var=True) + Tnull[b] = ttest_.statistic +(np.abs(Tnull) > np.abs(observedT)).mean() + +``` + + +This fraction, 0.0398, +is our re-sampling-based $p$-value. +It is almost identical to the $p$-value of 0.0412 obtained using the theoretical null distribution. +We can plot a histogram of the re-sampling-based test statistics in order to reproduce Figure 13.7. + +```{python} +fig, ax = plt.subplots(figsize=(8,8)) +ax.hist(Tnull, + bins=100, + density=True, + facecolor='y', + label='Null') +xval = np.linspace(-4.2, 4.2, 1001) +ax.plot(xval, + t_dbn.pdf(xval, D_.shape[0]-2), + c='r') +ax.axvline(observedT, + c='b', + label='Observed') +ax.legend() +ax.set_xlabel("Null Distribution of Test Statistic"); + +``` +The re-sampling-based null distribution is almost identical to the theoretical null distribution, which is displayed in red. + +Finally, we implement the plug-in re-sampling FDR approach outlined in +Algorithm 13.4. Depending on the speed of your +computer, calculating the FDR for all 2,308 genes in the `Khan` +dataset may take a while. Hence, we will illustrate the approach on a +random subset of 100 genes. For each gene, we first compute the +observed test statistic, and then produce 10,000 re-sampled test +statistics. This may take a few minutes to run. If you are in a rush, +then you could set `B` equal to a smaller value (e.g. `B=500`). + +```{python} +m, B = 100, 10000 +idx = rng.choice(Khan['xtest'].columns, m, replace=False) +T_vals = np.empty(m) +Tnull_vals = np.empty((m, B)) + +for j in range(m): + col = idx[j] + T_vals[j] = ttest_ind(D2[col], + D4[col], + equal_var=True).statistic + D_ = np.hstack([D2[col], D4[col]]) + D_null = D_.copy() + for b in range(B): + rng.shuffle(D_null) + ttest_ = ttest_ind(D_null[:n_], + D_null[n_:], + equal_var=True) + Tnull_vals[j,b] = ttest_.statistic + +``` + +Next, we compute the number of rejected null hypotheses $R$, the +estimated number of false positives $\widehat{V}$, and the estimated +FDR, for a range of threshold values $c$ in +Algorithm 13.4. The threshold values are chosen +using the absolute values of the test statistics from the 100 genes. + +```{python} +cutoffs = np.sort(np.abs(T_vals)) +FDRs, Rs, Vs = np.empty((3, m)) +for j in range(m): + R = np.sum(np.abs(T_vals) >= cutoffs[j]) + V = np.sum(np.abs(Tnull_vals) >= cutoffs[j]) / B + Rs[j] = R + Vs[j] = V + FDRs[j] = V / R + +``` + +Now, for any given FDR, we can find the genes that will be +rejected. For example, with FDR controlled at 0.1, we reject 15 of the +100 null hypotheses. On average, we would expect about one or two of +these genes (i.e. 10% of 15) to be false discoveries. At an FDR of +0.2, we can reject the null hypothesis for 28 genes, of which we +expect around six to be false discoveries. + +The variable `idx` stores which +genes were included in our 100 randomly-selected genes. Let’s look at +the genes whose estimated FDR is less than 0.1. + +```{python} +sorted(idx[np.abs(T_vals) >= cutoffs[FDRs < 0.1].min()]) + +``` + +At an FDR threshold of 0.2, more genes are selected, at the cost of having a higher expected +proportion of false discoveries. + +```{python} +sorted(idx[np.abs(T_vals) >= cutoffs[FDRs < 0.2].min()]) + +``` + +The next line generates Figure 13.11, which is similar +to Figure 13.9, +except that it is based on only a subset of the genes. + +```{python} +fig, ax = plt.subplots() +ax.plot(Rs, FDRs, 'b', linewidth=3) +ax.set_xlabel("Number of Rejections") +ax.set_ylabel("False Discovery Rate"); + +``` + + diff --git a/ISLP_labs_R4DS/imagenet_class_index copy.json b/ISLP_labs_R4DS/imagenet_class_index copy.json new file mode 100644 index 0000000..5fe0dfe --- /dev/null +++ b/ISLP_labs_R4DS/imagenet_class_index copy.json @@ -0,0 +1 @@ +{"0": ["n01440764", "tench"], "1": ["n01443537", "goldfish"], "2": ["n01484850", "great_white_shark"], "3": ["n01491361", "tiger_shark"], "4": ["n01494475", "hammerhead"], "5": ["n01496331", "electric_ray"], "6": ["n01498041", "stingray"], "7": ["n01514668", "cock"], "8": ["n01514859", "hen"], "9": ["n01518878", "ostrich"], "10": ["n01530575", "brambling"], "11": ["n01531178", "goldfinch"], "12": ["n01532829", "house_finch"], "13": ["n01534433", "junco"], "14": ["n01537544", "indigo_bunting"], "15": ["n01558993", "robin"], "16": ["n01560419", "bulbul"], "17": ["n01580077", "jay"], "18": ["n01582220", "magpie"], "19": ["n01592084", "chickadee"], "20": ["n01601694", "water_ouzel"], "21": ["n01608432", "kite"], "22": ["n01614925", "bald_eagle"], "23": ["n01616318", "vulture"], "24": ["n01622779", "great_grey_owl"], "25": ["n01629819", "European_fire_salamander"], "26": ["n01630670", "common_newt"], "27": ["n01631663", "eft"], "28": ["n01632458", "spotted_salamander"], "29": ["n01632777", "axolotl"], "30": ["n01641577", "bullfrog"], "31": ["n01644373", "tree_frog"], "32": ["n01644900", "tailed_frog"], "33": ["n01664065", "loggerhead"], "34": ["n01665541", "leatherback_turtle"], "35": ["n01667114", "mud_turtle"], "36": ["n01667778", "terrapin"], "37": ["n01669191", "box_turtle"], "38": ["n01675722", "banded_gecko"], "39": ["n01677366", "common_iguana"], "40": ["n01682714", "American_chameleon"], "41": ["n01685808", "whiptail"], "42": ["n01687978", "agama"], "43": ["n01688243", "frilled_lizard"], "44": ["n01689811", "alligator_lizard"], "45": ["n01692333", "Gila_monster"], "46": ["n01693334", "green_lizard"], "47": ["n01694178", "African_chameleon"], "48": ["n01695060", "Komodo_dragon"], "49": ["n01697457", "African_crocodile"], "50": ["n01698640", "American_alligator"], "51": ["n01704323", "triceratops"], "52": ["n01728572", "thunder_snake"], "53": ["n01728920", "ringneck_snake"], "54": ["n01729322", "hognose_snake"], "55": ["n01729977", "green_snake"], "56": ["n01734418", "king_snake"], "57": ["n01735189", "garter_snake"], "58": ["n01737021", "water_snake"], "59": ["n01739381", "vine_snake"], "60": ["n01740131", "night_snake"], "61": ["n01742172", "boa_constrictor"], "62": ["n01744401", "rock_python"], "63": ["n01748264", "Indian_cobra"], "64": ["n01749939", "green_mamba"], "65": ["n01751748", "sea_snake"], "66": ["n01753488", "horned_viper"], "67": ["n01755581", "diamondback"], "68": ["n01756291", "sidewinder"], "69": ["n01768244", "trilobite"], "70": ["n01770081", "harvestman"], "71": ["n01770393", "scorpion"], "72": ["n01773157", "black_and_gold_garden_spider"], "73": ["n01773549", "barn_spider"], "74": ["n01773797", "garden_spider"], "75": ["n01774384", "black_widow"], "76": ["n01774750", "tarantula"], "77": ["n01775062", "wolf_spider"], "78": ["n01776313", "tick"], "79": ["n01784675", "centipede"], "80": ["n01795545", "black_grouse"], "81": ["n01796340", "ptarmigan"], "82": ["n01797886", "ruffed_grouse"], "83": ["n01798484", "prairie_chicken"], "84": ["n01806143", "peacock"], "85": ["n01806567", "quail"], "86": ["n01807496", "partridge"], "87": ["n01817953", "African_grey"], "88": ["n01818515", "macaw"], "89": ["n01819313", "sulphur-crested_cockatoo"], "90": ["n01820546", "lorikeet"], "91": ["n01824575", "coucal"], "92": ["n01828970", "bee_eater"], "93": ["n01829413", "hornbill"], "94": ["n01833805", "hummingbird"], "95": ["n01843065", "jacamar"], "96": ["n01843383", "toucan"], "97": ["n01847000", "drake"], "98": ["n01855032", "red-breasted_merganser"], "99": ["n01855672", "goose"], "100": ["n01860187", "black_swan"], "101": ["n01871265", "tusker"], "102": ["n01872401", "echidna"], "103": ["n01873310", "platypus"], "104": ["n01877812", "wallaby"], "105": ["n01882714", "koala"], "106": ["n01883070", "wombat"], "107": ["n01910747", "jellyfish"], "108": ["n01914609", "sea_anemone"], "109": ["n01917289", "brain_coral"], "110": ["n01924916", "flatworm"], "111": ["n01930112", "nematode"], "112": ["n01943899", "conch"], "113": ["n01944390", "snail"], "114": ["n01945685", "slug"], "115": ["n01950731", "sea_slug"], "116": ["n01955084", "chiton"], "117": ["n01968897", "chambered_nautilus"], "118": ["n01978287", "Dungeness_crab"], "119": ["n01978455", "rock_crab"], "120": ["n01980166", "fiddler_crab"], "121": ["n01981276", "king_crab"], "122": ["n01983481", "American_lobster"], "123": ["n01984695", "spiny_lobster"], "124": ["n01985128", "crayfish"], "125": ["n01986214", "hermit_crab"], "126": ["n01990800", "isopod"], "127": ["n02002556", "white_stork"], "128": ["n02002724", "black_stork"], "129": ["n02006656", "spoonbill"], "130": ["n02007558", "flamingo"], "131": ["n02009229", "little_blue_heron"], "132": ["n02009912", "American_egret"], "133": ["n02011460", "bittern"], "134": ["n02012849", "crane"], "135": ["n02013706", "limpkin"], "136": ["n02017213", "European_gallinule"], "137": ["n02018207", "American_coot"], "138": ["n02018795", "bustard"], "139": ["n02025239", "ruddy_turnstone"], "140": ["n02027492", "red-backed_sandpiper"], "141": ["n02028035", "redshank"], "142": ["n02033041", "dowitcher"], "143": ["n02037110", "oystercatcher"], "144": ["n02051845", "pelican"], "145": ["n02056570", "king_penguin"], "146": ["n02058221", "albatross"], "147": ["n02066245", "grey_whale"], "148": ["n02071294", "killer_whale"], "149": ["n02074367", "dugong"], "150": ["n02077923", "sea_lion"], "151": ["n02085620", "Chihuahua"], "152": ["n02085782", "Japanese_spaniel"], "153": ["n02085936", "Maltese_dog"], "154": ["n02086079", "Pekinese"], "155": ["n02086240", "Shih-Tzu"], "156": ["n02086646", "Blenheim_spaniel"], "157": ["n02086910", "papillon"], "158": ["n02087046", "toy_terrier"], "159": ["n02087394", "Rhodesian_ridgeback"], "160": ["n02088094", "Afghan_hound"], "161": ["n02088238", "basset"], "162": ["n02088364", "beagle"], "163": ["n02088466", "bloodhound"], "164": ["n02088632", "bluetick"], "165": ["n02089078", "black-and-tan_coonhound"], "166": ["n02089867", "Walker_hound"], "167": ["n02089973", "English_foxhound"], "168": ["n02090379", "redbone"], "169": ["n02090622", "borzoi"], "170": ["n02090721", "Irish_wolfhound"], "171": ["n02091032", "Italian_greyhound"], "172": ["n02091134", "whippet"], "173": ["n02091244", "Ibizan_hound"], "174": ["n02091467", "Norwegian_elkhound"], "175": ["n02091635", "otterhound"], "176": ["n02091831", "Saluki"], "177": ["n02092002", "Scottish_deerhound"], "178": ["n02092339", "Weimaraner"], "179": ["n02093256", "Staffordshire_bullterrier"], "180": ["n02093428", "American_Staffordshire_terrier"], "181": ["n02093647", "Bedlington_terrier"], "182": ["n02093754", "Border_terrier"], "183": ["n02093859", "Kerry_blue_terrier"], "184": ["n02093991", "Irish_terrier"], "185": ["n02094114", "Norfolk_terrier"], "186": ["n02094258", "Norwich_terrier"], "187": ["n02094433", "Yorkshire_terrier"], "188": ["n02095314", "wire-haired_fox_terrier"], "189": ["n02095570", "Lakeland_terrier"], "190": ["n02095889", "Sealyham_terrier"], "191": ["n02096051", "Airedale"], "192": ["n02096177", "cairn"], "193": ["n02096294", "Australian_terrier"], "194": ["n02096437", "Dandie_Dinmont"], "195": ["n02096585", "Boston_bull"], "196": ["n02097047", "miniature_schnauzer"], "197": ["n02097130", "giant_schnauzer"], "198": ["n02097209", "standard_schnauzer"], "199": ["n02097298", "Scotch_terrier"], "200": ["n02097474", "Tibetan_terrier"], "201": ["n02097658", "silky_terrier"], "202": ["n02098105", "soft-coated_wheaten_terrier"], "203": ["n02098286", "West_Highland_white_terrier"], "204": ["n02098413", "Lhasa"], "205": ["n02099267", "flat-coated_retriever"], "206": ["n02099429", "curly-coated_retriever"], "207": ["n02099601", "golden_retriever"], "208": ["n02099712", "Labrador_retriever"], "209": ["n02099849", "Chesapeake_Bay_retriever"], "210": ["n02100236", "German_short-haired_pointer"], "211": ["n02100583", "vizsla"], "212": ["n02100735", "English_setter"], "213": ["n02100877", "Irish_setter"], "214": ["n02101006", "Gordon_setter"], "215": ["n02101388", "Brittany_spaniel"], "216": ["n02101556", "clumber"], "217": ["n02102040", "English_springer"], "218": ["n02102177", "Welsh_springer_spaniel"], "219": ["n02102318", "cocker_spaniel"], "220": ["n02102480", "Sussex_spaniel"], "221": ["n02102973", "Irish_water_spaniel"], "222": ["n02104029", "kuvasz"], "223": ["n02104365", "schipperke"], "224": ["n02105056", "groenendael"], "225": ["n02105162", "malinois"], "226": ["n02105251", "briard"], "227": ["n02105412", "kelpie"], "228": ["n02105505", "komondor"], "229": ["n02105641", "Old_English_sheepdog"], "230": ["n02105855", "Shetland_sheepdog"], "231": ["n02106030", "collie"], "232": ["n02106166", "Border_collie"], "233": ["n02106382", "Bouvier_des_Flandres"], "234": ["n02106550", "Rottweiler"], "235": ["n02106662", "German_shepherd"], "236": ["n02107142", "Doberman"], "237": ["n02107312", "miniature_pinscher"], "238": ["n02107574", "Greater_Swiss_Mountain_dog"], "239": ["n02107683", "Bernese_mountain_dog"], "240": ["n02107908", "Appenzeller"], "241": ["n02108000", "EntleBucher"], "242": ["n02108089", "boxer"], "243": ["n02108422", "bull_mastiff"], "244": ["n02108551", "Tibetan_mastiff"], "245": ["n02108915", "French_bulldog"], "246": ["n02109047", "Great_Dane"], "247": ["n02109525", "Saint_Bernard"], "248": ["n02109961", "Eskimo_dog"], "249": ["n02110063", "malamute"], "250": ["n02110185", "Siberian_husky"], "251": ["n02110341", "dalmatian"], "252": ["n02110627", "affenpinscher"], "253": ["n02110806", "basenji"], "254": ["n02110958", "pug"], "255": ["n02111129", "Leonberg"], "256": ["n02111277", "Newfoundland"], "257": ["n02111500", "Great_Pyrenees"], "258": ["n02111889", "Samoyed"], "259": ["n02112018", "Pomeranian"], "260": ["n02112137", "chow"], "261": ["n02112350", "keeshond"], "262": ["n02112706", "Brabancon_griffon"], "263": ["n02113023", "Pembroke"], "264": ["n02113186", "Cardigan"], "265": ["n02113624", "toy_poodle"], "266": ["n02113712", "miniature_poodle"], "267": ["n02113799", "standard_poodle"], "268": ["n02113978", "Mexican_hairless"], "269": ["n02114367", "timber_wolf"], "270": ["n02114548", "white_wolf"], "271": ["n02114712", "red_wolf"], "272": ["n02114855", "coyote"], "273": ["n02115641", "dingo"], "274": ["n02115913", "dhole"], "275": ["n02116738", "African_hunting_dog"], "276": ["n02117135", "hyena"], "277": ["n02119022", "red_fox"], "278": ["n02119789", "kit_fox"], "279": ["n02120079", "Arctic_fox"], "280": ["n02120505", "grey_fox"], "281": ["n02123045", "tabby"], "282": ["n02123159", "tiger_cat"], "283": ["n02123394", "Persian_cat"], "284": ["n02123597", "Siamese_cat"], "285": ["n02124075", "Egyptian_cat"], "286": ["n02125311", "cougar"], "287": ["n02127052", "lynx"], "288": ["n02128385", "leopard"], "289": ["n02128757", "snow_leopard"], "290": ["n02128925", "jaguar"], "291": ["n02129165", "lion"], "292": ["n02129604", "tiger"], "293": ["n02130308", "cheetah"], "294": ["n02132136", "brown_bear"], "295": ["n02133161", "American_black_bear"], "296": ["n02134084", "ice_bear"], "297": ["n02134418", "sloth_bear"], "298": ["n02137549", "mongoose"], "299": ["n02138441", "meerkat"], "300": ["n02165105", "tiger_beetle"], "301": ["n02165456", "ladybug"], "302": ["n02167151", "ground_beetle"], "303": ["n02168699", "long-horned_beetle"], "304": ["n02169497", "leaf_beetle"], "305": ["n02172182", "dung_beetle"], "306": ["n02174001", "rhinoceros_beetle"], "307": ["n02177972", "weevil"], "308": ["n02190166", "fly"], "309": ["n02206856", "bee"], "310": ["n02219486", "ant"], "311": ["n02226429", "grasshopper"], "312": ["n02229544", "cricket"], "313": ["n02231487", "walking_stick"], "314": ["n02233338", "cockroach"], "315": ["n02236044", "mantis"], "316": ["n02256656", "cicada"], "317": ["n02259212", "leafhopper"], "318": ["n02264363", "lacewing"], "319": ["n02268443", "dragonfly"], "320": ["n02268853", "damselfly"], "321": ["n02276258", "admiral"], "322": ["n02277742", "ringlet"], "323": ["n02279972", "monarch"], "324": ["n02280649", "cabbage_butterfly"], "325": ["n02281406", "sulphur_butterfly"], "326": ["n02281787", "lycaenid"], "327": ["n02317335", "starfish"], "328": ["n02319095", "sea_urchin"], "329": ["n02321529", "sea_cucumber"], "330": ["n02325366", "wood_rabbit"], "331": ["n02326432", "hare"], "332": ["n02328150", "Angora"], "333": ["n02342885", "hamster"], "334": ["n02346627", "porcupine"], "335": ["n02356798", "fox_squirrel"], "336": ["n02361337", "marmot"], "337": ["n02363005", "beaver"], "338": ["n02364673", "guinea_pig"], "339": ["n02389026", "sorrel"], "340": ["n02391049", "zebra"], "341": ["n02395406", "hog"], "342": ["n02396427", "wild_boar"], "343": ["n02397096", "warthog"], "344": ["n02398521", "hippopotamus"], "345": ["n02403003", "ox"], "346": ["n02408429", "water_buffalo"], "347": ["n02410509", "bison"], "348": ["n02412080", "ram"], "349": ["n02415577", "bighorn"], "350": ["n02417914", "ibex"], "351": ["n02422106", "hartebeest"], "352": ["n02422699", "impala"], "353": ["n02423022", "gazelle"], "354": ["n02437312", "Arabian_camel"], "355": ["n02437616", "llama"], "356": ["n02441942", "weasel"], "357": ["n02442845", "mink"], "358": ["n02443114", "polecat"], "359": ["n02443484", "black-footed_ferret"], "360": ["n02444819", "otter"], "361": ["n02445715", "skunk"], "362": ["n02447366", "badger"], "363": ["n02454379", "armadillo"], "364": ["n02457408", "three-toed_sloth"], "365": ["n02480495", "orangutan"], "366": ["n02480855", "gorilla"], "367": ["n02481823", "chimpanzee"], "368": ["n02483362", "gibbon"], "369": ["n02483708", "siamang"], "370": ["n02484975", "guenon"], "371": ["n02486261", "patas"], "372": ["n02486410", "baboon"], "373": ["n02487347", "macaque"], "374": ["n02488291", "langur"], "375": ["n02488702", "colobus"], "376": ["n02489166", "proboscis_monkey"], "377": ["n02490219", "marmoset"], "378": ["n02492035", "capuchin"], "379": ["n02492660", "howler_monkey"], "380": ["n02493509", "titi"], "381": ["n02493793", "spider_monkey"], "382": ["n02494079", "squirrel_monkey"], "383": ["n02497673", "Madagascar_cat"], "384": ["n02500267", "indri"], "385": ["n02504013", "Indian_elephant"], "386": ["n02504458", "African_elephant"], "387": ["n02509815", "lesser_panda"], "388": ["n02510455", "giant_panda"], "389": ["n02514041", "barracouta"], "390": ["n02526121", "eel"], "391": ["n02536864", "coho"], "392": ["n02606052", "rock_beauty"], "393": ["n02607072", "anemone_fish"], "394": ["n02640242", "sturgeon"], "395": ["n02641379", "gar"], "396": ["n02643566", "lionfish"], "397": ["n02655020", "puffer"], "398": ["n02666196", "abacus"], "399": ["n02667093", "abaya"], "400": ["n02669723", "academic_gown"], "401": ["n02672831", "accordion"], "402": ["n02676566", "acoustic_guitar"], "403": ["n02687172", "aircraft_carrier"], "404": ["n02690373", "airliner"], "405": ["n02692877", "airship"], "406": ["n02699494", "altar"], "407": ["n02701002", "ambulance"], "408": ["n02704792", "amphibian"], "409": ["n02708093", "analog_clock"], "410": ["n02727426", "apiary"], "411": ["n02730930", "apron"], "412": ["n02747177", "ashcan"], "413": ["n02749479", "assault_rifle"], "414": ["n02769748", "backpack"], "415": ["n02776631", "bakery"], "416": ["n02777292", "balance_beam"], "417": ["n02782093", "balloon"], "418": ["n02783161", "ballpoint"], "419": ["n02786058", "Band_Aid"], "420": ["n02787622", "banjo"], "421": ["n02788148", "bannister"], "422": ["n02790996", "barbell"], "423": ["n02791124", "barber_chair"], "424": ["n02791270", "barbershop"], "425": ["n02793495", "barn"], "426": ["n02794156", "barometer"], "427": ["n02795169", "barrel"], "428": ["n02797295", "barrow"], "429": ["n02799071", "baseball"], "430": ["n02802426", "basketball"], "431": ["n02804414", "bassinet"], "432": ["n02804610", "bassoon"], "433": ["n02807133", "bathing_cap"], "434": ["n02808304", "bath_towel"], "435": ["n02808440", "bathtub"], "436": ["n02814533", "beach_wagon"], "437": ["n02814860", "beacon"], "438": ["n02815834", "beaker"], "439": ["n02817516", "bearskin"], "440": ["n02823428", "beer_bottle"], "441": ["n02823750", "beer_glass"], "442": ["n02825657", "bell_cote"], "443": ["n02834397", "bib"], "444": ["n02835271", "bicycle-built-for-two"], "445": ["n02837789", "bikini"], "446": ["n02840245", "binder"], "447": ["n02841315", "binoculars"], "448": ["n02843684", "birdhouse"], "449": ["n02859443", "boathouse"], "450": ["n02860847", "bobsled"], "451": ["n02865351", "bolo_tie"], "452": ["n02869837", "bonnet"], "453": ["n02870880", "bookcase"], "454": ["n02871525", "bookshop"], "455": ["n02877765", "bottlecap"], "456": ["n02879718", "bow"], "457": ["n02883205", "bow_tie"], "458": ["n02892201", "brass"], "459": ["n02892767", "brassiere"], "460": ["n02894605", "breakwater"], "461": ["n02895154", "breastplate"], "462": ["n02906734", "broom"], "463": ["n02909870", "bucket"], "464": ["n02910353", "buckle"], "465": ["n02916936", "bulletproof_vest"], "466": ["n02917067", "bullet_train"], "467": ["n02927161", "butcher_shop"], "468": ["n02930766", "cab"], "469": ["n02939185", "caldron"], "470": ["n02948072", "candle"], "471": ["n02950826", "cannon"], "472": ["n02951358", "canoe"], "473": ["n02951585", "can_opener"], "474": ["n02963159", "cardigan"], "475": ["n02965783", "car_mirror"], "476": ["n02966193", "carousel"], "477": ["n02966687", "carpenter's_kit"], "478": ["n02971356", "carton"], "479": ["n02974003", "car_wheel"], "480": ["n02977058", "cash_machine"], "481": ["n02978881", "cassette"], "482": ["n02979186", "cassette_player"], "483": ["n02980441", "castle"], "484": ["n02981792", "catamaran"], "485": ["n02988304", "CD_player"], "486": ["n02992211", "cello"], "487": ["n02992529", "cellular_telephone"], "488": ["n02999410", "chain"], "489": ["n03000134", "chainlink_fence"], "490": ["n03000247", "chain_mail"], "491": ["n03000684", "chain_saw"], "492": ["n03014705", "chest"], "493": ["n03016953", "chiffonier"], "494": ["n03017168", "chime"], "495": ["n03018349", "china_cabinet"], "496": ["n03026506", "Christmas_stocking"], "497": ["n03028079", "church"], "498": ["n03032252", "cinema"], "499": ["n03041632", "cleaver"], "500": ["n03042490", "cliff_dwelling"], "501": ["n03045698", "cloak"], "502": ["n03047690", "clog"], "503": ["n03062245", "cocktail_shaker"], "504": ["n03063599", "coffee_mug"], "505": ["n03063689", "coffeepot"], "506": ["n03065424", "coil"], "507": ["n03075370", "combination_lock"], "508": ["n03085013", "computer_keyboard"], "509": ["n03089624", "confectionery"], "510": ["n03095699", "container_ship"], "511": ["n03100240", "convertible"], "512": ["n03109150", "corkscrew"], "513": ["n03110669", "cornet"], "514": ["n03124043", "cowboy_boot"], "515": ["n03124170", "cowboy_hat"], "516": ["n03125729", "cradle"], "517": ["n03126707", "crane"], "518": ["n03127747", "crash_helmet"], "519": ["n03127925", "crate"], "520": ["n03131574", "crib"], "521": ["n03133878", "Crock_Pot"], "522": ["n03134739", "croquet_ball"], "523": ["n03141823", "crutch"], "524": ["n03146219", "cuirass"], "525": ["n03160309", "dam"], "526": ["n03179701", "desk"], "527": ["n03180011", "desktop_computer"], "528": ["n03187595", "dial_telephone"], "529": ["n03188531", "diaper"], "530": ["n03196217", "digital_clock"], "531": ["n03197337", "digital_watch"], "532": ["n03201208", "dining_table"], "533": ["n03207743", "dishrag"], "534": ["n03207941", "dishwasher"], "535": ["n03208938", "disk_brake"], "536": ["n03216828", "dock"], "537": ["n03218198", "dogsled"], "538": ["n03220513", "dome"], "539": ["n03223299", "doormat"], "540": ["n03240683", "drilling_platform"], "541": ["n03249569", "drum"], "542": ["n03250847", "drumstick"], "543": ["n03255030", "dumbbell"], "544": ["n03259280", "Dutch_oven"], "545": ["n03271574", "electric_fan"], "546": ["n03272010", "electric_guitar"], "547": ["n03272562", "electric_locomotive"], "548": ["n03290653", "entertainment_center"], "549": ["n03291819", "envelope"], "550": ["n03297495", "espresso_maker"], "551": ["n03314780", "face_powder"], "552": ["n03325584", "feather_boa"], "553": ["n03337140", "file"], "554": ["n03344393", "fireboat"], "555": ["n03345487", "fire_engine"], "556": ["n03347037", "fire_screen"], "557": ["n03355925", "flagpole"], "558": ["n03372029", "flute"], "559": ["n03376595", "folding_chair"], "560": ["n03379051", "football_helmet"], "561": ["n03384352", "forklift"], "562": ["n03388043", "fountain"], "563": ["n03388183", "fountain_pen"], "564": ["n03388549", "four-poster"], "565": ["n03393912", "freight_car"], "566": ["n03394916", "French_horn"], "567": ["n03400231", "frying_pan"], "568": ["n03404251", "fur_coat"], "569": ["n03417042", "garbage_truck"], "570": ["n03424325", "gasmask"], "571": ["n03425413", "gas_pump"], "572": ["n03443371", "goblet"], "573": ["n03444034", "go-kart"], "574": ["n03445777", "golf_ball"], "575": ["n03445924", "golfcart"], "576": ["n03447447", "gondola"], "577": ["n03447721", "gong"], "578": ["n03450230", "gown"], "579": ["n03452741", "grand_piano"], "580": ["n03457902", "greenhouse"], "581": ["n03459775", "grille"], "582": ["n03461385", "grocery_store"], "583": ["n03467068", "guillotine"], "584": ["n03476684", "hair_slide"], "585": ["n03476991", "hair_spray"], "586": ["n03478589", "half_track"], "587": ["n03481172", "hammer"], "588": ["n03482405", "hamper"], "589": ["n03483316", "hand_blower"], "590": ["n03485407", "hand-held_computer"], "591": ["n03485794", "handkerchief"], "592": ["n03492542", "hard_disc"], "593": ["n03494278", "harmonica"], "594": ["n03495258", "harp"], "595": ["n03496892", "harvester"], "596": ["n03498962", "hatchet"], "597": ["n03527444", "holster"], "598": ["n03529860", "home_theater"], "599": ["n03530642", "honeycomb"], "600": ["n03532672", "hook"], "601": ["n03534580", "hoopskirt"], "602": ["n03535780", "horizontal_bar"], "603": ["n03538406", "horse_cart"], "604": ["n03544143", "hourglass"], "605": ["n03584254", "iPod"], "606": ["n03584829", "iron"], "607": ["n03590841", "jack-o'-lantern"], "608": ["n03594734", "jean"], "609": ["n03594945", "jeep"], "610": ["n03595614", "jersey"], "611": ["n03598930", "jigsaw_puzzle"], "612": ["n03599486", "jinrikisha"], "613": ["n03602883", "joystick"], "614": ["n03617480", "kimono"], "615": ["n03623198", "knee_pad"], "616": ["n03627232", "knot"], "617": ["n03630383", "lab_coat"], "618": ["n03633091", "ladle"], "619": ["n03637318", "lampshade"], "620": ["n03642806", "laptop"], "621": ["n03649909", "lawn_mower"], "622": ["n03657121", "lens_cap"], "623": ["n03658185", "letter_opener"], "624": ["n03661043", "library"], "625": ["n03662601", "lifeboat"], "626": ["n03666591", "lighter"], "627": ["n03670208", "limousine"], "628": ["n03673027", "liner"], "629": ["n03676483", "lipstick"], "630": ["n03680355", "Loafer"], "631": ["n03690938", "lotion"], "632": ["n03691459", "loudspeaker"], "633": ["n03692522", "loupe"], "634": ["n03697007", "lumbermill"], "635": ["n03706229", "magnetic_compass"], "636": ["n03709823", "mailbag"], "637": ["n03710193", "mailbox"], "638": ["n03710637", "maillot"], "639": ["n03710721", "maillot"], "640": ["n03717622", "manhole_cover"], "641": ["n03720891", "maraca"], "642": ["n03721384", "marimba"], "643": ["n03724870", "mask"], "644": ["n03729826", "matchstick"], "645": ["n03733131", "maypole"], "646": ["n03733281", "maze"], "647": ["n03733805", "measuring_cup"], "648": ["n03742115", "medicine_chest"], "649": ["n03743016", "megalith"], "650": ["n03759954", "microphone"], "651": ["n03761084", "microwave"], "652": ["n03763968", "military_uniform"], "653": ["n03764736", "milk_can"], "654": ["n03769881", "minibus"], "655": ["n03770439", "miniskirt"], "656": ["n03770679", "minivan"], "657": ["n03773504", "missile"], "658": ["n03775071", "mitten"], "659": ["n03775546", "mixing_bowl"], "660": ["n03776460", "mobile_home"], "661": ["n03777568", "Model_T"], "662": ["n03777754", "modem"], "663": ["n03781244", "monastery"], "664": ["n03782006", "monitor"], "665": ["n03785016", "moped"], "666": ["n03786901", "mortar"], "667": ["n03787032", "mortarboard"], "668": ["n03788195", "mosque"], "669": ["n03788365", "mosquito_net"], "670": ["n03791053", "motor_scooter"], "671": ["n03792782", "mountain_bike"], "672": ["n03792972", "mountain_tent"], "673": ["n03793489", "mouse"], "674": ["n03794056", "mousetrap"], "675": ["n03796401", "moving_van"], "676": ["n03803284", "muzzle"], "677": ["n03804744", "nail"], "678": ["n03814639", "neck_brace"], "679": ["n03814906", "necklace"], "680": ["n03825788", "nipple"], "681": ["n03832673", "notebook"], "682": ["n03837869", "obelisk"], "683": ["n03838899", "oboe"], "684": ["n03840681", "ocarina"], "685": ["n03841143", "odometer"], "686": ["n03843555", "oil_filter"], "687": ["n03854065", "organ"], "688": ["n03857828", "oscilloscope"], "689": ["n03866082", "overskirt"], "690": ["n03868242", "oxcart"], "691": ["n03868863", "oxygen_mask"], "692": ["n03871628", "packet"], "693": ["n03873416", "paddle"], "694": ["n03874293", "paddlewheel"], "695": ["n03874599", "padlock"], "696": ["n03876231", "paintbrush"], "697": ["n03877472", "pajama"], "698": ["n03877845", "palace"], "699": ["n03884397", "panpipe"], "700": ["n03887697", "paper_towel"], "701": ["n03888257", "parachute"], "702": ["n03888605", "parallel_bars"], "703": ["n03891251", "park_bench"], "704": ["n03891332", "parking_meter"], "705": ["n03895866", "passenger_car"], "706": ["n03899768", "patio"], "707": ["n03902125", "pay-phone"], "708": ["n03903868", "pedestal"], "709": ["n03908618", "pencil_box"], "710": ["n03908714", "pencil_sharpener"], "711": ["n03916031", "perfume"], "712": ["n03920288", "Petri_dish"], "713": ["n03924679", "photocopier"], "714": ["n03929660", "pick"], "715": ["n03929855", "pickelhaube"], "716": ["n03930313", "picket_fence"], "717": ["n03930630", "pickup"], "718": ["n03933933", "pier"], "719": ["n03935335", "piggy_bank"], "720": ["n03937543", "pill_bottle"], "721": ["n03938244", "pillow"], "722": ["n03942813", "ping-pong_ball"], "723": ["n03944341", "pinwheel"], "724": ["n03947888", "pirate"], "725": ["n03950228", "pitcher"], "726": ["n03954731", "plane"], "727": ["n03956157", "planetarium"], "728": ["n03958227", "plastic_bag"], "729": ["n03961711", "plate_rack"], "730": ["n03967562", "plow"], "731": ["n03970156", "plunger"], "732": ["n03976467", "Polaroid_camera"], "733": ["n03976657", "pole"], "734": ["n03977966", "police_van"], "735": ["n03980874", "poncho"], "736": ["n03982430", "pool_table"], "737": ["n03983396", "pop_bottle"], "738": ["n03991062", "pot"], "739": ["n03992509", "potter's_wheel"], "740": ["n03995372", "power_drill"], "741": ["n03998194", "prayer_rug"], "742": ["n04004767", "printer"], "743": ["n04005630", "prison"], "744": ["n04008634", "projectile"], "745": ["n04009552", "projector"], "746": ["n04019541", "puck"], "747": ["n04023962", "punching_bag"], "748": ["n04026417", "purse"], "749": ["n04033901", "quill"], "750": ["n04033995", "quilt"], "751": ["n04037443", "racer"], "752": ["n04039381", "racket"], "753": ["n04040759", "radiator"], "754": ["n04041544", "radio"], "755": ["n04044716", "radio_telescope"], "756": ["n04049303", "rain_barrel"], "757": ["n04065272", "recreational_vehicle"], "758": ["n04067472", "reel"], "759": ["n04069434", "reflex_camera"], "760": ["n04070727", "refrigerator"], "761": ["n04074963", "remote_control"], "762": ["n04081281", "restaurant"], "763": ["n04086273", "revolver"], "764": ["n04090263", "rifle"], "765": ["n04099969", "rocking_chair"], "766": ["n04111531", "rotisserie"], "767": ["n04116512", "rubber_eraser"], "768": ["n04118538", "rugby_ball"], "769": ["n04118776", "rule"], "770": ["n04120489", "running_shoe"], "771": ["n04125021", "safe"], "772": ["n04127249", "safety_pin"], "773": ["n04131690", "saltshaker"], "774": ["n04133789", "sandal"], "775": ["n04136333", "sarong"], "776": ["n04141076", "sax"], "777": ["n04141327", "scabbard"], "778": ["n04141975", "scale"], "779": ["n04146614", "school_bus"], "780": ["n04147183", "schooner"], "781": ["n04149813", "scoreboard"], "782": ["n04152593", "screen"], "783": ["n04153751", "screw"], "784": ["n04154565", "screwdriver"], "785": ["n04162706", "seat_belt"], "786": ["n04179913", "sewing_machine"], "787": ["n04192698", "shield"], "788": ["n04200800", "shoe_shop"], "789": ["n04201297", "shoji"], "790": ["n04204238", "shopping_basket"], "791": ["n04204347", "shopping_cart"], "792": ["n04208210", "shovel"], "793": ["n04209133", "shower_cap"], "794": ["n04209239", "shower_curtain"], "795": ["n04228054", "ski"], "796": ["n04229816", "ski_mask"], "797": ["n04235860", "sleeping_bag"], "798": ["n04238763", "slide_rule"], "799": ["n04239074", "sliding_door"], "800": ["n04243546", "slot"], "801": ["n04251144", "snorkel"], "802": ["n04252077", "snowmobile"], "803": ["n04252225", "snowplow"], "804": ["n04254120", "soap_dispenser"], "805": ["n04254680", "soccer_ball"], "806": ["n04254777", "sock"], "807": ["n04258138", "solar_dish"], "808": ["n04259630", "sombrero"], "809": ["n04263257", "soup_bowl"], "810": ["n04264628", "space_bar"], "811": ["n04265275", "space_heater"], "812": ["n04266014", "space_shuttle"], "813": ["n04270147", "spatula"], "814": ["n04273569", "speedboat"], "815": ["n04275548", "spider_web"], "816": ["n04277352", "spindle"], "817": ["n04285008", "sports_car"], "818": ["n04286575", "spotlight"], "819": ["n04296562", "stage"], "820": ["n04310018", "steam_locomotive"], "821": ["n04311004", "steel_arch_bridge"], "822": ["n04311174", "steel_drum"], "823": ["n04317175", "stethoscope"], "824": ["n04325704", "stole"], "825": ["n04326547", "stone_wall"], "826": ["n04328186", "stopwatch"], "827": ["n04330267", "stove"], "828": ["n04332243", "strainer"], "829": ["n04335435", "streetcar"], "830": ["n04336792", "stretcher"], "831": ["n04344873", "studio_couch"], "832": ["n04346328", "stupa"], "833": ["n04347754", "submarine"], "834": ["n04350905", "suit"], "835": ["n04355338", "sundial"], "836": ["n04355933", "sunglass"], "837": ["n04356056", "sunglasses"], "838": ["n04357314", "sunscreen"], "839": ["n04366367", "suspension_bridge"], "840": ["n04367480", "swab"], "841": ["n04370456", "sweatshirt"], "842": ["n04371430", "swimming_trunks"], "843": ["n04371774", "swing"], "844": ["n04372370", "switch"], "845": ["n04376876", "syringe"], "846": ["n04380533", "table_lamp"], "847": ["n04389033", "tank"], "848": ["n04392985", "tape_player"], "849": ["n04398044", "teapot"], "850": ["n04399382", "teddy"], "851": ["n04404412", "television"], "852": ["n04409515", "tennis_ball"], "853": ["n04417672", "thatch"], "854": ["n04418357", "theater_curtain"], "855": ["n04423845", "thimble"], "856": ["n04428191", "thresher"], "857": ["n04429376", "throne"], "858": ["n04435653", "tile_roof"], "859": ["n04442312", "toaster"], "860": ["n04443257", "tobacco_shop"], "861": ["n04447861", "toilet_seat"], "862": ["n04456115", "torch"], "863": ["n04458633", "totem_pole"], "864": ["n04461696", "tow_truck"], "865": ["n04462240", "toyshop"], "866": ["n04465501", "tractor"], "867": ["n04467665", "trailer_truck"], "868": ["n04476259", "tray"], "869": ["n04479046", "trench_coat"], "870": ["n04482393", "tricycle"], "871": ["n04483307", "trimaran"], "872": ["n04485082", "tripod"], "873": ["n04486054", "triumphal_arch"], "874": ["n04487081", "trolleybus"], "875": ["n04487394", "trombone"], "876": ["n04493381", "tub"], "877": ["n04501370", "turnstile"], "878": ["n04505470", "typewriter_keyboard"], "879": ["n04507155", "umbrella"], "880": ["n04509417", "unicycle"], "881": ["n04515003", "upright"], "882": ["n04517823", "vacuum"], "883": ["n04522168", "vase"], "884": ["n04523525", "vault"], "885": ["n04525038", "velvet"], "886": ["n04525305", "vending_machine"], "887": ["n04532106", "vestment"], "888": ["n04532670", "viaduct"], "889": ["n04536866", "violin"], "890": ["n04540053", "volleyball"], "891": ["n04542943", "waffle_iron"], "892": ["n04548280", "wall_clock"], "893": ["n04548362", "wallet"], "894": ["n04550184", "wardrobe"], "895": ["n04552348", "warplane"], "896": ["n04553703", "washbasin"], "897": ["n04554684", "washer"], "898": ["n04557648", "water_bottle"], "899": ["n04560804", "water_jug"], "900": ["n04562935", "water_tower"], "901": ["n04579145", "whiskey_jug"], "902": ["n04579432", "whistle"], "903": ["n04584207", "wig"], "904": ["n04589890", "window_screen"], "905": ["n04590129", "window_shade"], "906": ["n04591157", "Windsor_tie"], "907": ["n04591713", "wine_bottle"], "908": ["n04592741", "wing"], "909": ["n04596742", "wok"], "910": ["n04597913", "wooden_spoon"], "911": ["n04599235", "wool"], "912": ["n04604644", "worm_fence"], "913": ["n04606251", "wreck"], "914": ["n04612504", "yawl"], "915": ["n04613696", "yurt"], "916": ["n06359193", "web_site"], "917": ["n06596364", "comic_book"], "918": ["n06785654", "crossword_puzzle"], "919": ["n06794110", "street_sign"], "920": ["n06874185", "traffic_light"], "921": ["n07248320", "book_jacket"], "922": ["n07565083", "menu"], "923": ["n07579787", "plate"], "924": ["n07583066", "guacamole"], "925": ["n07584110", "consomme"], "926": ["n07590611", "hot_pot"], "927": ["n07613480", "trifle"], "928": ["n07614500", "ice_cream"], "929": ["n07615774", "ice_lolly"], "930": ["n07684084", "French_loaf"], "931": ["n07693725", "bagel"], "932": ["n07695742", "pretzel"], "933": ["n07697313", "cheeseburger"], "934": ["n07697537", "hotdog"], "935": ["n07711569", "mashed_potato"], "936": ["n07714571", "head_cabbage"], "937": ["n07714990", "broccoli"], "938": ["n07715103", "cauliflower"], "939": ["n07716358", "zucchini"], "940": ["n07716906", "spaghetti_squash"], "941": ["n07717410", "acorn_squash"], "942": ["n07717556", "butternut_squash"], "943": ["n07718472", "cucumber"], "944": ["n07718747", "artichoke"], "945": ["n07720875", "bell_pepper"], "946": ["n07730033", "cardoon"], "947": ["n07734744", "mushroom"], "948": ["n07742313", "Granny_Smith"], "949": ["n07745940", "strawberry"], "950": ["n07747607", "orange"], "951": ["n07749582", "lemon"], "952": ["n07753113", "fig"], "953": ["n07753275", "pineapple"], "954": ["n07753592", "banana"], "955": ["n07754684", "jackfruit"], "956": ["n07760859", "custard_apple"], "957": ["n07768694", "pomegranate"], "958": ["n07802026", "hay"], "959": ["n07831146", "carbonara"], "960": ["n07836838", "chocolate_sauce"], "961": ["n07860988", "dough"], "962": ["n07871810", "meat_loaf"], "963": ["n07873807", "pizza"], "964": ["n07875152", "potpie"], "965": ["n07880968", "burrito"], "966": ["n07892512", "red_wine"], "967": ["n07920052", "espresso"], "968": ["n07930864", "cup"], "969": ["n07932039", "eggnog"], "970": ["n09193705", "alp"], "971": ["n09229709", "bubble"], "972": ["n09246464", "cliff"], "973": ["n09256479", "coral_reef"], "974": ["n09288635", "geyser"], "975": ["n09332890", "lakeside"], "976": ["n09399592", "promontory"], "977": ["n09421951", "sandbar"], "978": ["n09428293", "seashore"], "979": ["n09468604", "valley"], "980": ["n09472597", "volcano"], "981": ["n09835506", "ballplayer"], "982": ["n10148035", "groom"], "983": ["n10565667", "scuba_diver"], "984": ["n11879895", "rapeseed"], "985": ["n11939491", "daisy"], "986": ["n12057211", "yellow_lady's_slipper"], "987": ["n12144580", "corn"], "988": ["n12267677", "acorn"], "989": ["n12620546", "hip"], "990": ["n12768682", "buckeye"], "991": ["n12985857", "coral_fungus"], "992": ["n12998815", "agaric"], "993": ["n13037406", "gyromitra"], "994": ["n13040303", "stinkhorn"], "995": ["n13044778", "earthstar"], "996": ["n13052670", "hen-of-the-woods"], "997": ["n13054560", "bolete"], "998": ["n13133613", "ear"], "999": ["n15075141", "toilet_tissue"]} \ No newline at end of file diff --git a/ISLP_labs_R4DS/requirements copy.txt b/ISLP_labs_R4DS/requirements copy.txt new file mode 100644 index 0000000..c917f5a --- /dev/null +++ b/ISLP_labs_R4DS/requirements copy.txt @@ -0,0 +1,16 @@ +numpy==1.24.2 +scipy==1.11.1 +pandas==1.5.3 +lxml==4.9.3 +scikit-learn==1.3.0 +joblib==1.3.1 +statsmodels==0.14.0 +lifelines==0.27.7 +pygam==0.9.0 +l0bnb==1.0.0 +torch==2.0.1 +torchvision==0.15.2 +pytorch-lightning==2.0.6 +torchinfo==1.8.0 +torchmetrics==1.0.1 +ISLP==0.3.19 diff --git a/_freeze/02_exercises/execute-results/html.json b/_freeze/02_exercises/execute-results/html.json new file mode 100644 index 0000000..787dc14 --- /dev/null +++ b/_freeze/02_exercises/execute-results/html.json @@ -0,0 +1,16 @@ +{ + "hash": "10f5956f7ff75c2eade85d8de4b1bbc2", + "result": { + "markdown": "# Applied Exercises {-}\n\n\n::: {.cell}\n\n```{.r .cell-code}\nlibrary(reticulate)\nos <- import(\"os\")\nos$listdir(\".\")\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n [1] \"03_exercises.qmd\" \".Rhistory\" \"03_main.qmd\" \n [4] \"02_main.qmd\" \".Rbuildignore\" \"examples.md\" \n [7] \"07_exercises.qmd\" \"10_exercises.qmd\" \"_quarto.yml\" \n[10] \"bookclub-islp.Rproj\" \"example_python.ipynb\" \"09_main.qmd\" \n[13] \"08_main.qmd\" \"03_video.qmd\" \"06_notes.qmd\" \n[16] \".DS_Store\" \"12_notes.qmd\" \"08_video.qmd\" \n[19] \"13_exercises.qmd\" \"04_exercises.qmd\" \"11_video.qmd\" \n[22] \"05_video.qmd\" \"images\" \"13_main.qmd\" \n[25] \"12_main.qmd\" \"01_notes.qmd\" \"02_video.qmd\" \n[28] \"how-to.qmd\" \"ISLP_labs_R4DS\" \"example_quarto.qmd\" \n[31] \"13_notes.qmd\" \"04_main.qmd\" \"07_notes.qmd\" \n[34] \"05_main.qmd\" \"09_video.qmd\" \"06_notes.ipynb\" \n[37] \"docs\" \"index.qmd\" \"04_video.qmd\" \n[40] \"09_exercises.qmd\" \"10_video.qmd\" \"02_notes.qmd\" \n[43] \"README.md\" \"some_array.npy\" \"08_exercises.qmd\" \n[46] \".gitignore\" \"figpath.png\" \"01_video.qmd\" \n[49] \"04_notes.qmd\" \".github\" \"style.css\" \n[52] \"10_notes.qmd\" \"cover.png\" \"13_video.qmd\" \n[55] \"07_video.qmd\" \"sidebars-toggle.html\" \"01_main.qmd\" \n[58] \"09_notes.qmd\" \"03_notes.qmd\" \"07_main.qmd\" \n[61] \"06_main.qmd\" \"05_exercises.qmd\" \".git\" \n[64] \"12_exercises.qmd\" \"02_exercises.rmarkdown\" \"data\" \n[67] \"11_exercises.qmd\" \"06_exercises.qmd\" \"_freeze\" \n[70] \"11_notes.qmd\" \".Rproj.user\" \".quarto\" \n[73] \"ISLP_data\" \"05_notes.qmd\" \"02_exercises.qmd\" \n[76] \"06_video.qmd\" \"10_main.qmd\" \"11_main.qmd\" \n[79] \"12_video.qmd\" \"08_notes.qmd\" \n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\n\nimport numpy as np\nimport matplotlib.pyplot as plt\n#%matplotlib inline\nfrom matplotlib.pyplot import subplots\nimport pandas as pd\nfrom ISLP import load_data\n```\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\npd.set_option('display.max_columns', None)\n#pd.set_option('display.max_rows', None)\n```\n:::\n\n\n\n## 8. This exercise relates to the `College` data set, which can be found in the file `College.csv` on the book website. It contains a number of variables for 777 different universities and colleges in the US. The variables are\n\n- `Private` : Public/private indicator\n- `Apps` : Number of applications received\n- `Accept` : Number of applicants accepted\n- `Enroll` : Number of new students enrolled\n- `Top10perc` : New students from top 10 % of high school class\n- `Top25perc` : New students from top 25 % of high school class\n- `F.Undergrad` : Number of full-time undergraduates\n- `P.Undergrad` : Number of part-time undergraduates\n- `Outstate` : Out-of-state tuition\n- `Room.Board` : Room and board costs\n- `Books` : Estimated book costs\n- `Personal` : Estimated personal spending\n- `PhD` : Percent of faculty with Ph.D.s\n- `Terminal` : Percent of faculty with terminal degree\n- `S.F.Ratio` : Student/faculty ratio\n- `perc.alumni` : Percent of alumni who donate\n- `Expend` : Instructional expenditure per student\n- `Grad.Rate` : Graduation rate\n\nBefore reading the data into `Python`, it can be viewed in Excel or a\ntext editor.\n\n### (a) Use the `pd.read_csv()` function to read the data into `Python`. Call the loaded data `college`. Make sure that you have the directory set to the correct location for the data.\n\n\n::: {.cell}\n\n```{.python .cell-code}\ncollege = pd.read_csv('ISLP_data/College.csv')\ncollege\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n Unnamed: 0 Private Apps Accept Enroll Top10perc \\\n0 Abilene Christian University Yes 1660 1232 721 23 \n1 Adelphi University Yes 2186 1924 512 16 \n2 Adrian College Yes 1428 1097 336 22 \n3 Agnes Scott College Yes 417 349 137 60 \n4 Alaska Pacific University Yes 193 146 55 16 \n.. ... ... ... ... ... ... \n772 Worcester State College No 2197 1515 543 4 \n773 Xavier University Yes 1959 1805 695 24 \n774 Xavier University of Louisiana Yes 2097 1915 695 34 \n775 Yale University Yes 10705 2453 1317 95 \n776 York College of Pennsylvania Yes 2989 1855 691 28 \n\n Top25perc F.Undergrad P.Undergrad Outstate Room.Board Books \\\n0 52 2885 537 7440 3300 450 \n1 29 2683 1227 12280 6450 750 \n2 50 1036 99 11250 3750 400 \n3 89 510 63 12960 5450 450 \n4 44 249 869 7560 4120 800 \n.. ... ... ... ... ... ... \n772 26 3089 2029 6797 3900 500 \n773 47 2849 1107 11520 4960 600 \n774 61 2793 166 6900 4200 617 \n775 99 5217 83 19840 6510 630 \n776 63 2988 1726 4990 3560 500 \n\n Personal PhD Terminal S.F.Ratio perc.alumni Expend Grad.Rate \n0 2200 70 78 18.1 12 7041 60 \n1 1500 29 30 12.2 16 10527 56 \n2 1165 53 66 12.9 30 8735 54 \n3 875 92 97 7.7 37 19016 59 \n4 1500 76 72 11.9 2 10922 15 \n.. ... ... ... ... ... ... ... \n772 1200 60 60 21.0 14 4469 40 \n773 1250 73 75 13.3 31 9189 83 \n774 781 67 75 14.4 20 8323 49 \n775 2115 96 96 5.8 49 40386 99 \n776 1250 75 75 18.1 28 4509 99 \n\n[777 rows x 19 columns]\n```\n:::\n:::\n\n\n\n### (b) Look at the data used in the notebook by creating and running a new cell with just the code `college` in it. You should notice that the first column is just the name of each university in a column named something like `Unnamed: 0`. We don’t really want `pandas` to treat this as data. However, it may be handy to have these names for later. Try the following commands and similarly look at the resulting data frames:\n\n\n\n\n::: {.cell}\n\n```{.python .cell-code}\n\ncollege2 = pd.read_csv('ISLP_data/College.csv', index_col=0)\n#college2\n\ncollege3 = college.rename({'Unnamed: 0': 'college'},\n axis=1)\n#college3\n\ncollege3 = college3.set_index('college')\n#college3\n```\n:::\n\n\n\nThis has used the first column in the file as an `index` for the data frame. This means that `pandas` has given each row a name corresponding to the appropriate university. Now you should see that the first data column is `Private`. Note that the names of the colleges appear on the left of the table. We also introduced a new python object above: a *dictionary*, which is specified by `(key, value)` pairs. Keep your modified version of the data with the following:\n\n\n::: {.cell}\n\n```{.python .cell-code}\ncollege = college3\ncollege\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n Private Apps Accept Enroll Top10perc \\\ncollege \nAbilene Christian University Yes 1660 1232 721 23 \nAdelphi University Yes 2186 1924 512 16 \nAdrian College Yes 1428 1097 336 22 \nAgnes Scott College Yes 417 349 137 60 \nAlaska Pacific University Yes 193 146 55 16 \n... ... ... ... ... ... \nWorcester State College No 2197 1515 543 4 \nXavier University Yes 1959 1805 695 24 \nXavier University of Louisiana Yes 2097 1915 695 34 \nYale University Yes 10705 2453 1317 95 \nYork College of Pennsylvania Yes 2989 1855 691 28 \n\n Top25perc F.Undergrad P.Undergrad Outstate \\\ncollege \nAbilene Christian University 52 2885 537 7440 \nAdelphi University 29 2683 1227 12280 \nAdrian College 50 1036 99 11250 \nAgnes Scott College 89 510 63 12960 \nAlaska Pacific University 44 249 869 7560 \n... ... ... ... ... \nWorcester State College 26 3089 2029 6797 \nXavier University 47 2849 1107 11520 \nXavier University of Louisiana 61 2793 166 6900 \nYale University 99 5217 83 19840 \nYork College of Pennsylvania 63 2988 1726 4990 \n\n Room.Board Books Personal PhD Terminal \\\ncollege \nAbilene Christian University 3300 450 2200 70 78 \nAdelphi University 6450 750 1500 29 30 \nAdrian College 3750 400 1165 53 66 \nAgnes Scott College 5450 450 875 92 97 \nAlaska Pacific University 4120 800 1500 76 72 \n... ... ... ... ... ... \nWorcester State College 3900 500 1200 60 60 \nXavier University 4960 600 1250 73 75 \nXavier University of Louisiana 4200 617 781 67 75 \nYale University 6510 630 2115 96 96 \nYork College of Pennsylvania 3560 500 1250 75 75 \n\n S.F.Ratio perc.alumni Expend Grad.Rate \ncollege \nAbilene Christian University 18.1 12 7041 60 \nAdelphi University 12.2 16 10527 56 \nAdrian College 12.9 30 8735 54 \nAgnes Scott College 7.7 37 19016 59 \nAlaska Pacific University 11.9 2 10922 15 \n... ... ... ... ... \nWorcester State College 21.0 14 4469 40 \nXavier University 13.3 31 9189 83 \nXavier University of Louisiana 14.4 20 8323 49 \nYale University 5.8 49 40386 99 \nYork College of Pennsylvania 18.1 28 4509 99 \n\n[777 rows x 18 columns]\n```\n:::\n:::\n\n\n### (c) Use the `describe()` method to produce a numerical summary of the variables in the data set.\n\n\n::: {.cell}\n\n```{.python .cell-code}\ncollege.describe()\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n Apps Accept Enroll Top10perc Top25perc \\\ncount 777.000000 777.000000 777.000000 777.000000 777.000000 \nmean 3001.638353 2018.804376 779.972973 27.558559 55.796654 \nstd 3870.201484 2451.113971 929.176190 17.640364 19.804778 \nmin 81.000000 72.000000 35.000000 1.000000 9.000000 \n25% 776.000000 604.000000 242.000000 15.000000 41.000000 \n50% 1558.000000 1110.000000 434.000000 23.000000 54.000000 \n75% 3624.000000 2424.000000 902.000000 35.000000 69.000000 \nmax 48094.000000 26330.000000 6392.000000 96.000000 100.000000 \n\n F.Undergrad P.Undergrad Outstate Room.Board Books \\\ncount 777.000000 777.000000 777.000000 777.000000 777.000000 \nmean 3699.907336 855.298584 10440.669241 4357.526384 549.380952 \nstd 4850.420531 1522.431887 4023.016484 1096.696416 165.105360 \nmin 139.000000 1.000000 2340.000000 1780.000000 96.000000 \n25% 992.000000 95.000000 7320.000000 3597.000000 470.000000 \n50% 1707.000000 353.000000 9990.000000 4200.000000 500.000000 \n75% 4005.000000 967.000000 12925.000000 5050.000000 600.000000 \nmax 31643.000000 21836.000000 21700.000000 8124.000000 2340.000000 \n\n Personal PhD Terminal S.F.Ratio perc.alumni \\\ncount 777.000000 777.000000 777.000000 777.000000 777.000000 \nmean 1340.642214 72.660232 79.702703 14.089704 22.743887 \nstd 677.071454 16.328155 14.722359 3.958349 12.391801 \nmin 250.000000 8.000000 24.000000 2.500000 0.000000 \n25% 850.000000 62.000000 71.000000 11.500000 13.000000 \n50% 1200.000000 75.000000 82.000000 13.600000 21.000000 \n75% 1700.000000 85.000000 92.000000 16.500000 31.000000 \nmax 6800.000000 103.000000 100.000000 39.800000 64.000000 \n\n Expend Grad.Rate \ncount 777.000000 777.00000 \nmean 9660.171171 65.46332 \nstd 5221.768440 17.17771 \nmin 3186.000000 10.00000 \n25% 6751.000000 53.00000 \n50% 8377.000000 65.00000 \n75% 10830.000000 78.00000 \nmax 56233.000000 118.00000 \n```\n:::\n:::\n\n\n\n\n### (d) Use the `pd.plotting.scatter_matrix()` function to produce a scatterplot matrix of the first columns `[Top10perc, Apps, Enroll]`. Recall that you can reference a list C of columns of a data frame `A` using `A[C]`.\n\n\n\n::: {.cell}\n\n```{.python .cell-code}\n#fig, ax = subplots(figsize=(8, 8))\npd.plotting.scatter_matrix(college[['Top10perc','Apps','Enroll']])\n```\n\n::: {.cell-output .cell-output-stdout}\n```\narray([[,\n ,\n ],\n [,\n ,\n ],\n [,\n ,\n ]], dtype=object)\n```\n:::\n\n```{.python .cell-code}\n#plt.show()\n```\n\n::: {.cell-output-display}\n![](02_exercises_files/figure-html/unnamed-chunk-8-1.png){width=672}\n:::\n:::\n\n\n\n\n### (e) Use the boxplot() method of college to produce side-by-side boxplots of Outstate versus Private.\n\n\n::: {.cell}\n\n:::\n\n\n\n### (f) Create a new qualitative variable, called Elite, by binning the Top10perc variable into two groups based on whether or not the proportion of students coming from the top 10% of their high school classes exceeds 50%.\n\n\n::: {.cell}\n\n```{.python .cell-code}\ncollege['Elite'] = pd.cut(college['Top10perc'],\n [0,0.5,1],\n labels=['No', 'Yes'])\n```\n:::\n\n\n\n\nUse the `value_counts()` method of `college['Elite']` to see how many elite universities there are. Finally, use the `boxplot()` method again to produce side-by-side boxplots of `Outstate` versus `Elite`.\n\n\n\n::: {.cell}\n\n```{.python .cell-code}\ncollege['Elite'].value_counts()\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nYes 3\nNo 0\nName: Elite, dtype: int64\n```\n:::\n:::\n\n::: {.cell}\n\n:::\n\n\n\n### (g) Use the `plot.hist()` method of `college` to produce some histograms with difering numbers of bins for a few of the quantitative variables. The command `plt.subplots(2, 2)` may be useful: it will divide the plot window into four regions so that four plots can be made simultaneously. By changing the arguments you can divide the screen up in other combinations.\n\n\n::: {.cell}\n\n:::\n\n\n\n### (h) Continue exploring the data, and provide a brief summary of what you discover.\n\n\n::: {.cell}\n\n:::\n\n\n\n\n## 9. This exercise involves the `Auto` data set studied in the lab. Make sure that the missing values have been removed from the data.\n\n\n::: {.cell}\n\n```{.python .cell-code}\nAuto = pd.read_csv('ISLP_data/Auto.csv',\n na_values=['?'])\nAuto\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n mpg cylinders displacement horsepower weight acceleration year \\\n0 18.0 8 307.0 130.0 3504 12.0 70 \n1 15.0 8 350.0 165.0 3693 11.5 70 \n2 18.0 8 318.0 150.0 3436 11.0 70 \n3 16.0 8 304.0 150.0 3433 12.0 70 \n4 17.0 8 302.0 140.0 3449 10.5 70 \n.. ... ... ... ... ... ... ... \n392 27.0 4 140.0 86.0 2790 15.6 82 \n393 44.0 4 97.0 52.0 2130 24.6 82 \n394 32.0 4 135.0 84.0 2295 11.6 82 \n395 28.0 4 120.0 79.0 2625 18.6 82 \n396 31.0 4 119.0 82.0 2720 19.4 82 \n\n origin name \n0 1 chevrolet chevelle malibu \n1 1 buick skylark 320 \n2 1 plymouth satellite \n3 1 amc rebel sst \n4 1 ford torino \n.. ... ... \n392 1 ford mustang gl \n393 2 vw pickup \n394 1 dodge rampage \n395 1 ford ranger \n396 1 chevy s-10 \n\n[397 rows x 9 columns]\n```\n:::\n:::\n\n\n\n### (a) Which of the predictors are quantitative, and which are qualitative?\n\nMpg, Displacement, Horsepower, Weight and Acceleration are quantitative. Cylinders, Year, Origin, and Name are qualitative.\n\n\n::: {.cell}\n\n```{.python .cell-code}\nAuto.info()\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n\nRangeIndex: 397 entries, 0 to 396\nData columns (total 9 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 mpg 397 non-null float64\n 1 cylinders 397 non-null int64 \n 2 displacement 397 non-null float64\n 3 horsepower 392 non-null float64\n 4 weight 397 non-null int64 \n 5 acceleration 397 non-null float64\n 6 year 397 non-null int64 \n 7 origin 397 non-null int64 \n 8 name 397 non-null object \ndtypes: float64(4), int64(4), object(1)\nmemory usage: 28.0+ KB\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\nAuto['cylinders'] = Auto['cylinders'].astype('object') \nAuto['cylinders']\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n0 8\n1 8\n2 8\n3 8\n4 8\n ..\n392 4\n393 4\n394 4\n395 4\n396 4\nName: cylinders, Length: 397, dtype: object\n```\n:::\n:::\n\n\n\n### (b) What is the range of each quantitative predictor? You can answer this using the `min()` and `max()` methods in `numpy`. \n\n\n::: {.cell}\n\n```{.python .cell-code}\nmpg_min = Auto['mpg'].min( )\nmpg_max = Auto['mpg'].max( )\n\nprint('The min and max miles per gallon are', (mpg_min, mpg_max))\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe min and max miles per gallon are (9.0, 46.6)\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\ndsp_min = Auto['displacement'].min( )\ndsp_max = Auto['displacement'].max( )\n\nprint('The min and max displacement are', (dsp_min, dsp_max))\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe min and max displacement are (68.0, 455.0)\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\nhpwr_min = Auto['horsepower'].min( )\nhpwr_max = Auto['horsepower'].max( )\n\nprint('The min and max horsepower are', (hpwr_min, hpwr_max))\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe min and max horsepower are (46.0, 230.0)\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\nwt_min = Auto['weight'].min( )\nwt_max = Auto['weight'].max( )\n\nprint('The min and max weights are', (wt_min, wt_max))\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe min and max weights are (1613, 5140)\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\nacc_min = Auto['acceleration'].min( )\nacc_max = Auto['acceleration'].max( )\n\nprint('The min and max accelerations are', (acc_min, acc_max))\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe min and max accelerations are (8.0, 24.8)\n```\n:::\n:::\n\n\n### (c) What is the mean and standard deviation of each quantitative predictor?\n\n\n\n::: {.cell}\n\n```{.python .cell-code}\nmpg_mean = Auto['mpg'].mean( )\nmpg_sd = Auto['mpg'].std( )\n\nprint('The mean and standard deviation of miles per gallon are', mpg_mean,'and', mpg_sd)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe mean and standard deviation of miles per gallon are 23.515869017632248 and 7.825803928946562\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\ndsp_mean = Auto['displacement'].mean( )\ndsp_sd = Auto['displacement'].std( )\n\nprint('The mean and standard deviation of weight are', dsp_mean,'and', dsp_sd)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe mean and standard deviation of weight are 193.53274559193954 and 104.37958329992945\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\nhpwr_mean = Auto['horsepower'].mean( )\nhpwr_sd = Auto['horsepower'].std( )\n\nprint('The mean and standard deviation of horsepower are', hpwr_mean,'and', hpwr_sd)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe mean and standard deviation of horsepower are 104.46938775510205 and 38.49115993282855\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\nwt_mean = Auto['weight'].mean( )\nwt_sd = Auto['weight'].std( )\n\nprint('The mean and standard deviation of weight are', wt_mean,'and', wt_sd)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe mean and standard deviation of weight are 2970.2619647355164 and 847.9041194897246\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\nacc_mean = Auto['acceleration'].mean( )\nacc_sd = Auto['acceleration'].std( )\n\nprint('The mean and standard deviation of acceleration are', acc_mean,'and', acc_sd)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe mean and standard deviation of acceleration are 15.555667506297214 and 2.7499952929761515\n```\n:::\n:::\n\n\n\n### (d) Now remove the 10th through 85th observations. What is the range, mean, and standard deviation of each predictor in the subset of the data that remains?\n\n\n::: {.cell}\n\n```{.python .cell-code}\nAuto_new = Auto.drop(Auto.index[10:85])\nAuto_new\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n mpg cylinders displacement horsepower weight acceleration year \\\n0 18.0 8 307.0 130.0 3504 12.0 70 \n1 15.0 8 350.0 165.0 3693 11.5 70 \n2 18.0 8 318.0 150.0 3436 11.0 70 \n3 16.0 8 304.0 150.0 3433 12.0 70 \n4 17.0 8 302.0 140.0 3449 10.5 70 \n.. ... ... ... ... ... ... ... \n392 27.0 4 140.0 86.0 2790 15.6 82 \n393 44.0 4 97.0 52.0 2130 24.6 82 \n394 32.0 4 135.0 84.0 2295 11.6 82 \n395 28.0 4 120.0 79.0 2625 18.6 82 \n396 31.0 4 119.0 82.0 2720 19.4 82 \n\n origin name \n0 1 chevrolet chevelle malibu \n1 1 buick skylark 320 \n2 1 plymouth satellite \n3 1 amc rebel sst \n4 1 ford torino \n.. ... ... \n392 1 ford mustang gl \n393 2 vw pickup \n394 1 dodge rampage \n395 1 ford ranger \n396 1 chevy s-10 \n\n[322 rows x 9 columns]\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\nmpg_min = Auto_new['mpg'].min( )\nmpg_max = Auto_new['mpg'].max( )\n\nprint('The min and max miles per gallon of the subsetted data are', (mpg_min, mpg_max))\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe min and max miles per gallon of the subsetted data are (11.0, 46.6)\n```\n:::\n\n```{.python .cell-code}\nmpg_mean = Auto_new['mpg'].mean( )\nmpg_sd = Auto_new['mpg'].std( )\n\nprint('The mean and standard deviation of miles per gallon of the subsetted data are', mpg_mean,'and', mpg_sd)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe mean and standard deviation of miles per gallon of the subsetted data are 24.40931677018633 and 7.913357147165568\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\ndsp_min = Auto_new['displacement'].min( )\ndsp_max = Auto_new['displacement'].max( )\n\nprint('The min and max displacement of the subsetted data are', (dsp_min, dsp_max))\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe min and max displacement of the subsetted data are (68.0, 455.0)\n```\n:::\n\n```{.python .cell-code}\ndsp_mean = Auto_new['displacement'].mean( )\ndsp_sd = Auto_new['displacement'].std( )\n\nprint('The mean and standard deviation of weight of the subsetted data are', dsp_mean,'and', dsp_sd)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe mean and standard deviation of weight of the subsetted data are 187.6801242236025 and 100.12092459330134\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\nhpwr_min = Auto['horsepower'].min( )\nhpwr_max = Auto['horsepower'].max( )\n\nprint('The min and max horsepower of the subsetted data are', (hpwr_min, hpwr_max))\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe min and max horsepower of the subsetted data are (46.0, 230.0)\n```\n:::\n\n```{.python .cell-code}\nhpwr_mean = Auto['horsepower'].mean( )\nhpwr_sd = Auto['horsepower'].std( )\n\nprint('The mean and standard deviation of horsepower of the subsetted data are', hpwr_mean,'and', hpwr_sd)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe mean and standard deviation of horsepower of the subsetted data are 104.46938775510205 and 38.49115993282855\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\nwt_min = Auto['weight'].min( )\nwt_max = Auto['weight'].max( )\n\nprint('The min and max weights of the subsetted data are', (wt_min, wt_max))\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe min and max weights of the subsetted data are (1613, 5140)\n```\n:::\n\n```{.python .cell-code}\nwt_mean = Auto['weight'].mean( )\nwt_sd = Auto['weight'].std( )\n\nprint('The mean and standard deviation of weight of the subsetted data are', wt_mean,'and', wt_sd)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe mean and standard deviation of weight of the subsetted data are 2970.2619647355164 and 847.9041194897246\n```\n:::\n:::\n\n::: {.cell}\n\n```{.python .cell-code}\nacc_min = Auto['acceleration'].min( )\nacc_max = Auto['acceleration'].max( )\n\nprint('The min and max accelerations of the subsetted data are', (acc_min, acc_max))\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe min and max accelerations of the subsetted data are (8.0, 24.8)\n```\n:::\n\n```{.python .cell-code}\nacc_mean = Auto['acceleration'].mean( )\nacc_sd = Auto['acceleration'].std( )\n\nprint('The mean and standard deviation of acceleration of the subsetted data are', acc_mean,'and', acc_sd)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nThe mean and standard deviation of acceleration of the subsetted data are 15.555667506297214 and 2.7499952929761515\n```\n:::\n:::\n\n\n\n### (e) Using the full data set, investigate the predictors graphically, using scatterplots or other tools of your choice. Create some plots highlighting the relationships among the predictors. Comment on your findings.\n\n\n\n\n### (f) Suppose that we wish to predict gas mileage (`mpg`) on the basis of the other variables. Do your plots suggest that any of the other variables might be useful in predicting `mpg`? Justify your answer.\n\n\n\n## 10. This exercise involves the `Boston` housing data set.\n\n### (a) To begin, load in the `Boston` data set, which is part of the `ISLP` library.\n\n\n\n\n::: {.cell}\n\n```{.python .cell-code}\nBoston = load_data(\"Boston\")\nBoston\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n crim zn indus chas nox rm age dis rad tax \\\n0 0.00632 18.0 2.31 0 0.538 6.575 65.2 4.0900 1 296 \n1 0.02731 0.0 7.07 0 0.469 6.421 78.9 4.9671 2 242 \n2 0.02729 0.0 7.07 0 0.469 7.185 61.1 4.9671 2 242 \n3 0.03237 0.0 2.18 0 0.458 6.998 45.8 6.0622 3 222 \n4 0.06905 0.0 2.18 0 0.458 7.147 54.2 6.0622 3 222 \n.. ... ... ... ... ... ... ... ... ... ... \n501 0.06263 0.0 11.93 0 0.573 6.593 69.1 2.4786 1 273 \n502 0.04527 0.0 11.93 0 0.573 6.120 76.7 2.2875 1 273 \n503 0.06076 0.0 11.93 0 0.573 6.976 91.0 2.1675 1 273 \n504 0.10959 0.0 11.93 0 0.573 6.794 89.3 2.3889 1 273 \n505 0.04741 0.0 11.93 0 0.573 6.030 80.8 2.5050 1 273 \n\n ptratio lstat medv \n0 15.3 4.98 24.0 \n1 17.8 9.14 21.6 \n2 17.8 4.03 34.7 \n3 18.7 2.94 33.4 \n4 18.7 5.33 36.2 \n.. ... ... ... \n501 21.0 9.67 22.4 \n502 21.0 9.08 20.6 \n503 21.0 5.64 23.9 \n504 21.0 6.48 22.0 \n505 21.0 7.88 11.9 \n\n[506 rows x 13 columns]\n```\n:::\n:::\n\n\n\n### (b) How many rows are in this data set? How many columns? What do the rows and columns represent?\n\n\n\n### (c) Make some pairwise scatterplots of the predictors (columns) in this data set. Describe your fndings.\n\n\n\n### (d) Are any of the predictors associated with per capita crime rate? If so, explain the relationship.\n\n\n### (e) Do any of the suburbs of Boston appear to have particularly high crime rates? Tax rates? Pupil-teacher ratios? Comment on the range of each predictor.\n\n\n\n### (f) How many of the suburbs in this data set bound the Charles river?\n\n\n### (g) What is the median pupil-teacher ratio among the towns in this data set?\n\n\n\n\n### (h) Which suburb of Boston has lowest median value of owneroccupied homes? What are the values of the other predictors for that suburb, and how do those values compare to the overall ranges for those predictors? Comment on your fndings.\n\n\n\n\n### (i) In this data set, how many of the suburbs average more than seven rooms per dwelling? More than eight rooms per dwelling? Comment on the suburbs that average more than eight rooms per dwelling.\n\n\n\n\n\n", + "supporting": [ + "02_exercises_files" + ], + "filters": [ + "rmarkdown/pagebreak.lua" + ], + "includes": {}, + "engineDependencies": {}, + "preserve": {}, + "postProcess": true + } +} \ No newline at end of file diff --git a/_freeze/02_exercises/figure-html/unnamed-chunk-8-1.png b/_freeze/02_exercises/figure-html/unnamed-chunk-8-1.png new file mode 100644 index 0000000..bc0fb4e Binary files /dev/null and b/_freeze/02_exercises/figure-html/unnamed-chunk-8-1.png differ diff --git a/_freeze/03_notes/execute-results/html.json b/_freeze/03_notes/execute-results/html.json new file mode 100644 index 0000000..de65636 --- /dev/null +++ b/_freeze/03_notes/execute-results/html.json @@ -0,0 +1,11 @@ +{ + "hash": "1157b4b6a66df780fe4afff52e109232", + "result": { + "markdown": "# Data Structures and Sequences\n\n## Tuples\n\n![](https://pynative.com/wp-content/uploads/2021/02/python-tuple.jpg)\n\nA tuple is a fixed-length, immutable sequence of Python objects which, once assigned, cannot be changed. The easiest way to create one is with a comma-separated sequence of values wrapped in parentheses:\n\n::: {.cell execution_count=1}\n``` {.python .cell-code}\ntup = (4, 5, 6)\ntup\n```\n\n::: {.cell-output .cell-output-display execution_count=1}\n```\n(4, 5, 6)\n```\n:::\n:::\n\n\nIn many contexts, the parentheses can be omitted\n\n::: {.cell execution_count=2}\n``` {.python .cell-code}\ntup = 4, 5, 6\ntup\n```\n\n::: {.cell-output .cell-output-display execution_count=2}\n```\n(4, 5, 6)\n```\n:::\n:::\n\n\nYou can convert any sequence or iterator to a tuple by invoking\n\n::: {.cell execution_count=3}\n``` {.python .cell-code}\ntuple([4,0,2])\n\ntup = tuple('string')\n\ntup\n```\n\n::: {.cell-output .cell-output-display execution_count=3}\n```\n('s', 't', 'r', 'i', 'n', 'g')\n```\n:::\n:::\n\n\nElements can be accessed with square brackets [] \n\nNote the zero indexing\n\n::: {.cell execution_count=4}\n``` {.python .cell-code}\ntup[0]\n```\n\n::: {.cell-output .cell-output-display execution_count=4}\n```\n's'\n```\n:::\n:::\n\n\nTuples of tuples\n\n::: {.cell execution_count=5}\n``` {.python .cell-code}\nnested_tup = (4,5,6),(7,8)\n\nnested_tup\n```\n\n::: {.cell-output .cell-output-display execution_count=5}\n```\n((4, 5, 6), (7, 8))\n```\n:::\n:::\n\n\n::: {.cell execution_count=6}\n``` {.python .cell-code}\nnested_tup[0]\n```\n\n::: {.cell-output .cell-output-display execution_count=6}\n```\n(4, 5, 6)\n```\n:::\n:::\n\n\n::: {.cell execution_count=7}\n``` {.python .cell-code}\nnested_tup[1]\n```\n\n::: {.cell-output .cell-output-display execution_count=7}\n```\n(7, 8)\n```\n:::\n:::\n\n\nWhile the objects stored in a tuple may be mutable themselves, once the tuple is created it’s not possible to modify which object is stored in each slot:\n\n::: {.cell execution_count=8}\n``` {.python .cell-code}\ntup = tuple(['foo', [1, 2], True])\n\ntup[2]\n```\n\n::: {.cell-output .cell-output-display execution_count=8}\n```\nTrue\n```\n:::\n:::\n\n\n```{{python}}\n\ntup[2] = False\n\n```\n\n````\nTypeError Traceback (most recent call last)\nInput In [9], in ()\n----> 1 tup[2] = False\n\nTypeError: 'tuple' object does not support item assignment\nTypeError: 'tuple' object does not support item assignment\n````\n\nIf an object inside a tuple is mutable, such as a list, you can modify it in place\n\n::: {.cell execution_count=9}\n``` {.python .cell-code}\ntup[1].append(3)\n\ntup\n```\n\n::: {.cell-output .cell-output-display execution_count=9}\n```\n('foo', [1, 2, 3], True)\n```\n:::\n:::\n\n\nYou can concatenate tuples using the + operator to produce longer tuples:\n\n::: {.cell execution_count=10}\n``` {.python .cell-code}\n(4, None, 'foo') + (6, 0) + ('bar',)\n```\n\n::: {.cell-output .cell-output-display execution_count=10}\n```\n(4, None, 'foo', 6, 0, 'bar')\n```\n:::\n:::\n\n\n### Unpacking tuples\n\nIf you try to assign to a tuple-like expression of variables, Python will attempt to unpack the value on the righthand side of the equals sign:\n\n::: {.cell execution_count=11}\n``` {.python .cell-code}\ntup = (4, 5, 6)\ntup\n```\n\n::: {.cell-output .cell-output-display execution_count=11}\n```\n(4, 5, 6)\n```\n:::\n:::\n\n\n::: {.cell execution_count=12}\n``` {.python .cell-code}\na, b, c = tup\n\nc\n```\n\n::: {.cell-output .cell-output-display execution_count=12}\n```\n6\n```\n:::\n:::\n\n\nEven sequences with nested tuples can be unpacked:\n\n::: {.cell execution_count=13}\n``` {.python .cell-code}\ntup = 4, 5, (6,7)\n\na, b, (c, d) = tup\n\nd\n```\n\n::: {.cell-output .cell-output-display execution_count=13}\n```\n7\n```\n:::\n:::\n\n\nTo easily swap variable names\n\n::: {.cell execution_count=14}\n``` {.python .cell-code}\na, b = 1, 4\n\na\n```\n\n::: {.cell-output .cell-output-display execution_count=14}\n```\n1\n```\n:::\n:::\n\n\n::: {.cell execution_count=15}\n``` {.python .cell-code}\nb\n```\n\n::: {.cell-output .cell-output-display execution_count=15}\n```\n4\n```\n:::\n:::\n\n\n::: {.cell execution_count=16}\n``` {.python .cell-code}\nb, a = a, b\n\na\n```\n\n::: {.cell-output .cell-output-display execution_count=16}\n```\n4\n```\n:::\n:::\n\n\n::: {.cell execution_count=17}\n``` {.python .cell-code}\nb\n```\n\n::: {.cell-output .cell-output-display execution_count=17}\n```\n1\n```\n:::\n:::\n\n\nA common use of variable unpacking is iterating over sequences of tuples or lists\n\n::: {.cell execution_count=18}\n``` {.python .cell-code}\nseq = [(1, 2, 3), (4, 5, 6), (7, 8, 9)]\n\nseq\n```\n\n::: {.cell-output .cell-output-display execution_count=18}\n```\n[(1, 2, 3), (4, 5, 6), (7, 8, 9)]\n```\n:::\n:::\n\n\n::: {.cell execution_count=19}\n``` {.python .cell-code}\nfor a, b, c in seq:\n print(f'a={a}, b={b}, c={c}')\n```\n\n::: {.cell-output .cell-output-stdout}\n```\na=1, b=2, c=3\na=4, b=5, c=6\na=7, b=8, c=9\n```\n:::\n:::\n\n\n`*rest` syntax for plucking elements\n\n::: {.cell execution_count=20}\n``` {.python .cell-code}\nvalues = 1,2,3,4,5\n\na, b, *rest = values\n\nrest\n```\n\n::: {.cell-output .cell-output-display execution_count=20}\n```\n[3, 4, 5]\n```\n:::\n:::\n\n\n As a matter of convention, many Python programmers will use the underscore (_) for unwanted variables:\n\n::: {.cell execution_count=21}\n``` {.python .cell-code}\na, b, *_ = values\n```\n:::\n\n\n### Tuple methods\n\nSince the size and contents of a tuple cannot be modified, it is very light on instance methods. A particularly useful one (also available on lists) is `count`\n\n::: {.cell execution_count=22}\n``` {.python .cell-code}\na = (1,2,2,2,2,3,4,5,7,8,9)\n\na.count(2)\n```\n\n::: {.cell-output .cell-output-display execution_count=22}\n```\n4\n```\n:::\n:::\n\n\n## List\n\n![](https://pynative.com/wp-content/uploads/2021/03/python-list.jpg)\n\nIn contrast with tuples, lists are variable length and their contents can be modified in place.\n\nLists are mutable. \n\nLists use `[]` square brackts or the `list` function\n\n::: {.cell execution_count=23}\n``` {.python .cell-code}\na_list = [2, 3, 7, None]\n\ntup = (\"foo\", \"bar\", \"baz\")\n\nb_list = list(tup)\n\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=23}\n```\n['foo', 'bar', 'baz']\n```\n:::\n:::\n\n\n::: {.cell execution_count=24}\n``` {.python .cell-code}\nb_list[1] = \"peekaboo\"\n\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=24}\n```\n['foo', 'peekaboo', 'baz']\n```\n:::\n:::\n\n\nLists and tuples are semantically similar (though tuples cannot be modified) and can be used interchangeably in many functions.\n\n::: {.cell execution_count=25}\n``` {.python .cell-code}\ngen = range(10)\n\ngen\n```\n\n::: {.cell-output .cell-output-display execution_count=25}\n```\nrange(0, 10)\n```\n:::\n:::\n\n\n::: {.cell execution_count=26}\n``` {.python .cell-code}\nlist(gen)\n```\n\n::: {.cell-output .cell-output-display execution_count=26}\n```\n[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]\n```\n:::\n:::\n\n\n### Adding and removing list elements\n\nthe `append` method\n\n::: {.cell execution_count=27}\n``` {.python .cell-code}\nb_list.append(\"dwarf\")\n\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=27}\n```\n['foo', 'peekaboo', 'baz', 'dwarf']\n```\n:::\n:::\n\n\nthe `insert` method\n\n::: {.cell execution_count=28}\n``` {.python .cell-code}\nb_list.insert(1, \"red\")\n\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=28}\n```\n['foo', 'red', 'peekaboo', 'baz', 'dwarf']\n```\n:::\n:::\n\n\n`insert` is computationally more expensive than `append`\n\nthe `pop` method, the inverse of `insert`\n\n::: {.cell execution_count=29}\n``` {.python .cell-code}\nb_list.pop(2)\n```\n\n::: {.cell-output .cell-output-display execution_count=29}\n```\n'peekaboo'\n```\n:::\n:::\n\n\n::: {.cell execution_count=30}\n``` {.python .cell-code}\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=30}\n```\n['foo', 'red', 'baz', 'dwarf']\n```\n:::\n:::\n\n\nthe `remove` method\n\n::: {.cell execution_count=31}\n``` {.python .cell-code}\nb_list.append(\"foo\")\n\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=31}\n```\n['foo', 'red', 'baz', 'dwarf', 'foo']\n```\n:::\n:::\n\n\n::: {.cell execution_count=32}\n``` {.python .cell-code}\nb_list.remove(\"foo\")\n\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=32}\n```\n['red', 'baz', 'dwarf', 'foo']\n```\n:::\n:::\n\n\nCheck if a list contains a value using the `in` keyword:\n\n::: {.cell execution_count=33}\n``` {.python .cell-code}\n\"dwarf\" in b_list\n```\n\n::: {.cell-output .cell-output-display execution_count=33}\n```\nTrue\n```\n:::\n:::\n\n\nThe keyword `not` can be used to negate an `in`\n\n::: {.cell execution_count=34}\n``` {.python .cell-code}\n\"dwarf\" not in b_list\n```\n\n::: {.cell-output .cell-output-display execution_count=34}\n```\nFalse\n```\n:::\n:::\n\n\n### Concatenating and combining lists\n\nsimilar with tuples, use `+` to concatenate\n\n::: {.cell execution_count=35}\n``` {.python .cell-code}\n[4, None, \"foo\"] + [7, 8, (2, 3)]\n```\n\n::: {.cell-output .cell-output-display execution_count=35}\n```\n[4, None, 'foo', 7, 8, (2, 3)]\n```\n:::\n:::\n\n\nthe `extend` method\n\n::: {.cell execution_count=36}\n``` {.python .cell-code}\nx = [4, None, \"foo\"]\n\nx.extend([7,8,(2,3)])\n\nx\n```\n\n::: {.cell-output .cell-output-display execution_count=36}\n```\n[4, None, 'foo', 7, 8, (2, 3)]\n```\n:::\n:::\n\n\nlist concatenation by addition is an expensive operation\n\nusing `extend` is preferable\n\n```{{python}}\neverything = []\nfor chunk in list_of_lists:\n everything.extend(chunk)\n\n```\n\nis generally faster than\n\n```{{python}}\n\neverything = []\nfor chunk in list_of_lists:\n everything = everything + chunk\n\n```\n\n### Sorting\n\nthe `sort` method\n\n::: {.cell execution_count=37}\n``` {.python .cell-code}\na = [7, 2, 5, 1, 3]\n\na.sort()\n\na\n```\n\n::: {.cell-output .cell-output-display execution_count=37}\n```\n[1, 2, 3, 5, 7]\n```\n:::\n:::\n\n\n`sort` options\n\n::: {.cell execution_count=38}\n``` {.python .cell-code}\nb = [\"saw\", \"small\", \"He\", \"foxes\", \"six\"]\n\nb.sort(key = len)\n\nb\n```\n\n::: {.cell-output .cell-output-display execution_count=38}\n```\n['He', 'saw', 'six', 'small', 'foxes']\n```\n:::\n:::\n\n\n### Slicing\n\nSlicing semantics takes a bit of getting used to, especially if you’re coming from R or MATLAB.\n\nusing the indexing operator `[]`\n\n::: {.cell execution_count=39}\n``` {.python .cell-code}\nseq = [7, 2, 3, 7, 5, 6, 0, 1]\n\nseq[3:5]\n```\n\n::: {.cell-output .cell-output-display execution_count=39}\n```\n[7, 5]\n```\n:::\n:::\n\n\nalso assigned with a sequence\n\n::: {.cell execution_count=40}\n``` {.python .cell-code}\nseq[3:5] = [6,3]\n\nseq\n```\n\n::: {.cell-output .cell-output-display execution_count=40}\n```\n[7, 2, 3, 6, 3, 6, 0, 1]\n```\n:::\n:::\n\n\nEither the `start` or `stop` can be omitted\n\n::: {.cell execution_count=41}\n``` {.python .cell-code}\nseq[:5]\n```\n\n::: {.cell-output .cell-output-display execution_count=41}\n```\n[7, 2, 3, 6, 3]\n```\n:::\n:::\n\n\n::: {.cell execution_count=42}\n``` {.python .cell-code}\nseq[3:]\n```\n\n::: {.cell-output .cell-output-display execution_count=42}\n```\n[6, 3, 6, 0, 1]\n```\n:::\n:::\n\n\nNegative indices slice the sequence relative to the end:\n\n::: {.cell execution_count=43}\n``` {.python .cell-code}\nseq[-4:]\n```\n\n::: {.cell-output .cell-output-display execution_count=43}\n```\n[3, 6, 0, 1]\n```\n:::\n:::\n\n\nA step can also be used after a second colon to, say, take every other element:\n\n::: {.cell execution_count=44}\n``` {.python .cell-code}\nseq[::2]\n```\n\n::: {.cell-output .cell-output-display execution_count=44}\n```\n[7, 3, 3, 0]\n```\n:::\n:::\n\n\nA clever use of this is to pass -1, which has the useful effect of reversing a list or tuple:\n\n::: {.cell execution_count=45}\n``` {.python .cell-code}\nseq[::-1]\n```\n\n::: {.cell-output .cell-output-display execution_count=45}\n```\n[1, 0, 6, 3, 6, 3, 2, 7]\n```\n:::\n:::\n\n\n## Dictionary\n\n![](https://pynative.com/wp-content/uploads/2021/02/dictionaries-in-python.jpg)\n\nThe dictionary or dict may be the most important built-in Python data structure. \n\nOne approach for creating a dictionary is to use curly braces {} and colons to separate keys and values:\n\n::: {.cell execution_count=46}\n``` {.python .cell-code}\nempty_dict = {}\n\nd1 = {\"a\": \"some value\", \"b\": [1, 2, 3, 4]}\n\nd1\n```\n\n::: {.cell-output .cell-output-display execution_count=46}\n```\n{'a': 'some value', 'b': [1, 2, 3, 4]}\n```\n:::\n:::\n\n\naccess, insert, or set elements \n\n::: {.cell execution_count=47}\n``` {.python .cell-code}\nd1[7] = \"an integer\"\n\nd1\n```\n\n::: {.cell-output .cell-output-display execution_count=47}\n```\n{'a': 'some value', 'b': [1, 2, 3, 4], 7: 'an integer'}\n```\n:::\n:::\n\n\nand as before\n\n::: {.cell execution_count=48}\n``` {.python .cell-code}\n\"b\" in d1\n```\n\n::: {.cell-output .cell-output-display execution_count=48}\n```\nTrue\n```\n:::\n:::\n\n\nthe `del` and `pop` methods\n\n::: {.cell execution_count=49}\n``` {.python .cell-code}\ndel d1[7]\n\nd1\n```\n\n::: {.cell-output .cell-output-display execution_count=49}\n```\n{'a': 'some value', 'b': [1, 2, 3, 4]}\n```\n:::\n:::\n\n\n::: {.cell execution_count=50}\n``` {.python .cell-code}\nret = d1.pop(\"a\")\n\nret\n```\n\n::: {.cell-output .cell-output-display execution_count=50}\n```\n'some value'\n```\n:::\n:::\n\n\nThe `keys` and `values` methods\n\n::: {.cell execution_count=51}\n``` {.python .cell-code}\nlist(d1.keys())\n```\n\n::: {.cell-output .cell-output-display execution_count=51}\n```\n['b']\n```\n:::\n:::\n\n\n::: {.cell execution_count=52}\n``` {.python .cell-code}\nlist(d1.values())\n```\n\n::: {.cell-output .cell-output-display execution_count=52}\n```\n[[1, 2, 3, 4]]\n```\n:::\n:::\n\n\nthe `items` method\n\n::: {.cell execution_count=53}\n``` {.python .cell-code}\nlist(d1.items())\n```\n\n::: {.cell-output .cell-output-display execution_count=53}\n```\n[('b', [1, 2, 3, 4])]\n```\n:::\n:::\n\n\n the update method to merge one dictionary into another\n\n::: {.cell execution_count=54}\n``` {.python .cell-code}\nd1.update({\"b\": \"foo\", \"c\": 12})\n\nd1\n```\n\n::: {.cell-output .cell-output-display execution_count=54}\n```\n{'b': 'foo', 'c': 12}\n```\n:::\n:::\n\n\n ### Creating dictionaries from sequences\n\n::: {.cell execution_count=55}\n``` {.python .cell-code}\nlist(range(5))\n```\n\n::: {.cell-output .cell-output-display execution_count=55}\n```\n[0, 1, 2, 3, 4]\n```\n:::\n:::\n\n\n::: {.cell execution_count=56}\n``` {.python .cell-code}\ntuples = zip(range(5), reversed(range(5)))\n\ntuples\n\nmapping = dict(tuples)\n\nmapping\n```\n\n::: {.cell-output .cell-output-display execution_count=56}\n```\n{0: 4, 1: 3, 2: 2, 3: 1, 4: 0}\n```\n:::\n:::\n\n\n### Default values\n\nimagine categorizing a list of words by their first letters as a dictionary of lists\n\n::: {.cell execution_count=57}\n``` {.python .cell-code}\nwords = [\"apple\", \"bat\", \"bar\", \"atom\", \"book\"]\n\nby_letter = {}\n\nfor word in words:\n letter = word[0]\n if letter not in by_letter:\n by_letter[letter] = [word]\n else:\n by_letter[letter].append(word)\n\nby_letter\n```\n\n::: {.cell-output .cell-output-display execution_count=57}\n```\n{'a': ['apple', 'atom'], 'b': ['bat', 'bar', 'book']}\n```\n:::\n:::\n\n\nThe `setdefault` dictionary method can be used to simplify this workflow. The preceding for loop can be rewritten as:\n\n::: {.cell execution_count=58}\n``` {.python .cell-code}\nby_letter = {}\n\nfor word in words:\n letter = word[0]\n by_letter.setdefault(letter, []).append(word)\n\nby_letter\n```\n\n::: {.cell-output .cell-output-display execution_count=58}\n```\n{'a': ['apple', 'atom'], 'b': ['bat', 'bar', 'book']}\n```\n:::\n:::\n\n\nThe built-in `collections`module has a useful class, `defaultdict`, which makes this even easier.\n\n::: {.cell execution_count=59}\n``` {.python .cell-code}\nfrom collections import defaultdict\n\nby_letter = defaultdict(list)\n\nfor word in words:\n by_letter[word[0]].append(word)\n\nby_letter\n```\n\n::: {.cell-output .cell-output-display execution_count=59}\n```\ndefaultdict(list, {'a': ['apple', 'atom'], 'b': ['bat', 'bar', 'book']})\n```\n:::\n:::\n\n\n### Valid dictionary key types\n\nkeys generally have to be immutable objects like scalars or tuples for *hashability*\n\nTo use a list as a key, one option is to convert it to a tuple, which can be hashed as long as its elements also can be:\n\n::: {.cell execution_count=60}\n``` {.python .cell-code}\nd = {}\n\nd[tuple([1,2,3])] = 5\n\nd\n```\n\n::: {.cell-output .cell-output-display execution_count=60}\n```\n{(1, 2, 3): 5}\n```\n:::\n:::\n\n\n## Set\n\n![](https://pynative.com/wp-content/uploads/2021/03/python-sets.jpg)\n\ncan be created in two ways: via the `set` function or via a `set literal` with curly braces:\n\n::: {.cell execution_count=61}\n``` {.python .cell-code}\nset([2, 2, 2, 1, 3, 3])\n\n{2,2,1,3,3}\n```\n\n::: {.cell-output .cell-output-display execution_count=61}\n```\n{1, 2, 3}\n```\n:::\n:::\n\n\nSets support mathematical set operations like union, intersection, difference, and symmetric difference.\n\nThe `union` of these two sets:\n\n::: {.cell execution_count=62}\n``` {.python .cell-code}\na = {1, 2, 3, 4, 5}\n\nb = {3, 4, 5, 6, 7, 8}\n\na.union(b)\n\na | b\n```\n\n::: {.cell-output .cell-output-display execution_count=62}\n```\n{1, 2, 3, 4, 5, 6, 7, 8}\n```\n:::\n:::\n\n\nThe `&`operator or the `intersection` method\n\n::: {.cell execution_count=63}\n``` {.python .cell-code}\na.intersection(b)\n\na & b\n```\n\n::: {.cell-output .cell-output-display execution_count=63}\n```\n{3, 4, 5}\n```\n:::\n:::\n\n\n[A table of commonly used `set` methods](https://wesmckinney.com/book/python-builtin.html#tbl-table_set_operations)\n\nAll of the logical set operations have in-place counterparts, which enable you to replace the contents of the set on the left side of the operation with the result. For very large sets, this may be more efficient\n\n::: {.cell execution_count=64}\n``` {.python .cell-code}\nc = a.copy()\n\nc |= b\n\nc\n```\n\n::: {.cell-output .cell-output-display execution_count=64}\n```\n{1, 2, 3, 4, 5, 6, 7, 8}\n```\n:::\n:::\n\n\n::: {.cell execution_count=65}\n``` {.python .cell-code}\nd = a.copy()\n\nd &= b\n\nd\n```\n\n::: {.cell-output .cell-output-display execution_count=65}\n```\n{3, 4, 5}\n```\n:::\n:::\n\n\nset elements generally must be immutable, and they must be hashable\n\nyou can convert them to tuples\n\nYou can also check if a set is a subset of (is contained in) or a superset of (contains all elements of) another set\n\n::: {.cell execution_count=66}\n``` {.python .cell-code}\na_set = {1, 2, 3, 4, 5}\n\n{1, 2, 3}.issubset(a_set)\n```\n\n::: {.cell-output .cell-output-display execution_count=66}\n```\nTrue\n```\n:::\n:::\n\n\n::: {.cell execution_count=67}\n``` {.python .cell-code}\na_set.issuperset({1, 2, 3})\n```\n\n::: {.cell-output .cell-output-display execution_count=67}\n```\nTrue\n```\n:::\n:::\n\n\n## Built-In Sequence Functions\n\n### enumerate\n\n`enumerate` returns a sequence of (i, value) tuples\n\n### sorted\n\n`sorted` returns a new sorted list \n\n::: {.cell execution_count=68}\n``` {.python .cell-code}\nsorted([7,1,2,9,3,6,5,0,22])\n```\n\n::: {.cell-output .cell-output-display execution_count=68}\n```\n[0, 1, 2, 3, 5, 6, 7, 9, 22]\n```\n:::\n:::\n\n\n### zip\n\n`zip` “pairs” up the elements of a number of lists, tuples, or other sequences to create a list of tuples\n\n::: {.cell execution_count=69}\n``` {.python .cell-code}\nseq1 = [\"foo\", \"bar\", \"baz\"]\n\nseq2 = [\"one\", \"two\", \"three\"]\n\nzipped = zip(seq1, seq2)\n\nlist(zipped)\n```\n\n::: {.cell-output .cell-output-display execution_count=69}\n```\n[('foo', 'one'), ('bar', 'two'), ('baz', 'three')]\n```\n:::\n:::\n\n\n`zip` can take an arbitrary number of sequences, and the number of elements it produces is determined by the shortest sequence\n\n::: {.cell execution_count=70}\n``` {.python .cell-code}\nseq3 = [False, True]\n\nlist(zip(seq1, seq2, seq3))\n```\n\n::: {.cell-output .cell-output-display execution_count=70}\n```\n[('foo', 'one', False), ('bar', 'two', True)]\n```\n:::\n:::\n\n\nA common use of `zip` is simultaneously iterating over multiple sequences, possibly also combined with `enumerate`\n\n::: {.cell execution_count=71}\n``` {.python .cell-code}\nfor index, (a, b) in enumerate(zip(seq1, seq2)):\n print(f\"{index}: {a}, {b}\")\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n0: foo, one\n1: bar, two\n2: baz, three\n```\n:::\n:::\n\n\n`reversed` iterates over the elements of a sequence in reverse order\n\n::: {.cell execution_count=72}\n``` {.python .cell-code}\nlist(reversed(range(10)))\n```\n\n::: {.cell-output .cell-output-display execution_count=72}\n```\n[9, 8, 7, 6, 5, 4, 3, 2, 1, 0]\n```\n:::\n:::\n\n\n## List, Set, and Dictionary Comprehensions\n\n```\n[expr for value in collection if condition]\n```\n\nFor example, given a list of strings, we could filter out strings with length 2 or less and convert them to uppercase like this\n\n::: {.cell execution_count=73}\n``` {.python .cell-code}\nstrings = [\"a\", \"as\", \"bat\", \"car\", \"dove\", \"python\"]\n\n[x.upper() for x in strings if len(x) > 2]\n```\n\n::: {.cell-output .cell-output-display execution_count=73}\n```\n['BAT', 'CAR', 'DOVE', 'PYTHON']\n```\n:::\n:::\n\n\nA dictionary comprehension looks like this\n\n```\ndict_comp = {key-expr: value-expr for value in collection\n if condition}\n```\n\nSuppose we wanted a set containing just the lengths of the strings contained in the collection\n\n::: {.cell execution_count=74}\n``` {.python .cell-code}\nunique_lengths = {len(x) for x in strings}\n\nunique_lengths\n```\n\n::: {.cell-output .cell-output-display execution_count=74}\n```\n{1, 2, 3, 4, 6}\n```\n:::\n:::\n\n\nwe could create a lookup map of these strings for their locations in the list\n\n::: {.cell execution_count=75}\n``` {.python .cell-code}\nloc_mapping = {value: index for index, value in enumerate(strings)}\n\nloc_mapping\n```\n\n::: {.cell-output .cell-output-display execution_count=75}\n```\n{'a': 0, 'as': 1, 'bat': 2, 'car': 3, 'dove': 4, 'python': 5}\n```\n:::\n:::\n\n\n## Nested list comprehensions\n\nSuppose we have a list of lists containing some English and Spanish names. We want to get a single list containing all names with two or more a’s in them\n\n::: {.cell execution_count=76}\n``` {.python .cell-code}\nall_data = [[\"John\", \"Emily\", \"Michael\", \"Mary\", \"Steven\"],\n [\"Maria\", \"Juan\", \"Javier\", \"Natalia\", \"Pilar\"]]\n\nresult = [name for names in all_data for name in names\n if name.count(\"a\") >= 2]\n\nresult\n```\n\n::: {.cell-output .cell-output-display execution_count=76}\n```\n['Maria', 'Natalia']\n```\n:::\n:::\n\n\nHere is another example where we “flatten” a list of tuples of integers into a simple list of integers\n\n::: {.cell execution_count=77}\n``` {.python .cell-code}\nsome_tuples = [(1, 2, 3), (4, 5, 6), (7, 8, 9)]\n\nflattened = [x for tup in some_tuples for x in tup]\n\nflattened\n```\n\n::: {.cell-output .cell-output-display execution_count=77}\n```\n[1, 2, 3, 4, 5, 6, 7, 8, 9]\n```\n:::\n:::\n\n\n# Functions\n\n![](https://miro.medium.com/max/1200/1*ZegxhR33NdeVRpBPYXnYYQ.gif)\n\n`Functions` are the primary and most important method of code organization and reuse in Python.\n\nthey use the `def` keyword\n\nEach function can have positional arguments and keyword arguments. Keyword arguments are most commonly used to specify default values or optional arguments. Here we will define a function with an optional z argument with the default value 1.5\n\n::: {.cell execution_count=78}\n``` {.python .cell-code}\ndef my_function(x, y, z=1.5):\n return (x + y) * z \n\nmy_function(4,25)\n```\n\n::: {.cell-output .cell-output-display execution_count=78}\n```\n43.5\n```\n:::\n:::\n\n\nThe main restriction on function arguments is that the keyword arguments must follow the positional arguments\n\n## Namespaces, Scope, and Local Functions\n\nA more descriptive name describing a variable scope in Python is a namespace.\n\nConsider the following function\n\n::: {.cell execution_count=79}\n``` {.python .cell-code}\na = []\n\ndef func():\n for i in range(5):\n a.append(i)\n```\n:::\n\n\nWhen `func()` is called, the empty list a is created, five elements are appended, and then a is destroyed when the function exits. \n\n::: {.cell execution_count=80}\n``` {.python .cell-code}\nfunc()\n\nfunc()\n\na\n```\n\n::: {.cell-output .cell-output-display execution_count=80}\n```\n[0, 1, 2, 3, 4, 0, 1, 2, 3, 4]\n```\n:::\n:::\n\n\n## Returing Multiple Values\n\nWhat’s happening here is that the function is actually just returning one object, a tuple, which is then being unpacked into the result variables.\n\n::: {.cell execution_count=81}\n``` {.python .cell-code}\ndef f():\n a = 5\n b = 6\n c = 7\n return a, b, c\n\na, b, c = f()\n\na\n```\n\n::: {.cell-output .cell-output-display execution_count=81}\n```\n5\n```\n:::\n:::\n\n\n## Functions are Objects\n\n Suppose we were doing some data cleaning and needed to apply a bunch of transformations to the following list of strings:\n\n::: {.cell execution_count=82}\n``` {.python .cell-code}\nstates = [\" Alabama \", \"Georgia!\", \"Georgia\", \"georgia\", \"FlOrIda\",\n \"south carolina##\", \"West virginia?\"]\n\nimport re\n\ndef clean_strings(strings):\n result = []\n for value in strings:\n value = value.strip()\n value = re.sub(\"[!#?]\", \"\", value)\n value = value.title()\n result.append(value)\n return result\n\nclean_strings(states)\n```\n\n::: {.cell-output .cell-output-display execution_count=82}\n```\n['Alabama',\n 'Georgia',\n 'Georgia',\n 'Georgia',\n 'Florida',\n 'South Carolina',\n 'West Virginia']\n```\n:::\n:::\n\n\nAnother approach\n\n::: {.cell execution_count=83}\n``` {.python .cell-code}\ndef remove_punctuation(value):\n return re.sub(\"[!#?]\", \"\", value)\n\nclean_ops = [str.strip, remove_punctuation, str.title]\n\ndef clean_strings(strings, ops):\n result = []\n for value in strings:\n for func in ops:\n value = func(value)\n result.append(value)\n return result\n\nclean_strings(states, clean_ops)\n```\n\n::: {.cell-output .cell-output-display execution_count=83}\n```\n['Alabama',\n 'Georgia',\n 'Georgia',\n 'Georgia',\n 'Florida',\n 'South Carolina',\n 'West Virginia']\n```\n:::\n:::\n\n\nYou can use functions as arguments to other functions like the built-in `map` function\n\n::: {.cell execution_count=84}\n``` {.python .cell-code}\nfor x in map(remove_punctuation, states):\n print(x)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n Alabama \nGeorgia\nGeorgia\ngeorgia\nFlOrIda\nsouth carolina\nWest virginia\n```\n:::\n:::\n\n\n## Anonymous Lambda Functions\n\n a way of writing functions consisting of a single statement\n\nsuppose you wanted to sort a collection of strings by the number of distinct letters in each string\n\n::: {.cell execution_count=85}\n``` {.python .cell-code}\nstrings = [\"foo\", \"card\", \"bar\", \"aaaaaaa\", \"ababdo\"]\n\nstrings.sort(key=lambda x: len(set(x)))\n\nstrings\n```\n\n::: {.cell-output .cell-output-display execution_count=85}\n```\n['aaaaaaa', 'foo', 'bar', 'card', 'ababdo']\n```\n:::\n:::\n\n\n# Generators\n\nMany objects in Python support iteration, such as over objects in a list or lines in a file. \n\n::: {.cell execution_count=86}\n``` {.python .cell-code}\nsome_dict = {\"a\": 1, \"b\": 2, \"c\": 3}\n\nfor key in some_dict:\n print(key)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\na\nb\nc\n```\n:::\n:::\n\n\nMost methods expecting a list or list-like object will also accept any iterable object. This includes built-in methods such as `min`, `max`, and `sum`, and type constructors like `list` and `tuple`\n\nA `generator` is a convenient way, similar to writing a normal function, to construct a new iterable object. Whereas normal functions execute and return a single result at a time, generators can return a sequence of multiple values by pausing and resuming execution each time the generator is used. To create a generator, use the yield keyword instead of return in a function\n\n::: {.cell execution_count=87}\n``` {.python .cell-code}\ndef squares(n=10):\n print(f\"Generating squares from 1 to {n ** 2}\")\n for i in range(1, n + 1):\n yield i ** 2\n\ngen = squares()\n\nfor x in gen:\n print(x, end=\" \")\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nGenerating squares from 1 to 100\n1 4 9 16 25 36 49 64 81 100 \n```\n:::\n:::\n\n\n> Since generators produce output one element at a time versus an entire list all at once, it can help your program use less memory.\n\n## Generator expressions\n\n This is a generator analogue to list, dictionary, and set comprehensions. To create one, enclose what would otherwise be a list comprehension within parentheses instead of brackets:\n\n::: {.cell execution_count=88}\n``` {.python .cell-code}\ngen = (x ** 2 for x in range(100))\n\ngen\n```\n\n::: {.cell-output .cell-output-display execution_count=88}\n```\n at 0x7fa541620c80>\n```\n:::\n:::\n\n\nGenerator expressions can be used instead of list comprehensions as function arguments in some cases:\n\n::: {.cell execution_count=89}\n``` {.python .cell-code}\nsum(x ** 2 for x in range(100))\n```\n\n::: {.cell-output .cell-output-display execution_count=89}\n```\n328350\n```\n:::\n:::\n\n\n::: {.cell execution_count=90}\n``` {.python .cell-code}\ndict((i, i ** 2) for i in range(5))\n```\n\n::: {.cell-output .cell-output-display execution_count=90}\n```\n{0: 0, 1: 1, 2: 4, 3: 9, 4: 16}\n```\n:::\n:::\n\n\n## itertools module\n\n`itertools` module has a collection of generators for many common data algorithms.\n\n`groupby` takes any sequence and a function, grouping consecutive elements in the sequence by return value of the function\n\n::: {.cell execution_count=91}\n``` {.python .cell-code}\nimport itertools\n\ndef first_letter(x):\n return x[0]\n\nnames = [\"Alan\", \"Adam\", \"Jackie\", \"Lily\", \"Katie\", \"Molly\"]\n\nfor letter, names in itertools.groupby(names, first_letter):\n print(letter, list(names))\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nA ['Alan', 'Adam']\nJ ['Jackie']\nL ['Lily']\nK ['Katie']\nM ['Molly']\n```\n:::\n:::\n\n\n[Table of other itertools functions](https://wesmckinney.com/book/python-builtin.html#tbl-table_itertools)\n\n# Errors and Exception Handling\n\nHandling errors or exceptions gracefully is an important part of building robust programs\n\n::: {.cell execution_count=92}\n``` {.python .cell-code}\ndef attempt_float(x):\n try:\n return float(x)\n except:\n return x\n\nattempt_float(\"1.2345\")\n```\n\n::: {.cell-output .cell-output-display execution_count=92}\n```\n1.2345\n```\n:::\n:::\n\n\n::: {.cell execution_count=93}\n``` {.python .cell-code}\nattempt_float(\"something\")\n```\n\n::: {.cell-output .cell-output-display execution_count=93}\n```\n'something'\n```\n:::\n:::\n\n\nYou might want to suppress only ValueError, since a TypeError (the input was not a string or numeric value) might indicate a legitimate bug in your program. To do that, write the exception type after except:\n\n::: {.cell execution_count=94}\n``` {.python .cell-code}\ndef attempt_float(x):\n try:\n return float(x)\n except ValueError:\n return x\n```\n:::\n\n\n::: {.cell execution_count=95}\n``` {.python .cell-code}\nattempt_float((1, 2))\n```\n:::\n\n\n```\n---------------------------------------------------------------------------\nTypeError Traceback (most recent call last)\nd:\\packages\\bookclub-py4da\\03_notes.qmd in ()\n----> 1001 attempt_float((1, 2))\n\nInput In [114], in attempt_float(x)\n 1 def attempt_float(x):\n 2 try:\n----> 3 return float(x)\n 4 except ValueError:\n 5 return x\n\nTypeError: float() argument must be a string or a real number, not 'tuple'\n\n```\n\nYou can catch multiple exception types by writing a tuple of exception types instead (the parentheses are required):\n\n::: {.cell execution_count=96}\n``` {.python .cell-code}\ndef attempt_float(x):\n try:\n return float(x)\n except (TypeError, ValueError):\n return x\n\nattempt_float((1, 2))\n```\n\n::: {.cell-output .cell-output-display execution_count=95}\n```\n(1, 2)\n```\n:::\n:::\n\n\nIn some cases, you may not want to suppress an exception, but you want some code to be executed regardless of whether or not the code in the try block succeeds. To do this, use `finally`:\n\n::: {.cell execution_count=97}\n``` {.python .cell-code}\nf = open(path, mode=\"w\")\n\ntry:\n write_to_file(f)\nfinally:\n f.close()\n```\n:::\n\n\nHere, the file object f will always get closed. \n\nyou can have code that executes only if the try: block succeeds using else:\n\n::: {.cell execution_count=98}\n``` {.python .cell-code}\nf = open(path, mode=\"w\")\n\ntry:\n write_to_file(f)\nexcept:\n print(\"Failed\")\nelse:\n print(\"Succeeded\")\nfinally:\n f.close()\n```\n:::\n\n\n## Exceptions in IPython\n\nIf an exception is raised while you are %run-ing a script or executing any statement, IPython will by default print a full call stack trace. Having additional context by itself is a big advantage over the standard Python interpreter\n\n# Files and the Operating System\n\nTo open a file for reading or writing, use the built-in open function with either a relative or absolute file path and an optional file encoding.\n\nWe can then treat the file object f like a list and iterate over the lines\n\n::: {.cell execution_count=99}\n``` {.python .cell-code}\npath = \"examples/segismundo.txt\"\n\nf = open(path, encoding=\"utf-8\")\n\nlines = [x.rstrip() for x in open(path, encoding=\"utf-8\")]\n\nlines\n```\n:::\n\n\nWhen you use open to create file objects, it is recommended to close the file\n\n::: {.cell execution_count=100}\n``` {.python .cell-code}\nf.close()\n```\n:::\n\n\nsome of the most commonly used methods are `read`, `seek`, and `tell`.\n\n`read(10)` returns 10 characters from the file\n\nthe `read` method advances the file object position by the number of bytes read\n\n`tell()` gives you the current position in the file\n\nTo get consistent behavior across platforms, it is best to pass an encoding (such as `encoding=\"utf-8\"`)\n\n`seek(3)` changes the file position to the indicated byte \n\nTo write text to a file, you can use the file’s `write` or `writelines` methods\n\n## Byte and Unicode with Files\n\nThe default behavior for Python files (whether readable or writable) is text mode, which means that you intend to work with Python strings (i.e., Unicode). \n\n", + "supporting": [ + "03_notes_files" + ], + "filters": [], + "includes": {} + } +} \ No newline at end of file diff --git a/_freeze/04_main/execute-results/html.json b/_freeze/04_main/execute-results/html.json new file mode 100644 index 0000000..3077477 --- /dev/null +++ b/_freeze/04_main/execute-results/html.json @@ -0,0 +1,11 @@ +{ + "hash": "ab028e5031ac1b8cc0fee96b85595cff", + "result": { + "markdown": "# 4. NumPy Basics: Arrays and Vectorized Computation\n\n## Learning Objectives\n\n- Learn about NumPy, a package for numerical computing in Python\n- Use NumPy for array-based data: operations, algorithms\n\n## Import NumPy\n\n::: {.cell execution_count=1}\n``` {.python .cell-code}\nimport numpy as np # Recommended standard NumPy convention \n```\n:::\n\n\n## Array-based operations\n* A fast, flexible container for large datasets in Python\n* Stores multiple items of the same type together\n* Can perform operations on whole blocks of data with similar syntax\n\n![Image of an array with 10 length and the first index, 8th element, and indicies denoted by text](https://media.geeksforgeeks.org/wp-content/uploads/CommonArticleDesign1-min.png)\n\n::: {.panel-tabset}\n\n## Create an array\n\n::: {.cell execution_count=2}\n``` {.python .cell-code}\narr = np.array([[1.5, -0.1, 3], [0, -3, 6.5]])\narr\n```\n\n::: {.cell-output .cell-output-display execution_count=2}\n```\narray([[ 1.5, -0.1, 3. ],\n [ 0. , -3. , 6.5]])\n```\n:::\n:::\n\n\n## Perform operation\nAll of the elements have been multiplied by 10.\n\n::: {.cell execution_count=3}\n``` {.python .cell-code}\narr * 10\n```\n\n::: {.cell-output .cell-output-display execution_count=3}\n```\narray([[ 15., -1., 30.],\n [ 0., -30., 65.]])\n```\n:::\n:::\n\n\n:::\n\n* Every array has a `shape` indicating the size of each dimension\n* and a `dtype`, an object describing the data type of the array\n\n::: {.panel-tabset}\n\n## Shape\n\n::: {.cell execution_count=4}\n``` {.python .cell-code}\narr.shape\n```\n\n::: {.cell-output .cell-output-display execution_count=4}\n```\n(2, 3)\n```\n:::\n:::\n\n\n## dtype\n\n::: {.cell execution_count=5}\n``` {.python .cell-code}\narr.dtype\n```\n\n::: {.cell-output .cell-output-display execution_count=5}\n```\ndtype('float64')\n```\n:::\n:::\n\n\n:::\n\n### ndarray\n\n* Generic one/multi-dimensional container where all elements are the same type\n* Created using `numpy.array` function\n\n::: {.panel-tabset}\n## 1D\n\n::: {.cell execution_count=6}\n``` {.python .cell-code}\ndata1 = [6, 7.5, 8, 0, 1]\narr1 = np.array(data1)\narr1\n```\n\n::: {.cell-output .cell-output-display execution_count=6}\n```\narray([6. , 7.5, 8. , 0. , 1. ])\n```\n:::\n:::\n\n\n::: {.cell execution_count=7}\n``` {.python .cell-code}\nprint(arr1.ndim)\nprint(arr1.shape)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n1\n(5,)\n```\n:::\n:::\n\n\n## Multi-dimensional\n\n::: {.cell execution_count=8}\n``` {.python .cell-code}\ndata2 = [[1, 2, 3, 4], [5, 6, 7, 8]]\narr2 = np.array(data2)\narr2\n```\n\n::: {.cell-output .cell-output-display execution_count=8}\n```\narray([[1, 2, 3, 4],\n [5, 6, 7, 8]])\n```\n:::\n:::\n\n\n::: {.cell execution_count=9}\n``` {.python .cell-code}\nprint(arr2.ndim)\nprint(arr2.shape)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n2\n(2, 4)\n```\n:::\n:::\n\n\n:::\n\n#### Special array creation\n\n* `numpy.zeros` creates an array of zeros with a given length or shape\n* `numpy.ones` creates an array of ones with a given length or shape\n* `numpy.empty` creates an array without initialized values\n* `numpy.arange` creates a range\n* Pass a tuple for the shape to create a higher dimensional array\n\n::: {.panel-tabset}\n\n## Zeros\n\n::: {.cell execution_count=10}\n``` {.python .cell-code}\nnp.zeros(10)\n```\n\n::: {.cell-output .cell-output-display execution_count=10}\n```\narray([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])\n```\n:::\n:::\n\n\n## Multi-dimensional\n\n::: {.cell execution_count=11}\n``` {.python .cell-code}\nnp.zeros((3, 6))\n```\n\n::: {.cell-output .cell-output-display execution_count=11}\n```\narray([[0., 0., 0., 0., 0., 0.],\n [0., 0., 0., 0., 0., 0.],\n [0., 0., 0., 0., 0., 0.]])\n```\n:::\n:::\n\n\n:::\n\n::: {.column-margin}\n`numpy.empty` does not return an array of zeros, though it may look like it.\n\n::: {.cell execution_count=12}\n``` {.python .cell-code}\nnp.empty(1)\n```\n\n::: {.cell-output .cell-output-display execution_count=12}\n```\narray([0.])\n```\n:::\n:::\n\n\n:::\n\n[Wes provides a table of array creation functions in the book.](https://wesmckinney.com/book/numpy-basics.html#tbl-table_array_ctor)\n\n#### Data types for ndarrays\n\n* Unless explicitly specified, `numpy.array` tries to infer a good data created arrays. \n* Data type is stored in a special `dtype` metadata object.\n* Can be explict or converted (cast)\n* It is important to care about the general kind of data you’re dealing with.\n\n::: {.panel-tabset}\n## Inferred dtype\n\n::: {.cell execution_count=13}\n``` {.python .cell-code}\narr1.dtype\n```\n\n::: {.cell-output .cell-output-display execution_count=13}\n```\ndtype('float64')\n```\n:::\n:::\n\n\n## Explicit dtype\n\n::: {.cell execution_count=14}\n``` {.python .cell-code}\narr2 = np.array([1, 2, 3], dtype=np.int32)\narr2.dtype\n```\n\n::: {.cell-output .cell-output-display execution_count=14}\n```\ndtype('int32')\n```\n:::\n:::\n\n\n## Cast dtype\n\n::: {.cell execution_count=15}\n``` {.python .cell-code}\nfloat_arr = arr1.astype(np.float64)\nfloat_arr.dtype\n```\n\n::: {.cell-output .cell-output-display execution_count=15}\n```\ndtype('float64')\n```\n:::\n:::\n\n\n## Cast dtype using another array\n\n::: {.cell execution_count=16}\n``` {.python .cell-code}\nint_array = arr1.astype(arr2.dtype)\nint_array.dtype\n```\n\n::: {.cell-output .cell-output-display execution_count=16}\n```\ndtype('int32')\n```\n:::\n:::\n\n\n:::\n\n::: {.column-margin}\nCalling `astype` always creates a new array (a copy of the data), even if the new data type is the same as the old data type.\n:::\n\n[Wes provides a table of supported data types in the book.](https://wesmckinney.com/book/numpy-basics.html#tbl-table_array_dtypes)\n\n## Arithmetic with NumPy Arrays\n\n::: {.panel-tabset}\n## Vectorization\n\nBatch operations on data without `for` loops\n\n::: {.cell execution_count=17}\n``` {.python .cell-code}\narr = np.array([[1., 2., 3.], [4., 5., 6.]])\narr * arr\n```\n\n::: {.cell-output .cell-output-display execution_count=17}\n```\narray([[ 1., 4., 9.],\n [16., 25., 36.]])\n```\n:::\n:::\n\n\n## Arithmetic operations with scalars \n\nPropagate the scalar argument to each element in the array\n\n::: {.cell execution_count=18}\n``` {.python .cell-code}\n1 / arr\n```\n\n::: {.cell-output .cell-output-display execution_count=18}\n```\narray([[1. , 0.5 , 0.33333333],\n [0.25 , 0.2 , 0.16666667]])\n```\n:::\n:::\n\n\n## Comparisons between arrays\n\nof the same size yield boolean arrays\n\n::: {.cell execution_count=19}\n``` {.python .cell-code}\narr2 = np.array([[0., 4., 1.], [7., 2., 12.]])\n\narr2 > arr\n```\n\n::: {.cell-output .cell-output-display execution_count=19}\n```\narray([[False, True, False],\n [ True, False, True]])\n```\n:::\n:::\n\n\n:::\n\n## Basic Indexing and Slicing\n\n* select a subset of your data or individual elements\n\n::: {.cell execution_count=20}\n``` {.python .cell-code}\narr = np.arange(10)\narr\n```\n\n::: {.cell-output .cell-output-display execution_count=20}\n```\narray([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])\n```\n:::\n:::\n\n\n::: {.column-margin}\nArray views are on the original data. Data is not copied, and any modifications to the view will be reflected in the source array. If you want a copy of a slice of an ndarray instead of a view, you will need to explicitly copy the array—for example, `arr[5:8].copy()`.\n:::\n\n::: {.panel-tabset}\n\n## select the sixth element\n\n::: {.cell execution_count=21}\n``` {.python .cell-code}\narr[5]\n```\n\n::: {.cell-output .cell-output-display execution_count=21}\n```\n5\n```\n:::\n:::\n\n\n## select sixth through eighth\n\n::: {.cell execution_count=22}\n``` {.python .cell-code}\narr[5:8]\n```\n\n::: {.cell-output .cell-output-display execution_count=22}\n```\narray([5, 6, 7])\n```\n:::\n:::\n\n\n## broadcast data\n\n::: {.cell execution_count=23}\n``` {.python .cell-code}\narr[5:8] = 12\n```\n:::\n\n\n:::\n\nExample of \"not copied data\"\n\n**Original**\n\n::: {.cell execution_count=24}\n``` {.python .cell-code}\narr_slice = arr[5:8]\narr\n```\n\n::: {.cell-output .cell-output-display execution_count=24}\n```\narray([ 0, 1, 2, 3, 4, 12, 12, 12, 8, 9])\n```\n:::\n:::\n\n\n**Change values in new array**\n\nNotice that arr is now changed.\n\n::: {.cell execution_count=25}\n``` {.python .cell-code}\narr_slice[1] = 123\narr\n```\n\n::: {.cell-output .cell-output-display execution_count=25}\n```\narray([ 0, 1, 2, 3, 4, 12, 123, 12, 8, 9])\n```\n:::\n:::\n\n\n**Change all values in an array**\n\nThis is done with bare slice `[:]`:\n\n::: {.cell execution_count=26}\n``` {.python .cell-code}\narr_slice[:] = 64\narr_slice\n```\n\n::: {.cell-output .cell-output-display execution_count=26}\n```\narray([64, 64, 64])\n```\n:::\n:::\n\n\nHigher dimensional arrays have 1D arrays at each index:\n\n::: {.cell execution_count=27}\n``` {.python .cell-code}\narr2d = np.array([[1,2,3], [4,5,6], [7,8,9]])\narr2d\n```\n\n::: {.cell-output .cell-output-display execution_count=27}\n```\narray([[1, 2, 3],\n [4, 5, 6],\n [7, 8, 9]])\n```\n:::\n:::\n\n\nTo slice, can pass a comma-separated list to select individual elements:\n\n::: {.cell execution_count=28}\n``` {.python .cell-code}\narr2d[0][2]\n```\n\n::: {.cell-output .cell-output-display execution_count=28}\n```\n3\n```\n:::\n:::\n\n\n![](https://media.geeksforgeeks.org/wp-content/uploads/Numpy1.jpg)\n\nOmitting indicies will reduce number of dimensions:\n\n::: {.cell execution_count=29}\n``` {.python .cell-code}\narr2d[0]\n```\n\n::: {.cell-output .cell-output-display execution_count=29}\n```\narray([1, 2, 3])\n```\n:::\n:::\n\n\nCan assign scalar values or arrays:\n\n::: {.cell execution_count=30}\n``` {.python .cell-code}\narr2d[0] = 9\narr2d\n```\n\n::: {.cell-output .cell-output-display execution_count=30}\n```\narray([[9, 9, 9],\n [4, 5, 6],\n [7, 8, 9]])\n```\n:::\n:::\n\n\nOr create an array of the indices. This is like indexing in two steps:\n\n::: {.cell execution_count=31}\n``` {.python .cell-code}\narr2d = np.array([[1,2,3], [4,5,6], [7,8,9]])\narr2d[1,0]\n```\n\n::: {.cell-output .cell-output-display execution_count=31}\n```\n4\n```\n:::\n:::\n\n\n### Indexing with slices\n\nndarrays can be sliced with the same syntax as Python lists:\n\n::: {.cell execution_count=32}\n``` {.python .cell-code}\narr = np.arange(10)\n\narr[1:6]\n```\n\n::: {.cell-output .cell-output-display execution_count=32}\n```\narray([1, 2, 3, 4, 5])\n```\n:::\n:::\n\n\nThis slices a range of elements (\"select the first row of `arr2d`\"):\n\n::: {.cell execution_count=33}\n``` {.python .cell-code}\n# arr2d[row, column]\narr2d[:1]\n```\n\n::: {.cell-output .cell-output-display execution_count=33}\n```\narray([[1, 2, 3]])\n```\n:::\n:::\n\n\nCan pass multiple indicies:\n\n::: {.cell execution_count=34}\n``` {.python .cell-code}\narr2d[:3, :1] # colons keep the dimensions\n# arr2d[0:3, 0] # does not keep the dimensions\n```\n\n::: {.cell-output .cell-output-display execution_count=34}\n```\narray([[1],\n [4],\n [7]])\n```\n:::\n:::\n\n\n## Boolean Indexing\n\n::: {.cell execution_count=35}\n``` {.python .cell-code}\nnames = np.array([\"Bob\", \"Joe\", \"Will\", \"Bob\", \"Will\", \"Joe\", \"Joe\"])\nnames\n```\n\n::: {.cell-output .cell-output-display execution_count=35}\n```\narray(['Bob', 'Joe', 'Will', 'Bob', 'Will', 'Joe', 'Joe'], dtype=' 0, 2, -2)\n```\n\n::: {.cell-output .cell-output-display execution_count=69}\n```\narray([[-2, -2, 2, 2],\n [ 2, 2, -2, -2],\n [-2, 2, 2, -2],\n [-2, 2, 2, -2]])\n```\n:::\n:::\n\n\n::: {.cell execution_count=70}\n``` {.python .cell-code}\n# set only positive to 2\nnp.where(arr > 0,2,arr)\n```\n\n::: {.cell-output .cell-output-display execution_count=70}\n```\narray([[-1.34360107, -0.08168759, 2. , 2. ],\n [ 2. , 2. , -0.95898831, -1.20938829],\n [-1.41229201, 2. , 2. , -0.65876032],\n [-1.22867499, 2. , 2. , -0.13081169]])\n```\n:::\n:::\n\n\n## Mathematical and Statistical Methods\n\nUse \"aggregations' like `sum`, `mean`, and `std`\n\n* If using NumPy, must pass the array you want to aggregate as the first argument\n\n::: {.cell execution_count=71}\n``` {.python .cell-code}\narr = rng.standard_normal((5, 4))\n\narr.mean()\n```\n\n::: {.cell-output .cell-output-display execution_count=71}\n```\n0.06622379901441691\n```\n:::\n:::\n\n\n::: {.cell execution_count=72}\n``` {.python .cell-code}\nnp.mean(arr)\n```\n\n::: {.cell-output .cell-output-display execution_count=72}\n```\n0.06622379901441691\n```\n:::\n:::\n\n\nCan use `axis` to specify which axis to computer the statistic\n\n:::{.panel-tabset}\n\n## \"compute across the columns\"\n\n::: {.cell execution_count=73}\n``` {.python .cell-code}\narr.mean(axis=1)\n```\n\n::: {.cell-output .cell-output-display execution_count=73}\n```\narray([ 0.00066383, 0.40377331, 0.44452789, -0.36983452, -0.14801151])\n```\n:::\n:::\n\n\n## \"compute across the rows\"\n\n::: {.cell execution_count=74}\n``` {.python .cell-code}\narr.mean(axis=0)\n```\n\n::: {.cell-output .cell-output-display execution_count=74}\n```\narray([ 0.54494867, -0.10500845, 0.15080113, -0.32584615])\n```\n:::\n:::\n\n\n:::\n\nOther methods like cumsum and cumprod do not aggregate, instead producing an array of the intermediate results:\n\n::: {.cell execution_count=75}\n``` {.python .cell-code}\narr.cumsum()\n```\n\n::: {.cell-output .cell-output-display execution_count=75}\n```\narray([1.26998312e+00, 1.17702066e+00, 1.11086977e+00, 2.65530664e-03,\n 1.38612157e-01, 1.48568992e+00, 1.54683394e+00, 1.61774854e+00,\n 2.05140308e+00, 2.32888674e+00, 2.85913913e+00, 3.39586010e+00,\n 4.01421011e+00, 3.21919265e+00, 3.51922360e+00, 1.91652201e+00,\n 2.18332084e+00, 9.21697056e-01, 8.50426250e-01, 1.32447598e+00])\n```\n:::\n:::\n\n\nIn multidimensional arrays, accumulation functions like cumsum compute along the indicated axis:\n\n:::{.panel-tabset}\n\n## \"compute across the columns\"\n\n::: {.cell execution_count=76}\n``` {.python .cell-code}\narr.cumsum(axis=1)\n```\n\n::: {.cell-output .cell-output-display execution_count=76}\n```\narray([[ 1.26998312, 1.17702066, 1.11086977, 0.00265531],\n [ 0.13595685, 1.48303461, 1.54417864, 1.61509324],\n [ 0.43365454, 0.7111382 , 1.24139058, 1.77811155],\n [ 0.61835001, -0.17666744, 0.1233635 , -1.47933809],\n [ 0.26679883, -0.99482495, -1.06609576, -0.59204603]])\n```\n:::\n:::\n\n\n## \"compute across the rows\"\n\n::: {.cell execution_count=77}\n``` {.python .cell-code}\narr.cumsum(axis=0)\n```\n\n::: {.cell-output .cell-output-display execution_count=77}\n```\narray([[ 1.26998312, -0.09296246, -0.06615089, -1.10821447],\n [ 1.40593997, 1.25411531, -0.00500687, -1.03729987],\n [ 1.83959451, 1.53159897, 0.52524552, -0.5005789 ],\n [ 2.45794452, 0.73658151, 0.82527646, -2.10328049],\n [ 2.72474335, -0.52504227, 0.75400566, -1.62923076]])\n```\n:::\n:::\n\n\n:::\n\n## Methods for Boolean Arrays\n\nBoolean values are coerced to 1 (`True`) and 0 (`False`) in the preceding methods. Thus, sum is often used as a means of counting True values in a boolean array:\n\n::: {.cell execution_count=78}\n``` {.python .cell-code}\n(arr > 0).sum() # Number of positive values\n```\n\n::: {.cell-output .cell-output-display execution_count=78}\n```\n13\n```\n:::\n:::\n\n\n`any` tests whether one or more values in an array is True, while `all` checks if every value is True:\n\n::: {.cell execution_count=79}\n``` {.python .cell-code}\nbools = np.array([False, False, True, False])\nbools.any()\n```\n\n::: {.cell-output .cell-output-display execution_count=79}\n```\nTrue\n```\n:::\n:::\n\n\n## Sorting\n\nNumPy arrays can be sorted in place with the `sort` method:\n\n::: {.cell execution_count=80}\n``` {.python .cell-code}\narr = rng.standard_normal(6)\narr.sort()\narr\n```\n\n::: {.cell-output .cell-output-display execution_count=80}\n```\narray([-1.64041784, -1.15452958, -0.85725882, -0.41485376, 0.0977165 ,\n 0.68828179])\n```\n:::\n:::\n\n\nCan sort multidimensional section by providing an axis:\n\n::: {.cell execution_count=81}\n``` {.python .cell-code}\narr = rng.standard_normal((5, 3))\n```\n:::\n\n\n:::{.panel-tabset}\n## \"compute across the columns\"\n\n::: {.cell execution_count=82}\n``` {.python .cell-code}\narr.cumsum(axis=1)\n```\n\n::: {.cell-output .cell-output-display execution_count=82}\n```\narray([[ 0.65045239, -0.73790756, -1.64529002],\n [-1.09542531, -1.08827961, -0.55391971],\n [-1.06580785, -1.24728059, 0.37467121],\n [-0.31739195, -1.13320691, -0.7466279 ],\n [-0.22363893, -0.92532973, -2.72104291]])\n```\n:::\n:::\n\n\n## \"compute across the rows\"\n\n::: {.cell execution_count=83}\n``` {.python .cell-code}\narr.cumsum(axis=0)\n```\n\n::: {.cell-output .cell-output-display execution_count=83}\n```\narray([[ 0.65045239, -1.38835995, -0.90738246],\n [-0.44497292, -1.38121426, -0.37302255],\n [-1.51078076, -1.562687 , 1.24892924],\n [-1.82817271, -2.37850196, 1.63550826],\n [-2.05181164, -3.08019277, -0.16020491]])\n```\n:::\n:::\n\n\n:::\n\nThe top-level method `numpy.sort` returns a sorted copy of an array (like the Python built-in function `sorted`) instead of modifying the array in place:\n\n::: {.cell execution_count=84}\n``` {.python .cell-code}\narr2 = np.array([5, -10, 7, 1, 0, -3])\nsorted_arr2 = np.sort(arr2)\nsorted_arr2\n```\n\n::: {.cell-output .cell-output-display execution_count=84}\n```\narray([-10, -3, 0, 1, 5, 7])\n```\n:::\n:::\n\n\n## Unique and Other Set Logic\n\n`numpy.unique` returns the sorted unique values in an array:\n\n::: {.cell execution_count=85}\n``` {.python .cell-code}\nnp.unique(names)\n```\n\n::: {.cell-output .cell-output-display execution_count=85}\n```\narray(['Bob', 'Joe', 'Will'], dtype='()\n----> 1 tup[2] = False\n\nTypeError: 'tuple' object does not support item assignment\nTypeError: 'tuple' object does not support item assignment\n````\n\nIf an object inside a tuple is mutable, such as a list, you can modify it in place\n\n::: {.cell execution_count=9}\n``` {.python .cell-code}\ntup[1].append(3)\n\ntup\n```\n\n::: {.cell-output .cell-output-display execution_count=9}\n```\n('foo', [1, 2, 3], True)\n```\n:::\n:::\n\n\nYou can concatenate tuples using the + operator to produce longer tuples:\n\n::: {.cell execution_count=10}\n``` {.python .cell-code}\n(4, None, 'foo') + (6, 0) + ('bar',)\n```\n\n::: {.cell-output .cell-output-display execution_count=10}\n```\n(4, None, 'foo', 6, 0, 'bar')\n```\n:::\n:::\n\n\n### Unpacking tuples\n\nIf you try to assign to a tuple-like expression of variables, Python will attempt to unpack the value on the righthand side of the equals sign:\n\n::: {.cell execution_count=11}\n``` {.python .cell-code}\ntup = (4, 5, 6)\ntup\n```\n\n::: {.cell-output .cell-output-display execution_count=11}\n```\n(4, 5, 6)\n```\n:::\n:::\n\n\n::: {.cell execution_count=12}\n``` {.python .cell-code}\na, b, c = tup\n\nc\n```\n\n::: {.cell-output .cell-output-display execution_count=12}\n```\n6\n```\n:::\n:::\n\n\nEven sequences with nested tuples can be unpacked:\n\n::: {.cell execution_count=13}\n``` {.python .cell-code}\ntup = 4, 5, (6,7)\n\na, b, (c, d) = tup\n\nd\n```\n\n::: {.cell-output .cell-output-display execution_count=13}\n```\n7\n```\n:::\n:::\n\n\nTo easily swap variable names\n\n::: {.cell execution_count=14}\n``` {.python .cell-code}\na, b = 1, 4\n\na\n```\n\n::: {.cell-output .cell-output-display execution_count=14}\n```\n1\n```\n:::\n:::\n\n\n::: {.cell execution_count=15}\n``` {.python .cell-code}\nb\n```\n\n::: {.cell-output .cell-output-display execution_count=15}\n```\n4\n```\n:::\n:::\n\n\n::: {.cell execution_count=16}\n``` {.python .cell-code}\nb, a = a, b\n\na\n```\n\n::: {.cell-output .cell-output-display execution_count=16}\n```\n4\n```\n:::\n:::\n\n\n::: {.cell execution_count=17}\n``` {.python .cell-code}\nb\n```\n\n::: {.cell-output .cell-output-display execution_count=17}\n```\n1\n```\n:::\n:::\n\n\nA common use of variable unpacking is iterating over sequences of tuples or lists\n\n::: {.cell execution_count=18}\n``` {.python .cell-code}\nseq = [(1, 2, 3), (4, 5, 6), (7, 8, 9)]\n\nseq\n```\n\n::: {.cell-output .cell-output-display execution_count=18}\n```\n[(1, 2, 3), (4, 5, 6), (7, 8, 9)]\n```\n:::\n:::\n\n\n::: {.cell execution_count=19}\n``` {.python .cell-code}\nfor a, b, c in seq:\n print(f'a={a}, b={b}, c={c}')\n```\n\n::: {.cell-output .cell-output-stdout}\n```\na=1, b=2, c=3\na=4, b=5, c=6\na=7, b=8, c=9\n```\n:::\n:::\n\n\n`*rest` syntax for plucking elements\n\n::: {.cell execution_count=20}\n``` {.python .cell-code}\nvalues = 1,2,3,4,5\n\na, b, *rest = values\n\nrest\n```\n\n::: {.cell-output .cell-output-display execution_count=20}\n```\n[3, 4, 5]\n```\n:::\n:::\n\n\n As a matter of convention, many Python programmers will use the underscore (_) for unwanted variables:\n\n::: {.cell execution_count=21}\n``` {.python .cell-code}\na, b, *_ = values\n```\n:::\n\n\n### Tuple methods\n\nSince the size and contents of a tuple cannot be modified, it is very light on instance methods. A particularly useful one (also available on lists) is `count`\n\n::: {.cell execution_count=22}\n``` {.python .cell-code}\na = (1,2,2,2,2,3,4,5,7,8,9)\n\na.count(2)\n```\n\n::: {.cell-output .cell-output-display execution_count=22}\n```\n4\n```\n:::\n:::\n\n\n## List\n\n![](https://pynative.com/wp-content/uploads/2021/03/python-list.jpg)\n\nIn contrast with tuples, lists are variable length and their contents can be modified in place.\n\nLists are mutable. \n\nLists use `[]` square brackts or the `list` function\n\n::: {.cell execution_count=23}\n``` {.python .cell-code}\na_list = [2, 3, 7, None]\n\ntup = (\"foo\", \"bar\", \"baz\")\n\nb_list = list(tup)\n\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=23}\n```\n['foo', 'bar', 'baz']\n```\n:::\n:::\n\n\n::: {.cell execution_count=24}\n``` {.python .cell-code}\nb_list[1] = \"peekaboo\"\n\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=24}\n```\n['foo', 'peekaboo', 'baz']\n```\n:::\n:::\n\n\nLists and tuples are semantically similar (though tuples cannot be modified) and can be used interchangeably in many functions.\n\n::: {.cell execution_count=25}\n``` {.python .cell-code}\ngen = range(10)\n\ngen\n```\n\n::: {.cell-output .cell-output-display execution_count=25}\n```\nrange(0, 10)\n```\n:::\n:::\n\n\n::: {.cell execution_count=26}\n``` {.python .cell-code}\nlist(gen)\n```\n\n::: {.cell-output .cell-output-display execution_count=26}\n```\n[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]\n```\n:::\n:::\n\n\n### Adding and removing list elements\n\nthe `append` method\n\n::: {.cell execution_count=27}\n``` {.python .cell-code}\nb_list.append(\"dwarf\")\n\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=27}\n```\n['foo', 'peekaboo', 'baz', 'dwarf']\n```\n:::\n:::\n\n\nthe `insert` method\n\n::: {.cell execution_count=28}\n``` {.python .cell-code}\nb_list.insert(1, \"red\")\n\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=28}\n```\n['foo', 'red', 'peekaboo', 'baz', 'dwarf']\n```\n:::\n:::\n\n\n`insert` is computationally more expensive than `append`\n\nthe `pop` method, the inverse of `insert`\n\n::: {.cell execution_count=29}\n``` {.python .cell-code}\nb_list.pop(2)\n```\n\n::: {.cell-output .cell-output-display execution_count=29}\n```\n'peekaboo'\n```\n:::\n:::\n\n\n::: {.cell execution_count=30}\n``` {.python .cell-code}\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=30}\n```\n['foo', 'red', 'baz', 'dwarf']\n```\n:::\n:::\n\n\nthe `remove` method\n\n::: {.cell execution_count=31}\n``` {.python .cell-code}\nb_list.append(\"foo\")\n\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=31}\n```\n['foo', 'red', 'baz', 'dwarf', 'foo']\n```\n:::\n:::\n\n\n::: {.cell execution_count=32}\n``` {.python .cell-code}\nb_list.remove(\"foo\")\n\nb_list\n```\n\n::: {.cell-output .cell-output-display execution_count=32}\n```\n['red', 'baz', 'dwarf', 'foo']\n```\n:::\n:::\n\n\nCheck if a list contains a value using the `in` keyword:\n\n::: {.cell execution_count=33}\n``` {.python .cell-code}\n\"dwarf\" in b_list\n```\n\n::: {.cell-output .cell-output-display execution_count=33}\n```\nTrue\n```\n:::\n:::\n\n\nThe keyword `not` can be used to negate an `in`\n\n::: {.cell execution_count=34}\n``` {.python .cell-code}\n\"dwarf\" not in b_list\n```\n\n::: {.cell-output .cell-output-display execution_count=34}\n```\nFalse\n```\n:::\n:::\n\n\n### Concatenating and combining lists\n\nsimilar with tuples, use `+` to concatenate\n\n::: {.cell execution_count=35}\n``` {.python .cell-code}\n[4, None, \"foo\"] + [7, 8, (2, 3)]\n```\n\n::: {.cell-output .cell-output-display execution_count=35}\n```\n[4, None, 'foo', 7, 8, (2, 3)]\n```\n:::\n:::\n\n\nthe `extend` method\n\n::: {.cell execution_count=36}\n``` {.python .cell-code}\nx = [4, None, \"foo\"]\n\nx.extend([7,8,(2,3)])\n\nx\n```\n\n::: {.cell-output .cell-output-display execution_count=36}\n```\n[4, None, 'foo', 7, 8, (2, 3)]\n```\n:::\n:::\n\n\nlist concatenation by addition is an expensive operation\n\nusing `extend` is preferable\n\n```{{python}}\neverything = []\nfor chunk in list_of_lists:\n everything.extend(chunk)\n\n```\n\nis generally faster than\n\n```{{python}}\n\neverything = []\nfor chunk in list_of_lists:\n everything = everything + chunk\n\n```\n\n### Sorting\n\nthe `sort` method\n\n::: {.cell execution_count=37}\n``` {.python .cell-code}\na = [7, 2, 5, 1, 3]\n\na.sort()\n\na\n```\n\n::: {.cell-output .cell-output-display execution_count=37}\n```\n[1, 2, 3, 5, 7]\n```\n:::\n:::\n\n\n`sort` options\n\n::: {.cell execution_count=38}\n``` {.python .cell-code}\nb = [\"saw\", \"small\", \"He\", \"foxes\", \"six\"]\n\nb.sort(key = len)\n\nb\n```\n\n::: {.cell-output .cell-output-display execution_count=38}\n```\n['He', 'saw', 'six', 'small', 'foxes']\n```\n:::\n:::\n\n\n### Slicing\n\nSlicing semantics takes a bit of getting used to, especially if you’re coming from R or MATLAB.\n\nusing the indexing operator `[]`\n\n::: {.cell execution_count=39}\n``` {.python .cell-code}\nseq = [7, 2, 3, 7, 5, 6, 0, 1]\n\nseq[3:5]\n```\n\n::: {.cell-output .cell-output-display execution_count=39}\n```\n[7, 5]\n```\n:::\n:::\n\n\nalso assigned with a sequence\n\n::: {.cell execution_count=40}\n``` {.python .cell-code}\nseq[3:5] = [6,3]\n\nseq\n```\n\n::: {.cell-output .cell-output-display execution_count=40}\n```\n[7, 2, 3, 6, 3, 6, 0, 1]\n```\n:::\n:::\n\n\nEither the `start` or `stop` can be omitted\n\n::: {.cell execution_count=41}\n``` {.python .cell-code}\nseq[:5]\n```\n\n::: {.cell-output .cell-output-display execution_count=41}\n```\n[7, 2, 3, 6, 3]\n```\n:::\n:::\n\n\n::: {.cell execution_count=42}\n``` {.python .cell-code}\nseq[3:]\n```\n\n::: {.cell-output .cell-output-display execution_count=42}\n```\n[6, 3, 6, 0, 1]\n```\n:::\n:::\n\n\nNegative indices slice the sequence relative to the end:\n\n::: {.cell execution_count=43}\n``` {.python .cell-code}\nseq[-4:]\n```\n\n::: {.cell-output .cell-output-display execution_count=43}\n```\n[3, 6, 0, 1]\n```\n:::\n:::\n\n\nA step can also be used after a second colon to, say, take every other element:\n\n::: {.cell execution_count=44}\n``` {.python .cell-code}\nseq[::2]\n```\n\n::: {.cell-output .cell-output-display execution_count=44}\n```\n[7, 3, 3, 0]\n```\n:::\n:::\n\n\nA clever use of this is to pass -1, which has the useful effect of reversing a list or tuple:\n\n::: {.cell execution_count=45}\n``` {.python .cell-code}\nseq[::-1]\n```\n\n::: {.cell-output .cell-output-display execution_count=45}\n```\n[1, 0, 6, 3, 6, 3, 2, 7]\n```\n:::\n:::\n\n\n## Dictionary\n\n![](https://pynative.com/wp-content/uploads/2021/02/dictionaries-in-python.jpg)\n\nThe dictionary or dict may be the most important built-in Python data structure. \n\nOne approach for creating a dictionary is to use curly braces {} and colons to separate keys and values:\n\n::: {.cell execution_count=46}\n``` {.python .cell-code}\nempty_dict = {}\n\nd1 = {\"a\": \"some value\", \"b\": [1, 2, 3, 4]}\n\nd1\n```\n\n::: {.cell-output .cell-output-display execution_count=46}\n```\n{'a': 'some value', 'b': [1, 2, 3, 4]}\n```\n:::\n:::\n\n\naccess, insert, or set elements \n\n::: {.cell execution_count=47}\n``` {.python .cell-code}\nd1[7] = \"an integer\"\n\nd1\n```\n\n::: {.cell-output .cell-output-display execution_count=47}\n```\n{'a': 'some value', 'b': [1, 2, 3, 4], 7: 'an integer'}\n```\n:::\n:::\n\n\nand as before\n\n::: {.cell execution_count=48}\n``` {.python .cell-code}\n\"b\" in d1\n```\n\n::: {.cell-output .cell-output-display execution_count=48}\n```\nTrue\n```\n:::\n:::\n\n\nthe `del` and `pop` methods\n\n::: {.cell execution_count=49}\n``` {.python .cell-code}\ndel d1[7]\n\nd1\n```\n\n::: {.cell-output .cell-output-display execution_count=49}\n```\n{'a': 'some value', 'b': [1, 2, 3, 4]}\n```\n:::\n:::\n\n\n::: {.cell execution_count=50}\n``` {.python .cell-code}\nret = d1.pop(\"a\")\n\nret\n```\n\n::: {.cell-output .cell-output-display execution_count=50}\n```\n'some value'\n```\n:::\n:::\n\n\nThe `keys` and `values` methods\n\n::: {.cell execution_count=51}\n``` {.python .cell-code}\nlist(d1.keys())\n```\n\n::: {.cell-output .cell-output-display execution_count=51}\n```\n['b']\n```\n:::\n:::\n\n\n::: {.cell execution_count=52}\n``` {.python .cell-code}\nlist(d1.values())\n```\n\n::: {.cell-output .cell-output-display execution_count=52}\n```\n[[1, 2, 3, 4]]\n```\n:::\n:::\n\n\nthe `items` method\n\n::: {.cell execution_count=53}\n``` {.python .cell-code}\nlist(d1.items())\n```\n\n::: {.cell-output .cell-output-display execution_count=53}\n```\n[('b', [1, 2, 3, 4])]\n```\n:::\n:::\n\n\n the update method to merge one dictionary into another\n\n::: {.cell execution_count=54}\n``` {.python .cell-code}\nd1.update({\"b\": \"foo\", \"c\": 12})\n\nd1\n```\n\n::: {.cell-output .cell-output-display execution_count=54}\n```\n{'b': 'foo', 'c': 12}\n```\n:::\n:::\n\n\n ### Creating dictionaries from sequences\n\n::: {.cell execution_count=55}\n``` {.python .cell-code}\nlist(range(5))\n```\n\n::: {.cell-output .cell-output-display execution_count=55}\n```\n[0, 1, 2, 3, 4]\n```\n:::\n:::\n\n\n::: {.cell execution_count=56}\n``` {.python .cell-code}\ntuples = zip(range(5), reversed(range(5)))\n\ntuples\n\nmapping = dict(tuples)\n\nmapping\n```\n\n::: {.cell-output .cell-output-display execution_count=56}\n```\n{0: 4, 1: 3, 2: 2, 3: 1, 4: 0}\n```\n:::\n:::\n\n\n### Default values\n\nimagine categorizing a list of words by their first letters as a dictionary of lists\n\n::: {.cell execution_count=57}\n``` {.python .cell-code}\nwords = [\"apple\", \"bat\", \"bar\", \"atom\", \"book\"]\n\nby_letter = {}\n\nfor word in words:\n letter = word[0]\n if letter not in by_letter:\n by_letter[letter] = [word]\n else:\n by_letter[letter].append(word)\n\nby_letter\n```\n\n::: {.cell-output .cell-output-display execution_count=57}\n```\n{'a': ['apple', 'atom'], 'b': ['bat', 'bar', 'book']}\n```\n:::\n:::\n\n\nThe `setdefault` dictionary method can be used to simplify this workflow. The preceding for loop can be rewritten as:\n\n::: {.cell execution_count=58}\n``` {.python .cell-code}\nby_letter = {}\n\nfor word in words:\n letter = word[0]\n by_letter.setdefault(letter, []).append(word)\n\nby_letter\n```\n\n::: {.cell-output .cell-output-display execution_count=58}\n```\n{'a': ['apple', 'atom'], 'b': ['bat', 'bar', 'book']}\n```\n:::\n:::\n\n\nThe built-in `collections`module has a useful class, `defaultdict`, which makes this even easier.\n\n::: {.cell execution_count=59}\n``` {.python .cell-code}\nfrom collections import defaultdict\n\nby_letter = defaultdict(list)\n\nfor word in words:\n by_letter[word[0]].append(word)\n\nby_letter\n```\n\n::: {.cell-output .cell-output-display execution_count=59}\n```\ndefaultdict(list, {'a': ['apple', 'atom'], 'b': ['bat', 'bar', 'book']})\n```\n:::\n:::\n\n\n### Valid dictionary key types\n\nkeys generally have to be immutable objects like scalars or tuples for *hashability*\n\nTo use a list as a key, one option is to convert it to a tuple, which can be hashed as long as its elements also can be:\n\n::: {.cell execution_count=60}\n``` {.python .cell-code}\nd = {}\n\nd[tuple([1,2,3])] = 5\n\nd\n```\n\n::: {.cell-output .cell-output-display execution_count=60}\n```\n{(1, 2, 3): 5}\n```\n:::\n:::\n\n\n## Set\n\n![](https://pynative.com/wp-content/uploads/2021/03/python-sets.jpg)\n\ncan be created in two ways: via the `set` function or via a `set literal` with curly braces:\n\n::: {.cell execution_count=61}\n``` {.python .cell-code}\nset([2, 2, 2, 1, 3, 3])\n\n{2,2,1,3,3}\n```\n\n::: {.cell-output .cell-output-display execution_count=61}\n```\n{1, 2, 3}\n```\n:::\n:::\n\n\nSets support mathematical set operations like union, intersection, difference, and symmetric difference.\n\nThe `union` of these two sets:\n\n::: {.cell execution_count=62}\n``` {.python .cell-code}\na = {1, 2, 3, 4, 5}\n\nb = {3, 4, 5, 6, 7, 8}\n\na.union(b)\n\na | b\n```\n\n::: {.cell-output .cell-output-display execution_count=62}\n```\n{1, 2, 3, 4, 5, 6, 7, 8}\n```\n:::\n:::\n\n\nThe `&`operator or the `intersection` method\n\n::: {.cell execution_count=63}\n``` {.python .cell-code}\na.intersection(b)\n\na & b\n```\n\n::: {.cell-output .cell-output-display execution_count=63}\n```\n{3, 4, 5}\n```\n:::\n:::\n\n\n[A table of commonly used `set` methods](https://wesmckinney.com/book/python-builtin.html#tbl-table_set_operations)\n\nAll of the logical set operations have in-place counterparts, which enable you to replace the contents of the set on the left side of the operation with the result. For very large sets, this may be more efficient\n\n::: {.cell execution_count=64}\n``` {.python .cell-code}\nc = a.copy()\n\nc |= b\n\nc\n```\n\n::: {.cell-output .cell-output-display execution_count=64}\n```\n{1, 2, 3, 4, 5, 6, 7, 8}\n```\n:::\n:::\n\n\n::: {.cell execution_count=65}\n``` {.python .cell-code}\nd = a.copy()\n\nd &= b\n\nd\n```\n\n::: {.cell-output .cell-output-display execution_count=65}\n```\n{3, 4, 5}\n```\n:::\n:::\n\n\nset elements generally must be immutable, and they must be hashable\n\nyou can convert them to tuples\n\nYou can also check if a set is a subset of (is contained in) or a superset of (contains all elements of) another set\n\n::: {.cell execution_count=66}\n``` {.python .cell-code}\na_set = {1, 2, 3, 4, 5}\n\n{1, 2, 3}.issubset(a_set)\n```\n\n::: {.cell-output .cell-output-display execution_count=66}\n```\nTrue\n```\n:::\n:::\n\n\n::: {.cell execution_count=67}\n``` {.python .cell-code}\na_set.issuperset({1, 2, 3})\n```\n\n::: {.cell-output .cell-output-display execution_count=67}\n```\nTrue\n```\n:::\n:::\n\n\n## Built-In Sequence Functions\n\n### enumerate\n\n`enumerate` returns a sequence of (i, value) tuples\n\n### sorted\n\n`sorted` returns a new sorted list \n\n::: {.cell execution_count=68}\n``` {.python .cell-code}\nsorted([7,1,2,9,3,6,5,0,22])\n```\n\n::: {.cell-output .cell-output-display execution_count=68}\n```\n[0, 1, 2, 3, 5, 6, 7, 9, 22]\n```\n:::\n:::\n\n\n### zip\n\n`zip` “pairs” up the elements of a number of lists, tuples, or other sequences to create a list of tuples\n\n::: {.cell execution_count=69}\n``` {.python .cell-code}\nseq1 = [\"foo\", \"bar\", \"baz\"]\n\nseq2 = [\"one\", \"two\", \"three\"]\n\nzipped = zip(seq1, seq2)\n\nlist(zipped)\n```\n\n::: {.cell-output .cell-output-display execution_count=69}\n```\n[('foo', 'one'), ('bar', 'two'), ('baz', 'three')]\n```\n:::\n:::\n\n\n`zip` can take an arbitrary number of sequences, and the number of elements it produces is determined by the shortest sequence\n\n::: {.cell execution_count=70}\n``` {.python .cell-code}\nseq3 = [False, True]\n\nlist(zip(seq1, seq2, seq3))\n```\n\n::: {.cell-output .cell-output-display execution_count=70}\n```\n[('foo', 'one', False), ('bar', 'two', True)]\n```\n:::\n:::\n\n\nA common use of `zip` is simultaneously iterating over multiple sequences, possibly also combined with `enumerate`\n\n::: {.cell execution_count=71}\n``` {.python .cell-code}\nfor index, (a, b) in enumerate(zip(seq1, seq2)):\n print(f\"{index}: {a}, {b}\")\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n0: foo, one\n1: bar, two\n2: baz, three\n```\n:::\n:::\n\n\n`reversed` iterates over the elements of a sequence in reverse order\n\n::: {.cell execution_count=72}\n``` {.python .cell-code}\nlist(reversed(range(10)))\n```\n\n::: {.cell-output .cell-output-display execution_count=72}\n```\n[9, 8, 7, 6, 5, 4, 3, 2, 1, 0]\n```\n:::\n:::\n\n\n## List, Set, and Dictionary Comprehensions\n\n```\n[expr for value in collection if condition]\n```\n\nFor example, given a list of strings, we could filter out strings with length 2 or less and convert them to uppercase like this\n\n::: {.cell execution_count=73}\n``` {.python .cell-code}\nstrings = [\"a\", \"as\", \"bat\", \"car\", \"dove\", \"python\"]\n\n[x.upper() for x in strings if len(x) > 2]\n```\n\n::: {.cell-output .cell-output-display execution_count=73}\n```\n['BAT', 'CAR', 'DOVE', 'PYTHON']\n```\n:::\n:::\n\n\nA dictionary comprehension looks like this\n\n```\ndict_comp = {key-expr: value-expr for value in collection\n if condition}\n```\n\nSuppose we wanted a set containing just the lengths of the strings contained in the collection\n\n::: {.cell execution_count=74}\n``` {.python .cell-code}\nunique_lengths = {len(x) for x in strings}\n\nunique_lengths\n```\n\n::: {.cell-output .cell-output-display execution_count=74}\n```\n{1, 2, 3, 4, 6}\n```\n:::\n:::\n\n\nwe could create a lookup map of these strings for their locations in the list\n\n::: {.cell execution_count=75}\n``` {.python .cell-code}\nloc_mapping = {value: index for index, value in enumerate(strings)}\n\nloc_mapping\n```\n\n::: {.cell-output .cell-output-display execution_count=75}\n```\n{'a': 0, 'as': 1, 'bat': 2, 'car': 3, 'dove': 4, 'python': 5}\n```\n:::\n:::\n\n\n## Nested list comprehensions\n\nSuppose we have a list of lists containing some English and Spanish names. We want to get a single list containing all names with two or more a’s in them\n\n::: {.cell execution_count=76}\n``` {.python .cell-code}\nall_data = [[\"John\", \"Emily\", \"Michael\", \"Mary\", \"Steven\"],\n [\"Maria\", \"Juan\", \"Javier\", \"Natalia\", \"Pilar\"]]\n\nresult = [name for names in all_data for name in names\n if name.count(\"a\") >= 2]\n\nresult\n```\n\n::: {.cell-output .cell-output-display execution_count=76}\n```\n['Maria', 'Natalia']\n```\n:::\n:::\n\n\nHere is another example where we “flatten” a list of tuples of integers into a simple list of integers\n\n::: {.cell execution_count=77}\n``` {.python .cell-code}\nsome_tuples = [(1, 2, 3), (4, 5, 6), (7, 8, 9)]\n\nflattened = [x for tup in some_tuples for x in tup]\n\nflattened\n```\n\n::: {.cell-output .cell-output-display execution_count=77}\n```\n[1, 2, 3, 4, 5, 6, 7, 8, 9]\n```\n:::\n:::\n\n\n# Functions\n\n![](https://miro.medium.com/max/1200/1*ZegxhR33NdeVRpBPYXnYYQ.gif)\n\n`Functions` are the primary and most important method of code organization and reuse in Python.\n\nthey use the `def` keyword\n\nEach function can have positional arguments and keyword arguments. Keyword arguments are most commonly used to specify default values or optional arguments. Here we will define a function with an optional z argument with the default value 1.5\n\n::: {.cell execution_count=78}\n``` {.python .cell-code}\ndef my_function(x, y, z=1.5):\n return (x + y) * z \n\nmy_function(4,25)\n```\n\n::: {.cell-output .cell-output-display execution_count=78}\n```\n43.5\n```\n:::\n:::\n\n\nThe main restriction on function arguments is that the keyword arguments must follow the positional arguments\n\n## Namespaces, Scope, and Local Functions\n\nA more descriptive name describing a variable scope in Python is a namespace.\n\nConsider the following function\n\n::: {.cell execution_count=79}\n``` {.python .cell-code}\na = []\n\ndef func():\n for i in range(5):\n a.append(i)\n```\n:::\n\n\nWhen `func()` is called, the empty list a is created, five elements are appended, and then a is destroyed when the function exits. \n\n::: {.cell execution_count=80}\n``` {.python .cell-code}\nfunc()\n\nfunc()\n\na\n```\n\n::: {.cell-output .cell-output-display execution_count=80}\n```\n[0, 1, 2, 3, 4, 0, 1, 2, 3, 4]\n```\n:::\n:::\n\n\n## Returing Multiple Values\n\nWhat’s happening here is that the function is actually just returning one object, a tuple, which is then being unpacked into the result variables.\n\n::: {.cell execution_count=81}\n``` {.python .cell-code}\ndef f():\n a = 5\n b = 6\n c = 7\n return a, b, c\n\na, b, c = f()\n\na\n```\n\n::: {.cell-output .cell-output-display execution_count=81}\n```\n5\n```\n:::\n:::\n\n\n## Functions are Objects\n\n Suppose we were doing some data cleaning and needed to apply a bunch of transformations to the following list of strings:\n\n::: {.cell execution_count=82}\n``` {.python .cell-code}\nstates = [\" Alabama \", \"Georgia!\", \"Georgia\", \"georgia\", \"FlOrIda\",\n \"south carolina##\", \"West virginia?\"]\n\nimport re\n\ndef clean_strings(strings):\n result = []\n for value in strings:\n value = value.strip()\n value = re.sub(\"[!#?]\", \"\", value)\n value = value.title()\n result.append(value)\n return result\n\nclean_strings(states)\n```\n\n::: {.cell-output .cell-output-display execution_count=82}\n```\n['Alabama',\n 'Georgia',\n 'Georgia',\n 'Georgia',\n 'Florida',\n 'South Carolina',\n 'West Virginia']\n```\n:::\n:::\n\n\nAnother approach\n\n::: {.cell execution_count=83}\n``` {.python .cell-code}\ndef remove_punctuation(value):\n return re.sub(\"[!#?]\", \"\", value)\n\nclean_ops = [str.strip, remove_punctuation, str.title]\n\ndef clean_strings(strings, ops):\n result = []\n for value in strings:\n for func in ops:\n value = func(value)\n result.append(value)\n return result\n\nclean_strings(states, clean_ops)\n```\n\n::: {.cell-output .cell-output-display execution_count=83}\n```\n['Alabama',\n 'Georgia',\n 'Georgia',\n 'Georgia',\n 'Florida',\n 'South Carolina',\n 'West Virginia']\n```\n:::\n:::\n\n\nYou can use functions as arguments to other functions like the built-in `map` function\n\n::: {.cell execution_count=84}\n``` {.python .cell-code}\nfor x in map(remove_punctuation, states):\n print(x)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n Alabama \nGeorgia\nGeorgia\ngeorgia\nFlOrIda\nsouth carolina\nWest virginia\n```\n:::\n:::\n\n\n## Anonymous Lambda Functions\n\n a way of writing functions consisting of a single statement\n\nsuppose you wanted to sort a collection of strings by the number of distinct letters in each string\n\n::: {.cell execution_count=85}\n``` {.python .cell-code}\nstrings = [\"foo\", \"card\", \"bar\", \"aaaaaaa\", \"ababdo\"]\n\nstrings.sort(key=lambda x: len(set(x)))\n\nstrings\n```\n\n::: {.cell-output .cell-output-display execution_count=85}\n```\n['aaaaaaa', 'foo', 'bar', 'card', 'ababdo']\n```\n:::\n:::\n\n\n# Generators\n\nMany objects in Python support iteration, such as over objects in a list or lines in a file. \n\n::: {.cell execution_count=86}\n``` {.python .cell-code}\nsome_dict = {\"a\": 1, \"b\": 2, \"c\": 3}\n\nfor key in some_dict:\n print(key)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\na\nb\nc\n```\n:::\n:::\n\n\nMost methods expecting a list or list-like object will also accept any iterable object. This includes built-in methods such as `min`, `max`, and `sum`, and type constructors like `list` and `tuple`\n\nA `generator` is a convenient way, similar to writing a normal function, to construct a new iterable object. Whereas normal functions execute and return a single result at a time, generators can return a sequence of multiple values by pausing and resuming execution each time the generator is used. To create a generator, use the yield keyword instead of return in a function\n\n::: {.cell execution_count=87}\n``` {.python .cell-code}\ndef squares(n=10):\n print(f\"Generating squares from 1 to {n ** 2}\")\n for i in range(1, n + 1):\n yield i ** 2\n\ngen = squares()\n\nfor x in gen:\n print(x, end=\" \")\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nGenerating squares from 1 to 100\n1 4 9 16 25 36 49 64 81 100 \n```\n:::\n:::\n\n\n> Since generators produce output one element at a time versus an entire list all at once, it can help your program use less memory.\n\n## Generator expressions\n\n This is a generator analogue to list, dictionary, and set comprehensions. To create one, enclose what would otherwise be a list comprehension within parentheses instead of brackets:\n\n::: {.cell execution_count=88}\n``` {.python .cell-code}\ngen = (x ** 2 for x in range(100))\n\ngen\n```\n\n::: {.cell-output .cell-output-display execution_count=88}\n```\n at 0x7fa65d4cdba0>\n```\n:::\n:::\n\n\nGenerator expressions can be used instead of list comprehensions as function arguments in some cases:\n\n::: {.cell execution_count=89}\n``` {.python .cell-code}\nsum(x ** 2 for x in range(100))\n```\n\n::: {.cell-output .cell-output-display execution_count=89}\n```\n328350\n```\n:::\n:::\n\n\n::: {.cell execution_count=90}\n``` {.python .cell-code}\ndict((i, i ** 2) for i in range(5))\n```\n\n::: {.cell-output .cell-output-display execution_count=90}\n```\n{0: 0, 1: 1, 2: 4, 3: 9, 4: 16}\n```\n:::\n:::\n\n\n## itertools module\n\n`itertools` module has a collection of generators for many common data algorithms.\n\n`groupby` takes any sequence and a function, grouping consecutive elements in the sequence by return value of the function\n\n::: {.cell execution_count=91}\n``` {.python .cell-code}\nimport itertools\n\ndef first_letter(x):\n return x[0]\n\nnames = [\"Alan\", \"Adam\", \"Jackie\", \"Lily\", \"Katie\", \"Molly\"]\n\nfor letter, names in itertools.groupby(names, first_letter):\n print(letter, list(names))\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nA ['Alan', 'Adam']\nJ ['Jackie']\nL ['Lily']\nK ['Katie']\nM ['Molly']\n```\n:::\n:::\n\n\n[Table of other itertools functions](https://wesmckinney.com/book/python-builtin.html#tbl-table_itertools)\n\n# Errors and Exception Handling\n\nHandling errors or exceptions gracefully is an important part of building robust programs\n\n::: {.cell execution_count=92}\n``` {.python .cell-code}\ndef attempt_float(x):\n try:\n return float(x)\n except:\n return x\n\nattempt_float(\"1.2345\")\n```\n\n::: {.cell-output .cell-output-display execution_count=92}\n```\n1.2345\n```\n:::\n:::\n\n\n::: {.cell execution_count=93}\n``` {.python .cell-code}\nattempt_float(\"something\")\n```\n\n::: {.cell-output .cell-output-display execution_count=93}\n```\n'something'\n```\n:::\n:::\n\n\nYou might want to suppress only ValueError, since a TypeError (the input was not a string or numeric value) might indicate a legitimate bug in your program. To do that, write the exception type after except:\n\n::: {.cell execution_count=94}\n``` {.python .cell-code}\ndef attempt_float(x):\n try:\n return float(x)\n except ValueError:\n return x\n```\n:::\n\n\n::: {.cell execution_count=95}\n``` {.python .cell-code}\nattempt_float((1, 2))\n```\n:::\n\n\n```\n---------------------------------------------------------------------------\nTypeError Traceback (most recent call last)\nd:\\packages\\bookclub-py4da\\03_notes.qmd in ()\n----> 1001 attempt_float((1, 2))\n\nInput In [114], in attempt_float(x)\n 1 def attempt_float(x):\n 2 try:\n----> 3 return float(x)\n 4 except ValueError:\n 5 return x\n\nTypeError: float() argument must be a string or a real number, not 'tuple'\n\n```\n\nYou can catch multiple exception types by writing a tuple of exception types instead (the parentheses are required):\n\n::: {.cell execution_count=96}\n``` {.python .cell-code}\ndef attempt_float(x):\n try:\n return float(x)\n except (TypeError, ValueError):\n return x\n\nattempt_float((1, 2))\n```\n\n::: {.cell-output .cell-output-display execution_count=95}\n```\n(1, 2)\n```\n:::\n:::\n\n\nIn some cases, you may not want to suppress an exception, but you want some code to be executed regardless of whether or not the code in the try block succeeds. To do this, use `finally`:\n\n::: {.cell execution_count=97}\n``` {.python .cell-code}\nf = open(path, mode=\"w\")\n\ntry:\n write_to_file(f)\nfinally:\n f.close()\n```\n:::\n\n\nHere, the file object f will always get closed. \n\nyou can have code that executes only if the try: block succeeds using else:\n\n::: {.cell execution_count=98}\n``` {.python .cell-code}\nf = open(path, mode=\"w\")\n\ntry:\n write_to_file(f)\nexcept:\n print(\"Failed\")\nelse:\n print(\"Succeeded\")\nfinally:\n f.close()\n```\n:::\n\n\n## Exceptions in IPython\n\nIf an exception is raised while you are %run-ing a script or executing any statement, IPython will by default print a full call stack trace. Having additional context by itself is a big advantage over the standard Python interpreter\n\n# Files and the Operating System\n\nTo open a file for reading or writing, use the built-in open function with either a relative or absolute file path and an optional file encoding.\n\nWe can then treat the file object f like a list and iterate over the lines\n\n::: {.cell execution_count=99}\n``` {.python .cell-code}\npath = \"examples/segismundo.txt\"\n\nf = open(path, encoding=\"utf-8\")\n\nlines = [x.rstrip() for x in open(path, encoding=\"utf-8\")]\n\nlines\n```\n:::\n\n\nWhen you use open to create file objects, it is recommended to close the file\n\n::: {.cell execution_count=100}\n``` {.python .cell-code}\nf.close()\n```\n:::\n\n\nsome of the most commonly used methods are `read`, `seek`, and `tell`.\n\n`read(10)` returns 10 characters from the file\n\nthe `read` method advances the file object position by the number of bytes read\n\n`tell()` gives you the current position in the file\n\nTo get consistent behavior across platforms, it is best to pass an encoding (such as `encoding=\"utf-8\"`)\n\n`seek(3)` changes the file position to the indicated byte \n\nTo write text to a file, you can use the file’s `write` or `writelines` methods\n\n## Byte and Unicode with Files\n\nThe default behavior for Python files (whether readable or writable) is text mode, which means that you intend to work with Python strings (i.e., Unicode). \n\n", + "supporting": [ + "04_notes_files" + ], + "filters": [], + "includes": {} + } +} \ No newline at end of file diff --git a/_freeze/05_notes/execute-results/html.json b/_freeze/05_notes/execute-results/html.json new file mode 100644 index 0000000..6392e28 --- /dev/null +++ b/_freeze/05_notes/execute-results/html.json @@ -0,0 +1,15 @@ +{ + "hash": "5e5df9c6dc72ddbbaebddceea2a8ec3f", + "result": { + "markdown": "# Notes {.unnumbered}\n\n\n## Introduction\n\n:::{.callout-note}\nThis is a long chapter, these notes are intended as a tour of main ideas! \n:::\n\n![Panda bus tour!](images/pandabus.jpg){width=300px}\n\n* Pandas is a major tool in Python data analysis\n\n* Works with Numpy, adding support for tabular / heterogenous data\n\n## Import conventions: \n\n::: {.cell execution_count=1}\n``` {.python .cell-code}\nimport numpy as np\nimport pandas as pd\n```\n:::\n\n\n## Panda's primary data structures\n\n* Series: One dimensional object containing a sequence of values of the same type.\n\n* DataFrame: Tabular data, similar (and inspired by) R dataframe.\n\n* Other structures will be introduced as they arise, e.g. Index and Groupby objects.\n\n### Series\n\n::: {.cell execution_count=2}\n``` {.python .cell-code}\nobj = pd.Series([4,7,-4,3], index = [\"A\",\"B\",\"C\",\"D\"])\nobj\n```\n\n::: {.cell-output .cell-output-display execution_count=2}\n```\nA 4\nB 7\nC -4\nD 3\ndtype: int64\n```\n:::\n:::\n\n\nThe `index` is optional, if not specified it will default to 0 through N-1 \n\n#### Selection\n\nSelect elements or sub-Series by labels, sets of labels, boolean arrays ...\n\n::: {.cell execution_count=3}\n``` {.python .cell-code}\nobj['A']\n```\n\n::: {.cell-output .cell-output-display execution_count=3}\n```\n4\n```\n:::\n:::\n\n\n::: {.cell execution_count=4}\n``` {.python .cell-code}\nobj[['A','C']]\n```\n\n::: {.cell-output .cell-output-display execution_count=4}\n```\nA 4\nC -4\ndtype: int64\n```\n:::\n:::\n\n\n::: {.cell execution_count=5}\n``` {.python .cell-code}\nobj[obj > 3]\n```\n\n::: {.cell-output .cell-output-display execution_count=5}\n```\nA 4\nB 7\ndtype: int64\n```\n:::\n:::\n\n\n#### Other things you can do\n\n* Numpy functions and Numpy-like operations work as expected:\n\n::: {.cell execution_count=6}\n``` {.python .cell-code}\nobj*3\n```\n\n::: {.cell-output .cell-output-display execution_count=6}\n```\nA 12\nB 21\nC -12\nD 9\ndtype: int64\n```\n:::\n:::\n\n\n::: {.cell execution_count=7}\n``` {.python .cell-code}\nnp.exp(obj)\n```\n\n::: {.cell-output .cell-output-display execution_count=7}\n```\nA 54.598150\nB 1096.633158\nC 0.018316\nD 20.085537\ndtype: float64\n```\n:::\n:::\n\n\n* Series can be created from and converted to a dictionary\n\n::: {.cell execution_count=8}\n``` {.python .cell-code}\nobj.to_dict()\n```\n\n::: {.cell-output .cell-output-display execution_count=8}\n```\n{'A': 4, 'B': 7, 'C': -4, 'D': 3}\n```\n:::\n:::\n\n\n* Series can be converted to numpy array:\n\n::: {.cell execution_count=9}\n``` {.python .cell-code}\nobj.to_numpy()\n```\n\n::: {.cell-output .cell-output-display execution_count=9}\n```\narray([ 4, 7, -4, 3])\n```\n:::\n:::\n\n\n### DataFrame\n\n* Represents table of data\n\n* Has row index *index* and column index *column*\n\n* Common way to create is from a dictionary, but see *Table 5.1* for more!\n\n::: {.cell execution_count=10}\n``` {.python .cell-code}\ntest = pd.DataFrame({\"cars\":['Chevy','Ford','Dodge','BMW'],'MPG':[14,15,16,12], 'Year':[1979, 1980, 2001, 2020]})\ntest\n```\n\n::: {.cell-output .cell-output-display execution_count=10}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
carsMPGYear
0Chevy141979
1Ford151980
2Dodge162001
3BMW122020
\n
\n```\n:::\n:::\n\n\n* If you want a non-default index, it can be specified just like with Series.\n\n* `head(n)` / `tail(n)` - return the first / last n rows, 5 by default\n\n#### Selecting \n\n* Can retrieve columns or sets of columns by using `obj[...]`:\n\n::: {.cell execution_count=11}\n``` {.python .cell-code}\ntest['cars']\n```\n\n::: {.cell-output .cell-output-display execution_count=11}\n```\n0 Chevy\n1 Ford\n2 Dodge\n3 BMW\nName: cars, dtype: object\n```\n:::\n:::\n\n\nNote that we got a `Series` here.\n\n::: {.cell execution_count=12}\n``` {.python .cell-code}\ntest[['cars','MPG']]\n```\n\n::: {.cell-output .cell-output-display execution_count=12}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
carsMPG
0Chevy14
1Ford15
2Dodge16
3BMW12
\n
\n```\n:::\n:::\n\n\n* Dot notation can also be used (`test.cars`) as long as the column names are valid identifiers\n\n* *Rows* can be retrieved with `iloc[...]` and `loc[...]`:\n\n - `loc` retrieves by index\n\n - `iloc` retrieves by position. \n\n \n#### Modifying / Creating Columns\n\n* Columns can be modified (and created) by assignment:\n\n::: {.cell execution_count=13}\n``` {.python .cell-code}\ntest['MPG^2'] = test['MPG']**2\ntest\n```\n\n::: {.cell-output .cell-output-display execution_count=13}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
carsMPGYearMPG^2
0Chevy141979196
1Ford151980225
2Dodge162001256
3BMW122020144
\n
\n```\n:::\n:::\n\n\n* `del` keyword can be used to drop columns, or `drop` method can be used to do so non-destructively\n\n\n### Index object \n\n* Index objects are used for holding axis labels and other metadata\n\n::: {.cell execution_count=14}\n``` {.python .cell-code}\ntest.index\n```\n\n::: {.cell-output .cell-output-display execution_count=14}\n```\nRangeIndex(start=0, stop=4, step=1)\n```\n:::\n:::\n\n\n* Can change the index, in this case replacing the default:\n\n::: {.cell execution_count=15}\n``` {.python .cell-code}\n# Create index from one of the columns\ntest.index = test['cars'] \n\n # remove 'cars' column since i am using as an index now. s\ntest=test.drop('cars', axis = \"columns\") # or axis = 1\ntest\n```\n\n::: {.cell-output .cell-output-display execution_count=15}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
MPGYearMPG^2
cars
Chevy141979196
Ford151980225
Dodge162001256
BMW122020144
\n
\n```\n:::\n:::\n\n\n* Note the `axis` keyword argument above, many DataFrame methods use this.\n\n* Above I changed a column into an index. Often you want to go the other way, this can be done with `reset_index`: \n\n::: {.cell execution_count=16}\n``` {.python .cell-code}\ntest.reset_index() # Note this doesn't actually change test\n```\n\n::: {.cell-output .cell-output-display execution_count=16}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
carsMPGYearMPG^2
0Chevy141979196
1Ford151980225
2Dodge162001256
3BMW122020144
\n
\n```\n:::\n:::\n\n\n* Columns are an index as well:\n\n::: {.cell execution_count=17}\n``` {.python .cell-code}\ntest.columns\n```\n\n::: {.cell-output .cell-output-display execution_count=17}\n```\nIndex(['MPG', 'Year', 'MPG^2'], dtype='object')\n```\n:::\n:::\n\n\n* Indexes act like immutable sets, see *Table 5.2* in book for Index methods and properties\n\n## Essential Functionality\n\n### Reindexing and dropping\n\n* `reindex` creats a *new* object with the values arranged according to the new index. Missing values are used if necessary, or you can use optional fill methods. You can use `iloc` and `loc` to reindex as well.\n\n::: {.cell execution_count=18}\n``` {.python .cell-code}\ns = pd.Series([1,2,3,4,5], index = list(\"abcde\"))\ns2 = s.reindex(list(\"abcfu\")) # not a song by GAYLE \ns2\n```\n\n::: {.cell-output .cell-output-display execution_count=18}\n```\na 1.0\nb 2.0\nc 3.0\nf NaN\nu NaN\ndtype: float64\n```\n:::\n:::\n\n\n* Missing values and can be tested for with `isna` or `notna` methods\n\n::: {.cell execution_count=19}\n``` {.python .cell-code}\npd.isna(s2)\n```\n\n::: {.cell-output .cell-output-display execution_count=19}\n```\na False\nb False\nc False\nf True\nu True\ndtype: bool\n```\n:::\n:::\n\n\n* `drop` , illustrated above can drop rows or columns. In addition to using `axis` you can use `columns` or `index`. Again these make copies.\n\n::: {.cell execution_count=20}\n``` {.python .cell-code}\ntest.drop(columns = 'MPG')\n```\n\n::: {.cell-output .cell-output-display execution_count=20}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
YearMPG^2
cars
Chevy1979196
Ford1980225
Dodge2001256
BMW2020144
\n
\n```\n:::\n:::\n\n\n::: {.cell execution_count=21}\n``` {.python .cell-code}\ntest.drop(index = ['Ford', 'BMW'])\n```\n\n::: {.cell-output .cell-output-display execution_count=21}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
MPGYearMPG^2
cars
Chevy141979196
Dodge162001256
\n
\n```\n:::\n:::\n\n\n### Indexing, Selection and Filtering\n\n#### Series\n* For Series, indexing is similar to Numpy, except you can use the index as well as integers.\n\n::: {.cell execution_count=22}\n``` {.python .cell-code}\nobj = pd.Series(np.arange(4.), index=[\"a\", \"b\", \"c\", \"d\"])\nobj[0:3]\n```\n\n::: {.cell-output .cell-output-display execution_count=22}\n```\na 0.0\nb 1.0\nc 2.0\ndtype: float64\n```\n:::\n:::\n\n\n::: {.cell execution_count=23}\n``` {.python .cell-code}\nobj['a':'c']\n```\n\n::: {.cell-output .cell-output-display execution_count=23}\n```\na 0.0\nb 1.0\nc 2.0\ndtype: float64\n```\n:::\n:::\n\n\n::: {.cell execution_count=24}\n``` {.python .cell-code}\nobj[obj<2]\n```\n\n::: {.cell-output .cell-output-display execution_count=24}\n```\na 0.0\nb 1.0\ndtype: float64\n```\n:::\n:::\n\n\n::: {.cell execution_count=25}\n``` {.python .cell-code}\nobj[['a','d']]\n```\n\n::: {.cell-output .cell-output-display execution_count=25}\n```\na 0.0\nd 3.0\ndtype: float64\n```\n:::\n:::\n\n\n* *However*, preferred way is to use `loc` for selection by *index* and `iloc` for selection by position. This is to avoid the issue where the `index` is itself integers.\n\n::: {.cell execution_count=26}\n``` {.python .cell-code}\nobj.loc[['a','d']]\n```\n\n::: {.cell-output .cell-output-display execution_count=26}\n```\na 0.0\nd 3.0\ndtype: float64\n```\n:::\n:::\n\n\n::: {.cell execution_count=27}\n``` {.python .cell-code}\nobj.iloc[1]\n```\n\n::: {.cell-output .cell-output-display execution_count=27}\n```\n1.0\n```\n:::\n:::\n\n\n:::{.callout-note}\nNote if a range or a set of indexes is used, a Series is returned. If a single item is requested, you get just that item.\n:::\n\n#### DataFrame\n\n* Selecting with `df[...]` for a DataFrame retrieves one or more columns as we have seen, if you select a single column you get a Series\n\n* There are some special cases, indexing with a boolean selects *rows*, as does selecting with a slice:\n\n::: {.cell execution_count=28}\n``` {.python .cell-code}\ntest[0:1]\n```\n\n::: {.cell-output .cell-output-display execution_count=28}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
MPGYearMPG^2
cars
Chevy141979196
\n
\n```\n:::\n:::\n\n\n::: {.cell execution_count=29}\n``` {.python .cell-code}\ntest[test['MPG'] < 15]\n```\n\n::: {.cell-output .cell-output-display execution_count=29}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
MPGYearMPG^2
cars
Chevy141979196
BMW122020144
\n
\n```\n:::\n:::\n\n\n* `iloc` and `loc` can be used to select rows as illustrated before, but can also be used to select columns or subsets of rows/columns\n\n::: {.cell execution_count=30}\n``` {.python .cell-code}\ntest.loc[:,['Year','MPG']]\n```\n\n::: {.cell-output .cell-output-display execution_count=30}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
YearMPG
cars
Chevy197914
Ford198015
Dodge200116
BMW202012
\n
\n```\n:::\n:::\n\n\n::: {.cell execution_count=31}\n``` {.python .cell-code}\ntest.loc['Ford','MPG']\n```\n\n::: {.cell-output .cell-output-display execution_count=31}\n```\n15\n```\n:::\n:::\n\n\n* These work with slices and booleans as well! The following says \"give me all the rows with MPG more then 15, and the columns starting from Year\"\n\n::: {.cell execution_count=32}\n``` {.python .cell-code}\ntest.loc[test['MPG'] > 15, 'Year':]\n```\n\n::: {.cell-output .cell-output-display execution_count=32}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
YearMPG^2
cars
Dodge2001256
\n
\n```\n:::\n:::\n\n\n* Indexing options are fully illustrated in the book and *Table 5.4* \n\n* Be careful with *chained indexing*:\n\n::: {.cell execution_count=33}\n``` {.python .cell-code}\ntest[test['MPG']> 15].loc[:,'MPG'] = 18\n```\n\n::: {.cell-output .cell-output-stderr}\n```\n/var/folders/6f/7n78v7cn1z5_hyt_zyfljtr00000gn/T/ipykernel_12698/2484144822.py:1: SettingWithCopyWarning:\n\n\nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n\n```\n:::\n:::\n\n\nHere we are assigning to a 'slice', which is probably not what is intended. You will get a warning and a recommendation to fix it by using one `loc`:\n\n::: {.cell execution_count=34}\n``` {.python .cell-code}\ntest.loc[test['MPG']> 15 ,'MPG'] = 18\ntest\n```\n\n::: {.cell-output .cell-output-display execution_count=34}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
MPGYearMPG^2
cars
Chevy141979196
Ford151980225
Dodge182001256
BMW122020144
\n
\n```\n:::\n:::\n\n\n:::{.callout-tip}\n### Rule of Thumb\n\nAvoid chained indexing when doing assignments\n:::\n\n### Arithmetic and Data Alignment\n\n* Pandas can make it simpler to work with objects that have different indexes, usually 'doing the right thing'\n\n::: {.cell execution_count=35}\n``` {.python .cell-code}\ns1 = pd.Series([7.3, -2.5, 3.4, 1.5], index=[\"a\", \"c\", \"d\", \"e\"])\ns2 = pd.Series([-2.1, 3.6, -1.5, 4, 3.1], index=[\"a\", \"c\", \"e\", \"f\", \"g\"])\ns1+s2\n```\n\n::: {.cell-output .cell-output-display execution_count=35}\n```\na 5.2\nc 1.1\nd NaN\ne 0.0\nf NaN\ng NaN\ndtype: float64\n```\n:::\n:::\n\n\n* Fills can be specified by using methods:\n\n::: {.cell execution_count=36}\n``` {.python .cell-code}\ns1.add(s2, fill_value = 0)\n```\n\n::: {.cell-output .cell-output-display execution_count=36}\n```\na 5.2\nc 1.1\nd 3.4\ne 0.0\nf 4.0\ng 3.1\ndtype: float64\n```\n:::\n:::\n\n\n* See *Table 5.5* for list of these methods.\n\n* You can also do arithmetic between *DataFrame*s and *Series* in a way that is similar to Numpy. \n\n### Function Application and Mapping\n\n* Numpy *ufuncs* also work with Pandas objects. \n\n::: {.cell execution_count=37}\n``` {.python .cell-code}\nframe = pd.DataFrame(np.random.standard_normal((4, 3)),\n columns=list(\"bde\"),\n index=[\"Utah\", \"Ohio\", \"Texas\", \"Oregon\"])\nframe\n```\n\n::: {.cell-output .cell-output-display execution_count=37}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
bde
Utah-0.365805-1.6414560.243295
Ohio-0.024996-0.9132140.129740
Texas0.1097530.4510451.036501
Oregon0.2938840.2084161.075689
\n
\n```\n:::\n:::\n\n\n::: {.cell execution_count=38}\n``` {.python .cell-code}\nnp.abs(frame)\n```\n\n::: {.cell-output .cell-output-display execution_count=38}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
bde
Utah0.3658051.6414560.243295
Ohio0.0249960.9132140.129740
Texas0.1097530.4510451.036501
Oregon0.2938840.2084161.075689
\n
\n```\n:::\n:::\n\n\n* `apply` can be used to apply a function on 1D arrays to each column or row:\n\n::: {.cell execution_count=39}\n``` {.python .cell-code}\nframe.apply(np.max, axis = 'rows') #'axis' is optional here, default is rows\n```\n\n::: {.cell-output .cell-output-display execution_count=39}\n```\nb 0.293884\nd 0.451045\ne 1.075689\ndtype: float64\n```\n:::\n:::\n\n\nApplying accross columns is common, especially to combine different columns in some way:\n\n::: {.cell execution_count=40}\n``` {.python .cell-code}\nframe['max'] = frame.apply(np.max, axis = 'columns')\nframe\n```\n\n::: {.cell-output .cell-output-display execution_count=40}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
bdemax
Utah-0.365805-1.6414560.2432950.243295
Ohio-0.024996-0.9132140.1297400.129740
Texas0.1097530.4510451.0365011.036501
Oregon0.2938840.2084161.0756891.075689
\n
\n```\n:::\n:::\n\n\n* Many more examples of this in the book.\n\n\n### Sorting and Ranking\n\n* `sort_index` will sort with the index (on either axis for *DataFrame*)\n* `sort_values` is used to sort by values or a particular column\n\n::: {.cell execution_count=41}\n``` {.python .cell-code}\ntest.sort_values('MPG')\n```\n\n::: {.cell-output .cell-output-display execution_count=41}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
MPGYearMPG^2
cars
BMW122020144
Chevy141979196
Ford151980225
Dodge182001256
\n
\n```\n:::\n:::\n\n\n* `rank` will assign ranks from on through the number of data points.\n\n\n## Summarizing and Computing Descriptive Statistics\n\n::: {.cell execution_count=42}\n``` {.python .cell-code}\ndf = pd.DataFrame([[1.4, np.nan], [7.1, -4.5],\n [np.nan, np.nan], [0.75, -1.3]],\n index=[\"a\", \"b\", \"c\", \"d\"],\n columns=[\"one\", \"two\"])\ndf\n```\n\n::: {.cell-output .cell-output-display execution_count=42}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
onetwo
a1.40NaN
b7.10-4.5
cNaNNaN
d0.75-1.3
\n
\n```\n:::\n:::\n\n\nSome Examples:\n\n\nSum over rows:\n\n::: {.cell execution_count=43}\n``` {.python .cell-code}\ndf.sum()\n```\n\n::: {.cell-output .cell-output-display execution_count=43}\n```\none 9.25\ntwo -5.80\ndtype: float64\n```\n:::\n:::\n\n\nSum over columns:\n\n::: {.cell execution_count=44}\n``` {.python .cell-code}\n# Sum Rows\ndf.sum(axis=\"columns\")\n```\n\n::: {.cell-output .cell-output-display execution_count=44}\n```\na 1.40\nb 2.60\nc 0.00\nd -0.55\ndtype: float64\n```\n:::\n:::\n\n\nExtremely useful is `describe`:\n\n::: {.cell execution_count=45}\n``` {.python .cell-code}\ndf.describe()\n```\n\n::: {.cell-output .cell-output-display execution_count=45}\n```{=html}\n
\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n
onetwo
count3.0000002.000000
mean3.083333-2.900000
std3.4936852.262742
min0.750000-4.500000
25%1.075000-3.700000
50%1.400000-2.900000
75%4.250000-2.100000
max7.100000-1.300000
\n
\n```\n:::\n:::\n\n\n**Book chapter contains *many* more examples and a full list of summary statistics and related methods.**\n \n## Summary\n\n* Primary Panda's data structures:\n\n - Series\n\n - DataFrame\n\n* Many ways to access and transform these objects. Key ones are:\n\n - `[]` : access an element(s) of a `Series` or columns(s) of a `DataFrame`\n\n - `loc[r ,c]` : access a row / column / cell by the `index`.\n\n - `iloc[i, j]` : access ar row / column / cell by the integer position.\n\n* [Online reference.](https://pandas.pydata.org/docs/reference/index.html)\n\n:::{.callout-tip}\n## Suggestion\nWork though the chapter's code and try stuff!\n:::\n\n## References\n\n* [Chapter's code.](https://nbviewer.org/github/pydata/pydata-book/blob/3rd-edition/ch05.ipynb)\n\n* [Panda reference.](https://pandas.pydata.org/docs/reference/index.html)\n\n## Next Chapter\n\n* Loading and writing data sets!\n\n", + "supporting": [ + "05_notes_files" + ], + "filters": [], + "includes": { + "include-in-header": [ + "\n\n\n" + ] + } + } +} \ No newline at end of file diff --git a/_quarto.yml b/_quarto.yml index d973995..71e52cd 100644 --- a/_quarto.yml +++ b/_quarto.yml @@ -19,6 +19,62 @@ book: chapters: - 02_notes.qmd - 02_video.qmd + - 02_exercises.qmd + - part: 03_main.qmd + chapters: + - 03_notes.qmd + - 03_video.qmd + - 03_exercises.qmd + - part: 04_main.qmd + chapters: + - 04_notes.qmd + - 04_video.qmd + - 04_exercises.qmd + - part: 05_main.qmd + chapters: + - 05_notes.qmd + - 05_video.qmd + - 05_exercises.qmd + - part: 06_main.qmd + chapters: + - 06_notes.qmd + - 06_video.qmd + - 06_exercises.qmd + - part: 07_main.qmd + chapters: + - 07_notes.qmd + - 07_video.qmd + - 07_exercises.qmd + - part: 08_main.qmd + chapters: + - 08_notes.qmd + - 08_video.qmd + - 08_exercises.qmd + - part: 09_main.qmd + chapters: + - 09_notes.qmd + - 09_video.qmd + - 09_exercises.qmd + - part: 10_main.qmd + chapters: + - 10_notes.qmd + - 10_video.qmd + - 10_exercises.qmd + - part: 11_main.qmd + chapters: + - 11_notes.qmd + - 11_video.qmd + - 11_exercises.qmd + - part: 12_main.qmd + chapters: + - 12_notes.qmd + - 12_video.qmd + - 12_exercises.qmd + - part: 13_main.qmd + chapters: + - 13_notes.qmd + - 13_video.qmd + - 13_exercises.qmd # - part: examples.md # chapters: # - example_quarto.qmd diff --git a/some_array.npy b/some_array.npy index a95c6a6..a3ff5af 100644 Binary files a/some_array.npy and b/some_array.npy differ