In construction: 👷 ...
A first approach to different Machine learning models using Sklearn Python library and following the CRISP-DM methodology
The 1st approach to prediction tasks, this first Jupyter Notebook compares the performance of three basic models: K Nearest Neighbours with Linear and Polynomial regression models. The objective was to compare two important measures of the model evaluation process: the MSE and the R^2 score (coefficient of determination), making a custom variation of the hyperparameters of each model.
The 2nd step was to introduce the concept of regularization using Ridge and Lasso models, which includes the alpha term of penalty to the regression's weights. Ridge uses L2 regularization, which uses a squared magnitude of coefficient penalty, Lasso on the other hand uses L1 regularization applying the absolute value penalty in the loss function coefficients. Is important to recall that Lasso regularization works for feature selection due to its effect when the term lambda is 0.
At the end we implement a Logistic Reg model to accomplish a binary classification task, showing the classic Precision/Recall trade-off in the training process of the model.
In construction: 👷 ...
In construction: 👷 ...