Skip to content

A first approach to different Machine learning models using Sklearn Python library

License

Notifications You must be signed in to change notification settings

IvanGarL/ML_models

Repository files navigation

ML_models

In construction: 👷 ...
A first approach to different Machine learning models using Sklearn Python library and following the CRISP-DM methodology

1. KNN vs Linear Reg vs Polynomial Reg:

The 1st approach to prediction tasks, this first Jupyter Notebook compares the performance of three basic models: K Nearest Neighbours with Linear and Polynomial regression models. The objective was to compare two important measures of the model evaluation process: the MSE and the R^2 score (coefficient of determination), making a custom variation of the hyperparameters of each model.

2. (Regularization: Ridge vs Lasso) + (Logistic Regression):

The 2nd step was to introduce the concept of regularization using Ridge and Lasso models, which includes the alpha term of penalty to the regression's weights. Ridge uses L2 regularization, which uses a squared magnitude of coefficient penalty, Lasso on the other hand uses L1 regularization applying the absolute value penalty in the loss function coefficients. Is important to recall that Lasso regularization works for feature selection due to its effect when the term lambda is 0.

At the end we implement a Logistic Reg model to accomplish a binary classification task, showing the classic Precision/Recall trade-off in the training process of the model.

3. SVM with unbalanced data:

In construction: 👷 ...

4. Naive Bayes, Bernoulli Bayes & Multinomial Bayes for articles classification:

In construction: 👷 ...

About

A first approach to different Machine learning models using Sklearn Python library

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published