Skip to content

Metrics for evaluation of Machine Learning and Deep Learning Models

License

Notifications You must be signed in to change notification settings

AdarshKumar712/Metrics.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Metrics

Dev Build Status

A collection of diverse metrics to analyse performance of Machine Learning and Deep Learning Models. This includes a variety of functions for Classification, Regression, Natural Language Processing, Computer Vision and Ranking Models and also utilities for better user support.

Introduction

Installation

To install Metrics.jl, you need to fill in the following code into the Julia Prompt

] add Metrics

or

using Pkg
Pkg.add("Metrics")

Examples

using Metrics

# get accuracy with default threshold = 0.5
acc = Metrics.binary_accuracy(y_pred, y_true)

# get complete stats including Confusion Matrix, Accuracy, Precision, Recall, F1 Score, etc. 
Metrics.report_stats(y_pred, y_true)  # where y_pred are the predicted values and y_true are onehot_encoded ground truth values

More information

For more details about the package and the functions, check out the documentation. In case you have any questions, you can tag me (@Adarsh Kumar) in Julia's slack, or you can just create an issue on Github.

References

  1. https://github.com/caseykneale/ChemometricsTools.jl
  2. https://github.com/tensorflow/nmt/blob/master/nmt/scripts/bleu.py#L56
  3. https://github.com/JuliaML/MLMetrics.jl

About

Metrics for evaluation of Machine Learning and Deep Learning Models

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages