-
Notifications
You must be signed in to change notification settings - Fork 3
/
DESCRIPTION
43 lines (43 loc) · 1.22 KB
/
DESCRIPTION
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
Package: distillML
Type: Package
Title: Model Distillation and Interpretability Methods for Machine Learning Models
Version: 0.1.0.14
Authors@R: c(
person("Brian", "Cho", role = "aut"),
person("Theo", "Saarinen", role = c("aut","cre"), email = "[email protected]"),
person("Jasjeet", "Sekhon", role = "aut"),
person("Simon", "Walter", role = "aut")
)
Maintainer: Theo Saarinen <[email protected]>
BugReports: https://github.com/forestry-labs/distillML/issues
URL: https://github.com/forestry-labs/distillML
Description: Provides several methods for model distillation and interpretability
for general black box machine learning models and treatment effect estimation
methods. For details on the algorithms implemented, see <https://forestry-labs.github.io/distillML/index.html>
Brian Cho, Theo F. Saarinen, Jasjeet S. Sekhon, Simon Walter.
License: GPL (>=3)
Encoding: UTF-8
Imports:
ggplot2,
glmnet,
Rforestry,
dplyr,
R6 (>= 2.0),
checkmate,
purrr,
tidyr,
data.table,
mltools,
gridExtra
Suggests:
testthat,
knitr,
rmarkdown,
mvtnorm
Collate:
'predictor.R'
'interpret.R'
'distiller.R'
'plotter.R'
'surrogate.R'
RoxygenNote: 7.2.3