Skip to content

Commit

Permalink
Improved FIL documentation, put FIL on the front page.
Browse files Browse the repository at this point in the history
  • Loading branch information
canonizer committed Jan 23, 2020
1 parent 564916e commit 9694913
Show file tree
Hide file tree
Showing 2 changed files with 15 additions and 13 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,6 +83,7 @@ repo](https://github.com/rapidsai/notebooks-contrib).
| | Stochastic Gradient Descent (SGD), Coordinate Descent (CD), and Quasi-Newton (QN) (including L-BFGS and OWL-QN) solvers for linear models | |
| **Nonlinear Models for Regression or Classification** | Random Forest (RF) Classification | Experimental multi-node multi-GPU via Dask |
| | Random Forest (RF) Regression | Experimental multi-node multi-GPU via Dask |
| | Inference for decision tree-based models | Forest Inference Library (FIL) |
| | K-Nearest Neighbors (KNN) | Multi-node multi-GPU via Dask, uses [Faiss](https://github.com/facebookresearch/faiss) for Nearest Neighbors Query. |
| | K-Nearest Neighbors (KNN) Classification | |
| | K-Nearest Neighbors (KNN) Regression | |
Expand Down
27 changes: 14 additions & 13 deletions python/cuml/fil/fil.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -316,29 +316,30 @@ cdef class ForestInference_impl():


class ForestInference(Base):
"""
ForestInference provides GPU-accelerated inference (prediction)
"""ForestInference provides GPU-accelerated inference (prediction)
for random forest and boosted decision tree models.
This module does not support training models. Rather, users should
train a model in another package and save it in a
treelite-compatible format. (See https://github.com/dmlc/treelite)
Currently, LightGBM and XGBoost GBDT and random forest models are
supported.
Currently, LightGBM, XGBoost and SKLearn GBDT and random forest models
are supported.
Users typically create a ForestInference object by loading a
saved model file with ForestInference.load. The resulting object
Users typically create a ForestInference object by loading a saved model
file with ForestInference.load. It is also possible to create it from an
SKLearn model using ForestInference.load_from_sklearn. The resulting object
provides a `predict` method for carrying out inference.
**Known limitations**:
* Trees are represented as complete binary trees, so a tree of depth k
will be stored in (2**k) - 1 nodes. This will be less space-efficient
for sparse trees.
* While treelite supports additional formats, only XGBoost and LightGBM
are tested in FIL currently.
* LightGBM categorical features are not supported
* A single row of data should fit into the shared memory of a thread block,
which means that more than 12288 features are not supported.
* From sklearn.ensemble, only
{RandomForest,GradientBoosting}{Classifier,Regressor} models are
supported; other sklearn.ensemble models are currently not supported.
* Importing large SKLearn models can be slow, as it is done in Python.
* LightGBM categorical features are not supported.
* Inference uses a dense matrix format, which is efficient for many
problems but will be suboptimal for sparse datasets.
problems but can be suboptimal for sparse datasets.
* Only binary classification and regression are supported.
Parameters
Expand Down

0 comments on commit 9694913

Please sign in to comment.