Skip to content

Commit

Permalink
Update documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
thanhtung09t2 committed Jun 29, 2022
1 parent 5fcc08e commit 23a0273
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 4 deletions.
4 changes: 2 additions & 2 deletions docs/user/features.rst
Original file line number Diff line number Diff line change
Expand Up @@ -54,8 +54,8 @@ Ability to directly process missing data
Learning algorithms for the general fuzzy min-max neural network supported by the library may classify inputs with missing
data directly without the need for replacing or imputing missing values as in other classifiers.

Continual learning of new classes in an incremental manner
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Continual learning ability of new classes
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Incremental learning algorithms of hyperbox-based models in the **hyperbox-brain** library can grow and accommodate new
classes of data without retraining the whole classifier. Incremental learning algorithms themselves can generate new hyperboxes
to represent clusters of new data with potentially new class labels both in the middle of normal training procedure and in the
Expand Down
4 changes: 2 additions & 2 deletions index.html
Original file line number Diff line number Diff line change
Expand Up @@ -139,7 +139,7 @@
<div class="keyfeatures-box-content keyfeatures-underline">
<p></p>
<div class="keyfeatures-box-title">Learning from both labelled and unlabelled data</div>
<div class="keyfeatures-box-text">One of the exciting features of learning algorithms for the general fuzzy min-max neural network(GFMMNN) is the capability of creating classification boundaries among known classes and clustering data and representing them as hyperboxes in the case that labels are not available. Unlabelled hyperboxes is then possible to be labelled on the basis of the evidence of next incoming input samples. As a result, the GFMMNN models have the ability to learn from the mixed labelled and unlabelled datasets in a native way.
<div class="keyfeatures-box-text">One of the exciting features of learning algorithms for the general fuzzy min-max neural network (GFMMNN) is the capability of creating classification boundaries among known classes and clustering data and representing them as hyperboxes in the case that labels are not available. Unlabelled hyperboxes is then possible to be labelled on the basis of the evidence of next incoming input samples. As a result, the GFMMNN models have the ability to learn from the mixed labelled and unlabelled datasets in a native way.
</div>
<p></p>
</div>
Expand All @@ -153,7 +153,7 @@
<div class="keyfeatures-box-content keyfeatures-underline">
<p></p>
<div class="keyfeatures-box-title">Continual learning of new classes in an incremental manner</div>
<div class="keyfeatures-box-text">Incremental learning algorithms of hyperbox-based models in the **hyperbox-brain** library can grow and accommodate new classes of data without retraining the whole classifier. Incremental learning algorithms themselves can generate new hyperboxes to represent clusters of new data with potentially new class labels both in the middle of normal training procedure and in the operating time where training has been finished. This property is a key feature for smart life-long learning systems.
<div class="keyfeatures-box-text">Incremental learning algorithms of hyperbox-based models in the hyperbox-brain library can grow and accommodate new classes of data without retraining the whole classifier. Incremental learning algorithms themselves can generate new hyperboxes to represent clusters of new data with potentially new class labels both in the middle of normal training procedure and in the operating time where training has been finished. This property is a key feature for smart life-long learning systems.
</div>
<p></p>
</div>
Expand Down

0 comments on commit 23a0273

Please sign in to comment.