From b681e81a5046366c5edeebeceffcf3d15a20db8c Mon Sep 17 00:00:00 2001 From: Baptiste Caramiaux Date: Wed, 20 May 2015 16:24:40 +0100 Subject: [PATCH] Update README.md update API in readme --- README.md | 41 ++++++++++++++++++++++++++++++++++++++++- 1 file changed, 40 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index ac176d3..2610a85 100755 --- a/README.md +++ b/README.md @@ -22,7 +22,46 @@ Compiling both Max/MSP and PureData objects: make sure that in ofxGVFTypes.h you API --- -To be updated !!!... +Declare an ofxGVF object, an ofxGVFGesture object (to store currently performed gesture): +``` +ofxGVF *gvf; +ofxGVFGesture currentGesture; +``` + +Build GVF with default configuration and parameters: +``` +gvf = new ofxGVF(); +``` + +Perform *Learning* from a matrix of type `vector>` called `data` by first filling the object `currentGesture` and add eventually add this gesture as a template: +``` +currentGesture.clear(); // just in case! + +for (vector >::iterator frame = data.begin() ; frame != data.end(); ++frame) + currentGesture.addObservation(*frame); + +gvf->addGestureTemplate(currentGesture); +``` + +Perform *Testing* from a matrix of type `vector>` called `data` by first changing GVF state to Following, clearing the object `currentGesture` and updating GVF for each sample: +``` +gvf->setState(ofxGVF::STATE_FOLLOWING); + +currentGesture.clear(); + +for (vector >::iterator frame = data.begin() ; frame != data.end(); ++frame){ + currentGesture.addObservation(*frame); // in case we want to plot or to analyse gesture data + gvf->update(currentGesture.getLastObservation()); +} +``` + +To get the recognition and adaption results, in the `for` loop above, called the following methods: +``` +float phase = gvf->getOutcomes().estimations[0].alignment; +float speed = gvf->getOutcomes().estimations[0].dynamics[0]; +``` + +An example is available in the folder _libtests/