Skip to content

How to add a new metric

Aldo edited this page Mar 13, 2017 · 8 revisions

How to add a new metric

  1. Adding a new metric
    1. Generate the metric
    2. Customize the metric class
    3. Enable and configure the new metric
    4. Recalculate scores for the new metric

The first step to add a new metric is execute the following task:

bundle exec rake generate:metric["metricName","evModelNames","moduleName"]

Specifying the following parameters:

Param Description
metricName The name of the metric. It can be any alphanumeric string. Mandatory.
evModelNames The names of the evaluation models (separated by commas) of the metric. Usually one model needs to be specified, but metrics based on several evaluation models are also allowed. Mandatory.
moduleName This param is optional. Allows to specify the name of the internal Rails class that will handle the metric. By default is obtained based on the metricName param.

Thereby, if we want to add a new metric called "My SUS" that is based on the SUS evaluation model, we should execute the following:

bundle exec rake generate:metric["My SUS","SUS"]

The output of the task should be something like this:

Generating new metric
The model was created in /home/user/workspace/LOEP/app/models/metrics/my_sus.rb
The metric was succesfully generated
{"created_at":"2016-04-10T16:49:36Z","id":18,"name":"My SUS","updated_at":"2016-04-10T16:49:36Z"}

After executing the generate:metric task a new file corresponding to the metric class is automatically generated in the app/models/metrics folder.

For instance, after executing bundle exec rake generate:metric["My SUS","SUS"] the following file is automatically generated:

app/models/metrics/my_sus.rb

Next we need to customize this file (i.e. the metric class). The default content of the file (app/models/metrics/my_sus.rb in the example of this guide) should be something like this:

# encoding: utf-8

class Metrics::My_sus < Metric
  #this is for Metrics with type=My_sus
  #Override methods here

  def self.getLoScore(evData)
  end
end

These are all the mandatory methods that the module needs to work on the LOEP platform:

Method Description
self.getLoScore(evData) It receives as a parameter all the evaluation data for a Learning Object and should return the score (as a float value) in a [0,10] scale for that Learning Object based on the evaluation data according to the metric.

The evData param in a hash object where each key is the name of a evaluation model of the metric. In the case of the "My SUS" metric of the example this hash will contain only one key with the SUS value. Thereby, evData["SUS"] will contain all the evaluation data belonging to the SUS evaluation model. These data is returned as a hash with two keys: :evaluations, which is an array with all the performed evaluations of the Learning Object using the evaluation model, and :items, an array with the average value that each item of the evaluation model obtained for this Learning Object.
Next, a example of the evData param for a Learning Object which recieved two evaluations with the SUS evaluation model is represented:

{"SUS"=>
  {:evaluations=>
    [<Evaluations::Sus id: 3034, user_id: 1, assignment_id: nil, lo_id: 2022, evmethod_id: 6, type: "Evaluations::Sus", completed_at: "2016-04-10 17:13:00", item1: 4, item2: 4, item3: 4, item4: 2, item5: 4, item6: 2, item7: 4, item8: 2, item9: 5, item10: 1, ...>, 
     <Evaluations::Sus id: 3035, user_id: 1, assignment_id: nil, lo_id: 2022, evmethod_id: 6, type: "Evaluations::Sus", completed_at: "2016-04-10 17:25:55", item1: 5, item2: 2, item3: 5, item4: 2, item5: 4, item6: 2, item7: 4, item8: 3, item9: 3, item10: 1,...>],
   :items=>[4.5, 3.0, 4.5, 2.0, 4.0, 2.0, 4.0, 2.5, 4.0, 1.0]}}

For instance, the following implementation of the self.getLoScore(evData) method, returns the average value of all items as the final score:

def self.getLoScore(evData)
    loScore = 0

    evMethods = self.getInstance.evmethods
    evMethods.each do |evmethod|
      loScoreForEvMethod = 0
      evData = evData[evmethod.name]
      items = evData[:items]
      itemWeight = 1/items.length.to_f
      scale = evmethod.module.constantize.getScale
      items.each_with_index do |iScore,i|
        loScoreForEvMethod += ((iScore-scale[0]) * itemWeight)
      end
      loScoreForEvMethod = 10/(scale[1]-scale[0]).to_f * loScoreForEvMethod.to_f
      loScore += loScoreForEvMethod
    end

    loScore = loScore/evMethods.length
    return loScore
  end

This common metric (Arithmetic Mean Metric) is provided in the Metrics::AM class (app/models/metrics/am.rb). We can use it for our own metric without copying it in the following way:

# encoding: utf-8

class Metrics::My_sus < Metrics::AM
  #this is for Metrics with type=My_sus

  def self.getLoScore(evData)
    super
  end
end

This way, the metric "My SUS" will calculate the score of a Learning Object as the average value of all items of the SUS evaluation model.

There are more examples of metrics implementations in the app/models/metrics folder.

Finally, we need to enable the new metric in the config/application_config.yml file. Check the Setting up a LOEP instance: The application_config.yml file page for more details about this file.

If the name of the metric is "My SUS", we should add this name to the "metrics" key in the following way:

#Copy this file to application_config.yml
#Configure here all the variables of your LOEP instance.

development:
  domain: "localhost:8080"
  [...]
  metrics: ["LORI Arithmetic Mean","LORI WAM CW",...,"My SUS"]
  [...]

At this point after a new evaluation of the corresponding evaluation model is performed for a Learning Object, a score based on the new metric will be automatically calculated and displayed in the LOEP platform. For metrics based on several evaluation models, one evaluation of each of the evaluation models is required. In the example of this guide, after a SUS evaluation is performed a new score will be calculated based on the "My SUS" metric.

Nevertheless, scores for Learning Objects evaluated before the definition and enabling of the metric are not available. We can calculate (or recalculate) these scores with the following task:

bundle exec rake db:populate:scores["metricName",""]

So, for our example the task will be:

bundle exec rake db:populate:scores["My SUS",""]

This task will recalculate (or generate) all scores for the metric "My SUS".