Skip to content

Repo of the research project conducted by Jiahe (Michael) Pan, investigating the relationship between robot autonomy and the human operator's cognitive load and trust in a teleoperation setting.

Notifications You must be signed in to change notification settings

hci-unimelb/AutonomyCLTrust

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

72 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AutonomyCLTrust

This is a research project conducted by Jiahe Pan at the University of Melbourne, Australia, under supervision of Jonathan Eden, Denny Oetomo and Wafa Johal. We utilize a shared control teleoperated trajectory tracking setup to investigate the relationship between robot autonomy and the human operator's cognitive load and trust. We use the Franka Emika robot arm and the Novint Falcon haptic device for the primary trajectory tracking task and Tobii eye trackers for one of the cognitive load measures. Experiments are conducted with 24 participants.

Project Links

Contents


ROS2 Workspace

A laptop with Ubuntu 22.04 and ROS2 (Humble) installations are required. The ros2_ws workspace contains the following two ROS packages:

  • ros2_package
  • tutorial_interfaces

ros2_package

This package contains the code files for the primary task, including receiving information from and sending control commands to the robot, data logging scripts, and rendering the task scene in RViz. Specifically, it contains the following sub-folders:

Folder Description
/data_logging/csv_logs Contains the raw data (.csv format) collected from all participants, including a header file for each participant with the calculated task performances for each trial condition.
/launch Contains ROS launch files to run the nodes defined in the /src folder, including launching the controller with both the Gazebo simulator and the real robot, and to start the RViz rendering of the task.
/ros2_package Contains package files including useful functions to generate the trajectories, parameters to run experiments, and the definition of the DataLogger Python class.
/scripts Contains the definition of the TrajRecorder Python class, used for receiving and saving control commands and robot poses into temporary data structures, before logging the data to csv files using a DataLogger instance.
/src Contains C++ source code for the ROS nodes used, including class definitions of the GazeboController and RealController for controlling the robot in simulation and the real world respectively, the PositionTalker for reading the position of the Falcon joystick, and the MarkerPublisher for publishing visualization markers into the RViz rendering.
/urdf Contains an auto-generated URDF file of the Franka Emika robot arm.

tutorial_interfaces

This packcage contains custom ROS message and service definitions. Specifically, there are two custom msg interfaces (in the /msg directory) defined for communication and data logging:

Msg Description
Falconpos.msg A simple definition of a 3D coordinate in Euclidean space. Attributes: x, y, z
PosInfo.msg A definition of the state vector of the system for a given timestamp. Attributes: ref_position[], human_position[], robot_position[], tcp_position[], time_from_start

Eye-Tracking

The implementation uses a laptop with Windows 11 installation.

In order to use the Tobii eye-tracker, first install the required SDK from PyPI:

pip install tobii-research

Then, it can be imported into Python as:

import tobii_research as tr

To connect to and receive information from the eye-tracker:

found_eyetrackers = tr.find_all_eyetrackers()
my_eyetracker = found_eyetrackers[0]
my_eyetracker.subscribe_to(tr.EYETRACKER_GAZE_DATA, 
                           gaze_data_callback, 
                           as_dictionary=True)

# Basic definition of the callback function
def gaze_data_callback(self, gaze_data):

    left_eye_gaze = gaze_data['left_gaze_point_on_display_area']
    right_eye_gaze = gaze_data['right_gaze_point_on_display_area']
    
    left_pupil_diameter = gaze_data['left_pupil_diameter']
    right_pupil_diameter = gaze_data['right_pupil_diameter']

To disconnect:

my_eyetracker.unsubscribe_from(tr.EYETRACKER_GAZE_DATA, gaze_data_callback)

For more details of implementation, please refer to rhythm_method.py located in the /experiment/secondary task/ directory.


Tapping Task

The implementation uses a laptop with Windows 11 installation.

The dual-task method is adopted in this project to capture cognitive load objectively, in addition to using pupil diameter. Specifically, "The Rhythm Method" is employed as the secondary task, which involves a rhythmic tapping at a given tempo.

The rhythm files are in .wav format, and can be generated from any software (e.g. GarageBand on IOS devices).

The implementation of the method in this project uses the following Python libraries:

from keyboard import read_key       # to record key taps on the keyboard
from playsound import playsound     # to play the rhythm for participants' reference

which can be installed via PyPI:

pip install keyboard playsound

For more details of implementation, please refer to rhythm_method.py located in the /experiment/secondary task/ directory.


Dataframes

The data logged throughout each of the experimental sessions are written to .csv files. These include both the main measures of interest and the initial demographics information collected at the start of each session. The final processed .csv files are all located in the /experiment/grouped_dataframes/ directory. Their descriptions and file names are summarized below:

Measure Description Filename
Trajectory Tracking Error RMSE error (cm) between the reference and recorded trajectories of each trial grouped_traj_err.csv
Rhythm Tapping Error Normalized percentage error (%) of the inter-tap interval lengths (relative to participant's baseline) grouped_tapping_err.csv
Pupil Diameter Pupil diameter (mm) for both left and right eyes, and averaged across them grouped_pupil.csv
Perceived Autonomy Participants' perceived level of robot autonomy, rated on a 10-point Likert scale grouped_p_auto.csv
Perceived Trust Participants' self-reported trust using a 10-point Likert scale grouped_p_trust.csv
NASA-TLX Self-reported cognitive load levels across all 6 aspects of the NASA-TLX questionnaire grouped_tlx.csv
MDMT Self-reported trust levels across all 8 dimensions of the MDMT questionnaire grouped_mdmt.csv

Data Analysis

The data analysis was performed in RStudio, leveraging existing libraries in the R programming langauge. All R scripts used are located in the /analysis/R_scripts/ directory, which has the following three sub-folders:

  • indiv_measures: Individual analysis of autonomy's effect on each of the measures using ANOVAs
  • interactions: Analysis of correlations and interaction effects between the measures and autonomy conditions
  • learning_effect: Identification of possible learning effects for the repeated measures within each round using Linear Mixed Models

Plots are also generated in R, and the code are embedded within the above R scripts. The preliminary plots are located in the /analysis/plots/ directory. The actual plots used in the paper (in .jpg format) can be found in the /analysis/pdf_plots/pdf_to_jpg/ directory.


Paper and Citation Info

The manuscript is submitted to RAL for review. Meanwhile, please check out the preprint version on arXiv. For including it in your publications, please use:

@misc{pan2024exploring,
      title={Exploring the Effects of Shared Autonomy on Cognitive Load and Trust in Human-Robot Interaction}, 
      author={Jiahe Pan and Jonathan Eden and Denny Oetomo and Wafa Johal},
      year={2024},
      eprint={2402.02758},
      archivePrefix={arXiv},
      primaryClass={cs.RO}
}

About

Repo of the research project conducted by Jiahe (Michael) Pan, investigating the relationship between robot autonomy and the human operator's cognitive load and trust in a teleoperation setting.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published