Skip to content

HARPLab/ada_meal_scenario

 
 

Repository files navigation

ada_meal_scenario

A set of scripts for a meal serving scenario using Ada.

generate_saved_trajectories.py

If the configuration of any fixed points (e.g. servingConfiguration) are changed, this must be run to cache new trajectories

runBiteServing.py (Marshmallow demo):

Note: in all terminals, first source devel/setup.bash. Each of the following commands happens in its own terminal.

  1. Set up a roscore

    roscore
  2. Make sure the Kinova USB joystick is plugged in, then launch it

    roslaunch ada_launch kinova_joystick.launch
  3. Open rviz

    rosrun rviz rviz
  4. Start the ADA controller (if running on real robot)

    roslaunch ada_launch default.launch
  5. Make sure the depth camera is attached and powered on, then launch it

    roslaunch openni2_launch openni2.launch

    Note: The error "Unsupported color video mode" in red is ok

    At this point, the camera feed should be showing in rviz (subscribe to the /camera/depth/image topic).

  6. Start the morsel detector

    cd src/morsel/detector/
    python biteserver.py structureio_settings.json verbose.json
  7. Start the demo

    rosrun ada_meal_scenario runBiteServing.py --real --viewer=interactivemarker

At this point, a GUI should appear that allows you to select the operation mode. Select a mode (e.g., shared autonomy) and an input device (e.g., Kinova USB), then press the start next trial button.

About

A set of scripts for a meal serving scenario using Ada.

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 97.5%
  • CMake 2.5%