Project ID 1800649 Supervisor: Prof Y.K. Demiris Room: 1014 Email: [email protected]
In this project, you will design and implement algorithms to allow the user to interact with a humanoid robot (Baxter, icub, or ABB YuMI, all available in the Personal Robotics Laboratory) perform joint tasks, for example sorting objects on a table (or other ideas that you might contribute). Using a Microsoft Hololens augmented reality platform, you will examine how to enable a rich interaction with the robot, focusing on how the Hololens can visually explain the robot's capabilities and internal knowledge (for example, whether an object can be reached by the robot, or that the robot does not know the function of certain objects), overlaying this information to the human's visual field.
Students should have very strong programming skills (C/C++/Python) and strong interest in machine learning, computer vision, and virtual/augmented reality. You will be using multiple SDKs to program the robots and the MS Hololens in diverse operating systems (both Linux and Windows), and you will need to be very experienced in software development to safely take this project.
gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dNOPAUSE -dQUIET -dBATCH -sOutputFile=apl115.pdf FinalReport/main.pdf
pdftotext main.pdf - | wc -w