Personal Robots

Projects in personal robots are where we learn about the different technologies associated with robotics including microcontroller, communication, sensors, actuators, localisation, mapping, navigation and machine intelligence.


Robot Navigation with Eddie Robot and ROS

This is a BSc final year project. In this project, we implemented the ROS navigation stack on the Eddie robot platform. We replaced the controller with a Raspberry Pi and a laptop. We learned to setup the Robot Operating System (ROS), implement the navigation stack, configure and write the necessary transformation, odometry and motor control nodes specific to the Eddie robot.

The Eddie robot will serve as the mobile platform for our other high level applications and research works on personal robots. The mobile platform has the ability to autonomously build the map of its surrounding, localize itself within the map and navigate around safely without bumping into obstacles.

We have added a tall body to the Eddie robot to fix a Kinect sensor on it. The Kinect sensor serves to provide the visual perception to the robot. The Kinect also serves as the laserscan input to the navigation stack. The controller of the Eddie robot has been replaced with a Raspberry Pi and a laptop. The two computers communicates through direct LAN connection between them. Most of the ROS nodes are run in the laptop, while the Raspberry Pi runs the nodes dealing with the drive train (motors and odometry).

People

Data/Code

Publications

Videos

Note: The video has been speedup by about x2.5. The scene is outside the Laboratory of Robotics and Intelligent System (Robolab V3) in the Universiti Brunei Darussalam (UBD).


Robot Navigation with iRobot Create Robot

This is a BSc final year project. We implemented the Simultaneous Localization and Mapping (SLAM) on a iRobot Create robot, the Wanderer-v1. The implementation uses Python. In this project, we learned the basic concepts in SLAM and the implementation of various algorithms involved.

The SLAM was implemented from scratch involving implementation of the sensor model, motion model and the localization using particle filter. The implementation is a FastSLAM. In addition, A-star search has been implemented for path planning. The robot is able to navigate to a given location while performing SLAM and avoiding obstacles.

The robot is equipped with a Kinect sensor to provide the laserscan. The odometry is from the iRobot Create. No other sensors are used at the moment.

People

  • Muhammad Amirul Akmal Haji Menjeni
  • Ong Wee Hong
  • Pg Dr Haji NorJaidi bin Pg Hj Tuah

Data/Code

Publications

Videos

Note: The video has been speedup by about x3. The scene is outside the Laboratory of Robotics and Intelligent System (Robolab V3) in the Universiti Brunei Darussalam (UBD).


Robot Control Over Internet

This is a BSc final year project. We learned to remotely control a LEGO Mindstorms robot over the internet to provide remote surveillance with live cam feedback.

People

  • Kenneth Chiang-Yu Chin
  • Ong Wee Hong
  • Dr Seyed Mohamed Buhari

Publications