Personal Robots

Projects in personal robots are where we learn about the different technologies associated with robotics including microcontroller, communication, sensors, actuators, perception, localisation, mapping, navigation and machine intelligence.


AR-based Indoor Navigator

“Global” positioning in indoor environment has been challenging due to the absence of a standard and reliable GPS (Global Positioning System). The satellite GPS is not usable in indoor environment. We are learning the existing works in indoor positioning systems with the interest to develop one that we can use for autonomous mobile robot navigation.

As a start, we are looking at fiducial markers. Our current work is not developed on a mobile robot (yet). Instead we are developing an (Augmented Reality) AR-based indoor navigator mobile app to guide a user to navigate from any location to a desired destination inside a building. The ultimate goal is to apply the indoor positioning technique in mobile robots. The project started with developing an AR-based outdoor navigator application for UBD campus. We are currently extending the application with indoor navigator capability.

People

  • Lim Huey Theeng
  • Nur Afifah Ilyana binti Ilham
  • Ong Wee Hong
  • Owais Ahmed Malik

Data/Code

Publications

Videos


Application of Deep Learning in Visual SLAM

This is Nazrul’s MSc project. Nazrul is exploring the potential application of latest deep learning networks in improving the performance of Visual SLAM. His current focus is in the improvement of features detection and matching.

People

  • Muhammad Nazrul Fitri Bin Hj Ismail
  • Owais Ahmed Malik (Main Supervisor)
  • Ong Wee Hong

Data/Code

Publications

Videos


Application of Reinforcement Learning in Robot Navigation

This is Hafiq’s MSc project. In this research, we explore the use of Reinforcement Learning (RL), especially the Deep Reinforcement Learning in robot navigation. The ultimate goal is to improve the local planning of the navigation system through RL targeting the performance in crowded or complex environment where existing local planning techniques are not efficient.

People

Data/Code

Publications

Videos


Robot Navigation with Eddie Robot and ROS

This is a BSc final year project. In this project, we implemented the ROS navigation stack on the Eddie robot platform. We replaced the controller with a Raspberry Pi and a laptop. We learned to setup the Robot Operating System (ROS), implement the navigation stack, configure and write the necessary transformation, odometry and motor control nodes specific to the Eddie robot.

The Eddie robot will serve as the mobile platform for our other high level applications and research works on personal robots. The mobile platform has the ability to autonomously build the map of its surrounding, localize itself within the map and navigate around safely without bumping into obstacles.

We have added a tall body to the Eddie robot to fix a Kinect sensor on it. The Kinect sensor serves to provide the visual perception to the robot. The Kinect also serves as the laserscan input to the navigation stack. The controller of the Eddie robot has been replaced with a Raspberry Pi and a laptop. The two computers communicates through direct LAN connection between them. Most of the ROS nodes are run in the laptop, while the Raspberry Pi runs the nodes dealing with the drive train (motors and odometry).

People

Data/Code

Publications

Videos

Note: The video has been speedup by about x2.5. The scene is outside the Laboratory of Robotics and Intelligent System (Robolab V3) in the Universiti Brunei Darussalam (UBD).


Robot Navigation with iRobot Create Robot

This is a BSc final year project. We implemented the Simultaneous Localization and Mapping (SLAM) on a iRobot Create robot, the Wanderer-v1. The implementation uses Python. In this project, we learned the basic concepts in SLAM and the implementation of various algorithms involved.

The SLAM was implemented from scratch involving implementation of the sensor model, motion model and the localization using particle filter. The implementation is a FastSLAM. In addition, A-star search has been implemented for path planning. The robot is able to navigate to a given location while performing SLAM and avoiding obstacles.

The robot is equipped with a Kinect sensor to provide the laserscan. The odometry is from the iRobot Create. No other sensors are used at the moment.

People

  • Muhammad Amirul Akmal Haji Menjeni
  • Ong Wee Hong
  • Pg Dr Haji NorJaidi bin Pg Hj Tuah

Data/Code

Publications

Videos

Note: The video has been speedup by about x3. The scene is outside the Laboratory of Robotics and Intelligent System (Robolab V3) in the Universiti Brunei Darussalam (UBD).


Robot Control Over Internet

This is a BSc final year project. We learned to remotely control a LEGO Mindstorms robot over the internet to provide remote surveillance with live cam feedback.

People

  • Kenneth Chiang-Yu Chin
  • Ong Wee Hong
  • Dr Seyed Mohamed Buhari

Publications