ECE497 SLAM via ROS
Team members: Elias White
In autonomous navigation understanding the robot's surrounding environment, as well as its position in this environment, is of paramount importance. This project attempts to leverage the open-source efforts resulting in simultaneous localization and mapping (SLAM) algorithms and use them, in collaboration with the Beagleboard -xm, to develop a 3-D model of the world surrounding the board as it moves through space. Obviously the more (quality) sensory data used in a SLAM algorithm the better the results, but at this time a camera will be the only sensor device, although there is the possibility of incorporating a gyroscope. A primary objective of this project is to test the feasibility of using the Beagleboard -xm as the "brain" for an autonomous quad-copter.
Give step by step instructions on how to install your project on the SPEd2 image.
- Include your github path as a link like this: https://github.com/MarkAYoder/gitLearn.
- Include any additional packages installed via opkg.
- Include kernel mods.
- If there is extra hardware needed, include links to where it can be obtained.
Once everything is installed, how do you use the program? Give details here, so if you have a long user manual, link to it here.
While there are currently no highlights, this video provides an idea of what I would like to do, although the quality of their results is much higher than I am expecting to achieve.
Theory of Operation
Give a high level overview of the structure of your software. Are you using GStreamer? Show a diagram of the pipeline. Are you running multiple tasks? Show what they do and how they interact.
As a solo group I'll be the only one working on this project.
Suggest addition things that could be done with this project.