Difference between revisions of "ECE497 SLAM via ROS"

From eLinux.org
Jump to: navigation, search
(Highlights)
m (Grading Template)
(13 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 
[[Category:ECE497 |Project]]
 
[[Category:ECE497 |Project]]
 
Team members: [[user:whiteer|Elias White]]
 
Team members: [[user:whiteer|Elias White]]
 +
 +
== Grading Template ==
 +
I'm using the following template to grade.  Each slot is 10 points.
 +
0 = Missing, 5=OK, 10=Wow!
 +
 +
<pre style="color:red">
 +
05 Executive Summary
 +
00 Installation Instructions
 +
00 User Instructions
 +
00 Highlights
 +
00 Theory of Operation
 +
00 Work Breakdown
 +
00 Future Work
 +
00 Conclusions
 +
00 Demo
 +
00 Late
 +
Comments: Let's talk about this.
 +
 +
Score:  5/100
 +
</pre>
  
 
== Executive Summary ==
 
== Executive Summary ==
Line 6: Line 26:
 
Give two sentence intro to the project.
 
Give two sentence intro to the project.
 
-->
 
-->
 +
<span style="color:red">(This is a big project for one person.  Let's talk about your progress today.)</span>
 
In autonomous navigation understanding the robot's surrounding environment, as well as its position in this environment, is of paramount importance.  This project attempts to leverage the open-source efforts resulting in simultaneous localization and mapping (SLAM) algorithms and use them, in collaboration with the Beagleboard -xm, to develop a 3-D model of the world surrounding the board as it moves through space.  Obviously the more (quality) sensory data used in a SLAM algorithm the better the results, but at this time a camera will be the only sensor device, although there is the possibility of incorporating a gyroscope.  A primary objective of this project is to test the feasibility of using the Beagleboard -xm as the "brain" for an autonomous quad-copter.
 
In autonomous navigation understanding the robot's surrounding environment, as well as its position in this environment, is of paramount importance.  This project attempts to leverage the open-source efforts resulting in simultaneous localization and mapping (SLAM) algorithms and use them, in collaboration with the Beagleboard -xm, to develop a 3-D model of the world surrounding the board as it moves through space.  Obviously the more (quality) sensory data used in a SLAM algorithm the better the results, but at this time a camera will be the only sensor device, although there is the possibility of incorporating a gyroscope.  A primary objective of this project is to test the feasibility of using the Beagleboard -xm as the "brain" for an autonomous quad-copter.
  
Line 11: Line 32:
 
Give two sentences telling what works.
 
Give two sentences telling what works.
 
-->
 
-->
 +
An Embedded version of Ubuntu has been successfully installed, and ROS has been installed on top of it.  While I make my final SLAM algorithm decision there is nothing else that works.
  
 
<!--
 
<!--
 
Give two sentences telling what isn't working.
 
Give two sentences telling what isn't working.
 
-->
 
-->
 +
Building the world model and localizing my self in it.
  
 
<!--
 
<!--
Line 20: Line 43:
 
-->
 
-->
  
<!--
+
With this project I hope to provide a example for my Aerial-robotics team members to follow and a starting point that leads them to increasingly awesome aerial robotics projects.
The sentence count is approximate and only to give an idea of the expected length.
+
-->
+
  
 
== Installation Instructions ==
 
== Installation Instructions ==
  
 +
<!--
 
Give step by step instructions on how to install your project on the SPEd2 image.   
 
Give step by step instructions on how to install your project on the SPEd2 image.   
  
Line 32: Line 54:
 
* Include kernel mods.
 
* Include kernel mods.
 
* If there is extra hardware needed, include links to where it can be obtained.
 
* If there is extra hardware needed, include links to where it can be obtained.
 +
-->
 +
 +
There are a few fairly large installs (embedded Ubuntu, ROS) required.  I'll pretty up the procedure and put throw it up here when I think it is at its most painless.
 +
 +
<span style="color:red">(Do you have a git repository?)</span>
  
 
== User Instructions ==
 
== User Instructions ==
  
 +
<!--
 
Once everything is installed, how do you use the program?  Give details here, so if you have a long user manual, link to it here.
 
Once everything is installed, how do you use the program?  Give details here, so if you have a long user manual, link to it here.
 +
-->
 +
Pending...
  
 
== Highlights ==
 
== Highlights ==
Line 46: Line 76:
 
While there are currently no highlights, this [http://www.youtube.com/watch?feature=player_embedded&v=IMSozUpFFkU video] provides an idea of what I would like to do, although the quality of their results is much higher than I am expecting to achieve.
 
While there are currently no highlights, this [http://www.youtube.com/watch?feature=player_embedded&v=IMSozUpFFkU video] provides an idea of what I would like to do, although the quality of their results is much higher than I am expecting to achieve.
  
== Theory of Operation ==
+
<span style="color:red">(How much of what's in the video comes with ROS?)</span>
  
 +
== Theory of Operation ==
 +
<!--
 
Give a high level overview of the structure of your software.  Are you using GStreamer?  Show a diagram of the pipeline.  Are you running multiple tasks?  Show what they do and how they interact.
 
Give a high level overview of the structure of your software.  Are you using GStreamer?  Show a diagram of the pipeline.  Are you running multiple tasks?  Show what they do and how they interact.
 +
-->
 +
 +
The operating system running on the -xm is an embedded version of Ubuntu 12.04.  ROS (Robotic Operating System) is installed within this OS, providing hardware abstraction, device drivers, libraries, et cetera, that provide simplified control of the robot platform.  OpenCV is embedded in ROS and is responsible for the robot's interpreting the data provided by the camera and constructing an accurate representation of the world.  While I haven't yet made my final decision on which SLAM algorithm to use, I am leaning towards [http://www.openslam.org/gmapping.html GMapping].
  
 
== Work Breakdown ==
 
== Work Breakdown ==
  
 +
<!--
 
List the major tasks in your project and who did what.
 
List the major tasks in your project and who did what.
  
 
Also list here what doesn't work yet and when you think it will be finished and who is finishing it.
 
Also list here what doesn't work yet and when you think it will be finished and who is finishing it.
 +
-->
 +
 +
As a solo group I'll be the only one working on this project.
  
 
== Future Work ==
 
== Future Work ==
  
 +
<!--
 
Suggest addition things that could be done with this project.
 
Suggest addition things that could be done with this project.
 
== Conclusions ==
 
 
<!--
 
Give some concluding thoughts about the project. Suggest some future additions that could make it even more interesting.
 
 
-->
 
-->
 
 
In order to improve performance once could bolster the sensory profile of the platform.  Useful sensors include:
 
In order to improve performance once could bolster the sensory profile of the platform.  Useful sensors include:
 
#Laser scanning range-finder
 
#Laser scanning range-finder
Line 73: Line 107:
  
 
The incorporation of the data gathered from these sensors will improve the robot's model of the world and decrease the uncertainty it has about its location in the model.
 
The incorporation of the data gathered from these sensors will improve the robot's model of the world and decrease the uncertainty it has about its location in the model.
 +
 +
== Conclusions ==
 +
 +
<!--
 +
Give some concluding thoughts about the project. Suggest some future additions that could make it even more interesting.
 +
-->
 +
 +
Nothing yet.

Revision as of 16:26, 5 November 2012

Team members: Elias White

Grading Template

I'm using the following template to grade. Each slot is 10 points. 0 = Missing, 5=OK, 10=Wow!

05 Executive Summary
00 Installation Instructions 
00 User Instructions
00 Highlights
00 Theory of Operation
00 Work Breakdown
00 Future Work
00 Conclusions
00 Demo
00 Late
Comments: Let's talk about this.

Score:  5/100

Executive Summary

(This is a big project for one person. Let's talk about your progress today.) In autonomous navigation understanding the robot's surrounding environment, as well as its position in this environment, is of paramount importance. This project attempts to leverage the open-source efforts resulting in simultaneous localization and mapping (SLAM) algorithms and use them, in collaboration with the Beagleboard -xm, to develop a 3-D model of the world surrounding the board as it moves through space. Obviously the more (quality) sensory data used in a SLAM algorithm the better the results, but at this time a camera will be the only sensor device, although there is the possibility of incorporating a gyroscope. A primary objective of this project is to test the feasibility of using the Beagleboard -xm as the "brain" for an autonomous quad-copter.

An Embedded version of Ubuntu has been successfully installed, and ROS has been installed on top of it. While I make my final SLAM algorithm decision there is nothing else that works.

Building the world model and localizing my self in it.


With this project I hope to provide a example for my Aerial-robotics team members to follow and a starting point that leads them to increasingly awesome aerial robotics projects.

Installation Instructions

There are a few fairly large installs (embedded Ubuntu, ROS) required. I'll pretty up the procedure and put throw it up here when I think it is at its most painless.

(Do you have a git repository?)

User Instructions

Pending...

Highlights

While there are currently no highlights, this video provides an idea of what I would like to do, although the quality of their results is much higher than I am expecting to achieve.

(How much of what's in the video comes with ROS?)

Theory of Operation

The operating system running on the -xm is an embedded version of Ubuntu 12.04. ROS (Robotic Operating System) is installed within this OS, providing hardware abstraction, device drivers, libraries, et cetera, that provide simplified control of the robot platform. OpenCV is embedded in ROS and is responsible for the robot's interpreting the data provided by the camera and constructing an accurate representation of the world. While I haven't yet made my final decision on which SLAM algorithm to use, I am leaning towards GMapping.

Work Breakdown

As a solo group I'll be the only one working on this project.

Future Work

In order to improve performance once could bolster the sensory profile of the platform. Useful sensors include:

  1. Laser scanning range-finder
  2. IMU (Inertial measurement unit)
  3. Digital Compass
  4. GPS

The incorporation of the data gathered from these sensors will improve the robot's model of the world and decrease the uncertainty it has about its location in the model.

Conclusions

Nothing yet.