ECE497 Project: Kinect Project

Team members: Yifei Li, Guanqun Wang

Objectives
Our goal is to implant a specific game application into Beagleboard and using Microsoft's Kinect to play with it. We would firstly make the Kinect recognizable to the Beagleboard. And then try to adapt the application to the Beagleboard. We haven't decided which game to play with, because the performance may be greatly influenced by what kind of operating system we want to use, or the size of this game. So the second part would be to test what is the best option for us. Finally, we would try everything to make it work.

Update: We previously want to build our own driver for Kinect, and develop our game on Android. For there are many open-source code that we can reference. We got a sample game that run on Android and successfully tested it on BeagleBoard with Kinect. But it seemed that it was quite difficult to develop our own kinect driver. So with the suggestion of Prof.Yoder, we finally decided to use Omek Beckon 2.4 SDK on BeagleBoard as interface of Kinect to develop our application and we chose Qt creator as cross-compiler tool to program and debug. As the limitation of the image(we could only use the image Omek provided with us, see Omek Forum for details, you may need to register first), we can't transfer all the drivers and data to the image we used to use and also we can't develop programs with GUI using Qt, because the Omek image only support command-line mode, so we chose to develop the GUI itself within the source code. We searched on line and found the open source code for a game called Q-ball. We are going to use kinect to play with it.

Executive Summary
1. Burn the Image Package and Boot the System

The detailed instruction is shown in page7 of developer's guide from omek.

When I tried to run mkcard.sh to create new partitions, I got some errors.

One of the error is the shell script is asumming that first partition of device "mmc" is "mmc1" but it's "mmcp1" in my Ubuntu. I changed the shell script to fix this but still got some errors about the partition specs.

Finally I found a tricky way to use the shell script directly. If I use a external USB card reader to read and write the microsd card, everything just works perfectly. I think maybe the author of shell script had a different verison of Linux or only tested the script on descktop with external card reader.

2. Run the Sample Demos

After burning the microsd card sucessfully, we can boot from this card and see a simplified Angstrom without GUI. All demos ran almost as expected but,

(1).the program froze sometimes. It even happened sometimes when we were just typing some command. We think maybe it's because of the speed of microsd card. We will get a faster card and have a try.

(2).the tracking for legs doesn't work very well, we think maybe it's because we didn't have enough space for the kinect.(When you played kinect game in xbox, it will tell you a large space is required for tracking).

3. Compile the Sample Code on Host

Detailed instruction is provided in page26 of the developer's guide from omek. By following the instruction, we downloaded and installed the Linux ARM cross toolchain. We also installed zlib library and cross compiler for qt. But we had some problem when compiling qt. The error information says the make cannot find the cross-compiler. I used export command to set the path of cross complier we used in class before(some time later I noticed this may cause some problem) and new error occured saying "undefined reference to clock_gettime". I googled it and find it's involved in one library so I modifed the make file and include the library when compiling but it still gave the same error.

Then I went through the instruction carefully and found actually the instruction asked us to install two different toolchains for the same purpose. The difference of two toolchains is the version of qt, one is 4.6.3 and the other is 4.6.2, since our souce code is for 4.6.2 version, so I want to try toolchain for 4.6.2 to see if it makes any difference.

The instruction also asked us to install two different versions of QtCreater in the pdf file and in blog, we prefered the version in pdf file since the instruction should be written based on this version.

After struggling for a very long time, I decided to clean everything up and start over. I followed the instruction one step by step but skipped one step and it turned out everything works perfectly now. Link Omek Forum may help you with the version problem.

Some conclusion for compiling Qt:

(1). I reinstalled the Ubuntu when started over. It turned out you need to install g++ compiler before before getting started.(Type "sudo apt-   get install g++"). Other than that, you don't need to do anything not mentioned in the developer's guide and the blog it quotes.

(2). The instruction from Omek is a little redundant with the blog it quotes and not taltally right.(At least they don't all work on my laptop)

(3). The "developer's guide" pdf file and the blog it quotes both tell developer to install a toolchain( but different version, it's 4.6.3 in  the pdf file and 4.6.2 in the blog) for cross-compiling. The 4.6.3 toolchain works perfectly for compiling Qt 4.6.2.

(4). You don't have to install QtCreater before compiling Qt. The blog says you'd better download one Qtcreater from the nokia website so that you can get the latest version, but it's not really necessary according to my experience. You can get a pretty good and working version by simply running "sudo apt-get install qtcreator".

(5). In the instruction for building QtTracking Sample, step 3.b tells you to remove the existing qmake step and make you own custom step, but I always got error when the building process reached the costum step, so I tried to just leave the existing qmake there, and everything worked amazingly!!! So, if you get trouble about the costum step you made in building, you can just try to use the default qmake("existing qmake" in the instruction from Omek).

4. Build Q-ball without adding Kinect support using Qtcreator After setting up ready all the development environment, we are going to firstly build the game without Kinect to see if it can work. The source code is already Qt coded, so we didn't meat any problems here. The code could successfully run on our beagleboard.

5. Build Q-ball with Kinect support using Qtcreator Lastly and most importantly, we would add Kinect support to Q-ball. The source code is written in C++, but both of us do not have any experience in it. The process of programming and debugging is quite struggling. We could finally make it work, the BeagleBoard could recognize our left and right scroll gestures, but the speed is very slow. We may proceed with this, and make it faster if more time is allowable.

Installation Instructions

 * Github links: Github, including Omek image, developer guide, Omek sample code, and Kinect supported Q-ball source code
 * Omek Beckon SDK Development tools
 * Qtcreator

1. If you haven't installed git, see instrucstions here EBC Exercise 07, if you did, git pull our reposotiry to your host. host$ git clone git@github.com:wangg/ECE497.git host$ cd ECE497/

2. Insert your SD card, we recommend you to use an external USB card-reader, which would avoid strange mistakes. sudo bash mkcard.sh /media/sdX beagle-omek WARNING: If you mistakenly use the main OS device name this script can erase your entire OS.

3. Connect your BeagleBoard, insert your newly made SD-card, plug in the power supply, Kinect, mouse and keyboard. The log in user is root without a password.

4. Now we want to run the demo program Omek gave us to see if everything works. cd ECE497/Beckon-SDK-2.4.16236/bin NOTICE: The code has been modified for Q-ball, you need to untar Beckon-SDK-2.4.16236.tar.gz file first in order to run the sample.

To run Qt Tracking Sample in live tracking using camera input: or if you can record a camera sequence using the tools provided with the PC version of OpenNI. To run prerecorded camera sequence (OpenNI sequence file (.oni)): Note: the path points to a directory containing the sequence, not to the sequence file itself.
 * 1) sh ./tracking.sh
 * 1) sh ./tracking.sh -seq path/to/directory/containing/the/sequence/file/

To run TrackingViewer3D Sample in live tracking using camera input: To run GestureDemo Sample, you need to run The supported list of gestures are: • _rightClick _leftClick • _rightScrollRight _rightScrollLeft • _rightScrollUp _rightScrollDown • _leftScrollRight _leftScrollLeft • _leftScrollUp _leftScrollDown
 * 1) sh ./tracking3d.sh -gest -gest ...
 * 1) sh ./gestures.sh -gest -gest ...

Note: The path points to a directory containing the sequence, not to the sequence file itself.

5. Now we are moving to the host side, and install the toolchain and Qtcreator we need. There is a very good blog that shows the every step you need to do. The link is here blog We are using 4.6.2 version, you can download the file at here,

Highlights
1. Our project enables the BeagleBoard to recognize Kinect and could play a game with it using its gesture recognition function.

2. We used Qt creator, the embedded version of qt to create, build and debug the program.

3. We cross-compiled the program on our host, which made it much easier to debug and program.

4. There may be various of compilation errors when building the project, regardless of the code problem, we thought it may because of the setup of building configurations. We finally chose to use the project configuration of TrackingDemo(use this project settings and import the sources and header files in it) for it is much more similar to what we want to develop. And this one just works.

Theory of Operation
The whole point is to use Kinect interface provided by Omek to control the movement of the bar. The original control is achieved by calling Qt key event "Qt::Key_Left", which would get back a value that is used in PlayGameState::updateScene as shown followings: // move if( mainWindow->isKeyDownLeft ) { bat_has_target = false; itemBat->moveBy(-bat_sp_x_c, 0); if( itemBat->pos.x < 0 ) { itemBat->setPos(0, itemBat->pos.y); }   }    else if( mainWindow->isKeyDownRight ) { bat_has_target = false; itemBat->moveBy(bat_sp_x_c, 0); if( itemBat->pos.x > scene->width - bat_w_c ) { itemBat->setPos(scene->width - bat_w_c, itemBat->pos.y); }   }    else if( bat_has_target ) { int bat_centre_x = itemBat->pos.x + bat_w_c/2; int dist = abs(bat_centre_x - bat_target_x); if( dist <= bat_sp_x_c ) { itemBat->setPos(bat_target_x - bat_w_c/2, itemBat->pos.y); bat_has_target = false; }       else { int dir = bat_target_x > bat_centre_x ? 1 : -1;           itemBat->moveBy(dir * bat_sp_x_c, 0); }   } In order to replace the condition judgement in if shown above, we need to first initialize Omek Kinect sensors ready. The Omek give us a good example on how to register and enable Gestures, we would use that directly. The problem is whether we should initialize them in main, or in class. We need the init to get the value of pGesture in order to get gestureName that used to determine which gesture is detected, so we can't seperate them! If we put them in main, then we won't be able to process them repeatly. The best way is to put them in class, and use the following code to call the class from period to period.

QTimer timer; QObject::connect(&timer, SIGNAL(timeout), &w, SLOT(updateScene)); timer.start(time_per_frame_c); Note: We found a good explanation of this code. See here:

At this point, we have two different ways of coding. Guanqun defined the init function of kinect in a headerfile called GestureDemo.h, and call it within PlayGameState::updateScene. So every period of time_per_frame_c, it would go into the init function, and detect which gestures we make. While, Yifei put the init function into a new class, called PlayGameState::updateKinect, and add another signal called timer1_per_frame_c so that updateKinect and updateScene would be both processed repeatly. The first one worked!

We still want to use the keyboard while playing with it. So we didn't change anything in MainWindow::KeyPressEvent and MainWindow::KeyReleaseEvent. void MainWindow::keyPressEvent(QKeyEvent *event) { //qDebug("keyPressEvent"); if( event->key == Qt::Key_Left ) { this->keydown_left = true; //qDebug("left"); }   else if( event->key == Qt::Key_Right ) { this->keydown_right = true; //qDebug("right"); }   else if( event->key == Qt::Key_Escape ) { gamestate->userQuit; }   else if( event->key == Qt::Key_P ) { gamestate->userPause; }   else if( event->key == Qt::Key_Return ) { QPushButton *button = qobject_cast< QPushButton* >(qApp->focusWidget); if (button) { button->click; }   }    else { QMainWindow::keyPressEvent(event); } }

void MainWindow::keyReleaseEvent(QKeyEvent *event) { //qDebug("keyReleaseEvent"); if( event->key == Qt::Key_Left ) { this->keydown_left = false; }   else if( event->key == Qt::Key_Right ) { this->keydown_right = false; }   else { QMainWindow::keyReleaseEvent(event); } }

Now, let's take a look at how initilizing Kinect sensor is made:

This would create the sensor if (a==0){ // create the sensor if (sequencePath) {               // from a sequence pSensor = IMotionSensor::createSequenceSensor(sequencePath); }       else {               // from a cameras pSensor = IMotionSensor::createCameraSensor(true); }

if(pSensor == NULL) {               cerr << "Error, failed creating sensor." << endl; ret = 1; goto end; }

This initialize the tracking algorithm, OMK_SUCCESS and setTrackingOptons are defined in header files Omek provided with us. // initialize the tracking algorithm if(pSensor->setTrackingOptions(TRACK_ALL) != OMK_SUCCESS) {               cerr << "Error, failed to set tracking options." << endl; ret = 1; goto end; }

This enables the gestures that we want to use, only enabled gestures can be called; // enable the selected gestures numberOfGestures=1; for(int i = 0 ; i < numberOfGestures; i++) {               if(pSensor->enableGesture("_rightScrollRight") != OMK_SUCCESS) {                       cerr << "Error, unrecognized gesture: " << "_rightScrollRight" << endl; ret = 1; goto end; }               if(pSensor->enableGesture("_leftScrollLeft") != OMK_SUCCESS) {                       cerr << "Error, unrecognized gesture: " << "_leftScrollLeft" << endl; ret = 1; goto end; }               /*if(pSensor->enableGesture("_rightClick") != OMK_SUCCESS) { cerr << "Error, unrecognized gesture: " << "_rightScrollClick" << endl; ret = 1; goto end; }*/       }

We want to rescan the gesture again and again for control the bar. So this code should be used: // run the main loop as long as there are frames to process while (sensor->isAlive && g_run_gestures) {                       // Handle the current frame only if we have successfully processed a new image bool bHasNewImage = false; if ((pSensor->processNextImage(true, bHasNewImage) == OMK_SUCCESS) && bHasNewImage) {                               processedFrames++; while (pSensor->hasMoreGestures) {                                       const IFiredEvent* pFiredEvent = pFiredEvent = pSensor->popNextGesture; std::stringstream text; text << endl << "Gesture (" << (pFiredEvent->getName!=NULL?pFiredEvent->getName:"") << ") fired in frame " << processedFrames; cout << text.str.c_str << endl; std::string gestureName = pFiredEvent->getName; if (gestureName == "_rightScrollRight") {haha=1;} else if (gestureName == "_leftScrollLeft") {haha1=1;} //else if (gestureName == "_rightClick") {haha1=0;haha=0;}

pSensor->releaseGesture(pFiredEvent); g_run_gestures = 0; }                       } Notice two whiles are used. The first while check to see if there are still sensors being used, the second while used to detect whether more gestures are made.

Work Breakdown
1. Installed the Beckon SDK Development tools on both Linux PC host and BeagleBoard; Successfully tested the sample code and discussed the interface functions. ---completed on February 5th by Yifei Li and Guanqun Wang