Please note that User Registration has been temporarily disabled due to a recent increase in automated registrations. If anyone needs an account, please request one here: RequestAccount. Thanks for your patience!--Wmat (talk)
Please email User:Wmat if you experience any issues with the Request Account form.

Difference between revisions of "ECE497 Project: Kinect Project"

From eLinux.org
Jump to: navigation, search
m (Grading)
m (Executive Summary)
Line 13: Line 13:
  
 
== Executive Summary ==
 
== Executive Summary ==
 +
 +
<span style="color:red">This isn't and executive summary.  See my template for what should be in the summary.</span>
  
 
1. '''Burn the Image Package and Boot the System'''
 
1. '''Burn the Image Package and Boot the System'''
  
The detailed instruction is shown in page7 of developer's guide from omek.
+
The detailed instruction is shown in page7 of developer's guide from omek. (<span style="color:red">Give a link to this so it's easy to find.</span>)
  
 
When I tried to run mkcard.sh to create new partitions, I got some errors.
 
When I tried to run mkcard.sh to create new partitions, I got some errors.

Revision as of 19:55, 24 February 2012


Team members: Yifei Li, Guanqun Wang

Grading

Update your Objectives to reflect the current objects. Presently it captures the history of your thinking rather than the current state. The Executive Summary isn't one. See my template for what should be in the summary.

Objectives

Our goal is to implant a specific game application into Beagleboard and using Microsoft's Kinect to play with it. We would firstly make the Kinect recognizable to the Beagleboard. And then try to adapt the application to the Beagleboard. We haven't decided which game to play with, because the performance may be greatly influenced by what kind of operating system we want to use, or the size of this game. So the second part would be to test what is the best option for us. Finally, we would try everything to make it work.

Update: We previously want to build our own driver for Kinect, and develop a specific game on Android. For there are many open-source code that can be referenced. We got a sample game code that could run on Android and successfully tested it on BeagleBoard with Kinect. But it seemed that it was quite difficult to develop our own kinect driver. So with the suggestion of Prof.Yoder, we decided to use Omek Beckon 2.4 SDK on BeagleBoard supported with Kinect driver to develop our application and we chose Qt creator as cross-compiler tool to program and debug. As the limitation of the image(we could only use the image Omek provided with us, see Omek Forum for details, you may need to register first), we not only can't transfer all the drivers and data to the image we used to use but also can't develop programs with GUI using Qt, because the Omek image only support command-line mode, so we chose to develop the GUI itself within the source code. We searched on line and found the open source code for a game called Q-ball(Thanks to the original author of this code!). Now we are going to re-build it and use kinect to play with it.

Executive Summary

This isn't and executive summary. See my template for what should be in the summary.

1. Burn the Image Package and Boot the System

The detailed instruction is shown in page7 of developer's guide from omek. (Give a link to this so it's easy to find.)

When I tried to run mkcard.sh to create new partitions, I got some errors.

One of the error is the shell script is asumming that first partition of device "mmc" is "mmc1" but it's "mmcp1" in my Ubuntu. I changed the shell script to fix this but still got some errors about the partition specs.

Finally I found a tricky way to use the shell script directly. If I use a external USB card reader to read and write the microsd card, everything just works perfectly. I think maybe the author of shell script had a different verison of Linux or only tested the script on descktop with external card reader.

2. Run the Sample Demos

After burning the microsd card sucessfully, we can boot from this card and see a simplified Angstrom without GUI. All demos ran almost as expected but,

(1).the program froze sometimes. It even happened sometimes when we were just typing some command. We think maybe it's because of the speed of microsd card. We will get a faster card and have a try.

(2).the tracking for legs doesn't work very well, we think maybe it's because we didn't have enough space for the kinect.(When you played kinect game in xbox, it will tell you a large space is required for tracking).

3. Compile the Sample Code on Host

Detailed instruction is provided in page26 of the developer's guide from omek. By following the instruction, we downloaded and installed the Linux ARM cross toolchain. We also installed zlib library and cross compiled it for qt. But we had some problem when compiling qt. The error information says the make cannot find the cross-compiler. I used export command to set the path of cross complier we used in class before(some time later I noticed this may cause some problem) and new error occured saying "undefined reference to clock_gettime". I googled it and find it's involved in one library so I modifed the make file and include the library when compiling but it still gave the same error.

Then I went through the instruction carefully and found actually the instruction asked us to install two different toolchains for the same purpose. The difference of two toolchains is the version of qt, one is 4.6.3 and the other is 4.6.2, since our souce code is for 4.6.2 version, so I want to try toolchain for 4.6.2 to see if it makes any difference.

The instruction also asked us to install two different versions of QtCreater in the pdf file and in blog, we prefered the version in pdf file since the instruction should be written based on this version.

After struggling for a very long time, I decided to clean everything up and start over. I followed the instruction one step by step but skipped one step and it turned out everything works perfectly now. Link Omek Forum may help you with the version problem.

Some conclusion for compiling Qt:

(1). I reinstalled the Ubuntu when started over. It turned out you need to install g++ compiler before before getting started.(Type "sudo apt- get install g++"). Other than that, you don't need to do anything not mentioned in the developer's guide and the blog it quotes.

(2). The instruction from Omek is a little redundant with the blog it quotes and not taltally right.(At least they don't all work on my laptop)

(3). The "developer's guide" pdf file and the blog it quotes both tell developer to install a toolchain( but different version, it's 4.6.3 in the pdf file and 4.6.2 in the blog) for cross-compiling. The 4.6.3 toolchain works perfectly for compiling Qt 4.6.2.

(4). You don't have to install QtCreater before compiling Qt. The blog says you'd better download one Qtcreater from the nokia website so that you can get the latest version, but it's not really necessary according to my experience. You can get a pretty good and working version by simply running "sudo apt-get install qtcreator".

(5). In the instruction for building QtTracking Sample, step 3.b tells you to remove the existing qmake step and make you own custom step, but I always got error when the building process reached the costum step, so I tried to just leave the existing qmake there, and everything worked amazingly!!! So, if you get trouble about the costum step you made in building, you can just try to use the default qmake("existing qmake" in the instruction from Omek).

4. Build Q-ball without adding Kinect support using Qtcreator After setting up ready all the development environment, we are going to firstly build the game without Kinect to see if it can work. The source code is already Qt coded, so we didn't meat any problems here. The code could successfully run on our beagleboard and it is pretty fast!

5. Build Q-ball with Kinect support using Qtcreator Lastly and most importantly, we would add Kinect support to Q-ball. Details could be found on "Theory of Operation" on this wiki page. The source code is written in C++, but both of us do not have any experience in it. The process of programming and debugging is quite struggling. However we could finally make it work, the BeagleBoard could recognize our left and right scroll gestures. We have already set the fastest scan period of the gestures but the speed is still very slow. We may proceed with this to make it faster if more time is allowable.

Installation Instructions

  • Github links: Github, including Omek image, developer guide, Omek sample code, and Kinect supported Q-ball source code(The file "new" is the final version we use.)
  • Omek Beckon SDK Development tools
  • Qtcreator

1. If you haven't installed git, see instrucstions here EBC Exercise 07, if you did, git clone our reposotiry to your host.

host$ git clone git@github.com:wangg/ECE497.git
host$ cd ECE497/

2. Insert your SD card, we recommend you to use an external USB card-reader, which would avoid strange mistakes.

sudo bash mkcard.sh /media/sdX beagle-omek

WARNING: If you mistakenly use the main OS device name this script can erase your entire OS.

3. Connect your BeagleBoard, insert your newly made SD-card, plug in the power supply, Kinect, mouse and keyboard. The log in user is root without a password.

4. Now we want to run the demo program Omek gave us to see if everything works.

cd ECE497/Beckon-SDK-2.4.16236/bin

NOTICE: The code you found in sample has been modified for Q-ball, you need to get the very original sample code from the tar file we provided for you. Untar Beckon-SDK-2.4.16236.tar.gz file first in order to run the sample.

To run Qt Tracking Sample in live tracking using camera input:

# sh ./tracking.sh

or if you can record a camera sequence using the tools provided with the PC version of OpenNI. To run prerecorded camera sequence (OpenNI sequence file (.oni)):

# sh ./tracking.sh -seq path/to/directory/containing/the/sequence/file/

Note: the path points to a directory containing the sequence, not to the sequence file itself.

To run TrackingViewer3D Sample in live tracking using camera input:

# sh ./tracking3d.sh -gest <gesture name> -gest ...

To run GestureDemo Sample, you need to run

# sh ./gestures.sh -gest <gesture name> -gest ...

The supported list of gestures are: • _rightClick _leftClick • _rightScrollRight _rightScrollLeft • _rightScrollUp _rightScrollDown • _leftScrollRight _leftScrollLeft • _leftScrollUp _leftScrollDown

Note: The path points to a directory containing the sequence, not to the sequence file itself.

5. Now we are moving to the host side, and install the toolchain and Qtcreator we need. There is a very good blog that shows the every step you need to do. The link is here blog We are using 4.6.3 version, you can download the file at here. The developer guide is also a good reference.

If your host is newly installed, install g++ first.

sudo apt-get install g++

Install the Linux (Angstrom distro) ARM cross-toolchain: extract the downloaded archive into your system root directory:

sudo tar -xvjf angstrom-2011.03-i686-linux-armv7a-linux-gnueabi-toolchain-qte-4.6.3.tar.bz2 -C /

Qt embedded SDK for Angstrom/BeagleBoard. To install the Qt embedded SDK for Angstrom/BeagleBoard: install zlib library

sudo sudo apt-get install zlib1g-dev

You need to create a new make.conf. Run the following line to make a new mkspecs directory for the BeagleBoard Processor.

cp -R [DownloadDirectory]/mkspecs/qws/linux-arm-g++/ [DownloadDirectory]/mkspecs/qws/linux-DM3730-g++/

Edit the qmake.conf file found in [DownloadDirectory]/mkspecs/qws/linux-DM3730-g++/ to look like the following.

#
# qmake configuration for building with arm-linux-g++
#

include(../../common/g++.conf)
include(../../common/linux.conf)
include(../../common/qws.conf)

# modifications to g++.conf
#Toolchain

#Compiler Flags to take advantage of the ARM architecture
QMAKE_CFLAGS_RELEASE =   -O3 -march=armv7-a -mtune=cortex-a8 -mfpu=neon -mfloat-abi=softfp
QMAKE_CXXFLAGS_RELEASE = -O3 -march=armv7-a -mtune=cortex-a8 -mfpu=neon -mfloat-abi=softfp

QMAKE_CC = /usr/local/angstrom/arm/arm-angstrom-linux-gnueabi/bin/gcc
QMAKE_CXX = /usr/local/angstrom/arm/arm-angstrom-linux-gnueabi/bin/g++
QMAKE_LINK = /usr/local/angstrom/arm/arm-angstrom-linux-gnueabi/bin/g++
QMAKE_LINK_SHLIB = /usr/local/angstrom/arm/arm-angstrom-linux-gnueabi/bin/g++

# modifications to linux.conf
QMAKE_AR = /usr/local/angstrom/arm/arm-angstrom-linux-gnueabi/bin/ar cqs
QMAKE_OBJCOPY = /usr/local/angstrom/arm/arm-angstrom-linux-gnueabi/bin/objcopy
QMAKE_STRIP = /usr/local/angstrom/arm/arm-angstrom-linux-gnueabi/bin/strip

load(qt_config)

Now run the configure command [this can take many minutes]:

./configure -opensource -confirm-license -prefix /opt/qt-arm -no-qt3support -embedded arm -little-endian -xplatform qws/linux-DM3730-g++ -qtlibinfix E

then

sudo make install

this would take a very long time!

When successful, you are ready to build the project. Unlike the develop guide, we use the default build configuration. Simply open our project, build it and you would the binary executable file in Bin directory. Copy the whole directory to beagle, and run

./QtTracking-SDK -qws


User Instructions

Our game currently support only two gestures, right hand scroll left and right hand scroll right. Multiple gestures could be enabled, but the complexity of the code would be increased, and the speed may be limited. Because we can't make the bar respond faster enough, we stop the movement of the ball so that we could see the response of our gestures. If you want to enable the movement of the ball, you could un-comment the code " //itemBall->moveBy(ball_sp_x, ball_sp_y);" under //move ball.


Highlights

1. Our project enables the BeagleBoard to recognize Kinect and could play a game with it using its gesture recognition function.

2. We used Qt creator, the embedded version of qt to create, build and debug the program.

3. We cross-compiled the program on our host, which made it much easier to debug and program.

4. There may be various of compilation errors when building the project, regardless of the code problem, we thought it may because of the setup of building configurations. We finally chose to use the project configuration of TrackingDemo(use this project settings and import the sources and header files in it) for it is much more similar to what we want to develop. And this one just works.


Theory of Operation

The whole point is to use Kinect interface provided by Omek to control the movement of the bar. The original control is achieved by calling Qt key event "Qt::Key_Left", which would get back a value that is used in PlayGameState::updateScene as shown followings:

 // move
    if( mainWindow->isKeyDownLeft() ) {
        bat_has_target = false;
        itemBat->moveBy(-bat_sp_x_c, 0);
        if( itemBat->pos().x() < 0 ) {
            itemBat->setPos(0, itemBat->pos().y());
        }
    }
    else if( mainWindow->isKeyDownRight() ) {
        bat_has_target = false;
        itemBat->moveBy(bat_sp_x_c, 0);
        if( itemBat->pos().x() > scene->width() - bat_w_c ) {
            itemBat->setPos(scene->width() - bat_w_c, itemBat->pos().y());
        }
    }
    else if( bat_has_target ) {
        int bat_centre_x = itemBat->pos().x() + bat_w_c/2;
        int dist = abs(bat_centre_x - bat_target_x);
        if( dist <= bat_sp_x_c ) {
            itemBat->setPos(bat_target_x - bat_w_c/2, itemBat->pos().y());
            bat_has_target = false;
        }
        else {
            int dir = bat_target_x > bat_centre_x ? 1 : -1;
            itemBat->moveBy(dir * bat_sp_x_c, 0);
        }
    }

In order to replace the condition judgement in if() shown above, we need to first initialize Omek Kinect sensors ready. The Omek give us a good example on how to register and enable Gestures, we would use that directly. The problem is whether we should initialize them in main(), or in class. We need the init to get the value of pGesture in order to get gestureName that used to determine which gesture is detected, so we can't seperate them! If we put them in main(), then we won't be able to process them repeatly. The best way is to put them in class, and use the following code to call the class from period to period.

 QTimer timer;
    QObject::connect(&timer, SIGNAL(timeout()), &w, SLOT(updateScene()));
    timer.start(time_per_frame_c);

Note: We found a good explanation of this code. See here:[1]

At this point, we have two different ways of coding. Guanqun defined the init function of kinect in a headerfile called GestureDemo.h, and call it within PlayGameState::updateScene. So every period of time_per_frame_c, it would go into the init function, and detect which gestures we make. While, Yifei put the init function into a new class, called PlayGameState::updateKinect, and add another signal called timer1_per_frame_c so that updateKinect and updateScene would be both processed repeatly. The first one worked!

We still want to use the keyboard while playing with it. So we didn't change anything in MainWindow::KeyPressEvent and MainWindow::KeyReleaseEvent.

void MainWindow::keyPressEvent(QKeyEvent *event) {
    //qDebug("keyPressEvent");
    if( event->key() == Qt::Key_Left ) {
        this->keydown_left = true;
        //qDebug("left");
    }
    else if( event->key() == Qt::Key_Right ) {
        this->keydown_right = true;
        //qDebug("right");
    }
    else if( event->key() == Qt::Key_Escape ) {
        gamestate->userQuit();
    }
    else if( event->key() == Qt::Key_P ) {
        gamestate->userPause();
    }
    else if( event->key() == Qt::Key_Return ) {
        QPushButton *button = qobject_cast< QPushButton* >(qApp->focusWidget());
        if (button) {
            button->click();
        }
    }
    else {
        QMainWindow::keyPressEvent(event);
    }
}

void MainWindow::keyReleaseEvent(QKeyEvent *event) {
    //qDebug("keyReleaseEvent");
    if( event->key() == Qt::Key_Left ) {
        this->keydown_left = false;
    }
    else if( event->key() == Qt::Key_Right ) {
        this->keydown_right = false;
    }
    else {
        QMainWindow::keyReleaseEvent(event);
    }
}

Now, let's take a look at how initilizing Kinect sensor is made:

This would create the sensor

    if (a==0){
        // create the sensor
        if (sequencePath)
        {
                // from a sequence
                pSensor = IMotionSensor::createSequenceSensor(sequencePath);
        }
        else
        {
                // from a cameras
                pSensor = IMotionSensor::createCameraSensor(true);
        }

        if(pSensor == NULL)
        {
                cerr << "Error, failed creating sensor." << endl;
                ret = 1;
                goto end;
        }

This initialize the tracking algorithm, OMK_SUCCESS and setTrackingOptons are defined in header files Omek provided with us.

    // initialize the tracking algorithm
        if(pSensor->setTrackingOptions(TRACK_ALL) != OMK_SUCCESS)
        {
                cerr << "Error, failed to set tracking options." << endl;
                ret = 1;
                goto end;
        }

This enables the gestures that we want to use, only enabled gestures can be called;

  // enable the selected gestures
        numberOfGestures=1;
        for(int i = 0 ; i < numberOfGestures; i++)
        {
                if(pSensor->enableGesture("_rightScrollRight") != OMK_SUCCESS)
                {
                        cerr << "Error, unrecognized gesture: " << "_rightScrollRight" << endl;
                        ret = 1;
                        goto end;
                }
                if(pSensor->enableGesture("_leftScrollLeft") != OMK_SUCCESS)
                {
                        cerr << "Error, unrecognized gesture: " << "_leftScrollLeft" << endl;
                        ret = 1;
                        goto end;
                }
                /*if(pSensor->enableGesture("_rightClick") != OMK_SUCCESS)
{
cerr << "Error, unrecognized gesture: " << "_rightScrollClick" << endl;
ret = 1;
goto end;
}*/
        }

We want to rescan the gesture again and again for control the bar. So this code should be used:

  // run the main loop as long as there are frames to process
                while (sensor->isAlive() && g_run_gestures)
                {
                        // Handle the current frame only if we have successfully processed a new image
                        bool bHasNewImage = false;
                        if ((pSensor->processNextImage(true, bHasNewImage) == OMK_SUCCESS) && bHasNewImage)
                        {
                                processedFrames++;
                                while (pSensor->hasMoreGestures())
                                {
                                        const IFiredEvent* pFiredEvent = pFiredEvent = pSensor->popNextGesture();
                                        std::stringstream text;
                                        text << endl << "Gesture (" << (pFiredEvent->getName()!=NULL?pFiredEvent->getName():"") << ") fired in frame " << processedFrames;
                                        cout << text.str().c_str() << endl;
                                        std::string gestureName = pFiredEvent->getName();
                                        if (gestureName == "_rightScrollRight") {haha=1;}
                                        else if (gestureName == "_leftScrollLeft") {haha1=1;}
                                        //else if (gestureName == "_rightClick") {haha1=0;haha=0;}


                                        pSensor->releaseGesture(pFiredEvent);
                                        g_run_gestures = 0;
                                }
                        }

Notice two while()s are used. The first while() check to see if there are still sensors being used, the second while() used to detect whether more gestures are made.

Work Breakdown

1. Installed the Beckon SDK Development tools on both Linux PC host and BeagleBoard; Successfully tested the sample code and discussed the interface functions.

--completed on February 5th by Guanqun Wang and Yifei Li -10 hours

2. Successfully build the sample code using Qt creator

--completed on February 13th by Guanqun Wang and Yifei Li -26 hours

3. Successfully add kinect support to Q-ball

--completed on February 19th by Guanqun Wang and Yifei Li -40 hours

Total: 66 hours

Conclusions

This project adds kinect support to an open source C++ game using Omek Beckon SDK tool. Our gestures could be recognized by Kinect at last, but the speed is very slow, the game could not run fluently as expected. We may delve into this if more time is allowable. From this project, we understood the whole process of cross-developing from a system view. We also learned how to use Qt creator and set up cross-compile environment. With the help of Omek, we also learned how to use Kinect interface support. Because both of us do not have any experience in C++, after this project we also gained some experience and practiced a little bit. All in all, this project is very interesting, it gave us an idea of how an embedded system could be developed.