Jetson/Tutorials/Walking Follower Robot

From eLinux.org
< Jetson
Revision as of 19:29, 1 June 2015 by Shervin.emami (talk | contribs) (Added info about SCOL robot)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Building the SCOL "Super-Computer-On-Legs" robot

NVIDIA SCOL robot

This tutorial explains the basics for building your own 2-legged walking robot that walks towards the nearest person it sees, using GPU-accelerated face detection with the onboard Jetson TK1 and webcam, and controlling the motors & sensors using the onboard Arduino microcontroller + motor controller, as shown in the video on youtube.

If you have already built a robot like this, you can then follow the next tutorial: Using the SCOL Robot.


Description

The SCOL robot ("Super-Computer-On-Legs") is fully autonomous, including the networking system that I didnt show in the youtube video. After you turn power on for the Jetson TK1 and the Arduino and the motors, roughly 1 minute later it begins the robot high-level C++ code that simply uses OpenCV CUDA-accelerated face detection to look for faces, and if it detects a face then starts walking either left or right or straight, towards that face. So it is a really simple high-level algorithm, mostly just using the OpenCV face detector sample code. Instead of Serial UART I used 2 GPIO signals to communicate to the Arduino microcontroller (sending 2 binary signals for left, right, straight or stop), but it would probably have been easier if I used UART instead. It uses a 3-cell LiPo battery pack for the Jetson TK1 + Arduino + motor controller, and a separate 4-cell NiMh battery pack for the motors, to reduce the amount of electrical noise from the motors going into the Jetson TK1 or Arduino systems, since that can cause a LOT of problems.


I added CUDA-accelerated video stabilization into the Jetson TK1 code because I thought it would be necessary for onboard vision of a walking robot, but it turned out that the face tracking behaviour worked just as well without video stabilization, and since the video stabilization uses a lot of battery power, I eventually removed it.


The total power draw for the Jetson TK1 is around 3 Watts when idle and around 7 Watts when doing continuous face detection. The face detection can be switched between plain CPU code and CUDA-accelerated code that is roughly 5x faster (ie: 5 FPS instead of 1 FPS for a full 720p image) while using up the same amount of battery power.


The Jetson TK1 also checks roughly once every 5 seconds to see if the USB Wifi dongle is attached, and if so it connects automatically to the specific Adhoc Wifi network I setup on my Android tablet. If the Wifi is connected and it detects that my tablet has opened a VNC remote desktop / X graphical environment, it will stop the robot application then let the user run the graphical robot application manually from the GUI, that will stream the processed video output (showing the camera feed + detected faces overlayed on it). Unfortunately, the refresh rate for this VNC session is typically just 2 to 5 FPS over Wifi, so I didn't show it in the demo video.


Hardware

  • a LynxMotion "BRAT" walking robot kit with BotBoarduino and Servo Upgrade. Note that BotBoarduino is an Arduino clone combined with a built-in servo motor controller. You could just as easily use any Arduino board and plug in a servo motor controller shield.
  • a Jetson TK1
  • a 3S LiPo battery pack for the Jetson and BotBoarduino microcontroller
  • a 4S LiPo battery pack for the motors
  • a USB webcam
  • a USB Wifi dongle (optional)


Source-code

The code on the Jetson TK1 basically just combines several sample programs from OpenCV:


The code on the Arduino is basically just the sample code for the walking robot kit I bought, but modified to move the legs either left or right or straight or stop, based on the 2 GPIO inputs. It also has the ability to stand itself up if it fell forwards or backwards, but due to the extra weight of the Jetson TK1 + 2 battery packs, it doesn't always work reliably so I didn't show it in the demo video.


To set up the SCOL robot with a new Jetson TK1 computer

- flash L4T onto it with a kernel that has the Wifi driver (eg: rtl8712u) built in. The firmwire is placed at /lib/firmware.
- reboot the device.
- find the IP address of the device.
- get access to the Jetson TK1 from my PC:
    export JETSON=172.17.162.41
    # Mount the whole filesystem remotely at ~/REMOTE
    mkdir ~/REMOTE
    sshfs ubuntu@$JETSON:/home/ubuntu ~/REMOTE
    # Make the console more useful
    cp ~/Desktop/SHERVS_crucial.bashrc ~/REMOTE/.bashrc
    # Open a console
    ssh -X ubuntu@$JETSON
- install CUDA toolkit
- install OpenCV4Tegra toolkit
- install some useful tools
    sudo apt-get install cmake locate tmux
    # Update the "locate" database in the background, since it often comes in handy!
    sudo updatedb &
- install SCOL_Robot code:
    on PC:
        cp -fav ~/SCOL_Robot/ON_ROBOT/* ~/REMOTE/SCOL_Robot/
- get SCOL_Robot code to run automatically on bootup:
    on PC:
        cp -fav /nv/work/SCOL_Robot/ON_ROBOT_SETUP/* ~/REMOTE/Downloads/
    on device:
        cd ~/Downloads
        sudo chown root:root rc.local run_*
        sudo chmod a+rx rc.local run_*
        sudo mv rc.local /etc/
        sudo mv run_* /etc/init.d/
        # Get the robot to start walking soon after bootup
        sudo ln -s /etc/init.d/run_SCOL_Robot /etc/rc2.d/S99run_SCOL_Robot
        # Get a terminal window to show up if an Android tablet with an X server is accessing the device through Wifi:
        sudo ln -s /etc/init.d/run_background_xterm /etc/rc2.d/S99run_background_xterm
- build the SCOL_Robot code on the device:
    cd SCOL_Robot
    mkdir build
    cd build
    cmake ..
    make -j4
    cd ..
- try running the SCOL_Robot code interactively to test it:
    sudo ./run_SCOL_Robot_GUI
- setup Wifi to connect automatically on SCOL robot bootup so without a GUI I can connect to it using Wifi on a tablet.
    Add this to the bottom of "/etc/network/interfaces" with root permissions:
        # Get the SCOL Robot's 802.11n Wifi USB dongle to work on bootup,
        # even before the graphical desktop.
        # Uses wpasupplicant to provide WPA-PSK or WPA2-PSK encryption.
        # Ran "wpa_passphrase NunaWifi" to generate the PSK (encrypted password).
        #allow-hotplug wlan0
        auto wlan0
        iface wlan0 inet dhcp
                wpa-ssid NunaWifi
                wpa-psk aj39sfk9dfj3hffd8sd9f8sdf7a9dff8ghklej4ha7jsfjfl3
- reduce the time it waits on bootup if Wifi is not working:
    edit /etc/init/failsafe.conf by disabling the 30 second and 59 second sleeps.
- allow any user to edit the SCOL_Robot folder
    cd ~
    sudo chmod a+rw -R SCOL_Robot

Using the SCOL robot and maintaining it

After you've built a robot similar to above, you can read the info at "http://elinux.org/Jetson/Tutorials/Using_the_SCOL_Robot" to see how to use the SCOL robot and maintain it.