BeagleBoard/bb blue api v2

From eLinux.org
< BeagleBoard
Revision as of 01:32, 26 March 2017 by Kiran.l13 (talk | contribs) (Description)
Jump to: navigation, search


ProposalTemplate

{{#ev:youtube|Jl3sUq2WwcY||right|BeagleLogic}}

A short summary of the idea will go here.

Student: Student
Mentors: Jason Kridner
Code: https://github.com/BeaglePilot
Wiki: http://elinux.org/BeagleBoard/GSoC/ProposalTemplate
GSoC: GSoC entry

Status

This project is currently just a proposal.

Proposal

Please complete the requirements listed on the ideas page and fill out this template.

About you

IRC: Freenode IRC nickname
Github: Github account
School: School name
Country: Country
Primary language (We have mentors who speak multiple languages): Language
Typical work hours (We have mentors in various time zones): 8AM-5PM US Eastern
Previous GSoC participation: Provide list of URLs for previous participation or tell us why you want to participate here. No previous experience required.

About your project

Project name: Super Awesome Project

Description

Following are the complete set of project goals and their challenges which I plan to deliver at the end of the tenure: Stereo camera support for the Blue: The very first step of this project is to develop the kernel drivers for the devices/modules on BB Blue. Presently most of the kernel drivers for the onboard modules are present in 4.1 kernel. I have already configured the kernel driver for MPU-9250 which is a combination of MPU-6050 and AKM8963. I would also implement an optical flow algorithm to determine the planar velocities./

Porting userspace drivers to Kernel: The very first step of this project is to develop the kernel drivers for the devices/modules on BB Blue. Presently most of the kernel drivers for the onboard modules are present in 4.1 kernel. I have already configured the kernel driver for MPU-9250 which is a combination of MPU-6050 and AKM8963.

Kernel driver support for applications: Once the Kernel drivers are created for the peripherals, these will added to the applications requiring them. Some of the applications are: Ardupilot ROS Cloud

Ardupilot support for BB Blue: By the end of this project BB Blue support for Ardupilot will be developed which essentially involves implementing the AP_HAL APIs using the BB Blue APIs. mavlink will also be integrated with the APIs which would enable the board to communicate with other mavlink supported applications. Rotatory Standalone ROS for BB Blue: The APIs will also be used for interfacing the onboard devices with the ROS middleware by creating some custom packages mentioned below: bbb_io: ROS node to get BB Blue input from buttons and output the control signals to the LEDs. bbb_dcmotor: ROS package that launches a node to control a DC motor connected to the BB Blue. bbb_servo: ROS package that launches a node to control a ESC connected to the BB Blue servo port. bbb_encoders: ROS node which publishes the encoder data from BB Blue to a topic. bbb_imu: BeagleBone Blue ROS package to publish the Invensense MPU-9250 data into a topic. bbb_baro: ROS package to publish the data from BMP-280 values to a topic.

Following ROS packages will also be tested on BB Blue to check the connectivity:

mavros: ROS interface for mavlink roscopter: ROS interface for ArduCopter using Mavlink 1.0. Using this package, a quadcopter can be controlled directly from ROS overriding the RC commands.

Documentation and examples: I will provide extensive and accurate documentation for whatever I build in this project. Functional documentation for the APIs will be done in doxygen. Code documentation will be as comments in the source file.

Timeline

Provide a development timeline with a milestone each of the 11 weeks. (A realistic timeline is critical to our selection process.)

2017-06-06: Milestone #1
2017-06-13: Milestone #2
2017-06-20: Milestone #3
2017-06-27: Milestone #4
2017-07-04: Milestone #5
2017-07-11: Milestone #6
2017-07-18: Milestone #7
2017-07-25: Milestone #8
2017-08-01: Milestone #9
2017-08-08: Milestone #10
2017-08-15: Milestone #11

Experience and approach

In 5-15 sentences, convince us you will be able to successfully complete your project in the timeline you have described.

I am a fourth-year undergraduate student studying in India. Besides having a key interest towards networking, robotics and systems-related courses I also like hacking on embedded electronics. I like to work on an open-source project this summer because it is interesting and contributing to the project is fun and exciting. I did not work much on open-source before, but I have some idea about how things work in open-source community which seem to be very fascinating.

Accurate and Augmented Localization and Mapping for Indoor Quadcopters: In this project, a state-estimation system for Quadcopters operating in indoor environment is developed that enables the quadcopter to localize itself on a globally scaled map reconstructed by the system. To estimate the pose and the global map, we use ORB-SLAM, fused with onboard metric sensors along with a 2D LIDAR mounted on the Quadcopter which helps in robust tracking and scale estimation. [github]

Enhancing ORB-SLAM using IMU and Sonar: Increased the accuracy and robustness of ORB-SLAM by integrating Extended Kalman Filter (EKF) by fusing the IMU and sonar measurements. The scale of the map is estimated by a closed form Maximum Likelihood approach. [github]

Semi-Autonomous Quadcopter for Person Following: Developed an IBVS based robotic system, implemented on Parrot AR Drone, which is capable of following a person or any moving object and simultaneously measuring the localized coordinates of the quadcopter, on a scaled map. [github]

API Support for Beaglebone Blue: Created easy-to-use APIs for Beaglebone Blue. With these APIs, applications can be directly ported onto the board. This project was a collaboration of Beagleboard.org with the University of California, San Diego as part of Google Summer of Code 2016. [github]

Intelligent Parking System for Autonomous Robot: Using Beaglebone Black as an onboard microcontroller, the robot finds the park set-point by matching features using SURF descriptors on the template image and directs the actuators connected to PRU (Programmable Real-time Unit). https://github.com/kiran4399/parking_system_cv

I plan all my work properly and sketch out a perfect routine so that the work planned gets completed within the given time. I always sketch out priorities and keep priority management above time management. My policy is: “Hard-work beats talent when talent doesn't work hard !!”. I strongly feel that striving to know something is the best way to learn something. I can assure that I will work around 50-55 hours a week without any other object of interest. I also hope for lot of learning experience throughout the program and come closer to the open-source world.

Contingency

What will you do if you get stuck on your project and your mentor isn’t around?

Benefit

kiran4399: I already read and watched a video on how BB Blue can be used to make robotics education much more accessible to students. By developing these APIs, students are interested in applying the high-level concepts like localization, mapping, pose estimation, position control etc. would be benefited. Also hobby enthusiasts who are interested in making different kinds of robots will also benefit from this project.

Suggestions

Is there anything else we should have asked you?