BeagleBoard/GSoC/2023 Projects

Links

 * BeagleBoard.org GSoC page: http://bbb.io/gsoc
 * Status reports: http://bbb.io/gsocml
 * Live chat: http://bbb.io/gsocchat
 * Google GSoC site: https://summerofcode.withgoogle.com/
 * YouTube Playlist BeagleBoard.org GSoC 2023: TBD

Weekly reports

 * 1) All weekly reports will be sent to the mailing list (as that is our primary support venue outside of live chat) on a single thread (to avoid e-mail thrash).
 * 2) They must be sent on Monday to allow for mentors to respond ahead of Wednesdays IRC meeting where all blockers will be discussed live.
 * 3) They must include the following sections:
 * 4) * Accomplishments
 * 5) * Resolutions to blockers
 * 6) * On-going blockers
 * 7) * Plans for the next week

Replace GBridge
The main goals of this project is as follows:
 * 1) Introduce a Platform Driver that facilitates communication between AM6254 and CC1352 in BeaglePlay over UART.
 * 2) Move SVC and APBridge roles from GBridge into CC1352 firmware.
 * 3) Eliminate GBridge and other components that are no longer necessary.
 * 4) Add a tutorial to the documentation to use the new setup.

Milestones

 * 1) Introductory Video ✅
 * 2) Introductory Blog Post ✅
 * 3) Run old GBridge setup on BeaglePlay and BeagleConnect Freedom ✅
 * 4) Write a Hello World Linux Driver for BeaglePlay. ✅
 * 5) Write a Hello World Zephyr Module for CC1352. ✅
 * 6) Implement sending data from AM62 over UART on Linux Driver. ✅
 * 7) Implement receiving data from AM62 over UART on Zephyr Module. ✅
 * 8) Implement receiving data from CC1352 over UART on Linux Driver. ✅
 * 9) Implement sending data from CC1352 over UART on Zephyr Module. ✅
 * 10) Establish socket connection with beagleconnect freedom node. ✅
 * 11) Implement HDLC communication between Linux driver and beagle play CC1352.
 * 12) Implement sending/receiving Greybus-specific events and messages over UART.
 * 13) Add tests for Linux Driver
 * 14) Submit the initial Linux driver to the mailing list
 * 15) Implement SVC Role in the Zephyr Module.
 * 16) Implement APBridge Role in Zephyr Module.
 * 17) Test the complete new BeagleConnect set up with all the parts.
 * 18) Add documentation about the new setup.
 * 19) Get Linux Driver merged upstream.
 * 20) Final Youtube Video.

Zephyr on R5/M4F (K3)]
The goals of the project:
 * 1) Add Zephyr RTOS support to run on Cortex R5 processor cores loaded from A72 core running Linux through remoteproc.
 * 2) Add few peripheral support (Interrupts, Gpio, UART, Timers) on TDA4VM SoC.

Milestones

 * 1) Introductory Video.
 * 2) Create essential board directory, defconfig and device tree bindings for board.
 * 3) Add pinctrl support
 * 4) UART controller support for TDA4VM in Zephyr.
 * 5) Bring up Shell through UART in polling mode in Zephyr.
 * 6) Understand VIM interrupt programming model with AM263x SDK as reference.
 * 7) VIM Interrupt Controller support to Cortex R5F.
 * 8) UART in Interrupt mode.
 * 9) Understanding Timers programming model in TDA4VM.
 * 10) Timers and System Tick Support in Zephyr.
 * 11) Understanding Gpio programming model in TDA4VM.
 * 12) Gpio controller support in Zephyr.
 * 13) upstreaming the developed features to Zephyr.
 * 14) Final Project Video

OpenGLES Acceleration for DL
The goals of the project:
 * 1) To accelerate the performance of the DL models.
 * 2) Identify and optimise the deep learning layers within the darknet CNN framework.
 * 3) Writing optimised shader program that can perform necessary computations in parallel.
 * 4) Evaluate the performance of the accelerated darknet framework by comparing it with the original CPU based implementation.

Milestones

 * 1) Introductory Video.
 * 2) Implement darknet on the host(laptop)
 * 3) Benchmark the performance of the pre-existing model on the laptop.
 * 4) Cross-compile darknet for 64-bit architecture (Beaglebone AI-64).
 * 5) Go through the TIDL implementation
 * 6) Benchmarking the performnace of the framework on BB-AI 64.
 * 7) Adding some simple layers to the framework.
 * 8) Adding support for these layers and get the performance  benefits.
 * 9) Implement Opengles Support to utilise OpenGLES for specific layer computation.
 * 10) Cross-compile the modified darknet.
 * 11) Test the GPU-accelerated darknet and benchmark its performance.
 * 12) Analysing the benchmark result and optimising it.
 * 13) Final Project Video

Building an LLVM Backend for PRU
The goals of the project:
 * 1) To add an LLVM support for PRU.
 * 2) Upstream the backend for PRU.

Milestones

 * 1) Introductory Video. ✅
 * 2) Get thorough with the PRU architecture, instructions. ✅
 * 3) Implement pru-gcc examples on the board, understand assembly code. ✅
 * 4) Implement cpu0 backend. ✅
 * 5) Add the register set and register classes. ✅
 * 6) Add the Instruction set. ✅
 * 7) Describe the Target Machine for PRU.
 * 8) Describe the Target Registration.
 * 9) Construct the SelectionDAG.
 * 10) Add the Instruction Selector.
 * 11) Add the Assembly printer.
 * 12) Add the Machine code Emitter (JIT support).
 * 13) Testing and Debugging on various boards.
 * 14) Upstream the backend.
 * 15) Final Project Video.