File:GSoC' 19 Mind control project.txt

Summary
=Mind Controlling and Environment Manipulation= The proposed idea uses brain waves i.e alpha, beta, gamma and delta to read the persons brain activity. These are categorised based on the data collected from UCI and attention and meditation level is calculated along the values from different alpha, beta, gamma and delta bands. To convert the raw EEG data to category based numerical data is running on Arduino. So if the attention level id greater than what is specified by the user then the arduino will communicate with single board computer through digital IO pins that will act as a switch and if the attention level is high for 5 seconds then it will send a trigger that will turn on the camera and the single board computer will be running an text detection and text recognition model using Tesseract and OpenCV that will detect the text of the appliances or devices infront of it. After detecting and recognising the text the single board computer using IR will send the command wirelessly to the remote arduino connected to the main power supply of the appliance or device hence it will aloow using your brainwaves and and your gaze to the appliance to control it. The camera will move in the direction of where the user is looking at using servo motors.

Student: Rashbir Singh Mentors: Jason Kridner Code: Github Wiki: GSoC: GSoC entry

=Status= This project has a ready prototype.

=Prerequisite= Here I will input the link to Hello World code.

=About me=
 * IRC: RashbirSingh
 * Github: Github
 * School: Amity School of Engineering and Technology, Noida
 * Country: India
 * Primary language: Hindi and English
 * Typical work hours: 8AM-10PM IST
 * GSoC participation: This is my first time participating in GSoC. As I heard about it from one of my friend and as I have so many projects that uses IoT and Various ML models so I though of giving it a try to see if something good come out of it. I love to learn and GSoC is the best platform for it to work under such smart brains and get professional outlook and learn from people who are best in their race. I love to work in IoT and data field and beagle bone is perfect for that domain that I am looking for, I have been developing projects for past 3 years in IoT and recently started learning data science.
 * Skills: Machine Learning, Data acquiring and data analysis, Data Engineering, Computer vision, Internet of things, Cloud deployment and API generation, circuit designing, Android programming, Natural language processing.
 * Tools(proficient) : Git, Linux, Rapid miner.
 * Operating System: Linux, OSX, Windows.
 * Languages: C, C++, Embedded C, C# Java, Python, Arduino programming.
 * Web front end and backend: HTML, PHP, database and connectivity.
 * Hardware Skills: Raspberry Pi, Arduino, BoltIoT, NodeMCU.

=About your project= Project name: Mind Controlled devices and environment manipulation

Description
This project uses EEG headgear that is a Neurosky and uses Arduino to convert raw brain waves data into interpretable numerical format code for it is available on github that interfaces with single board computer running python script to control the onboard small camera and uses text detection and text recognition and transmits command to another Arduino over IR and wirelessly control the appliance by looking at it and paying attention. It will help people with physical disabilities, speech impairment, old aged and have applications in industrial factories.

Previous Work Done
I have previous experience with working and using EEG headgear and have an EEG headgear of my own that I can use for this project. To show my ability I would like to share a video of my sample work. Sample work video[Figure 1] for Brain controlled servo based on the data collected using Arduino and Neurosky.

Software part is consist of

 * C++
 * Python
 * linux bash scripting

Hardware used

 * Microcontroller like Arduino
 * Single board computers like Beagleboan or Raspberry Pi
 * Infrared lights for communication
 * Relays
 * LEDs
 * on Camera more text detection
 * Servos
 * Bluetooth Module

Methodology


Figure 2. shows the methodology used to develop a device to provide people with the ability to turn appliances ON and OFF just by concentrating on the specific appliance.

The technology used here is Bluetooth to transfer brainwaves over Bluetooth to the Arduino micro-controller from EEG. Then the raw brainwaves are converted to the numerical format and if the concentration is more than the specified than it sends a trigger to the raspberry pi over GPIOs which then triggers a python script. The script analyzes if the concentration to a specific object is more than 5 seconds than the raspberry pi runs a script to take pictures of the object which then is analyzed by the OCR running with the help of OpenCV in a python environment.

If the text says LED then it triggers and supplies the power to the LED, hence result in turning the LED on. If the text is not LED then the scripts run again and stay in a constant loop.

Future scope and areas of implementation
This proposed technology has a very broad area of implementation including
 * Hospitals
 * Visually impaired
 * Physically impaired
 * Speech impaired
 * Industries to control equipment
 * Household to have a full brain controlled environment
 * In field of studies, inorder to undestand the extend of learning an indvidual student is making

Detailed Page
Detailed Description PDF can be found here.

=Timeline= Timeline of milestone

2019-05-27: Initiation

 * Creating proper prsentable documentation for the selection in GSoC'19.

2019-06-03: Milestone #1

 * Creation of basic documentation.
 * Collecting all the necessary hardware.
 * Creating this milestone video.
 * Discussing and carefully explaing topic of choice to the mentor.

2019-06-10: Milestone #2

 * Creating virtual circuit diagram for future reference.
 * Understanding the I/O supply required by each sensor and the entire flow of current through the circuit.
 * Understanding schematics of each sensor, microcontroller, single board computer and other hardware.
 * Delivering a circuit diagram to the mentors and recording a youtube video for this phase.

2019-06-17: Milestone #3

 * Creating Arduino script to interpret EEG and convert raw data into tabular form.
 * Creating the script to store I2C data from serial port to a flat file.
 * Delivering the code along with the generated flat file to the mentor.
 * Recording youtube video for this phase and uploading for keeping a constant update.

2019-06-24: First evaluation milestone

 * Developing prototype hardware running the code Without IR support.

2019-07-01: Milestone #5

 * Adding cloud support and developing API.

2019-07-08: Milestone #6

 * Adding IR support and multiple micrcontrollers.

2019-07-15: Milestone #7

 * Compacting the size by removing bulky microcontrollers and adding a single chip to made device more mobile and easy to install.

2019-07-22: Second evaluation milestone

 * Delivering the project with compact and optimised version of the device.

2019-08-05: Milestone #9

 * Adding Iris controlled servo for camera.

2019-08-12: Milestone #10

 * Final delivery and quality check.

2019-08-19: Completion

 * Coding completion milestone
 * Completion Youtube video<br

=Experience= I have 3+ years of experince with micro controllers and single board computers and working on linux for more than 2 years. I have also filled several patents and published several papers in this domain only. Have won several award in my university for most innovative idea and best project idea. I have also worked on very thin deadlines and successfully deliver my promise even if that takes me to spend countless sleepless nights. Please go to my Linkedin to know more about me.


 * Refer this to know project related experience.


 * Please refer my CV to know more about my experience.

=Contingency= I will simply try to first google the solution, refer books. I am good at debugging and a self learner so if one thing fail I will try to find alternative way to implement it. Like one time My wifi module got burned and I had no other module left while my whole script was based on wifi module and was controlling the room using it. So I changed the implementation from Wifi to I2C over usb and converted the whole implementation from HTML to command line based as I had to present the project the next day. So I can surely say that I will find my way out, and in worst case scenario I will ask some professional I personally know to help me out.

=Benefit= Please refer to know in the above section. This project is scalable and can be implemented in various other domains and disciplines and can make brain computing a basic integral part of every project.

Mehreen 1/9/15          Re: [beagleboard] Re: Interfacing BBB with Windows Computer TJF, Thank you for your input. I do realize that my goals seem a bit ambiguous right now in reference to BBB, it is because I have a little difficulty understanding how it works. Having used the Arduino, I forget it is not as easy as to write the code in an Arduino software and simply burn it to the board that is connected to my PC using the USB cable. My project is based on Brain Computer Interface: a headset will extract brain signals from a user, those signals shall be amplified, filtered, extracted and finally classified. These signals will then determine one out of the two movements to be performed on a robotic gripper having (at the moment) two fingers and a thumb. Previously, this project was done using offline data. Ours require processing those signals in real time. My senior who pursued this project using Arduino Uno told me when I asked about: - what is the microcontroller used for The arduino was connected to a SD card shield that used SPI communication to transfer data from the card to                                 the controller.nThe card had a file with the EEG  raw data on it in form of a text file. - what was the input to your Arduino? A text file with eeg raw data. - what was the output? A signal on a port that ran h-bridges. (h-bridge logic) (Yes, controlling a DC motor)