I am Corten Singer Raised in San Diego UC Berkeley, Class of 2017 B.A. in Computer Science • B.A. in Cognitive Science Human-Computer Interaction / Assistive Tech / Rapid Prototyping / Arduino DIY
⇦
Smart House
March 2017
[
Arduino
Eagle
Electomyography (EMG)
Particle Photon
Fusion 360
]
This project won the Autodesk Design Award at the TOM:Berkeley Make-a-Thon held at UC Berkeley in March 2017. For this project, we worked with our friend Owen Kent who has very limited ability to physically interact with the world. He currently has a single joystick attached to his wheelchair through which he uses his computer and other devices that can take a mouse as input. Prior to this Make-a-Thon, we observed a few key problems that we aimed to solve with our invention.
Specifically, Owen previously had to interface with a very buggy RFID system that was synchronized with a wall-mounted button in order to open his door. The button can open the door from the outside, but he has to time it exactly with his buggy RFID transmitter, and it often takes him multiple tries to open his door. Moreover, Owen is severly constrained by his mouse mounted on his wheelchair. Unless he is seated in his wheelchair, he has no way to connect with any device. He has been wanting to design a system to allow him to use his tablet while lying down in bed.
We created a new joystick USB mouse that will allow Owen to talk to the various devices in his home that he currently has no means of controlling (this includes his door, lights, heater, speaker, etc.). In a little more detail, this new mouse enables him to have a more efficient interaction with his devices while also promoting bodily movement that he previously had no motivation to do. The mouse was built with an Arduino-like microcontroller with Electromyographic (EMG) sensors that can detect and process slight muscle movements in each of his hands and a 3-axis accelerometer that can detect stomach inflation/deflation. I programmed the device to be able to left-click and drag when Owen flexed his left hand and to right-click when he flexed his right hand. His stomach movement could easily be mapped to small functions, such as taking a screenshot. This solution works around his existing problematic setup where all clicks are sensed in software which can be difficult to operate, let alone much slower to use (must stall above an object for a certain time period in order to click, or nudge the joystick in a particular way). This EMG clicking mechanism also provides Owen with the motivation to engage in physical training of his existing hand muscles.
This solution provides Owen with a device that not only allows him to control his door with ease (not to mention allowing him to control various appliances like bedroom lights), but it lets him do so from his bed! We designed multiple mounts that could be placed near hes bed so that his tablet and our mouse could be operated without ever having to get up. Our invention gives Owen the freedom to interact with his house, especially when opening his door, while also giving him the freedom to connect to the world when he is not in hiw wheelchair. The mouse can virtually work with any PC or tablet available.
My friend Daniel Stickney, who suffers from cerebral palsy, is bound to a motorized wheelchair. Danny also suffers from cortical vision impairment, which makes successful independent wheelchair navigation next to impossible. As such, Danny struggles to explore unfamiliar terrain without supervision out of fear that he may crash and put his safety at risk. I joined with a team of friends and hackers to invent a solution to his problem of navigation. With an array of sensors and actuators, we modified Danny’s wheelchair to include a spatially-aware feedback system that alerts him when a potentially dangerous obstacle is approaching. Our modifications included sensors that detect when a drop-off (ie, step or curb) is in front of the wheelchair, when an obstacle is nearing the rear of the wheelchair, and when the wheelchair is approaching the edge of a ramp. This information is then relayed to Danny via both haptic and auditory feedback, which work together as a way to prevent the sensory overloading of any single modality.
Team
Pierre Karashchuk
•
Stephanie Valencia
•
Oscar Segovia
•
Ryan Sterlich
•
Kelly Peng
Tomás Vega
Corten Singer
⇦
SmartWheels
Fall 2016 - Ongoing
[
Eagle
LPKF CircuitPro
Laser Cutter
Raspberry Pi 3
Python
]
My team and I develop a self-driving, target-following, obstacle-avoiding motorized wheelchair for our friend Stephen Chavez. Using Stephen's research in reverse-engineering his motorized wheelchair's control system, we were able to communicate with and mount sensors onto a wheelchair in order to enabled dynaimc navigation. We use a Raspberry Pi 3 Model B with a PiCAN 2 shield to send commands to the wheelchair via the R-net protocol. An iPhone application is used to track AprilTags (2D barcodes developed for robotics applications) with its camera and send data to the Raspberry Pi via UDP. Finally, we use three ultrasonic sensors attached to the Raspberry Pi to detect obstacles.
Team
Yash Shah
•
James Musk
•
Tomás Vega
•
Corten Singer
⇦
ReOrient
Fall 2016
[
Node.js
Wifi-Enabled MCU
3D Printing
Eagle
]
For this final project, my team and I created ReOrient, which was developed in tandem with UCSF and their recent campaign to reduce and prevent the symptoms of hospital-induced delirium in patients. The UCSF Medical Center’s Chief Innovation Officer, Ralph Gonzales MD MSPH, has directly expressed interest in the further development of the project in the near future as it could save a drastic amount of overhead that the nursing staff must deal with on a daily basis. The device monitors a patient’s environment to identify any potential irregularities that may be contributing to sleep, and sensory, deprivation. The device also serves as a source of sensory stimulation that is meaningful to the patient by allowing audio messages from loved ones to be sent to the hospital room from the internet.
Team
Philip Brown
•
Levent Beker
•
Nicci Cazares
•
Corten Singer
⇦
AliviaRÁ
March 2016
[
Arduino
Eagle
iOS
Python
]
Won first place at UC Berkeley’s Hack for Humanity. We developed a smart rehabilitation system that aids people with arthritis improve their joint flexibility and reduce pain.
An iOS app runs the patient through a series of exercise. The glove measures flex levels of every finger and connects to the iPhone app via bluetooth. The app provides real-time visual and haptic feedback on performance, allowing the patient to know whether they’re doing the exercise correctly. Additionally, when the system senses that the user is having difficulty replicating a hand position, vibration motors embedded in the glove will massage the user's joints to loosen muscles and to relieve pain. This concept exploits the gate-theory of pain in the human nervous system. Essentially, both of the signals coding for pain and vibration sensation are sent through the same neural pathway, but this pathway has a limited bandwidth meaning that when the glove's vibration motors are active, their signals will drown out those of pain.
By tapping a button on the app, the patient is able to tag when pain is experienced. At the end of each exercise session, data is relayed to a server on the cloud which then performs data analytics. Optimal exercises that target the patient’s specific positions of pain are computed and suggested to the user.
Moreover, regular progress reports are automatically generated and sent to the patient and their doctor/physical therapist. This includes information on finger flexibility improvement, recurring positions of pain and suggested exercises.
The whole system was developed in 40 hours.
Team
Ghassan Makhoul
•
Dino Digma
•
Alejandro Castillejo
•
Tomás Vega
•
Corten Singer
⇦
Hand Shaped Text Entry Device
September 2016
[
Laser Cutter
Processing
Arduino
]
This was an an assignment for CS294-84: Interactive Device Design that asked for a functional prototype of a novel electronic text entry device. My device was based on a chorded keyboard technique (inspired by Englebart) requiring only 5 bits of information (5 buttons, 5 fingers) to capture all 26 letters of the alphabet with some extra characters reserved for punctuation. I used a RedBear MCU to communicate the signals serially to a computer. Along with the hardware, I design a Graphical User Interface (GUI) using the Processing language to give visual feedback to the user while composing sentences.
Team
Corten Singer
⇦
Prototype Ring
February 2016
[
Fusion 360
3D Printer
]
This was my first 3D modeling project, in which I designed a 3-finger ring prototyping tool for portable and wearable electronics. The ring has space for 2 mini breadboards and there is a pocket to hold a 3.7V LiPo batttery to power your circuit.
The particular circuit featured in the photo here is a synesthetic ring that uses a photodiode to sense brightness levels in the environment, and it proportionally translates this intensity through a vibration motor that stimulates the ring wearer. You can now get a sense of how bright it feels.
Team
Corten Singer
⇦
Wanderer
Spring 2016
[
Arduino/Flora
Conductive Thread
NeoPixel LEDs
Bluetooth
]
This project was created for the final design provocation for CS294-85: Critical Making. We designed a smart jacket with a deviation from the usual attention-grabbing applications of todays market. Our goal is to encourage users to embrace the environment around them by providing tailored suggestions of where to explore via our iOS app. Users can input their destination, but with a list of areas that are personally interesting to see. Wanderer will then help the users to be drawn by the attractions of the terrain, enabling a more immersive navigation experience.
This intelligent jacket was made with embedded electronics that receive GPS data from an iOS app via Bluetooth (Flora BLE MicroController). The jacket directs users with intuitive NeoPixel wristbands and an on-board wearable electronic compass. With this wearable device, we intend to free users from their devices while still exploiting their data services. The users do not rely on their phones, rather normal peripheral vision is all users need to perceive the NeoPixel wristbands.
Team
Daniel Goldberg
•
Tina Pai
•
Lydia Gilbert
•
Corten Singer
⇦
MindSweeper: Toward Haptic Cognitive Prostheses
Fall 2016 - Ongoing
[
Fusion 360
Eagle
Arduino
OpenCV
C
]
Research project from UC Berkeley Department of EECS in Human-Computer Interaction. We investigate an intelligent wearable feedback system capable of offloading cognitive demand during task exucation. We use the classic computer game Minesweeper as our task due to its challenging arithmetic nature. Simultaneously, we study the effects of designing novel haptic techniques that stimulate particular mechanoreceptors in our dermis/subdermis. The overall goal is to explore new human-computer interfaces that better utilize our natural sensing capabilities in order to convey information more effectively.