The Next Generation of Brain-Computer Interfaces: Responding Implicitly to Users' Cognitive State

Date: 
Thursday, March 3, 2016 - 10:00
Location: 
TH 331
Presenter: 
Beste Yuksel
Abstract: 
The human and computer are both complex machines, capable of sophisticated functions, yet there is a very narrow bandwidth of communication between them. A new generation of brain computer interfaces (BCIs) are currently being developed that can increase this communication bandwidth by passively detecting users' cognitive state and responding appropriately in real-time. In my talk, I present two examples using a musical BCI. The first increases learning speed and accuracy in pianists by increasing task difficulty as cognitive workload decreases. The other aids users in the tricky task of creativity by adding and removing musical harmonies based on cognitive workload during musical improvisation. I will discuss the broader future implications of this next generation of user interfaces.
Bio: 

Beste Filiz Yuksel is completing her Ph.D. in Human-Computer Interaction at Tufts University, working with Robert Jacob. Her research is on a new generation of brain-computer interfaces (BCIs) that detect and evaluate real-time brain signals using machine learning classification of functional near infrared spectroscopy (fNIRS) to build adaptable user interfaces for the general population. Her interests include user interaction in the areas of learning, creativity, and information visualization. She has also interned at Microsoft Research with Mary Czerwinski, investigating user-virtual agent interactions for the next generation of intelligent personal assistants. Beste received her Masters in Computer Science from University College London, UK, where she built a BCI using the P300 signal to select physical objects and a hybrid-BCI in a fully immersive virtual environment (CAVE). Her work is currently under consideration for a Best Paper Award at ACM CHI 2016.