Fusing and Analyzing Physical and Social Sensors on Smartphones using Web SocialSense

Wednesday, March 28, 2012 - 16:30
TH 331
Thomas Phan, Ph.D. (Samsung Research & Development)

Modern smartphones, including those running iOS, Android, and other mobile operating systems, offer a rich selection of on-board sensors, where
sensor access is typically performed through library API calls provided by the phone's operating system. Because both the smartphone platforms'programming language environment and the sensor libraries may differ in important ways, writing cross-platform software is difficult. In this talk I will explore the viability of implementing sensor acquisition and processing entirely in the Web browser layer with JavaScript and HTML5. In our lab at Samsung R&D, we developed Web SocialSense, a JavaScript framework for writing sensor-driven applications using a graph
topology-based programming paradigm with source, processor, and fusion nodes connected by edges. The resulting system allows programmers to write
personalized, context-aware applications by dynamically fusing time-series signals from physical sensors, such as the accelerometer and geolocation services, as well as social software sensors, such as social network services and personal information management applications. With this framework we then implemented components for physical activity recognition and social software sensing to drive two context-aware applications, Social Map and ActVertisements.


Thomas Phan is a researcher at Samsung R&D Center in San Jose, CA. He currently conducts exploratory research in smartphone sensing and activity recognition. He received his PhD from UCLA in 2002.