The BehaviorScope project seeks to develop a framework for understanding patterns and behaviors from sensor data and metadata in distributed multimodal sensor nodes. Patterns and behaviors (especially of humans) will be parsed by a hierarchy of probabilistic grammars and other mechanisms into a compact and more descriptive semantic form. These higher-level interpretations of the data will provide the necessary network cognition needed to provide services in many everyday life applications such as assisted living, workplace safety, security, entertainment and more. The project will use a lightweight camera sensor network as its primary platform and will focus on two types of spatio-temporal data processing. At the local sensor's field of view, this research will investigate the design of filters for robustly detecting humans as well as their gestures and postures. At a more macroscopic level, collections of sensors will coordinate to detect longer term patterns of behavior. The expected outcome is a new data interpretation framework that can understand the spatial and temporal aspects of data and respond to them with meaningful services. To collect real data and to demonstrate the developed concepts in practical applications, this work will use assisted living as the driver application. In this context, the developed sensor network will supervise the behaviors of elders living alone at home to generate daily activity summaries, post warnings and alarms when they engage in dangerous activities, and provide a variety of services that increase the autonomy and independence of these individuals.
The BehaviorScope Project: Sensory Grammars for Sensor Networks is a two-year, $150K grant.