Distributed Information Systems Laboratory LSIR

Middleware for Processing Phone Sensor Data

Project Details

Middleware for Processing Phone Sensor Data

Laboratory : LSIR Semester / Master Completed




Description:
In recent times, mobile phones have been riding the wave of Moore’s Law with rapid improvements in processing power, embedded sensors, storage capacities and network data rates. The mobile phones of today have evolved from merely being phones to full-fledged computing, sensing, and communication devices. It is thus hardly surprising that over 5 billion people globally have access to mobile phones. These advances in mobile phone technology coupled with their ubiquity have paved the way for an exciting new paradigm for accomplishing large-scale sensing, known in literature as participatory sensing (Burke et al., 2006; Campbell et al., 2006). The key idea behind participatory sensing is to empower ordinary citizens to collect and share sensed data from their surrounding environments using their mobile phones.

Mobile phones, though not built specifically for sensing, can in fact readily function as sophisticated sensors. The cameras on mobile phones can be used as video and image sensors. The microphone on the mobile phone, when it is not used for voice conversations, can double up as an acoustic sensor. The embedded GPS receivers on the phone can provide location information. Other embedded sensors such as gyroscopes, accelerometers, and proximity sensors can collectively be used to estimate useful contextual information (e.g., if the is user walking or traveling on a bicycle). Further, additional sensors can be easily interfaced with the phone via Bluetooth or wired connections, e.g., air pollution or biometric sensors. These sensors also provide useful contextual information about the surrounding environment and the activities undertaken by the individuals carrying the phone.

In this project, we seek to develop a system for collecting multi-modal sensory data from mobile phones carried around by people in naturalistic settings. Such a sensor-rich data set will provide useful insights about the activities and environmental characteristics of the users as they go about their day to day lives. In particular, the following sensors will be considered for the first stage of implementation - (i) accelerometer and gyroscope (ii) sound - a variety of features which need to be investigated (to check SoundSense) (iii) Anonymized call logs and SMS records including length of text messages and duration of voice calls (iv) cell ids (v) light sensor, (vi) proximity sensor. Efficient sampling methods will be investigated to ensure that the phone battery lasts for an entire day. In the second stage of the implementation additional sensors such as  GPS logs, images taken from the camera, bluetooth encounter logs, WiFi access point logs, will be considered.

An android based app will be implemented for this data collection campaign. The app will have a UI which allows the user to turn the application on and off and also selectively switch of some sensors such as the microphone. An optional mechanism to annotate the activities undertaken will also be available.

The system will broadly consist of 3 modules - (i) sensing module for data collection (ii) data upload module for opportunistically uploading the sensor data logs whenever the phone connects to the user's preferred WiFi network (e.g., home or office network) and (iii) back-end database which will be based on the GSN system at EPFL.

A large-scale data collection campaign will be conducted by requesting volunteers in 3 different countries (Australia, India and Switzerland) to install this application on their mobile phones. The data will later be analyzed to uncover interesting insights into the ability of the phone sensors to accurately detect user context.

More potential projects will be added soon. Interested students are kindly asked to contact Zhixian Yan.

Requirements:
Java, SQL
             Optional: Android Development, Machine learning tools (libsvm, weka)

Site:
   
Contact: Zhixian Yan