OmniSense: A Collaborative Sensing Framework for User Context Recognition Using Mobile PhonesHotMobile '10, Annapolis, M.D.
- Computer and Systems Architecture,
- Computer Engineering,
- Data Storage Systems,
- Digital Circuits,
- Digital Communications and Networking,
- Electrical and Computer Engineering,
- Hardware Systems,
- Other Computer Engineering,
- Other Electrical and Computer Engineering,
- Systems and Communications and
- VLSI and Circuits, Embedded and Hardware Systems
Date of Original Version1-1-2010
Abstract or DescriptionContext information, including a user’s locations and activities, is indispensable for context-aware applications such as targeted advertising and disaster response. Inferring user context from sensor data is intrinsically challenging due to the semantic gap between low-level signals and high-level human activities. When implemented on mobile phones, more challenges on resource limitations are present. While most existing work focuses on context recognition using a single mobile phone, collaboration among multiple phones has received little attention, and the recognition accuracy is susceptible to phone position and ambient changes. Simply putting a phone in one’s pocket can render the microphone muffled and the camera useless. Furthermore, naïve statistical learning methods used in prior work are insufficient to model the relationship between locations and activities.
Citation InformationHeng-Tze Cheng, Senaka Buthpitiya, Feng-Tso Sun and Martin L Griss. "OmniSense: A Collaborative Sensing Framework for User Context Recognition Using Mobile Phones" HotMobile '10, Annapolis, M.D. (2010)
Available at: http://works.bepress.com/martin_griss/6/