- Ease of use
- User preferences and sensitivities
- Behavior models
- Privacy options in multi-user communicative applications
Home Exercise Monitoring: Experience Sharing in Social Contexts
This project includes the design of a vision-based exercise
monitoring system and user acceptance studies on visual representation in a social network setting.
Aiming to use the technology to promote well-being by making
exercise sessions enjoyable experiences, the project employs techniques for real-time
measurement of user exercise routines, provides instructions and feedback to the user, and offers options for experience sharing or
group gaming with peers in a virtual community. The use of avatars is
explored as means of representation of the user's exercise movements or
appearance, and the system employs user-centric approaches in visual
processing, behavior modeling via history data accumulation, and user
feedback to learn the preferences. User acceptance surveys are
conducted in this project to explore the avatar preferences in different user groups.
Reinforcement Learning for Ambient Intelligent Atmospheric Lighting and Music
While technological advances in sensing, processing, and networking of information offer a solid foundation for creating systems based on interactive data and decision exchange with the user, an influential aspect in the adoption of such systems by the public has remained less explored. Existing concepts are for the most part developed based on one of the two alternatives of either fully relying on the system's intelligence to make service decisions for the user, or having to constantly query the user to approve the action proposed by the system. These approaches constitute the two extremes in assigning authority in the actuation stage. They both face steep user acceptance challenges due to the simple fact that users have as diverse sets of preferences as their numbers. A one-size-fits-all solution built upon a set of fixed decision making rules is too rigid to cater to the diverse styles of different users, while relying on the user to memorize and act on service control choices throughout the day or respond to numerous queries by the system when a service change is proposed introduces annoying interruptions in the user's daily life. An adaptive user-centric approach offers a workable middle-ground.
In this project, our goal is to create a user-centric methodology for adaptation of system services based on accumulated knowledge about the user preferences. These preferences are learned through user's explicit or implicit feedback to the system when the user opts to react to the provided service. As a result, the system adapts to provide the most satisfactory background music and ambient lighting to the user.
In our approach, the system gains intelligence through observing the user, interacting with the user, and exploring the user's interests via mutual discovery mechanisms. Context is inferred not just through generic data such as time of the day, but also via detecting user-specific situation such as location, activity, or event.
User-centric Speaker Assistance System
Presentation as an effective communication form is practiced
everyday and everywhere. However, in most cases the
speaker may not have access to feedback on his/her own presentation
to evaluate the effectiveness of communicating information.
Without any feedback/evaluation, it is hard for
the speaker to know about his or her weaknesses in the presentation
skills. In this project, we develop a computer vision-based system
to learn the speaker's preferences on presentation styles and
to evaluate the effectiveness of his/her presentations. This will
serves as feedback to help the speaker improve presentation
skills. The system is user-centric in that the
effectiveness/scoring function is learned through the user's own review and
scoring. In this way, the system's evaluation metric adapts to user's
preferences on the presentation style. Habits and traits such as walking, use of hands, distribution of gaze among audience, gazing at the screen or the ceiling, and other body language and gestures are examples of practices considered.
User-centric Environment Discovery with Camera Networks in Smart Homes
C. Wu and H. Aghajan,
IEEE Trans. on Systems, Man, and Cybernetics Part A, 2010.
Exercising at home: Real-time interaction and experience sharing using avatars (Link coming soon),
J. Cui, Y. Aghajan, J. Lacroix, A. van Halteren, H. Aghajan,
Journal of Entertainment Computing, Dec. 2009.
Autonomous Learning of User's Preference of Music and Light Services in Smart Home Applications
A. Khalili, C. Wu, and H. Aghajan,
Behavior Monitoring and Interpretation Workshop at German AI Conf, Sept. 2009.
User-centric Speaker Report: Ranking-based Effectiveness Evaluation and Feedback
T. Gao, C. Wu, H. Aghajan,
THEMIS Workshop at ICCV, Sept. 2009.
Home Exercise in a Social Context:
Real-Time Experience Sharing using Avatars (Link coming soon),
Y. Aghajan, J. Lacroix, J. Cui, A. van Halteren, H. Aghajan,
3rd Int. Conf. on Intelligent Technologies for Interactive Entertainment (INTETAIN), June 2009.