Usability of in-vehicle systems has become increasingly important for ease of operations and safety of driving. The user interface (UI) of in-vehicle systems is a critical focus of usability study. This paper studies how to use advanced computational, physiology- and behavior-based tools and methodologies to determine affective/emotional states and behavior of an individual in real time and in turn how to adapt the human-vehicle interaction to meet users' cognitive needs based on the real-time assessment. Specifically, we set up a set of physiological sensors that are capable of collecting EEG, facial EMG, skin conductance response, and respiration data and a set of motion sensing and tracking equipment that is capable of capturing eye ball movement and objects which the user is interacting with. All hardware components and software are integrated into an augmented sensor platform that can perform as “one coherent system” to enable multimodal data processing and information inference for context-aware analysis of emotional states and cognitive behavior based on the rough set inference engine. Meanwhile subjective data are also recorded for comparison. A usability study of in-vehicle system UI is shown to demonstrate the potential of the proposed methodology.