Observer methodologies used in the field of information retrieval, mainly focused on user behavior and search performance – part of the paper that, currently, I am working on…
Facial expression is one of the main ways in which humans communicate emotional states with one another (Ekman & Friesen, 1975). In 1999, Paul Ekman developed a classification system that recognized the established six basic emotions (fear, happiness, surprise, anger, sadness, and disgust) of the Darwinian discrete approach (Darwin, 1872). In addition, Ekman’s contribution to developing Facial Action Coding System (FACS) includes extraction of various facial cues that successfully helps analyze emotional states. These facial expression analysis consist of monitoring facial skin movements as well as facial features, such as in eyebrows, changes.
Most recently, facial expression reading has become a popular methodology in web search behavior studies. Arapakis (2009), for example, applied automatic facial analysis when examining the relationship between behavior and task difficulty in online search. Moreover, when studying user responses, researchers study these responses during Google search by utilizing facial analysis (Lopatovska, 2009).
Although FACS is said to have high reading validity, is non-obtrusive, and has high accuracy, (Cohn & Kanade, 2006), it does have its own limitations. Since FACS solely focuses on facial expression, it fails to take human body movements, in addition to the face, which play a big role in communication into account (Fasel & Luettin, 2003). In adittion, FACS is unable to include the context in which user reactions occur. (Pantic & Rothkrantz, 2003).
There are many debates as to whether studying body movements is valid indicative of human emotions and behavior. However, some studies have been able to show clear connection between certain behavior and certain body gestures (Boone & Cunningham, 1998). These findings directly relate to the Darwinian theory, where some body movements directly relate to specific emotional states (Wallbott, 1998). Furthermore, Gunes and Piccardi (2007) identify six facial and body gestures, connecting them with various body movements.
More specifically, researchers have monitored hand movements in hope to establish correlations between emotions and these movements. In these studies, researchers either study glove movements of the users or they have computer programs observe the movements. While glove-based studies analysis the hand gestures using model-based techniques, the vision-based approach observes hands in images using appearance-based techniques (Chen et al., 2003).
Some of the major limitations of this approach lies in difficulties analyzing the hand movements in an unbiased and controlled environment (McAllister et al., 2002).
A considerable amount of studies on verbal communication have led into a number of models in vocal communication. Lens perception model, developed by Brunswik in mid 20th century, is said to be one of the models that helps explain the transfer of speaker’s emotional state through certain acoustic characteristics of their voice (Miroff, 1977). These characteristics usually are the pitch, speech rate, and the intensity of the voice (Pantic & Rothkrantz, 2003).
For example, over-stressing certain words communicates the variations in the pitch and the intensity (Cowie et al., 2001). Emotion of anger tends to increase blood pressure, which in turn increases the intensity of the speech (Breazeal, 2001). Voice assessment system and programs do also carry their own limitations. Data quality and capturing noise-free data, limited voice classification, and the context-independency of the audio are to name a few (Jaimes & Sebe, 2007).
Computer log files have increasingly become a popular method in the filed of Information Retrieval for collecting information on user behavior (Kapoor et al., 2007). Data, such as number of visited pages or the time it took to find specific web resources, are some of the data logs captured and analyzed by researchers to infer frustration of users, for example. However, interactive behavior methodology still has long ways to go before it can establish relationships between behavior and search performance.