Challenges & Limitations: Smart Affective Neuro Search

In my previous post, Smart Affective Neuro Search, partially I discussed my (in progress) dissertation as well as my somewhat unconventional proposal in regards to the implications this model may have in the industry. Here, I intend to examine and discuss some of the challenges and limitations of this model.

In my pilot study, I was able to test my research design along with its proposed methods and measurement techniques. One of these techniques included the Observer Methods, more specifically body movements.

Shortly, I will discuss some of the challenges and limitations of my model. But before that, let’s review the different views when it comes to observing and measuring body movements.

Body Movement – Observer Methods

Some researchers debate whether body movements are valid indicators of human emotions. However, many studies include strong evidence that associates body movements with specific emotions (Wallbott, 1988; de Meijer, 2005). Boone and

Cunningham (1998) were also able to show connection between certain emotions and certain body movements. These findings pertain to the Darwinian theory of discrete emotions, where some body movements directly relate to specific emotional states (Wallbott, 1998). Furthermore, Gunes and Piccardi (2007) identified six facial and body gestures, connecting them with various emotions (see table below).

 Image

Table. List of Bodily Emotions (Gunes & Piccardi, 2007)

 

This table suggests that certain emotions may be assessed by certain body movements. For example, this table suggests that hands resting on the waists or made to fists appear to be indicators of anger in an individual. Moreover, the table indicates that signs of anxiety in participants may be detected through observations of the location of their hands on the table surface.

Moreover, researchers have been monitoring hand movements in hope of establishing correlations between emotions and these hand movements. In these studies, researchers either study glove movements of the users or they have computer programs observe the movements. While glove-based studies analyze the hand gestures using model-based techniques, the computer-based approach observes hands in images using appearance-based techniques (Chen et al., 2003).

Challenges & Limitations

Although human body movements play a big role in emotional communication (Fasel & Luettin, 2003), the complex motor movements involved may contribute to major amount of ‘noise’ when it comes to the readings of brain electrical signals. The current EEG devices in the market today, are partly designed to include noise suppressions. However, they still may not be able to fully suppress all the ‘noise’ emanating from major body movements, such as head or hand movements while participants conduct search tasks on various computer devices.

These, I believe are some of the challenges and limitations when it comes to this proposed model for developing Affective Neuro Search. It is my intention to, through continuous research, address these challenges and limitations.

In future posts, and as I progress in my dissertation, I will discuss proposed ways in which raw EEG data may be best analyzed for the purpose of Affective Neuro Search and such…

 

Please also see here about exciting and emerging wearable computing and AI devices here: Brain controlled airplanesneurogaming, and robots that learn behavior.


Smart (Affective) Neuro Search…

I am humbled by the overwhelming number of views and comments on my previous post, Affective Smart Search. Here, I intend to elaborate on my proposal in regards to the implications that my doctoral dissertation (in progress) may have in the industry.

(Please also see the Disclaimer page.)

Through a pilot study, I was able to test one of my hypotheses that: ‘Aroused dimensions of emotions (high intensity emotions, such as anger) that include high-frequency Beta (and possibly Gamma) brain waves impact search performance (efficiency and effectiveness) negatively.’

Shortly, I will discuss my proposal in regards to the implications that this doctoral dissertation may have in the industry. But before that, let’s review the different views of dimensions of emotions and how they may correlate with high and low frequency brain waves.

Views on Structure of Emotions

There are two main views on the structure of emotion: 1) discrete and 2) continuous approaches. Darwin, the father of the discrete approach, claimed that there exist six basic emotions: fear, happiness, surprise, anger, sadness, and disgust (Darwin, 1872; Ekman, 1992). These theorists argue that these six basic emotions are universal and that humans, regardless of their cultural background, appear to both display and recognize these six distinct emotions. On the other hand, the continuous approach addresses different ‘dimensions’ of emotions (Russel & Mehrabian, 1977; Russel, 1994). These theorists state that there are two dimensions of emotions, valence and arousal (Russell, 1994; Russell & Mehrabian, 1977; Russell & Steiger, 1982; Barrett & Russell, 1999).

While the discrete approach includes the list of discrete emotions, the dimensional self-report approach utilizes dimensions of emotions, arousal and valence (Wundt, 1904). The arousal dimension, as Wundt explains, measures the calmness versus the excitement of an emotion, ranging from calming to exciting (or agitating) states. On the other hand, valence indicates the positivity versus the negativity of an emotion, ranging from highly positive to negative states. As a result, in this method, participants indicate their subjective experience through these two coordinates.

Emotional Dimensions Associated with Brain Waves

While valence assesses the pleasantness (positivity/negativity of an emotion), arousal is explained to represent the intensity of an emotion. Valence (or positive happy emotions) result in a higher frontal coherence in alpha, and higher right parietal beta power, compared to negative emotion Arousal (or excitation) appear to present a higher beta power and coherence in the parietal lobe, plus lower alpha activity.

Russel’s (1989) research shows that the following two emotional dimensions are associated with various brain waves:

  • Theta waves, also seen in meditative states (Cahn & Polich, 2006), show arousal or drowsiness in adults
  • Alpha waves are exhibited when closing the eyes and during relaxation
  • Beta waves, linked with motor behavior, occur when the individual is actively moving (Pfurtscheller and da Silva, 1999). Low beta frequencies are often associated with concentration and/or active thinking
  • Gamma waves represent cognitive or motor functions (Niedermeyer & da Silva, 2004)

Neurophysiologic Methods

Neurophysiologic methods aim to monitor and read human body responses in reaction, such as skin conductance, blood pressure, heart pulse rate, and most recently brain activities, in order to infer human emotional states. Most recently, and the most non-invasive EEG devices, such as Emotiv or Interaxon, are gaining increased respect in the research community.

Industry Implications
And here comes my unconventional ‘out of the box’ proposal… I envision my dissertation add to the body of knowledge of Neuro Information Science in developing search engines that, through wearable computing devices, are able to read human brain waves, and dimensions of emotions thereof, in order to improve search results based on the neurological feedback that the search engines receive from user’s brain waves. In other words, search engines become an extension of the human brain by receiving brain waves that constantly provide neurological feedback in terms of the search results that they provide.

For example, at the time when the search result is being presented on the screen, high frequencies of brain waves may be an indication of high intensity emotions, such as frustration. All the while, the search engines read user brain waves by receiving the brain signals through wearable computing devices. Gradually, the search engine may ‘learn to improve’ its search results based on, for example, alpha (or calmer) brain waves received.

I my future posts, and as I progress in my dissertation, I will elaborate more and will discuss challenges and limitations of this model…

(Read more about exciting and emerging wearable computing and AI devices here: Brain controlled airplanesneurogaming, and robots that learn behavior.)

(Please also see the Disclaimer page.)


‘Smart’ Affective Search: Brain Activities to Help Improve Search Results?

In an era where we are creating brain controlled airplanes, neurogaming, and robots that learn behavior by reading human emotions, there appear to be no limits in having search engines read human emotions in order to improve search results based on the neurological feedback they receive from user’s brain waves. Thanks to companies such as Interaxon and Emotiv, EEG devices have readily been made available to researchers interested in neuro-related studies, who otherwise would have not had access to expensive fMRI machines. Although the two devices measure different entities of the brain, nonetheless, EEG devices help enthusiastic, but low budget, researchers (such as me!) conduct neuro-related studies.

My doctoral research topic aims to examine cognitive relationships between dimensions of human emotions and information retrieval, as in search performance, in the field of neuro information science (Gwizdka, 2012). This study aims to increase our understanding in regards to affective search, improving information systems design practices, and investigating ways to design ‘smart’ information systems that learn and improve search results based on neuro feedback.

To illustrate, emerging expressions, such as “pleasurable engineering” or “emotional design”, have not only become the driving factors in information retrieval system design (Nahl & Bilal, 2007) but also illustrate the important role of emotions in human-computer-interaction. Information retrieval entails complicated cognitive processes, composed of human cognitive processes as well as human physiological and neurological reactions (Picard, 2001). However, our understanding of how emotions affect information retrieval is limited (Nahl & Bilal, 2007), so is our understanding when it comes to the effects of physiological and neurological responses on information retrieval, more specifically on web search performance.

Hence, for us to be able to design better search engines, we need to understand both ‘human-computer-interaction’ as well as ‘brain-computer-interaction’ processes, such that the two not be treated separately.

Keywords: affective information retrieval, affective search, neuro-information science, web search performance, affective information behavior, EEG in information retrieval, emotional design, brain computer interaction


German Researchers Build A Plane Controlled By Your Brain

My dream of further research on mind controlled computing materializes, as I see these types of research results. A great read!


My Doctoral Confirmation of Candidature

Two years into the program and I am having my confirmation of candidature seminar on August 7, 2013!! Both excited and anxious to be defending my confirmation for the school of Information Systems panel and advisory. Here is the abstract, if interested… 

 

Keywords: affective information retrieval, affective search, neuro-information science, web search performance, affective information behavior, EEG in information retrieval, emotional design

ABSTRACT

In the past decade, the affective component of information retrieval system design has increasingly become an essential part of research in information retrieval. Expressions such as “pleasurable engineering” or “emotional design” have become the driving factors in information design, where these expressions have also been extended to information retrieval system design (Nahl & Bilal, 2007). These emerging expressions indicate the important role of emotions in human-computer-interaction.

Information retrieval processes entail complicated cognitive processes. These sophisticated processes are composed of not only human cognitive processes but also human emotion responses (Picard, 2001) where these responses entail physiological as well as neurological reactions. In order to understand the role of affective responses in information retrieval, more specifically within search process, researchers need to investigate these interactions from multiple perspectives (Scherer, 2005).  However, our understanding of how emotions affect information retrieval, as revealed in search performance, is limited (Nahl & Bilal, 2007).

There is a gap in the current body of knowledge on the effect of physiological and neurological emotion responses on information retrieval, more specifically on web search processes and performance. This research aims to examine causal relationship, if any, between dimensions of human emotions and web search performance. Specifically, I intend to contribute to affective web search studies by applying emerging and cutting edge research technologies in the field of neuro-information science (Gwizdka, 2012)—such as electroencephalography (EEG)—thereby increasing our understanding of affective search and improving information systems design practices. By addressing this gap, I intend to make a significant contribution towards the specific fields of affective search and neuro-information science. 


Quantitative Measurements in EEG

In an attempt to investigate the data validity of one of the methods that I use in my PhD dissertation in Information Science, it has become apparent to me that a lack of methodology standards when conducting EEG (Electroencephalography) is the largest contributor to its biased reputation in this field. Here, as the advocate of use of EEG methodology in the field of Information Science, I (briefly) present an overview and argue how this method may be used as a standardized method in the field, as well as in examining various dimensions of cognitive processes.

Recording Brain’s Electrical Signals – EEG

Raw EEG

Let me first give a quick overview of EEG brain’s electrical signals and the quantitative aspect of this method. Devices that collect EEG signals, in actuality collect the voltage fluctuations of the ionic current flow changes of the neurons of the brain (Niedermeyer & da Silva, 2004). These fluctuations of the brain signals are divided into six different wave patterns, depending on the frequency. Brain electrical signals are divided into five different brainwave types: delta, theta, alpha, low beta, midrange beta, high beta, and gamma waves. Each of these brainwave levels has its own specific frequency, ranging from 0 to 100 Hz.

The International 10-20 EEG System

10-20 System

The 10-20 system is a well-recognized technique that indicates specific and standardized locations of the scalp for EEG types of experiments (Niedermeyer & da Silva, 2004). The standardized locations on the scalp suggest areas on the scalp where the EEG electrodes can be set. These locations correlate with specific areas on the neocortex.

Fast Fourier Transformation (FFT)

EEG signals, through a series of mathematical functions and filters may be decomposed. Jean Baptiste Fourier (1768-1830), by developing frequency analysis, has contributed to algorithms, such as FFT, which converts time and space to frequencies (and vise versa). One of the most used quantitative measures of EEG signals is through FFT analysis. Contemporary EEG devices, such as EPOC Emotiv neuroheadset, heavily utilize FFT in order to ‘translate’ brain signals into wave lines displayed digitally and visible to the eyes.

I have covered these topics in great detail in the confirmation part my dissertation, which I hopefully get to publish soon!


Observer Methodology

Observer methodologies used in the field of information retrieval, mainly focused on user behavior and search performance – part of the paper that, currently, I am working on…

———————————————–

Facial Expression

Facial expression is one of the main ways in which humans communicate emotional states with one another (Ekman & Friesen, 1975). In 1999, Paul Ekman developed a classification system that recognized the established six basic emotions (fear, happiness, surprise, anger, sadness, and disgust) of the Darwinian discrete approach (Darwin, 1872).  In addition, Ekman’s contribution to developing Facial Action Coding System (FACS) includes extraction of various facial cues that successfully helps analyze emotional states. These facial expression analysis consist of monitoring facial skin movements as well as facial features, such as in eyebrows, changes.

Most recently, facial expression reading has become a popular methodology in web search behavior studies. Arapakis (2009), for example, applied automatic facial analysis when examining the relationship between behavior and task difficulty in online search. Moreover, when studying user responses, researchers study these responses during Google search by utilizing facial analysis (Lopatovska, 2009).

Although FACS is said to have high reading validity, is non-obtrusive, and has high accuracy, (Cohn & Kanade, 2006), it does have its own limitations. Since FACS solely focuses on facial expression, it fails to take human body movements, in addition to the face, which play a big role in communication into account  (Fasel & Luettin, 2003). In adittion, FACS is unable to include the context in which user reactions occur. (Pantic & Rothkrantz, 2003).

Body Movement

There are many debates as to whether studying body movements is valid indicative of human emotions and behavior. However, some studies have been able to show clear connection between certain behavior and certain body gestures (Boone & Cunningham, 1998). These findings directly relate to the Darwinian theory, where some body movements directly relate to specific emotional states (Wallbott, 1998).  Furthermore, Gunes and Piccardi (2007) identify six facial and body gestures, connecting them with various body movements.

More specifically, researchers have monitored hand movements in hope to establish correlations between emotions and these movements. In these studies, researchers either study glove movements of the users or they have computer programs observe the movements. While glove-based studies analysis the hand gestures using model-based techniques, the vision-based approach observes hands in images using appearance-based techniques (Chen et al., 2003).

Some of the major limitations of this approach lies in difficulties analyzing the hand movements in an unbiased and controlled environment (McAllister et al., 2002).

Verbal Communication

A considerable amount of studies on verbal communication have led into a number of models in vocal communication. Lens perception model, developed by Brunswik in mid 20th century, is said to be one of the models that helps explain the transfer of speaker’s emotional state through certain acoustic characteristics of their voice (Miroff, 1977). These characteristics usually are the pitch, speech rate, and the intensity of the voice (Pantic & Rothkrantz, 2003).

For example, over-stressing certain words communicates the variations in the pitch and the intensity (Cowie et al., 2001). Emotion of anger tends to increase blood pressure, which in turn increases the intensity of the speech (Breazeal, 2001). Voice assessment system and programs do also carry their own limitations. Data quality and capturing noise-free data, limited voice classification, and the context-independency of the audio are to name a few (Jaimes & Sebe, 2007).

Interactive Behavior

Computer log files have increasingly become a popular method in the filed of Information Retrieval for collecting information on user behavior (Kapoor et al., 2007). Data, such as number of visited pages or the time it took to find specific web resources, are some of the data logs captured and analyzed by researchers to infer frustration of users, for example. However, interactive behavior methodology still has long ways to go before it can establish relationships between behavior and search performance.