EEG Data Process Using EEGLAB on MatLab

P1NT1Many times I have been asked about the way in which I processed and graphed the EEG data that I collected for my doctoral studies. For the purpose of my dissertation, I collected the EEG data using the Emotiv neuroheadset and used the EEGLAB open source software to process and graph the EEG data. In this post, I have simplified the steps that I took in order to process my EEG data. Please note that I self-educated myself by reading through tutorials, forum discussions, help pages, and much much more… I am positioning my doctoral work in the field of Neuro Information Science, which is marriage between neuroscience and information science. By no means do I claim to be a neuroscientist or a medical professional.  Hope this helps some of you out there. Happy EEGLABing!

———————————————————————

I used the EEGLAB software, an interactive Matlab toolbox that is used for processing continuous and event-related EEG data, among others, in order to analyze the EEG data that I had collected for my research experiment. I used EEGLAB because it has been widely used in academia as well as in professional institutions, helping process complex EEG data while providing solid robust graphic user interface of the processed and the analyzed EEG data. Moreover, EEGLAB provided several data visualization graphs that helped me greatly in my work to find and establish patterns of brainwaves during each phase of the ISP model.

More specifically, I installed the EEGLAB Compiled version for Windows OS. Next, I will list the step-by-step ways in which I used EEGLAB to process my EEG data:

  1. Open the EEGLAB software.
  2. Go to the ‘File’ menu and click on ‘Import Data’ from the File menu options. Choose ‘From EDF File’.
  3. Find and choose the EEG data that is an EDF file saved on the hard drive and hit ‘Open’ in order to import it into EEGLAB.
  1. The ‘Load Data Using BIOSIG’ will open.
  2. In the ‘Channel List’ box, type numbers 3 through 16 with one spacebar between each number. This will map the 14 channels of the Emotiv neuroheadset data correctly to the EEGLAB software.
  3. Clic ‘Ok’
  1. Name the file in the field ‘Name It’.
  2. Click ‘Ok’
  1. Go to the ‘Edit’ menu and click on ‘Channel Location’ from the Edit menu options.
  1. Go to the Text Editor of the computer and create a file as shown below. Save as a CED file. These numbers will map the 14 sensor channels of the Emotiv neuroheadset channel locations correctly to the EEGLAB software.
  1. Go back to EEGLAB software and choose the ‘Read Locations’ button. (14location.ced)
  2. Choose the above CED file from your computer and highlight it.
  3. Click ‘Open’
  1. Choose ‘Autodetect’ from the ‘File Format’ menu.
  2. Click ‘Ok’.
  1. Click ‘Ok’.
  1. Go to the ‘Tool’ menu and click on ‘Remove Baseline’ from the Tool menu options.
  1. Click ‘Ok’.
  1. Go to the ‘Tool’ menu and click on ‘Run ICA from the Tool menu options.
  2. Click ‘Ok’.
  1. At this point, you will see a window like this. Depending on the memory of the computer, this part may be time consuming, if the memory is low.
  1. Go to the ‘Plot’ menu and click on ‘Channel Data’ from the Plot menu options.
  1. The brainwaves look like this graph and include outlier data that shows as irregularities in the brainwaves.
  1. Highlight the outliers of the brainwaves. These outliers show as peaks in the brainwaves.
  2. Click on the ‘>>’ button in order to move forward on the screen
  3. Repeat highlighting until the end of the data.
  4. Click ‘Reject’ in order to delete all the highlighted outlier data.
  1. Name this new data set in the field ‘Save it as File’.
  2. Click ‘Ok’.
  1. Go to the ‘Plot’ menu and click on ‘Channel Spectra and Maps’ from the Plot menu options.
  1. In the ‘Frequencies to Plot as Scalp Maps (HZ)’ indicate the desired brainwave frequencies to be graphed and plotted
  2. Click ‘Ok’.
  1. Depending on the chosen brainwave frequencies, such graph will be displayed.
  2. Save this plot as JPG file

 

 

 

 

 

 

 


Mapping the Affective Brain Activities of the Information Search Process Model

EEG HeatmapsWhile quite challenging, it has been exciting to work towards positioning my thesis in the (new) field of Neuro Information Science, a marriage between neuroscience and information science. One of my main undertakings with this research is to map the affective and the neural patterns of the information search processes. To my knowledge, this would be the first attempt in the field.

In the field of information science, the affective component of information retrieval system design is increasingly becoming part of the design processes and design roadmaps. In addition, Artificial Neural Networks strive to model the human brain’s biological structure. These system designs and computations, strive to understand and model human decision-making processes and aim to estimate a wide range of computational functions based on large sets of data inputs.

It is worth noting that artificial neural networks, while quite sophisticated in computing and recognizing patterns, at the moment, primarily receive their input from digital data sets, such as pixel, binary, digital, etc. However, the human brain also entails emotional cognitive processes. Hence, It is essential to recognize that, if we are to mimic the human brain we need to also add human emotions – one of the main components of the human cognitive processes – to the equation.

In order to do this, we need to first map the affective and neural patterns, in this case, the information search processes. For these reasons, I decided to map and establish the neural patterns of the information search process during different affective states.

I propose that adding additional data inputs of human emotions may improve not only information system designs but also the design of the artificial neural networks.

One way to read these affective neural activities is to gather user brainwaves via wearable devices and to use these as additional data input onto the information system designs. However, in order to do this, we need to know how to input the affective neural types of data. This doctoral research sets the foundation for continued investigation of the ways in which to design ‘smart’ information systems that learn and improve information retrieval results based on user affective neuro feedback. By developing information search systems that become an extension of the brain via neuro wearable devices we may be able to add human emotions readings as additional data input when developing information search system as well as artificial neural networks. I call this the Smart Affective Search.

In order to map the affective and neural patterns of the information search processes, using Electroencephalogram (EEG) devices as one of my methods, I measured user electrical brain activities during information search processes and during different affective states. Next, I give an overview of information search process model, the underlying theoretical framework used for this study, and why it is important.

Information Search Process Model (ISP)

Kuhlthau (1991) was the first to successfully develop the information-seeking phases of users. She established her findings as the Information Search Process (ISP) model. The ISP model attempts to define various steps of the information search processes in terms of the affective, cognitive, and physical realms. Kuhlthau (2004) developed six steps of the ISP model as listed below:

  1. Initiation: when a person first becomes aware of a lack of knowledge or understanding and feelings of uncertainty and apprehension are common.
  2. Selection: when a general area, topic, or problem is identified and initial uncertainty often gives way to a brief sense of optimism and a readiness to begin the search.
  3. Exploration: when inconsistent, incompatible information is encountered, uncertainty, confusion, and doubt frequently increase, and people find themselves “in the dip” of confidence.
  4. Formulation: when a focused perspective is formed and uncertainty diminishes as confidence begins to increase.
  5. Collection: when information pertinent to the focused perspective is gathered and uncertainty subsides as interest and involvement deepens.
  6. Presentation: when the search is completed with a new understanding enabling the person to explain his or her learning to others or in someway put the learning to use.

While the ISP model is well-established and widely used in the field of information science, to my knowledge, the affective neurological patterns of this model was never investigated nor established. In order to map the affective and neural patterns of the information search processes, I gathered extensive EEG data on the electrical activities of the various stages of the information search process during different affective states. As a result, and after months of data analysis, finally and excitingly, I was able to create heat maps (see the thumbnail of this post!) of the affective neural activities during specific stages of the ISP model! In my next posts, I will go into further details.


Paper Presentation at the Smart Data Conference 2015

SmartData 2015 Header

I am honored to be presenting my (in-progress) doctoral work at the upcoming Smart Data Conference 2015, presented by the DataVersity, a provider of high quality educational resources for business and information technology professionals on the uses and management of data. The Smart Data Conference is designed to accommodate all levels of technical understanding. It will bring together emerging disciplines that are focused on more intelligent information gathering and analysis. So, for those interested, I thought of sharing part of this paper here. And if you are planning on attending the conference, please attend my presentation on: Wednesday, August 19, 2015 – 04:45 PM – 05:30 PM at the San Jose Convention Center – 150 West San Carlos Street, San Jose, CA 95113 USA


Affective Search: How Does Affect Impact Web Search Performance? Towards Smart Emotional Neuro Search Engines: An Extension of the Human Brain

Information retrieval processes entail complicated cognitive processes, which are also composed human emotion responses (Picard, 2001). These entail physiological and neurological reactions. In order to understand the role of affective responses in information retrieval, more specifically within search process, researchers need to investigate these interactions from multiple perspectives (Scherer, 2005). However, our understanding of how emotions affect information retrieval, as revealed in search performance, is limited (Nahl & Bilal, 2007).

There is a gap in the body of knowledge on the effects of physiological and neurological responses on information retrieval, more specifically on web search performance. My doctoral research aims to examine cognitive relationships between dimensions of human emotions and information retrieval, as in search performance. My aim is to increase our understanding in regards to affective search, improving information systems design practices, and investigating ways to design ‘smart’ information systems that learn and improve search results based on neuro feedback. This pilot study examined the neurological relationship between dimensions of emotions and web search performance by applying emerging and cutting edge research technologies, such as electroencephalography (EEG), thereby increasing our understanding of affective search and improving information systems design practices.

This research topic will be a beneficial addition to the current body of knowledge in the field of Neuro Information Science. We need to increase our body of knowledge and strive to understand how human affective responses impact human-computer-interaction. This, in turn, will help us design smarter information retrieval systems. Most recently, Artificial Neural Networks, the complex adaptive deep learning systems (a step beyond machine learning) that use statistical learning algorithms, increasingly strive to model the human brain’s biological neuron networks and architecture. These computations, although artificial, strive to model human decision-making processes and aim to estimate a wide range of computational functions based on large sets of data inputs. It is worth noting that artificial neural networks, while quite sophisticated in computing and recognizing patterns, at the moment, primarily receive their input from data types, such as pixel, binary, digital, etc. These artificial neural networks are codes that aim to stimulate the way in which the human brain learns, more specifically in recognizing patterns or creating memories. The codes are organized in layers in order for the systems to learn to understand various data inputs. While the artificial neural networks are still in their infancy, it is essential to recognize that, to this day and to my knowledge, they are based solely on digital data input. System programmers and architectures fail to approach these efforts based on a holistic view of the human brain. In other words, the main component of emotion is missing from this equation. I propose that adding one additional data input of human emotion may improve these artificial neural networks. One of the main contributions of this research paper is my proposal to the scholars of Artificial Intelligence to include human emotions readings via wearable computing devices as an additional data put for their statistical learning algorithms when creating these artificial neural networks.

There is a gap in the current body of knowledge on the effects of physiological and neurological emotion responses in information retrieval, more specifically on web search. This pilot study aimed to examine the effect of different dimensions of emotions on web search performance, as revealed in search efficiency and search effectiveness.

In this session we will cover:

  • Q1: How do dimensions of emotions affect search effectiveness?
  • Q2: How do dimensions of emotions affect search efficiency?

Nilo Sarraf (@nilosarraf) is pursuing her PhD in Neuro Information Science in the SJSU School of Information Gateway PhD program, in partnership with Queensland University of Technology. Her doctoral research aims to examine cognitive relationships between dimensions of emotions and web search performance. Most recently, she was the first person to propose that “Smart Affective (Neuro)” engines, through wearable computing, read human emotions in order to improve search results based on the neurological feedback they receive from user’s brain waves.


The Heart Drum

Publishing one of my old free-writings as I’m picking back up my fictional writing. Note: This work has nothing to do with my dissertation.


purpleheartI beat and you’re confused. I drum the beats and your superiors are doubtfulness, laziness. I beat and drum and even more drum. My rhythm of quakes is forever and steadfast. It knows not the foolish playfulness. It mixes not inside the clay of the desires, shapes of vacuum of hopes. In the playground of the puppets, the whirling sounds and waves of the drums remain distanced, unfamiliar.

The ropes to the dangling rides cannot hold much longer. What will they do when the rope breaks? Gravity takes over… and they will fall. No doubt there. How will they refrain from catching the cold of the clay onto their celestial bodies and minds? The clay, the sands and the waters that is, is waiting. Just the glimpse of their fall is amusing enough… it is worth the wait.

The ropes of the dangling rides are pulled and squeezed; the actions and the reactions to them. How much more pull can you take? Where is the point of the breakdown? Is it not soon? Will it not happen any moment? How about the law of the probabilities? The kings of the playgrounds know this rule and fear the fall, and the splash, and the covered up bodies in the clay. Quietly working in the background, they know the rules and they follow it; hsssshhh… make no sound, remain invisible, stay still… and wait. They will fall any moment, this is their promise to the muds. Their consensual agreement is laudable. The pray and the haunter are both one and the same.

The rulers, the kings, once were bodies too, you know. They, too, joyed the playground. The rope and the dangling rides were their only companions too, you see. At the end? They fell. They fell down into the mud. The bodies of the minds mingled with the clays and became one. Nowhere else to go… and they had nothing more to say and no more games to play. Their world became the rules of the playground. The only mission turned into the darkest one: to suck as many bodies as we can aim.

The heartbeat, the kind one, the one with tears of the eyes and the compassions of the chests could not watch. The pain got unbearable. Shall I let go of you? I can’t watch the tearing of the blood veins that go right through me. My life is their life. So, how should I let go? I love you too much. You should know: the world of the muds is not where you were destined to be.

The creation of your veins and minds was for the greater purpose than playgrounds and clays. So, how dare you to obey the invitations of the pots? The fiesta of promises is empty with no base. So, stay with me, get nourished in me, sing to me…. keep still… let me hold you, hug you, kiss you. Feel the rush of the love-cells in your veins and see your blood turn into wine… and then hold your cup and just drink. Let me love you, hold you, forever and ever, my only love.


A Position Paper at the iConference 2015

I am excited to have been invited to present a position paper on my doctoral work at the upcoming iConference, presented by the iSchools association. The conference focuses on research work by information science scholars worldwide and will be hosted by The Donald Bren School of Information and Computer Sciences at the University of California, Irvine. So, for those interested, I thought of sharing part of this paper here. And if you are planning on attending the conference, please come by the conference workshop!


Web

1        Introduction

The affective component of information retrieval system design is becoming increasingly essential within the field of information retrieval, as it encompasses complicated human cognitive processes. Cognitive processes include not only mental processes but also emotion (or affective) processes and responses (Picard, 2001). Therefore, it is important to move toward understanding the affective dimension of information retrieval. The goal of this study is to explore and examine the neurological affective components of information retrieval systems, more specifically in web search processes and search performance.

Over the past decade, information retrieval research studies have evolved and become increasingly sophisticated. System-oriented approach was one of the first types of studies where information retrieval systems were the focal point of attention. However, researchers began to realize that not only do we need to examine machines but we also need to study user interaction with the systems. This, in turn, led to user-oriented approach. Shortly thereafter, researchers began to detect sophisticated cognitive processes when dealing with information retrieval systems. As a result, studies began to turn to cognitive-oriented approaches.

Most recently, research communities are detecting how human emotions may play a significant role in human-computer-interaction. Expressions such as “pleasurable engineering” or “emotional design” have become the driving factors in system design, and these expressions have also been extended to information retrieval system design (Nahl & Bilal, 2007). These emerging factors and expressions indicate the important role of emotions in human-computer-interaction, highlighting the importance of including the affective dimensions when designing information retrieval systems. However, our understanding of how emotions affect search processes, as revealed in search performance—search effectiveness and search efficiency—is limited (Nahl & Bilal, 2007).

As a result, the emotion-oriented approach has risen to the surface, making researchers realize the potential effects of affective dimensions on user information retrieval processes. More specifically, researchers are increasingly exploring the neurological aspects of cognitive and emotion responses. This research intends to contribute to the emotion-oriented approach studies, aiming to add value to the evolution of information retrieval research approaches by further exploring neuro-information science.

2        Problem Statement and Research Questions

There is a gap in the current body of knowledge on the effects of physiological and neurological emotion responses in information retrieval, more specifically on web search. This pilot study aimed to examine the effect of different dimensions of emotions on web search performance, as revealed in search efficiency and search effectiveness.

  • Q1: How do dimensions of emotions affect search effectiveness?
  • Q2: How do dimensions of emotions affect search efficiency?
  • Q3: Are there any interactional effects between dimensions of emotions and search performance?

The hypothesis is that positive emotional states have positive effects on information retrieval and negative emotional states affect users’ web search performance negatively.

3        Industry implications

In an era when humans are creating brain controlled airplanes, neuro-gaming, and robots that learn behavior by reading human emotions, there appear to be no limits in having search engines read human emotions in order to improve search results based on the neurological feedback they receive from brain waves. Thanks to technologies such as Interaxon and Emotiv, EEG devices have readily been made available to researchers interested in neuro-related studies that otherwise would have not had access to expensive fMRI machines. Although the two devices measure different aspects of the brain, nonetheless, EEG devices help researchers conduct neuro-related studies.

I envision my dissertation adding to the body of knowledge of Neuro Information Science in developing search engines that, through wearable computing devices that are able to read brain waves and dimensions of emotions in order to improve search results based on the neurological feedback that the search engines receive from brain waves. In other words, search engines become an extension of the human brain by receiving brain waves that constantly provide neurological feedback in terms of the search results that they provide. All the while, the search engine reads brain waves by receiving the brain signals through wearable computing devices. Gradually, the search engine may ‘learn to improve’ its results based on, for example, alpha (calm) brain waves received.

4        Conclusion

This research topic will be a beneficial addition to the current body of knowledge in the field of Neuro Information Science. We need to increase our body of knowledge and strive to understand how human affective responses impact human-computer-interaction. This, in turn, will help us design smarter information retrieval systems.

Most recently, Artificial Neural Networks, the complex adaptive deep learning systems (a step beyond machine learning) that use statistical learning algorithms, increasingly strive to model the human brain’s biological neuron networks and architecture. These computations, although artificial, strive to model human decision-making processes and aim to estimate a wide range of computational functions based on large sets of data inputs. It is worth noting that artificial neural networks, while quite sophisticated in computing and recognizing patterns, at the moment, primarily receive their input from data types, such as pixel, binary, digital, etc. These artificial neural networks are codes that aim to stimulate the way in which the human brain learns, more specifically in recognizing patterns or creating memories. The codes are organized in layers in order for the systems to learn to understand various data inputs.

While the artificial neural networks are still in their infancy, it is essential to recognize that, to this day and to my knowledge, they are based solely on digital data input. System programmers and architectures fail to approach these efforts based on a holistic view of the human brain. In other words, the main component of emotion is missing from this equation. I propose that adding one additional data input of human emotion may improve these artificial neural networks. One of the main contributions of this research paper is my proposal to the scholars of Artificial Intelligence to include human emotions readings via wearable computing devices as an additional data put for their statistical learning algorithms when creating these artificial neural networks.


Extracting & Analyzing EEG Data: Smart Neuro (Affective) Search

In the last posts, I wrote a summary of my (in-progress) dissertation and my proposal to the industry in developing search engines that, through wearable computing devices (such as Emotiv, Interaxon) would scan brain waves, compute the dimensions of emotions, and feed the results back into the search engines, in order to improve search results based on the positive/negative neurological feedback received, sort of a machine/computer learning process.

This I called the Smart Neuro (Affective) Search engine, that essentially would be an extension of the human brain onto search engines. In this post, I intend to provide an introduction of some aspects of these EEG signals and some of the data analysis involved.

In recognizing the dimensions of emotions through EEG devices (i.e. how these may correlate with high and low frequency brain waves), overall, studies appear to suggest that valence states may be measured through alpha asymmetry (present on states of low cognitive load) on the frontal lobes. On the other hand, a good indicator of the arousal states appears to be the ratio of beta-alpha waves (present on states of high cognitive load) on the frontal lobes.

For the purpose of my own research, I use the Emotiv neuroheadset. The very first thing that is needed is the raw EEG data. This data usually is not clean and some preprocessing steps are needed. These preprocessing steps include applying high-pass and low-pass filters. A high-pass filter helps remove the low frequencies and a low-pass filter helps remove the high frequency brain waves, such as Gamma waves.

Once the signals are preprocessed, we would need to divide them in chunks of time in order to extract features out of each one of these pieces. MatLab has many functions for filtering these signals where one could set band pass filters (e.g. alpha waves are between 8Hz and 12Hz).

It is also worth remembering that each of the sensory channels of the EEG neuroheadset, collects a spectrum of brain waves. These spectrums vary among the different types of brain waves. For example, one single channel may present different brain wave frequencies, each differing over time.  Therefore, in order to extract these frequency bands, the spectrum should be computed through the Fourier Transform.

To sum up, careful data processing and analysis must be done when collecting EEG raw data sets. To use the EEG raw data, it would require us to 1) pass each channel through a high pass filter and 2) perform a transform on the data and 3) filter for a key frequency band.

In my future posts, I will cover additional details, as these data analysis can get quite complicated.

Read more about the exciting and emerging wearable computing and AI devices here: Brain controlled airplanesneurogamingrobots that learn behavior, and MindRDR.


Challenges & Limitations: Smart Affective Neuro Search

In my previous post, Smart Affective Neuro Search, partially I discussed my (in progress) dissertation as well as my somewhat unconventional proposal in regards to the implications this model may have in the industry. Here, I intend to examine and discuss some of the challenges and limitations of this model.

In my pilot study, I was able to test my research design along with its proposed methods and measurement techniques. One of these techniques included the Observer Methods, more specifically body movements.

Shortly, I will discuss some of the challenges and limitations of my model. But before that, let’s review the different views when it comes to observing and measuring body movements.

Body Movement – Observer Methods

Some researchers debate whether body movements are valid indicators of human emotions. However, many studies include strong evidence that associates body movements with specific emotions (Wallbott, 1988; de Meijer, 2005). Boone and

Cunningham (1998) were also able to show connection between certain emotions and certain body movements. These findings pertain to the Darwinian theory of discrete emotions, where some body movements directly relate to specific emotional states (Wallbott, 1998). Furthermore, Gunes and Piccardi (2007) identified six facial and body gestures, connecting them with various emotions (see table below).

 Image

Table. List of Bodily Emotions (Gunes & Piccardi, 2007)

 

This table suggests that certain emotions may be assessed by certain body movements. For example, this table suggests that hands resting on the waists or made to fists appear to be indicators of anger in an individual. Moreover, the table indicates that signs of anxiety in participants may be detected through observations of the location of their hands on the table surface.

Moreover, researchers have been monitoring hand movements in hope of establishing correlations between emotions and these hand movements. In these studies, researchers either study glove movements of the users or they have computer programs observe the movements. While glove-based studies analyze the hand gestures using model-based techniques, the computer-based approach observes hands in images using appearance-based techniques (Chen et al., 2003).

Challenges & Limitations

Although human body movements play a big role in emotional communication (Fasel & Luettin, 2003), the complex motor movements involved may contribute to major amount of ‘noise’ when it comes to the readings of brain electrical signals. The current EEG devices in the market today, are partly designed to include noise suppressions. However, they still may not be able to fully suppress all the ‘noise’ emanating from major body movements, such as head or hand movements while participants conduct search tasks on various computer devices.

These, I believe are some of the challenges and limitations when it comes to this proposed model for developing Affective Neuro Search. It is my intention to, through continuous research, address these challenges and limitations.

In future posts, and as I progress in my dissertation, I will discuss proposed ways in which raw EEG data may be best analyzed for the purpose of Affective Neuro Search and such…

 

Please also see here about exciting and emerging wearable computing and AI devices here: Brain controlled airplanesneurogaming, and robots that learn behavior.


Follow

Get every new post delivered to your Inbox.