Course Details: A Step-by-Step Comprehensive Program to Collect and Analyze EEG Data with Emotiv EPOC Neuroheadset & EEGLAB MATLAB

The goals of this course includes, but is not limited to, providing hands-on learning in setting up, collecting, processing, and visualizing raw EEG data, using the Emotiv EPOC neuroheadset series and EEGLab Matlab compiled open source software version.

Course Name: A Step-by-Step Comprehensive 12-Weeks Course to Collect and Analyze EEG Data with Emotiv EPOC Neuroheadset & EEGLAB MATLAB Compiled Version

Hands-On Learning

  • How to set up hardware and software.
  • How to collect raw EEG data from human participants.
  • How to process the raw EEG data.
  • How to visualize (2D and 3D) the processed EEG data.

Note: Interpreting the visualized EEG data is out of the realm of this course and will NOT be covered in this course. Interpreting data is highly personalized per research project.

A One-Quarter Course Program

  • 12 weeks course
  • Certificate of Excellence/Completion
  • 1 curriculum per week
  • 12 sections

Week by Week High Level Overview

  1. Week 1: Introduction
  2. Week 2: Hardware setup
  3. Week 3: Software setup
  4. Week 4: Data collection: Importing raw EEG data
  5. Week 5: Data collection: Loading data and EDF file
  6. Week 6: Data collection: 14 EEG channel locations (CED) file
  7. Week 7: Data processing: Running ICA and removing baseline
  8. Week 6: Data collection: 14 EEG channel locations (CED) file
  9. Week 7: Data processing: Running ICA and removing baseline
  10. Week 8: Data processing: Removing artifacts
  11. Week 9: Data visualization: Channel spectra and maps
  12. Week 10: Data visualization: Component maps in 2D
  13. Week 11: Data visualization: Component maps in 3D
  14. Week 12: Laboratory setup best practices

Recommended: Pre-Course Preparation

While not mandatory, we highly recommend that you complete this pre-course preparation list to best of your ability. Being that this course is quite hands-on, having completed this list will help you use this course to its maximum potential:

Get Ready to Launch into Your EEG Journey

  • For best results, be prepared to dedicate 3-4 hours per week to the study materials of this course.
  • One lecture video per week that can be accessed and watched throughout the 12-week program. Private pre-recorded YouTube video links will be shared with students of this program only.
  • Optional: A 30-min Live Zoom session per week to help answer questions pertaining to that week’s lecture and material. 
  • Weekly PDFs and other online documents and resources pertaining to that week’s lecture and material.
  • Work in the comfort of your own home and/or self-created remote laboratory.

Detailed Overview of the Weekly Courses 

  • Week 1: Introduction
    • One exclusive lecture video (private) available on YouTube. 
    • PDF documents focused on 1) the physical signals in neuroscience and 2) the non-physical signals in neuroscience (EEG).
    • Online resources on 1) the physical signals in neuroscience and 2) the non-physical signals in neuroscience (EEG).
    • (Optional) One Live Zoom session for Q&A focused on the week’s materials.
  • Week 2: Hardware setup
    • One exclusive lecture video (private) available on YouTube.
    • PDF documents focused on Emotiv EPOC neuroheadset.
    • Online resources on Emotiv EPOC neuroheadset.
    • (Optional) One Live Zoom session for Q&A focused on the week’s materials.
  • Week 3: Software setup
    • One exclusive lecture video (private) available on EEGLab Matlab compiled version.
    • PDF documents focused on the EEGLab Matlab compiled version.
    • Online resources on EEGLab Matlab compiled version.
    • (Optional) One Live Zoom session for Q&A focused on the week’s materials.
  • Week 4: Data collection: Importing raw EEG data
    • One exclusive lecture video (private) available on YouTube.
    • PDF documents focused on how to collect and import raw EEG data.
    • Online resources on collecting and importing raw EEG data.
    • (Optional) One Live Zoom session for Q&A focused on the week’s materials.
  • Week 5: Data collection: Loading data and EDF file
    • One exclusive lecture video (private) available on YouTube.
    • PDF documents focused on how to load raw EEG data.
    • Online resources on loading raw EEG data.
    • (Optional) One Live Zoom session for Q&A focused on the week’s materials.
  • Week 6: Data collection: 14 EEG channel locations (CED) file
    • One exclusive lecture video (private) available on YouTube.
    • PDF documents focused on setting up the channel locations in EEGLab.
    • Online resources.
    • (Optional) One Live Zoom session for Q&A focused on the week’s materials.
  • Week 7: Data processing: Running ICA and removing baseline
    • One exclusive lecture video (private) available on YouTube.
    • PDF documents focused on how to run an ICA and removing baselines on a set of raw EEG dataset.
    • Online resources on ICA and baseline removal.
    • (Optional) One Live Zoom session for Q&A focused on the week’s materials.
  • Week 8: Data processing: Removing artifacts
    • One exclusive lecture video (private) available on YouTube.
    • PDF documents focused on how to remove artifacts, such as eye blinks.
    • Online resources on EEG artifact removals.
    • (Optional) One Live Zoom session for Q&A focused on the week’s materials.
  • Week 9: Data visualization: Channel spectra and maps
    • One exclusive lecture video (private) available on YouTube.
    • PDF documents focused on EEGLab spectra and maps.
    • Online resources on learning more about spectra and maps.
    • (Optional) One Live Zoom session for Q&A focused on the week’s materials.
  • Week 10: Data visualization: Component maps in 2D
    • One exclusive lecture video (private) available on YouTube.
    • PDF documents focused on how to visualize processed EEG data in 2D visual maps.
    • Online resources on 2D EEGLab visualization.
    • (Optional) One Live Zoom session for Q&A focused on the week’s materials.
  • Week 11: Data visualization: Component maps in 3D
    • One exclusive lecture video (private) available on YouTube.
    • PDF documents focused on how to visualize processed EEG data in 3D visual maps.
    • Online resources on 3D EEGLab visualization.
    • (Optional) One Live Zoom session for Q&A focused on the week’s materials.
  • Week 12: Laboratory setup best practices
    • One exclusive lecture video (private) available on YouTube.
    • PDF documents focused on best practices when it comes to setting up your own laboratory.
    • Online resources on laboratory settings. 
    • (Optional) One Live Zoom session for Q&A focused on the week’s materials.

Course Tuition (tax and fees included)

  • TBD

Before registering, please carefully read our Terms of Conditions. By paying the tuition fee, you agree to our Terms of Conditions and non-refundable course tuition. Thanks.


Comprehensive Course Offering: A step-by-step 12-week course to learn how to set up, collect, analyze EEG data, with Emotiv EPOC and EEGLAB MatLab

Earlier this year, a few students asked for this topic to be offered as a course. After working on the details, this course will be ready for launch mid August, 2023. For those interested, please email your inquiries to: nilo.sarraf@gmail.com. Also see below for the further details.

Welcome to the world of neuroscience and the exciting realm of EEG data analysis! Are you fascinated by the inner workings of the Neural Oscillations of the human brain and eager to explore the depths of its electrical signals? Look no further, because our online course is here to guide you through this captivating journey.

Introducing our comprehensive curriculum designed to provide you with a high-level summary of the vast landscape of neuroscience while also equipping you with specialized skills in setting up, collecting, processing, and visualizing raw EEG data. Whether you’re a student, researcher, or enthusiast, this course offers an invaluable opportunity to dive deep into the realm of neurophysiology and gain practical expertise in EEG analysis.

The program spans 12 weeks, equivalent to one academic quarter, allowing you to immerse yourself in the intricacies of neuroscience and master the art of EEG data manipulation. With a maximum of 6 credits, you can earn valuable academic recognition while building a strong foundation in this cutting-edge field.

Throughout the course, we will focus on using the Emotiv EPOC neuroheadset series, a popular and user-friendly EEG device, along with the powerful EEGLab Matlab compiled open source software. These tools will serve as your gateway to unlocking the mysteries hidden within brainwave patterns, enabling you to extract meaningful insights and visualize the electrical activity of the brain.

Our experienced instructor, who are renowned experts in neuroscience and EEG analysis, will guide you step-by-step, ensuring that you grasp the fundamentals and progress towards advanced techniques. Through a combination of engaging lectures, hands-on practical sessions, and interactive assignments, you will gain the skills necessary to confidently navigate the world of EEG data analysis.

By the end of the course, you will have a comprehensive understanding of the broader field of neuroscience, as well as the ability to set up EEG experiments, collect raw data, process and analyze it using sophisticated algorithms, and visualize your findings in meaningful and impactful ways. These skills will open doors to various opportunities in research, clinical applications, and cognitive enhancement.

Don’t miss out on this exciting opportunity to expand your horizons and delve into the captivating realm of neuroscience and EEG data analysis. Enroll in our online course today and unlock the potential of the human mind!

Course Details

  • Goal
    • By the end of the quarter, depending on the student’s activities and involvement in following and learning the curriculum, the student will learn how to build their own EEG Lab for under $2000 USD, using Emotiv EPOC neuroheadset and EEGLab Matlab compiled open source software. 
  • Course Duration
    • One Quarter (12 weeks)
    • Each student, upon completion of the course, will receive a Certificate of Excellence of 6 Credits. 
  • Format
    • Weekly lectures and Q&A
      • One lecture per week (30-45-min video will be shared)
      • One Q&A live session per week on Zoom (30-45 min) 
    • Weekly materials
      • Study materials for the week will be shared in the form of Google Docs and/or PDFs per week. 
  • Things needed to attend and complete this course
    • Hardware and Software needed to complete the course – Optional (to get the most out of this course, we highly recommend it that you have both hardware and software necessary)
      • Emotiv EPOC neuroheadset – ~$849 USD
      • EEGLab Matlab compiled open source software – FREE
      • This course will walk you through how to install the hardware and the software too.
    • A good high speed computer with enough CPU and RAM of your choice.
    • A good high speed Internet connection of your choice.
    • A Zoom account – FREE
    • Availabilities to meet during the US Pacific Time Zone for the Q&A Live sessions – Optional if you choose NOT to attend the Q&A
  • Tuition and Fees

Please send inquires to: nilo.sarraf@gmail.com


Part III: Neuroscience in AGI/ASI – Future-Looking Practical Applications

In Part 1we discussed how “the increased interest in neural networks, more specifically deep neural networks, and AI has greatly benefited various technological fields, these techniques are also expected to help further advance scientific fields, such as the neuroscience community.” We also covered that, to help simplify the ambiguous and overwhelming marriage between Neuroscience and AI, we should first categorize the space into two overarching areas: Theoretical and Practical applications. In my case, the connection between neuroscience and AI mostly lies between Neural Oscillations (a.k.a. brainwaves) and Neural Networks (ML/DL). Part 1, more specifically, looked at the current and the projected theoretical applications of neuroscience and AI.

In Part 2we explored the practical applications of Neuroscience and AI. Unlike the research and the academic communities, which primarily seek applications of theoretical nature, such as conducting research to discover, explore, and explain anything and everything, we discussed that the practical approach mainly seeks applications of practical nature, such as products or services to be marketed and sold. We also noted that the practical approach is not only about tech and/or retail products and services but also includes practical applications in the healthcare industry. 

In the previous edition we discussed how the practical applications of Neuroscience and AI can be further categorized into 1) existing applications and 2) future-looking applications. While we already covered the existing applications, in this edition, we are going to explore the future-looking practical applications of Neuroscience and AI. Although it is extremely hard to predict the future, there are still some elements that, to some extent, we have control over, which in turn can help us foresee what is coming.

First, there is no doubt in my mind that technologies will become an extension of the human brain, not only in BCI applications but also in commercial products and offerings. The first time that I openly talked about this was in 2012 during one of my (lightning) talks at the Smart Data Conference in San Jose, CA. In that talk, I illustrated how the brain and various technologies will be able to harness and enhance each other’s power. Years later, the idea of commercial applications of implants in the human bodies are not as far fetched as they once were. 

While the ideas, the developments, and the production of implants have been in the works over the past few decades, their implementations and/or usage remain ever-evolving. For example, as we covered in the previous edition, BCI has solved for several use cases in the health industry where patients who are unable to speak can write on screens through thinking of alphabets. But what remains yet to be discovered and developed are 1) easy non-invasive ways to make the connection between the brain and the machines and 2) beneficial use cases for commercial use that a seamless collaboration between brains and machines may offer.

Second, let us talk about AGI and ASI – at least this is what we call them now. Although still very forward-looking, AGI and ASI hold endless possibilities of future-looking practical applications in the realm of Neuroscience and AI. If you are not familiar with AGI and ASI yet, below are few highlights of future-looking hypothetical:

AGI: Artificial General Intelligence

The concept of AGI is where the future of AI applications will be able to learn intellectual tasks that us humans and/or animals are capable of doing in our daily lives. For example, it is predicted that AGIs of the future will be able to reason, strategize, learn, plan, and use common sense, and much more. An AGI is predicted to be quite autonomous, surpassing humanity in many tasks.

ASI: Artificial Super Intelligence

The super intelligence of an ASI system is predicted to not only surpass AGI systems, but also humans. These hypothetical entities are to be much more intelligent than the most intelligent humans and be a superior system of some sort who is able to learn, predict, and evolve indefinitely. 

If you are like me and have been flirting with the AGI and ASI hypotheticals over the past decade, and have been learning and researching these systems, you may also be convinced that most likely we are headed towards AGI/ASI-type entities within the next few decades. While my own research has primarily been around the possible integrations between Neuroscience and AGI/ASI, I have generally focused on learning about these hypotheses through books, articles, and talks. The fact is that the more you learn the more you realize the breath of possibilities that these systems may offer. 

There is much to cover about the potential role of Neuroscience in AGI and ASI. As I work through these mind-stretching theoretical and practical concepts, I will share these research findings in my next upcoming book. 

In the meantime, for those curious about some of the practical ways that we, today, are able to incorporate Neuroscience in AI, this book covers a few of the most common neural networks used in neuroscience, as well as the EEG input data that align the best with neural networks, such as frequency domain EEG signals, time domain EEG signals, and EEG images:

Neural Oscillations in Neural Networks: Top neural networks that work best with EEG data and top EEG task classifications and data signals that work best with neural networks

Note that in this edition we only covered an eagle-eye view of the future-looking practical applications of Neuroscience and AI. We have yet to look at the details of these future-looking applications.

If you are an aspiring student, there has never been a time like today where we can be anywhere in the world, with pretty much any background, and still be able to get into the field of computational neuroscience. If you have not checked it out, visit Neuromatch Academy and their YouTube channel (I took their 2021 summer course and have no affiliation with them).

What do you think? Can you think of any additional practical applications of Neuroscience and AI? Let me know and I will add to the list. Thanks!

Disclaimer: For over a decade, I have been researching and exploring ways of Neuroscience (more specifically Neural Oscillations) in AI/ML/DL, Robotics, etc. All writings are my own in the old fashioned way. I do not use ChatGPT or the like to generate content. All views, research, and work is my own. All information provided on Neuroscience & AI Newsletter is for general information purposes only and is the expressed opinion of myself, Nilo Sarraf, and not others. This includes (but is not limited to) my memberships, organizations, institutions, and/or employers.


Part II: The Existing Practical Applications of Neuroscience & AI

In Part 1, we discussed how “the increased interest in neural networks, more specifically deep neural networks, and AI has greatly benefited various technological fields, these techniques are also expected to help further advance scientific fields, such as the neuroscience community.” 

The previous edition (Part 1) also covered that, to help simplify the ambiguous and overwhelming marriage between Neuroscience and AI,, we should first categorize the space into two overarching areas: Theoretical and Practical applications. In my case, the connection between neuroscience and AI mostly lies between Neural Oscillations (a.k.a. brainwaves) and Neural Networks (ML/DL). Please note that neuroscience is a huge field and neural oscillations are one part of the field of neuroscience. However, much of what I research and write about can be thought of and applicable towards the rest of neuroscience as well. 

Furthermore, in Part 1 we looked at the current and projected theoretical applications of neuroscience (more specifically in my niche, neural oscillations/brainwaves) and AI. In this edition, we will explore the practical applications. Unlike the research and the academic communities, which primarily seek applications of theoretical nature, such as conducting research to discover, explore, and explain anything and everything, the practical approach mainly seeks applications of practical nature, such as products or services to be marketed and sold. But the practical approach is not only about tech and/or retail products and services. It also includes practical applications in the healthcare industry. Let us now dive into some of the existing practical applications in the field of neuroscience and neuroimaging in conjunction with AI.

The practical applications of Neuroscience and AI can perhaps be further categorized into two main spaces: The existing applications and the future-looking applications. In turn, the existing applications can be categorized even further into two major areas: 1) Brain Computer Interfaces (BCI) and 2) Neuroimaging. Each of these existing areas include many sub areas that have been, and still are, developing rapidly and extensively.

BCI (Brain Computer Interface) is one of the most exciting disciplines when it comes to the practical applications of Neuroscience and AI. These practical applications, both in hardware and in software, are designed to create direct communications between the human and/or animal brains and external devices, such as computer software, external hardware, robotic limbs, and much more. While most people are quite unfamiliar with BCI discipline areas – unless big names in the industry announce BCI findings as their own unique products – this branch started way back in the 70s and has had constant and quite impressive developments ever since. To illustrate, BCI brain-implants in humans were first done in the 90s and have been researched and developed even further during the past decade. 

The collaboration between neuroscience and AI is having BCI applications become even more sophisticated. AI (ML/DL) models applied to EEG data (a.k.a. neural oscillations/brainwaves), as an example, is giving birth to all kinds of practical developments. For example, applying ML models to EEG raw data is helping us invent and develop jaw-dropping practical applications in robotics, cognitive state classifications, mental disorder diagnosis, mind-controlled devices (software and hardware), just to name a few of the real-world applications. Moreover, BCI applications, in conjunction with neural networks (ML/DL), can be used to monitor cognitive and affective states of individuals, which can have many practical applications. One of the most important factors that need to be considered when it comes to making BCIs more practical is the ability to improve the performance of the system on a variety of subjects. Deep neural networks are a promising candidate for this task.

Furthermore, various neuroscience-related practical applications – such as assessing cognitive workload, emotion recognition efforts, seizure detection, motor imagery, or sleep stage scoring – machine learning and deep learning have been successfully used in the above-mentioned EEG applications. However, there are still many challenges that need to be solved in order to improve the collaboration between Neuroscience and AI technologies. To learn more about BCI please refer to the links included in this edition.

Neuroimaging in healthcare is the second existing practical area where Neuroscience and AI are contributing greatly. To illustrate, streamlined diagnostic is perhaps one of the most practical applications of neuroimaging and AI that has been expanding quite a lot in recent years. One of these applications is the ability to automate the tasks that are currently performed by human experts. For example, the ability to diagnose epilepsy and/or sleep disorders using neural oscillations (EEG data) is widely regarded as a powerful tool for assessing the functional status of the brain. Traditionally, these diagnostics have been quite manual where EEG technicians collect, process, and visualize EEG data, which then are shared with physicians for proper diagnostics. However, with the use of neural networks (still in exploration mode) these diagnostic practices have the potential of becoming more and more automated and streamlined. This does not necessarily mean that deep neural networks will replace the EEG technicians but it would certainly help the technicians process larger sets of data with more accuracy and under a shorter amount of time. 

For example, analyzing raw datasets of EEG , LFPfMRIMRI, etc, using specific deep learning techniques, such as CNN, can help increase the level of accuracy and efficiency of data analysis, which may drastically help streamline patients’ diagnostics. To some extent, the neuroimaging technicians and physicians collaborate with neural networks towards the common goal of improving patient care. 

As you can imagine, there are many technical skills that, in the above-mentioned practical application cases, are needed in order to set up and run neural networks in BCIs and Neuroimaging applications. Some of these technical skills are understanding deep learning techniques, such as CNN, knowing Python, understanding how to organize and set up datasets to feed into the neural networks, and much more. A neuroimaging technician, for example, may not have the computer science skills needed. Therefore, as a workaround, a new set of niche careers has emerged, called Computational Neuroscience, a.k.a. Theoretical Neuroscience.

A computational neuroscientist (also see the Neuromatch Academy) uses mathematical models to capture neurological features to help conduct a plethora of scientific research in the field of neuroscience. Since most traditional BCI or Neuroimaging technicians may not have, or even be interested in, the technical skills needed to use deep learning in their workflows and analysis, they may choose to partner with computational neuroscientists.

Talking about the collaboration areas between Neuroscience and Neural networks, the first time that I ever publicly talked about this was in 2012 at the Smart Data Conference in San Jose, California. At the time, I was in roughly 2 years exploring and had noticed that most of the signals (in terms of data points) given to neural networks were oversimplified in terms of the neurology and biology to make sense of AI models in neuroscience. Gradually, over the years, I watched the neuroscience community develop ways to discover and test the best neuroscience data types available – e.g. EEG input data that align best with neural networks are frequency domain EEG signals, time domain EEG signals, and EEG images – some of which are documented in this book:

Neural Oscillations in Neural Networks: Top neural networks that work best with EEG data and top EEG task classifications and data signals that work best with neural networks

To sum up, one of the most existing practical applications of neuroscience in AI is using deep learning techniques, such as CNNs, in BCI researchers to invent and create brain-to-computer communication software and hardware applications. Furthermore, in Neuroimaging, AI techniques help technicians and physicians streamline and improve patient diagnostics and healthcare recommendations practices.

Note that in this edition we only covered the existing practical applications of Neuroscience and AI. We have yet to look at the future-looking applications, such as AGI and ASI. We will do that in the next edition of this Newsletter. 

If you are an aspiring student, there has never been a time like today where we can be anywhere in the world, with pretty much any background, and still be able to get into the field of computational neuroscience. If you have not checked it out, visit Neuromatch Academy and their YouTube channel (I took their 2021 summer course and have no affiliation with them).

What do you think? Can you think of any additional practical applications of Neuroscience and AI? Let me know and I will add to the list. Thanks!

Disclaimer: For over a decade, I have been researching and exploring ways of Neuroscience (more specifically Neural Oscillations) in AI/ML/DL, Robotics, etc. All writings are my own in the old fashioned way. I do not use ChatGPT or the like to generate content. All views, research, and work is my own. All information provided on Neuroscience & AI Newsletter is for general information purposes only and is the expressed opinion of myself, Nilo Sarraf, and not others. This includes (but is not limited to) my memberships, organizations, institutions, and/or employers.


Part I: Theoretical & Practical Applications of Neuroscience & AI

(This post also shared here on LinkedIn)

While the increased interest in neural networks, more specifically deep neural networks, and AI has greatly benefited various technological fields, these techniques are also expected to help further advance scientific fields, such as the neuroscience community. 

The marriage between Neuroscience and AI, even after decades of research and exploration, is still quite ambiguous and overwhelming. The buzz words thrown around by the media tend to be attention-grabbing titles designed to manipulate human emotions into reactions. But if we stepped back for a minute and looked at this space at an eagle-eye view, we would get a glimpse of the current reality as well as the exciting future possibilities that this union might offer, away from the current dramas. To do that, we need to first categorize the different areas in this space.

The collaboration areas between Neural Oscillations (in neuroscience), a.k.a. brainwaves, and Neural Networks (in AI/ML/DL), can be categorized into two overarching area sets: The Theoretical and the Practical applications. In this edition, let us first look at the current and projected theoretical applications of neuroscience (more specifically in my niche, neural oscillations/brainwaves) and AI.

Unlike the industry, who primarily seeks applications of practical nature, such as a product or a service to be marketed and sold, traditionally the academic world seeks applications of theoretical nature. For example, in academia we are more interested in embarking on (research) projects that somehow help discover, explore, and explain anything and everything. We have curious scientists in literally any area that we can imagine, from researchers who dedicate their lives to studying the smallest to the largest entities in existence.

At its core, researchers and scientists work with sets of data that they tend to gather rigorously through particular research designs. We then tend to look for patterns, during the data analysis, that can be interpreted against set hypothesis and existing literature in the scientific body of knowledge. It would be safe to say that the key word here is patterns. To oversimplify, ‘looking for patterns’ can be thought of as one of the core elements that makes the case for the unification between neural networks and science.

While ‘looking for patterns’ is oversimplifying things, as I mentioned above, it is the stepping stone for the next set of benefits that neural networks have to offer the theoretical application side of the scientific world. The next step, and perhaps the most unique and mind blowing sets of benefits of neural networks, is its capability to autonomously adapt, learn, and explore datasets of choice. With correct set of configurations and model training, the neural networks can quite accurately automate the rigorous work of data synthesis and data analysis to help scientists not only analyze data much more efficiently, but also help interpret the data with increased levels of confidence.

For example, in the case of neural oscillations (brainwaves) in neural networks (AI), there are many theoretical applications that can benefit from the above-mentioned oversimplified benefit of neural networks, such as deep learning techniques, in analyzing raw datasets of electrical brain activities, gathered through EEG and/or LFP. In these cases, for example, the level of accuracy and efficiency that a specific deep learning technique, such as CNN, brings to the forefront of a scientist’s meticulous work of exploring and potentially discovering scientific work is priceless. In these cases, the scientist and the neural network become colleagues of some sort, collaborating with each other to help expand our scientific knowledge and discoveries of the brain.

Needless to say that a problem-free collaboration is quite unrealistic. There are many components that go into a satisfactory set up. Some of these prep-work steps are: Focused literature review, organized data sets, the right choice of the neural network technique, the right set up and configuration of the neural network, and much more. But once the set up is right and after a couple of satisfactory test runs have been completed, the researcher can safely keep the same set up and repeat the experiment over and over again, if that is the goal of the study.

As you can imagine, there are many technical skills that, in this case, a scientist would need to have in order to set up and run neural networks. Some of these technical skills are understanding deep learning techniques, such as CNN, knowing Python, understanding how to organize and set up datasets to feed into the neural networks, and much more. A scientist, as you may also imagine, traditionally, may not have the computer science skills needed. Therefore, as a workaround, a new sets of niche career has emerged, called a Computational Neuroscience, a.k.a. Theoretical Neuroscience.

A computational neuroscientist (also see the Neuromatch Academy) uses mathematical models to capture neurological features to help conduct a plethora of scientific research in the field of neuroscience. Since most traditional neuroscientists may not have, or even be interested in, the technical skills needed to use deep learning in their research design and analysis, they tend to partner with computational neuroscientists on research projects. In these cases, most likely, the neuroscientists have institutional laboratories where they design research, run the studies, and collect raw data from their experiments whereas the computational neuroscientists have access to computational tools needed to code in Python, train models, and run deep learning practices with the datasets given.

The partnerships among laboratory neuroscientists and computational neuroscientists has been growing dramatically over the past decade. If goals align, this type of partnership can be extremely beneficial to both parties where 1) the laboratory neuroscientist provides solid lab-grown data to the computational neuroscientist and 2) the computational neuroscientist provides the technical skills needed to set up and run deep learning techniques, such as CNNs. The win win situation, if research goals align, is that they both give each other something that the other partner does not have but need.

Partnering early on helps both parties not only to align on research goals but also to determine set rules for the type of datasets needed to help make the outcome as beneficial as possible. I say this because having a ‘good’ dataset is usually the number one complaints of a computational neuroscientist. Since a computational neuroscientist is fully dependent on the data and have no control over the datasets given to them, it is of outmost importances to start early on.

However, the reality is such that most computational neuroscientists, especially when they are students or early on in their academic careers, do not have the luxury of first-hand access to institutional laboratories. Thankfully, many institutions, such as accredited universities, or private labs publish their raw datasets, making it readily available to the public. However, using these datasets are sometimes a headache for the computational neuroscientists for the data-related reasons mentioned above.

Talking about raw datasets needed in feeding the neural networks, the first time that I ever publicly talked about this was in 2012 at the Smart Data Conference in San Jose, California. At the time, I was in roughly 2 years exploring and had noticed that most of the signals (in terms of data points) given to neural networks were oversimplified in terms of the neurology and biology to make sense of AI models in neuroscience. Gradually, over the years, I watched the neuroscience community develop various ways to discover and test the best neuroscience data types available, some of which are neural oscillation (brainwaves) related are documented in this book:

Neural Oscillations in Neural Networks: Top neural networks that work best with EEG data and top EEG task classifications and data signals that work best with neural networks

To sum up, one of the most theoretical applications of neuroscience in AI is using deep learning techniques, such as CNNs, in scientific research to assist the neuroscientist community uncover and discover our knowledge of the brain even further, and most likely, faster than ever before. Over the past decades, we have made tremendous developments in this field but we are still in the infancy age and have much more work to do.

If you are an aspiring student, there has never been a time like today where we can be anywhere in the world, with pretty much any background, and still be able to get into the field of computational neuroscience. If you have not checked it out, visit Neuromatch Academy and their YouTube channel (I took their 2021 summer course and have no affiliation with them).

What do you think? Can you think of any additional theoretical applications of neuroscience in AI? Let me know and I will add to the list. Thanks!

Disclaimer: For over a decade, I have been researching and exploring ways of Neuroscience (more specifically Neural Oscillations) in AI/ML/DL, Robotics, etc. All writings are my own in the old fashioned way. I do not use ChatGPT or the like to generate content. All views, research, and work my own. All information provided on Neuroscience & AI Newsletter is for general information purposes only and is the expressed opinion of myself, Nilo Sarraf, and not others. This includes (but is not limited to) my memberships, organizations, institutions, and/or employers.