Disclaimer: All work/ideas shared on my personal social media channels are my own and are independent of any institution and.or industry.
Science is a never-ending exploration and evolution of thoughts and ideas. The first time that I ever publicly talked about exploring opportunities around neural oscillations in neural networks was ~2012. While it is taking longer than anticipated, the exploration continues.
This video covers a high level discussion about one way in which EEGs (a.k.a brainwaves/neural oscillations) can be used as a data input in convolutional neural networks to help read sentiments/expressions. I also briefly talk about Deepface (a “lightweight face recognition and facial attribute analysis – age, gender, emotion and race – framework for python. It is a hybrid face recognition framework wrapping state-of-the-art models: VGG-Face, Google FaceNet, OpenFace, Facebook DeepFace, DeepID, ArcFace, Dlib and SFace.”) to hone in on the capabilities of existing neural network models that can read facial images.
While Deepface can help read facial images, we can now use the same models, such as convolutional neural networks, to read brainwave images using the same types of architecture. In that sense, we are expanding our capabilities from facial to brain images. Exploring these technologies hold great future and if you are a student, and interested in these topics, I highly recommend exploring this space.