I have spent over 5 months on this Emotiv neuroheadset due to my dissertation, which had been a nightmare for me☹. In case of anyone who is not that professional was cheated to play with this apparatus, I share some using experience and matters needing attention.
First of all, what is Emotiv? (I promise I am not advertising this product 😉) Emotiv has two types of wireless noninvasive consumer-grade (electroencephalography)EEG device, Emotiv Insight with 5 channels and EPOC+ with 14 channels (updated at end of Nov.). More information can be found at official website: https://www.emotiv.com/
Its convenience is intuitively obvious. It’s much cheaper than headsets with dozens even hundreds of electrodes in research labs. Emotiv headsets can be offered by nonprofessional people like normal consumers or students. In addition, it provides easy connection via Bluetooth, accessing cell phones, personal computers, etc.
It needs some techniques to wear the headset properly with good contact quality. If you have one, it is suggested to pay some attention to the following points.
- Charge first!
- Hair matters, remove hair from sensors as much as possible.
- For EPOC+, always hydrate the sensors to be FULLY saturated with saline solution. Or try hydrating hair directly where sensors are. BUT when they get dry, hydrate them again, ... and again. T^T
- To fit correct sensors positions, among which the two references guide the others. When the reference sensors (mastoid bone) behind ear is detected well, others can be detected much better.
- Sensors touching fully and firmly with scalp leads to good connection.
- For Insight, DO NOT unpack the device if you are still using it. Recharge it again when you use it after re-assembling it.
- For EPOC+, not good for raw EEG data record, most data are ZERO!
【Remember to remove hair(especially for girls)】
Here follows some official applications, free or paid.
MyEmotiv – phone app
Well here comes trials of some software and applications. My choice is MyEmotiv a phone app, which can assess your emotional performance under different situations. I provide friendly interface to connect the device and visualized 3D brain for real-time neural signals.
【MyEmotiv with Insight】
【MyEmotiv - EPOC+ contact quality】
Both Insight and EPOC+ can be utilized in this app. And we can immediately see intuitively the contact quality. With my experience, Insight can reach a better contact quality (100%) with shorter setup time (within 30 mins) than EPOC+ (lower than 80%, over 30 mins). Besides, it becomes weaker and weaker contact gradually as using EPOC+ in longer timer. It may blame the sensors which are supposed to keep hydrated all the time.
Emotiv EmoBot (V.188.8.131.52) – window desktop app
It a virtual robot face simulating human facial expressions only working for EPOC.
【EmoBot - Facial Expression】
If you wonder the accuracy, well, I have to say it is #@$!#*&...<(￣ ^ ￣)@m
Emotiv Xavier Control Panel (V3.3.3) – window desktop app
The control panel helps users how to set up the device, observe the real-time contact quality and realizes the functions like training and detection of mental command and facial expression.
Emotiv Xavier Pure.EEG (V3.4.3 subscribed up to Sept. 2017) – window desktop app
Note: There is a new version of premium application EmotivPro after Sept.
I subscribed the raw EEG records monthly before September.
【EEG line chart- good contact】
【EEG line chart- bad contact】
The software displays the real-time EEG oscillation line chart. In experiments, some can set up within several minutes with good contact quality but some over half an hour still with bad contact.
EMOTIV SDK (COMMUNITY EDITION - V3.5.0)
For developers Emotiv provides APIs for device information, average band power, facial expressions, mental commands, motion data, performance metrics(0.1Hz), raw EEG(charge), performance metrics(2Hz). If you want to access to raw EEG data, you can subscribe quota monthly or annually😩.
The SDK supports various programming languages, e.g. C++, C#, Java, Python, etc. and platforms, e.g. Windows, Mac, Ubuntu, Android and iOS. In general, C++ supports most functions of APIs, that is why I chose C++ as my development language.
BRAIN COMPUTER INTERFACE SYSTEM
I proposed a BCI typing system in which users can type a letter or words after training specific typing pad(s).
【Prototype - typing words】
I explored different potential neutral signal training and detection stimuli via mental command experiments. The possible factors influencing EEG training and detection involved UI elements, man-induced factors and environmental noises.
I mainly merely test Emotiv Insight with over 90% contact quality on average abandon EPOC+ with low contact quality (up to 78% /(T^T)~).
The analyzed results revealed that with Emotiv Insight visual factors have minor influence while major assistance is from human self. However, specifically, colors, sound effect and animation when training help users recall the feeling and increase the detection accuracy, while size or shape, position of elements make little difference to training and detection. As expected, motions or emotion like speak, blink regularly, left or right wink regularly, head moving regularly and strong emotion help detection reach over 65% accuracy on average. But some motions such as frowning, eye movements, arm movements, leg movements make little difference to detections; even unregular head movement makes EEG fluctuation.
Here is a BCI typing system prototype demonstration video.
(*￣▽￣*) Recently Emotiv finally provides Cortex service for developers, but I had my work done and have NO chance to have a try (T_T). More services and applications can be found in Emotiv website with updates periodically.
By the way, questions, discussion and suggestions are welcomed.