In the quest to unravel the secrets of how the human brain orchestrates our thoughts, emotions, and experiences, scientists have wielded many innovative technologies. One of these tools is electroencephalography (EEG), a valuable brain imaging tool that uses electrodes or channels strategically placed on the scalp to record vast amounts of electrical activity in the brain. It provides real-time insights into the intricate symphony of neural processes, cognitive functions, and emotional responses, making it an essential instrument for understanding the complexities of the human brain.
Yet, the technique is not without its challenges. While capturing the desired signals from the brain, EEG also picks up signals, artifacts, or noise from the environment and participants. Whether it be the subtle movements of eyes and face, or the rhythmic contractions of jaw and neck muscles, these artifacts can obscure the brain’s own signal, requiring researchers to ‘clean the data’ with preprocessing before they can delve into the brain’s mysteries.
One solution to this problem is having expert EEG researchers visually inspect the data so that they can identify and remove these artifacts. However, this data-cleaning method is meticulous, time-consuming, and imprecise since the definition of what constitutes an EEG artifact remains elusive. Each participant’s brain is unique, further complicating matters as the EEG signal can vary widely from one person to the next. To expedite this, EEG data analysis software programs have automated preprocessing methods to hasten brain data cleaning so that researchers can get to the business of understanding the brain’s secrets.
However, it is unclear just how useful these methods actually are because quality metrics to compare them don’t exist. IONS Scientist Arnaud Delorme decided to address this problem in a new study, where he created an EEG data quality metric.
How to Make a Quality EEG Metric?
We designed the data quality metric to be a simple and strong way to measure the percentage of useful brain channels recorded with EEG. Simply, the fewer artifacts or noise, the more channels should pick up significant brain activity. To see if preprocessing methods – or ways of cleaning the data – actually improve the data, we compared different software methods and examined whether they affected the percentage of useful channels.
To do this, we looked at the brain activity in three different experiments to see how certain tasks affect it. Since the brain’s response is consistent, while noise is random, one strategy to remove noise from the brain data is to measure and average neural responses while repeatedly presenting a task to participants. In the first experiment, participants had to spot animals in pictures; in the second, they judged faces for symmetry; and in the third, they responded to specific sounds. Then, we filtered the data using the software preprocessing steps, looked at specific time periods, and counted how many brain channels showed significant activity.
EEG is Best Left Alone
We found that, except for a couple of methods (i.e. high-pass filtering and bad channel interpolation), most automated corrections didn’t really help, and in fact, sometimes made things worse. Some common techniques (i.e. referencing and advanced baseline removal) hurt the performance. So, the results suggest that maybe EEG data is better left alone rather than being cleaned using automated preprocessing.
What does this mean?
In this new study, we created a new EEG quality metric based on how many channels picked up useful brain signals during certain tasks. We hoped this metric would be a beacon guiding researchers through the labyrinth of preprocessing methods to clean brain data on their quest to better understand the brain. Using this metric, we discovered that, surprisingly, even advanced preprocessing methods didn’t work as well as expected. These results suggest that simplicity reigns supreme in the realm of EEG.
The discovery holds the promise of significantly expediting neural data analysis by researchers, and this has the potential to revolutionize investigations aimed at unraveling the complexities of the human brain. By streamlining neural information processing, scientists can enhance efficiency and focus more rapidly on deciphering the intricate mechanisms governing our thoughts, emotions, and cognitive functions. The time saved in data analysis can be channeled toward deeper exploration and understanding of neural processes, potentially leading to groundbreaking insights into the mysteries of the brain.