BCI Design: Easier Than LEGO? | New Advances

by Archynetys Health Desk

Credit: Authors

Imagine being able to compose an email or steer a wheelchair directly with your thoughts. For millions of people living with neurological disorders such as ALS, this possibility could be life-changing. Their ability to think and feel remains intact, but the connection to the outside world is often disrupted.

For decades, scientists have dreamed of bridging that gap with brain–computer interfaces (BCIs)—a technology that allows people to communicate and interact using only the power of thought. Yet for years, the development of BCI systems has been restricted to a small group of experts with niche interdisciplinary know-how and programming skills.

In our article published in PLOS Onewe introduce PyNoetic, designed to overcome one of the biggest barriers for early-stage neuroscientists—rigorous coding.

Fundamental challenges in BCI development

Two fundamental challenges stand in the way of widespread BCI adoption. First, the sheer complexity of the brain means that a one-size-fits-all approach rarely works in practice. Systems designed for one disorder—or even for one individual—often fail for another. This highlights the urgent need for tools that support rapid prototyping of highly customized BCIs tailored to each user.

Second, existing BCI development platforms often present steep learning curves, lack flexibility, and require researchers to juggle a patchwork of expensive, proprietary software. This not only drives up costs but also creates significant barriers to entry, slowing down progress across the field.

To address these issues, we developed PyNoetic: a free, open-source Python framework built to democratize BCI research. Our goal was to design a platform that is both powerful and comprehensive, yet also accessible to researchers regardless of their coding expertise.

Designing Brain-Computer Interface is now easier than building LEGOEEG data. Credit: Authors”/>

A typical BCI system depicting the flow and processing of EEG data. Credit: Authors

What neuroscientists wanted: A no-code approach

We designed PyNoetic to provide the tools needed to create the highly customized algorithms that we believe are the future of BCI. At the heart of our framework is a powerful Graphical User Interface (GUI) featuring a unique “pick-and-place” configurable flowchart.

This allows any researcher to create a unique BCI recipe by dragging and dropping instruction cards. For example, a researcher can visually arrange cards labeled “Filter the Signal,” “Identify Key Channels,” and “Output,” making the complex process of pipeline design intuitive.

We believe this no-code approach to be a game-changer. It will empower neuroscientists, clinicians, and other domain experts to rapidly prototype and test their ideas without getting bogged down in complex code. For our colleagues who are advanced programmers, we ensured that the framework still allows for the seamless integration of custom algorithms with minimal effort. This flexibility is critical for tailoring BCIs to individual needs.

Designing Brain-Computer Interface is now easier than building LEGO

Overview of functionality supported by PyNoetic and its various modules, including the live analysis and programmable flowchart. Credit: Authors

By allowing researchers to easily swap out algorithms and reconfigure the entire processing pipeline, we’ve made it possible to fine-tune a system for a specific person’s unique neural activity.

Crucially, most functions within PyNoetic are built with tunable parameters, giving researchers granular control to adjust everything from filter settings to machine learning model configurations. This ensures that every stage of the BCI pipeline can be precisely calibrated to an individual’s unique neurophysiology.

This entire system can be thought of as a digital workbench or, more appropriately, as a “LEGO set for building BCIs.” To make PyNoetic a truly stand-alone solution, we built it to cover the entire BCI pipeline, including Stimuli Generation (creating custom visual and stimuli auditors to elicit specific brain responses), Data Acquisition and Recording (connecting to EEG hardware to record brain activity), Pre-Processing and Filtering (cleaning up noisy EEG signals and removing artifacts like eye blinks), Feature Extraction (identifying meaningful patterns in the brain data using a wide array of techniques), Classification (using Machine Learning and Deep Learning models to translate brain signals into commands) and Real-time Simulation (testing the complete BCI system in a 2D or 3D simulated environment with visual and auditory feedback).

  • Designing Brain-Computer Interface is now easier than building LEGO

    Illustration of recording paradigm with PyNoetic’s Stimuli generation and recording module. (a) Picture of an SSVEP recording session. (b) Real-time Channel Selection and preprocessing in online mode. Credit: Authors

  • Designing Brain-Computer Interface is now easier than building LEGO

    Pseudo live-stream of EEG data is generated, and a simple pick-and-place flowchart is designed for channel selection and filtering. The top plot displays the raw EEG signal, while the bottom plot shows the filtered EEG signal, with each instance representing data from a single epoch. Credit: Authors

Built for collaboration and the future

From the start, we designed PyNoetic to be free, open-sourceand cross-platform (Windows, macOS, and Linux). We chose Python as its core language to tap into the vast ecosystem of scientific and ML libraries that are already the standard in the BCI community.

One of our most important architectural choices was modularity. We carefully divided PyNoetic into distinct modules, catering to different areas of BCI expertise. This design makes the system easier to navigate and, more importantly, encourages community collaboration. Our hope is that experts will feel empowered to contribute to and update specific modules, ensuring PyNoetic remains current with the latest state-of-the-art methods.

We know this is only the first step. The framework won’t meet every need immediately, but we see this release as the beginning of a broader conversation. We anticipate PyNoetic will grow through community contributions, extending well beyond its initial form.

By lowering the technical barrier to entry and offering a comprehensive, all-in-one platform, we believe PyNoetic can accelerate innovation in BCI research and empower a global community of scientists to bring the revolutionary promise of thought-controlled technology closer to reality.

This story is part of Science X Dialogwhere researchers can report findings from their published research articles. Visit this page for information about Science X Dialog and how to participate.

More information:
Gursimran Singh et al, PyNoetic: A modular python framework for no-code development of EEG brain-computer interfaces, PLOS One (2025). DOI: 10.1371/journal.pone.0327791

Gursimran Singh received his Bachelor’s degree in Electronics and Communication Engineering from Thapar Institute of Engineering and Technology, India in 2023. He is currently working in Texas Instruments as an analog engineer in the Multiphase Processor Power Solutions team. His research interests include signal processing and mixed signal integrated circuits.

Aviral Chharia is a graduate student at Carnegie Mellon University. He has been awarded the ATK-Nick G. Vlahakis Graduate Fellowship at CMU, the Students’ Undergraduate Research Graduate Excellence (SURGE) fellowship at IIT Kanpur, India, and the MITACS Globalink Research Fellowship at the University of British Columbia. Additionally, he was a two-time recipient of the Dean’s List Scholarship as an undergraduate. His research interests include computer vision, computer graphics, and machine learning.

Citation:
Designing brain–computer interfaces is now easier than building with LEGO (2025, October 2)
retrieved 2 October 2025
from https://medicalxpress.com/news/2025-10-braincomputer-interfaces-easier-lego.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Related Posts

Leave a Comment