Brains interact with technology to turn imagination into reality.
NeuroGEARS is a technology company bridging Neuroscience, Games, Interaction, and Robotics, born from the desire to explore and create inspiring interfaces for augmenting human experience—and to make technology itself more accessible to everyone.
We are a network of explorers who like to imagine, use, and teach technology, having an extensive background building virtual and augmented reality systems, interactive installations, autonomous robots, and neuroscience research.
Gonçalo is a software engineer turned neuroscientist, fascinated by the behaviour of intelligent systems. With a background of applied research in virtual and augmented reality, parallel processing and autonomous agents, he joined the Champalimaud Neuroscience Programme in 2010, hoping to find better ways of building machines that learn by themselves. Gonçalo completed his PhD with Adam Kampff and Joe Paton, trying to understand the role of motor cortex in the control of movement in non-primate mammals.
Along the way, he extended his experience making interactive systems to rodents and other animal models. Gonçalo developed the Bonsai visual programming language as a way to rapidly prototype interactive neuroscience experiments.
André is a software engineer that specialized in Arts and Technology using Mixed Reality, Multimedia, Robotics, Physical Computing and Interactive Systems. Is fascinated by technological creativity. Back in 2004 started working in Ylabs research division of YDreams then became an expert in the creation of interactive systems and Augmented Reality. In 2011 he created Artica Creative Computing. Artica works on Human Computer Interaction, Electronics, Game Design, Creative Design AR & VR, Video Projection & Mapping, Physical Computing, Robotics, Digital Fabrication, Prototyping, Education and all things creative.
André likes human relationships involving everyone in agile development cycles and working together to a working, professional and affordable solution under realistic development timelines.
Dr. Danbee “Tauntaun” Kim (she/her/they/them) is a Korean-American neuroscientist and teacher who earned her BSc in Brain and Cognitive Sciences from MIT, where they also spent 10 years acting, directing, choreographing, costuming, and writing original content for the MIT Musical Theater Guild. While completing her PhD with Adam Kampff at the Champalimaud Neuroscience Program, Danbee authored of a graphic novel version of their doctoral dissertation called The First VIRS.
Danbee’s current focus is developing a framework for precisely observing nervous systems in increasingly natural settings, aka “field neuroscience”. This includes studying cuttlefish, building interactive exhibits, and using comics and circus as a tool for education and research. To learn more, you can visit www.danbeekim.org.
João is a roboticist and software engineer with a long experience developing real-time interactive applications. Having worked for pioneer tech startups in robotics and augmented reality, he is used to mixing sensors, actuators, and embedded systems to create next generation human computer interfaces. His projects span many fields, including technologically enhanced dance performances, interactive museums, and autonomous search and rescue robots.
More recently, João joined the Champalimaud Neuroscience Programme to help design high-end systems for electrophysiology. He also collaborated in the development of the Bonsai programming language.
Pavel is a MD. and PhD in Neuroscience (SISSA, Trieste, Italy) with more than 15 years of research experience in the fields of animal and human physiology, and behaviour. He is passionate about developing new tools that make research easier, and more fun. For the past 6 years, Pavel has been working on elucidating the molecular and neuronal mechanisms controlling feeding behaviour in the lab of Carlos Ribeiro at Champalimaud Research, using Drosophila melanogaster as a model organism.
He is the main developer of the flyPAD, an automated device that uses capacitive sensors for high-throughput measurement of feeding behaviour in Drosophila.