Pesaran Lab is part of

Overview

Sunday, September 28th, 2014

In addition to basic science, research in the lab also seeks to  develop and apply advanced neurotechnologies. We work with technology to extend our basic scientific understanding about the brain, as well as to develop novel brain-based devices to help treat people suffering neurological disorders.

One important direction has been to examine the properties of electrical brain potentials in order to provide specifications for devices that can be implanted into the brain to form advanced brain-machine interfaces. The ability to extract information-rich signals from the brain over long periods of time remains a major challenge. Our interest in developing advanced brain-machine interfaces, such as interfaces that are capable of controlling modern robotic systems with many degrees-of-freedom has led us to invest in virtual reality. We have developed a state-of-the-art virtual reality arena to examine the control of dexterous arm and hand movements using motion capture. We have also been investing in optogenetic tools in order to map large-scale circuits in the brain. By introducing modified proteins into neurons to make them light-sensitive and fluorescent we can control the activity of anatomically-defined brain circuits.

Posted in | Comments Off on Overview

Virtual-reality + motion capture

Sunday, September 28th, 2014

We have been exploring the combination of Virtual-reality (VR) and motion capture. This lets us combine technologies to track movements with other technologies to render movements in a virtual world. We have developed a combined VR-motion capture system to understand how we coordinate and control visually-guided movements. When people are in the virtual world, they no longer see the real world, and can only see their movements in the form of a computer avatar we present on a 3D display. We are using VR-motion capture to study how the brain learns to control dexterous, coordinated movements. Using a virtual reality interface makes it easy for us to program the mapping between intention and dexterous movements and we are also using this interface as a virtual prosthesis. Understanding how the brain learns to control this virtual interface will teach us how to train people to control advanced robotic devices in novel brain-machine interfaces.

Project Funded By
DARPA Biological Technologies Office

Posted in | Comments Off on Virtual-reality + motion capture

Overview

Saturday, September 27th, 2014

How brain activity gives rise to our thoughts, feelings and conscious awareness lies at the heart of understanding what it means to be human. Neurons are the fundamental building blocks of the brain and understanding how the brain controls behavior ultimately involves understanding the activity of individual neurons. The problem is that our brains contain many neurons connected in countless ways, and our ability to probe each neuron individually in the working brain is woefully limited. Progress in understanding the brain depends on finding the right way to simplify. In particular, we need to effectively reduce the complexity of the brain by understanding how neurons work together in groups and ensembles.

In order to make a seemingly intractable problem tractable, we need to simplify by exploiting regularities, correlations, which exist in the activity of groups of neurons. Correlations reflect the presence of patterns in the firing of ensembles of neurons.

 

One of my earliest contributions was to show how individual neurons contribute to neural ensembles. I did this by measuring how well the time of each action potential can be predicted from local field potential (LFP) activity. LFP activity is an electrical potential generated by current flow in the vicinity of a recording electrode. When I started my work on LFP activity in the monkey, the LFP received relatively-little serious attention because it is unclear why it is important. Neurons are the building blocks of the brain, not LFPs. My contribution has been to show that by estimating spike-field coherence – the coherence between the spike train of a neuron and the local field potential (LFP) – I can reveal whether a given neuron does or does not display correlated activity with groups of other neurons. Thus, coherence simplifies because it provides a definition for how neurons can be grouped into ensembles: Group neurons according to whether or not they show significant coherence with LFP activity. I can also identify neurons in one part of the brain that show coherence with LFP activity in another part of the brain. As I will describe below, coherent ensembles perform particular computations that guide behavior, and ensembles of neurons that do not fire coherent patterns of spikes do not perform these computations. As a result, I propose that LFP activity is important because it can reveal, and even help us understand, how neurons form ensembles.

Posted in | Comments Off on Overview

Brain-machine interfaces

Tuesday, July 29th, 2014

Selecting signals

Figure 1 – Neural recordings measured at different spatial scales differ in their performance and reliability.

A brain-machine interface (BMI) records neural activity from the brain and processes it to generate a control signal. Neural activity recorded in the motor system, for example, can be used to move a prosthetic device such as a computer interface, wheelchair or a robotic arm. The ideal brain-machine interface should be reliable and support high-performance control. However, neural signals can be measured in many different ways (see Sidebar). Neural signals differ in the performance and reliability (see Figure 1).  Spiking activity can support high-performance control but is difficult to obtain reliably. In contrast, neural activity measured at larger scales (LFP/ECoG) offers less detailed control than spiking, but may be more reliable. We are interested in understanding how to optimize BMI performance and reliability by incorporating neural signals at different spatial scales. We have been comparing the information contained in different neural signals. We have also recently engineered a new device which allows us to record spiking, LFP and ECoG signals at the same time from the same group of neurons.

Another direction in BMI is inspired by the success of Deep-Brain Stimulation. Stimulation-based BMIs process neural activity to control patterned brain stimulation. These systems close-the-loop between stimulation and recording to correct abnormal brain activity patterns. This approach has potential to treat a wide-range of neurological disorders (see Brain Stimulation).

Project Funded By
DARPA Biological Technologies Office
New York University Grand Challenge

Posted in | Comments Off on Brain-machine interfaces

Decision-making

Tuesday, July 29th, 2014

decision-making

Figure 1 – Decisions can be based on different kinds of information that the brain extracts and stores from the world.

Each time we make a decision, interactions between neurons in different circuits across the brain determine whether we want to choose one option over another. Different kinds of decisions are governed by these circuits (Figure 1). We can decide rapidly, typically in a way that leads to more automatic responses.  We can also make decisions based on what we expect to receive for making a choice. These decisions are more flexible but take longer for us to make. When we are making a decision, we can also indicate the choice with kinds of different movements. We can move our eyes. We can reach or point. We can speak. We are interested in decisions because they are controlled by different circuits in the brain. This means that studying decisions lets us understand how communication in the brain guides our voluntary behavior.

Figure 2 – We find that neurons that predict decisions the earliest are the ones that fire spikes coherently with their neighbors as well as with neurons in other parts of the brain.

To understand how neurons in different brain circuits work together to make choices, we have been analyzing how neurons in different parts of the brain fire spikes together with each other (see Figure 2). In the posterior parietal cortex, neurons that fire dual coherently are the ones that appear to be involved in making the choice.

Project Funded By
National Institutes of Health
McKnight Endowment Fund for Neuroscience
Simons Foundation
DARPA Biological Technologies Office

Posted in | Comments Off on Decision-making

Speech

Tuesday, July 29th, 2014

senSpeech

Among the many branches of systems neuroscience, the study of the human brain is among the most primitive. We are interested in the basic organization of one of the most uniquely human faculties: Speech and language. We believe that the organization of speech in the human brain shares neural mechanisms with other actions such as reaching and grasping. Exploring the connections offers opportunities to develop new insights and understanding.

Figure 1 – Electrode recording sites for an example subject. Responses were typically either sensitive to auditory or production or both in a sensory-motor response.

The neural circuits that perform transformations between sound and speaking are thought to reside in the dominant, typically left, hemisphere of the brain. We collaborate with Orrin Devinsky, Thomas Thesen and others at NYUMC’s Comprehensive Epilepsy Center to study how the human brain listens and speaks.

To study speech, we have developed some tasks inspired by work on looking and reaching (see Sidebar). We find that neural responses at particular sites in the human brain are driven by listening and speaking. Interestingly, some sites respond to both listening and speaking which we call sensory-motor responses. The responses predict not only what the subjects are listening to but also what they will say in the future. Surprisingly, we find the speech transforms are not only present in the dominant hemisphere. We find the transforms are bilateral and are equally present in both hemispheres of the brain.

Bilateral speech transformations suggest that there is an interesting distinction between speech and language: although speech transformations are bilateral, the computational system for language is lateralized. We propose that the brain systems for speech may access language through a unified sensory–motor speech interface. Our ongoing efforts aim to understand how speech and language interact through communication between groups of neurons in different regions of the brain.

Project Funded By:
National Institutes of Health

Posted in | Comments Off on Speech

Brain mapping

Tuesday, July 29th, 2014

The primate brain contains literally billions of neurons arranged into a dense, mosaic of areas connected by fiber tracts. We are using a combination of magnetic resonance imaging and large-scale recordings together with electrical and optogenetic stimulation to map the brain. We then try to understand how the sum of the parts makes the whole.

Project Funded By:
Simons Foundation
DARPA Biological Technologies Office

Posted in | Comments Off on Brain mapping

Spike-field approach

Tuesday, July 29th, 2014

Neurons are the fundamental building blocks of the brain and understanding the brain ultimately involves understanding the activity of individual neurons. In doing so, we face tremendous challenges because the brain is astronomically complex. The idea behind the spike-field approach is to exploit regularities, correlations, that exist in the activity of groups of neurons in order to tackle the enormous complexity of the brain

Understanding neuronal communication is a grand challenge in neuroscience and we have developed the spike-field approach to link activity across the brain. In brief, the approach is to use spike-field measurements as way to label neurons as participating in local and/or long-range circuits. We can then relate the firing of these neurons to the animal’s ongoing behavior to reveal inter-regional communication.

Neurons communicate with other areas by sending action potentials, or spikes, along axons that connect with other neurons across synapses. Electrodes placed in the brain pick up the action potential signals directly as spikes. The electrodes also pick up the activity of groups of neurons as the local field potential, or LFP. The LFP is predominantly generated by synaptic activity. Therefore the LFP picks up both the local activity in the vicinity of the electrode as well as the input activity on the projections from neurons in other areas.  Our proposal is that communication can occur when spikes in one area are correlated with the LFP in another, so if we can identify how spikes are correlated with LFPs then we can measure the communication. LFP activity contains different signals at different frequencies. To measure communication, we process spike-field activity in different frequency bands using the spike-field coherence. We can then resolve which neurons are coordinating their activity and in which frequencies.

 

Posted in | Comments Off on Spike-field approach

Coordination

Tuesday, July 29th, 2014

Visual information is processed to guide different movements by distinct yet overlapping circuits that connect sensation to action.

Figure 1 – Our ability to look at a cup and our ability to reach to the same cup depends on at least two different systems in the brain – a reach system to move the arm and a saccade system to control rapid eye movements.

Visual information is processed by multiple brain systems. These systems connect what we see, hear and feel with what we want to do and they do so according the kind of movement we are making (see Figure 1. One system is active when we are thinking about moving the eyes (red; saccade), and another system for moving the arm (green; reach). In the human brain, there are similar circuits that let us listen and speak (see Speech).

We are interested in how neurons in different regions of the brain work together to control our behavior. Coordination is an exceptional feature of natural behavior. Eye movements guide reaching movements of the arm and grasping movements of the hand. Since the brain systems for moving each have a different behavioral output,  coordination provides an opportunity to investigate how communication guides behavior.  In particular, we can ask, How does coordination depend on communication between different regions of the brain?  Our recent work indicates that the neurons in the eye movement and arm movement systems of posterior parietal cortex, (known as area LIP and PRR) are communicating with each other in order to coordinate when we move the eyes with the arm. Interestingly, posterior parietal neurons appear to use temporal patterns of activity, called neuronal coherence, to communicate with other groups of neurons.

Project Funded By
National Institutes of Health
National Science Foundation
McKnight Endowment Fund for Neuroscience
The Pew Charitable Trusts

Posted in | Comments Off on Coordination

Communication in the Brain

Wednesday, March 5th, 2014

Understanding how the brain works depends on understanding how communication between neurons in different parts of the brain leads to behavior. Our research focuses on movements like looking, reaching, grasping and speaking. We ask how individual neurons and groups of neurons in the brain communicate with each other when we are thinking about moving.

Many areas of the brain are active when we are thinking about moving and they are connected into large-scale networks (Figure 1). Our view is that understanding these areas depends on understanding what they are saying to each other. How can we measure communication between brain regions? How does activity within a brain region depend on activity in another? Can we think of one area effectively exciting or inhibiting activity in another? Do areas gate activity in other areas? Which influences are direct and which involve many areas working in a network? How does behavior depend on communication between different brain regions? If we disrupt communication, can we predict changes in how people move? If we strengthen communication, can we improve what they will do?

This is an image caption.

Figure 1 – Diffusion spectrum imaging of the macaque brain showing the connections that form large-scale networks.  Schmahmann et al (2007)

Our work uses a combination of behavioral, physiological and mathematical methods. We examine how different signals, such as current sensory input, recent motor commands and prior experience, guide our movements. We perform experimental measurements and manipulations of neuronal activity across large-scale brain networks to see how brain processes lead to movements. Finally, we employ sophisticated mathematical tools to extract information and obtain inferences from our experimental measures.

Figure 2 - This is an image caption.

Figure 2 – The dorsal and ventral visual streams.

A great deal of our work examines the dorsal visual stream (Figure 2). This pathway is active when we are planning to move and converts visual inputs into plans for looking, reaching and grasping. In humans, we also study the dorsal stream which processes visual and auditory inputs into speech when we are thinking about talking (Figure 3). Substantial interplay exists between our work in systems neuroscience and our work in neural engineering that provides new ways to think about moving.

Posted in | Comments Off on Communication in the Brain