DIPARTIMENTO DI MEDICINA E CHIRURGIA CORSO DI LAUREA MAGISTRALE IN PSICOBIOLOGIA E NEUROSCIENZE COGNITIVE
DIPARTIMENTO DI MEDICINA E CHIRURGIA CORSO DI LAUREA MAGISTRALE IN PSICOBIOLOGIA E NEUROSCIENZE COGNITIVE
MONKEY VENTRAL PREMOTOR NEURONS DURING CONSTRAINED AND FREELY MOVING CONDITIONS: FUNCTIONAL PROPERTIES
NEURONI DELLA CORTECCIA PREMOTORIA VENTRALE DELLA SCIMMIA IN CONDIZIONI DI RESTRIZIONE E DI LIBERTA’ DI MOVIMENTO: PROPERIETA’ FUNZIONALI
Relatore:
Xxxxx.xx Xxxx. XXXX XXXXXX
Correlatore:
Xxxxx.xx Prof.ssa XXXXXX XXXXXXXX
Laureanda:
XXXXXX XXXXXXXXX
ANNO ACCADEMICO 2020-2021
SUMMARY
Abstract (English) 2
Abstract (Italiano) 3
1. Introduction 4
1.1 Functional Properties and Organization of the Premotor Cortex 5
1.2 Fronto-Parietal networks 11
1.3 A concrete application: Brain Machine Interfaces (BMI). 14
1.4 Wireless Recordings as a mean to improve ethological and ecological validity 17
2. Aims of the Study 23
3. Materials and Methods 23
3.1 Ethical Statement 23
3.2 Subjects 23
3.3 Surgical Procedures 24
3.4 Behavioural Paradigm and Setup 26
3.5 Video Acquisition 28
3.6 Behavioural Scoring 29
3.7 Neural Recordings 30
3.8 Data Analysis 31
4. Results 36
4.1 Firing rate properties in the two conditions 38
4.2 Single Units Analysis 41
4.3 Multi Units Analysis 54
5. Discussion 62
Bibliography 68
Abstract (English)
The development of wireless neural recording techniques made it possible to explore the brain-behaviour relationship in more ethologically and ecologically valid contexts. In this study we implemented a two-step paradigm that allowed us to compare the functional properties of the ventral premotor neurons investigated in the classical head- fixed chair-restrained condition (CHR) and in a freely-moving condition in the NeuroEthoRoom (NER). We found that only a portion of the neurons showing a modulation in the CHR condition were modulated by similar behaviours in the NER condition, in particular the response to mouth behaviours seems to be better conserved across conditions than that of hand behaviours. This difference may be due to the influence of axial and postural components, which are virtually absent in the CHR condition but became an important variable in an unrestrained context. Overall, this study highlines the need of caution when generalizing the conclusions obtained in classical neurophysiological experiments to natural contexts and supports the necessity of the development of new ethological methodologies to investigate the neural substrates of non-human primates’ behaviours.
Abstract (Italiano)
Lo sviluppo di tecniche wireless di registrazione neurale ha reso possibile esplorare la relazione cervello-comportamento in contesti più etologicamente ed ecologicamente validi. In questo studio abbiamo sviluppato un paradigma in due fasi che ci ha permesso di comparare le proprietà funzionali dei neuroni della corteccia premotoria ventrale investigate nella classica condizione a testa fissa e con il corpo parzialmente limitato dalla presenza della sedia per primati (CHR) e in una condizione di movimento libero nella NeuroEthoRoom (NER). Abbiamo trovato che solo una porzione dei neuroni che mostrano una modulazione nella condizione CHR è modulata da simili comportamenti anche nella condizione NER, in particolare la risposta a comportamenti di bocca sembra essere meglio conservata tra condizioni rispetto a quella a comportamenti di mano. Questa differenza potrebbe essere dovuta ad un’influenza di componenti assiali e posturali, le quali sono virtualmente assenti nella condizione CHR ma diventano una variabile importante in un contesto privo di limitazioni fisiche. Nel complesso, questo studio evidenzia la necessità di cautela quando si generalizzano le conclusioni ottenute dai classici esperimenti neurofisiologici a contesti naturali e supporta il bisogno di uno sviluppo di nuove metodologie etologiche nell’indagine dei substrati neurali dei comportamenti dei primati non umani.
1. Introduction
Over the past century, neuroscientific studies have begun to shed light on how our brain works and how our behaviours, emotions and thoughts could be the outcomes of its activity. The main idea of the Cognitive Psychology of a highly segregated brain in which every function has its specific and unique cerebral area had been slowly replaced by a more modern approach in which neural networks can be devoted to certain functions but the same area can also be reused in other networks that underlie other behaviours. In particular, motor areas which have been considered for decades only devoted to the execution of motor actions - and therefore peripheral to the cognitive domain - have been re-evaluated in order to explain recent findings, such as sensory properties of many of their neurons.
The advent of wireless neural recording technologies made it possible to study neuronal correlates of animals’ natural behaviours in completely new settings, where the subjects can freely move and choose by themselves the behaviour to perform. This new approach could lead to experimental results with a much higher ethological validity, which is an essential feature for translational application capable to function in real-life settings, such as the development of rehabilitation approaches or brain- machine interfaces.
1.1 Functional Properties and Organization of the Premotor Cortex
Motor cortex corresponds to Broadmann’s areas 4 and 6, which are located in the posterior part of the frontal lobe, in front of the central sulcus (Figure 1).
Figure 1. Mesial and Lateral Views of the monkey brain showing the parcellation of the motor, premotor and posterior pariel cortices. Fronto-parietal circuits are represented by illustrating related areas with the same colour. (Xxxxxxxxxx and Xxxxxxx, 2001)
These areas have a cytoarchitectonic hallmark: they are agranular cortices because they lack the layer IV while having an extremely developed layer V, with large pyramidal neurons. Xxxxxxxxx’x area 4 corresponds to the primary motor cortex (F1 or M1) and, since the pioneering studies with dogs conducted by Xxxxxxx e Hitzig
(1870), it has been known that electrical stimulation of this area evokes contralateral
movements. Indeed, this area contains a somatotopic map of the body as demonstrated by Xxxxxxx in monkey’s brain (1952) and by Xxxxxxxx in human’s brain (1937), and this representation is not faithfully proportional with body parts’ sizes but exhibits a magnification of those body parts that, being highly innervated, allow a more sophisticated motor control. Neurons in this area encode simple movement parameters, such as force (Xxxxxx, 1968) and direction (Xxxxxxxxxxxx et al., 1982).
Xxxxxxxx’x area 6 lies anteriorly to F1 and corresponds to the premotor cortex, a group of areas with different functional properties and roles but with some similar features, such as being generally less excitable than F1 (a higher current intensity is required in order to elicit observable movements). Here we can distinguish 6 areas: F3 and F6 in the mesial sector, F2 and F7 in the dorsal sector and F4 and F5 in the ventral sector (Figure 1). Other somatotopic maps have been demonstrated in these areas but most of them are only partial (for example, in F5 only face and hand movement are represented whereas in F4 there is the representation of arm, neck and face movements). These areas can control movement through direct projections to the spinal cord or through the connections they have with F1. Premotor areas are known to be involved in higher order functions such as sensorimotor transformations (Xxxxxxxxxx and Xxxxxxx, 2001), action planning (Xxxxx and Xxxxx, 2006) and monitoring (Fornia et al., 2020), and it has been suggested that they play a role in action recognition (Xxxxxx et al., 2014; Xxxxxxxxx et al., 2008; Xxxxxxx et al., 2014). Neurons in these areas seem to encode the goal of an action and not the mere muscles and joints movements. In F4
there are neurons that encode reaching actions toward objects but are not activated
when the goal of the action is pushing the objects away (Xxxxxxxxxx et al., 1988). Umiltà et al. (2008) demonstrated this property also in F5 neurons that encode grasping actions: these neurons fire when the monkey grasps food with either a normal or a reverse plier, both allowing it to grasp the target but with an opposite sequence of hand muscles activation (opening vs closing the hand to take possession of the food), suggesting that the goal of “grasping” the food is the main coding principle of these neurons.
Some neurons in area F4, especially in its dorsal portion, display tactile and visual properties (bimodal neurons): their tactile receptive field is often larger than that of primary somatosensory cortex’s neurons and they show, in addition, visual responses to objects approaching the tactile receptive field (Xxxxxxxxxx et al., 1981a; Xxxxxxxxxx et al., 1981b). The visual receptive field of these neurons has somatocentric coordinates because it does not depend on monkey’s eyes direction and it has an extension in depth of about 30/40cm or less, which corresponds to the maximal possible extension of the monkey’s arm in the surrounding space (Figure 2). However, further studies (Xxxxxxx et al., 1996) demonstrated that the receptive field of these neurons seem to expand in depth with the increase of the stimulus approaching speed, bringing support to the idea that they code a potential motor action. Xxxxxxxx et al. (1999) have also demonstrated the presence in F4 of trimodal neurons which also have an acoustic response to sounds generated in a direction grossly perpendicular to the tactile field.
Figure 2. Examples of tactile and visual receptive fields of F4 bimodal neurons. (Xxxxxxx et al., 1996)
Bimodal neurons have also been found in the dorsal sector of the ventral primary motor cortex (Maranesi et al., 2012) - F1vd, an area strongly connected with the dorsal sector of area F4 (Matelli et al., 1986). Moreover, when these two areas are short-trains stimulated (50ms), the monkey executes axio-proximal and forelimb movements (Maranesi et al., 2012) whereas with long-train stimulation (500ms - on a behaviourally relevant time scale) coordinated, complex postures that involved many joints are evoked (Xxxxxxxx et al, 2002; Figure 3). Altogether, these properties suggest a possible involvement of these areas in defensive and reaching behaviours.
Figure 3. Motor actions evoked by electrical stimulation on the behaviourally-relevant timescale of 500ms in the motor cortex (Xxxxxxxx and Xxxxxx, 2007)
Area F5 is composed by three subregions: F5c on the post-arcuate convexity, F5p (medially) and F5a (ventrally), located both inside the arcuate sulcus. In general, F5 neurons are thought to be modulated by the goal of the action (for example grasping, placing or holding) and can generalize between different effectors used to perform it (Xxxxxxxxxx et al., 1988; Figure 4 - left); in addition, they can also exhibit selectivity for the movement parameters to be specified in order to obtain a particular goal (for example: grasping with precision grip vs whole-hand prehension, or by directing the hand toward a particular direction, Figure 4 - right).
Figure 4. On the left, an example of a F5 neuron that encode the goal of an action despite the effector used to perform it: it shows a response when the grasping action is performed with the mouth (A), as well as with the right (B) and left (C) hand. On the right, an example of a F5 neuron that exhibit a selectiveness to a particular type of grasping: it shows a response for a precision grip performed with right (D) or left (E) hand but no response if the monkey performs a whole-hand prehension with the right (F) or left (G) hand. (Xxxxxxxxxx et al., 1988)
In F5 there are also neurons with visual properties: “canonical neurons” fire not only in relation to the execution of reaching-grasping actions but also during the simple observation of an object that could evoke a specific type of prehension (Xxxxxx et al., 1997; Xxxx et al., 2006). Furthermore, “mirror neurons” are known to fire both when the monkey performs an action and when it observes someone acting (Xxxxxxx et al., 1996; Xxxxxxxxxx et al, 1996). Canonical neurons are more typically ascribed to F5p (Xxxxxxxxxx and Xxxxxxx, 2012) and play a role in the encoding of visuomotor transformation for grasping objects’ (Fogassi et al., 2001); mirror neurons are more sparsely present in area F5, particularly F5c (Xxxxxxxxxx and Xxxxxxx, 2012), and it has been proposed they can play a role in a larger variety of perceptual and cognitive
functions (Xxxxxx et al., 2014; Xxxxxxxxx et al., 2008; Xxxxxxx et al., 2014). However, more recent studies raised doubts on this apparently sharp segregation: indeed, canonical and mirror neurons can also be found in the same cortical site (Xxxxxx et al., 2014). Moreover, a new class of cells showing responses to both action observation and object presentation (“canonical-mirror neurons”) has been demonstrated. Another important difference between visuomotor neuron categories concerns the space selectivity of their visual responses: mirror neurons can either be selective for peripersonal or extrapersonal space (Caggiano et al., 2009) whereas almost all canonical neurons responds only when the object is presented in the animal’s peripersonal space (Xxxxxx et al., 2014). Canonical-mirror neurons shows mixed functional properties even regarding the space selectivity: most of them respond to object presentation only when the stimulus in the peripersonal space, whereas action observation responses are present both when the stimuli are located in the peripersonal and extrapersonal space (Caggiano et al., 2009; Xxxxxx et al., 2014), although with a clear-cut prevalence for the peripersonal space when highly dichotomic space sectors are tested (Maranesi et al., 2017). Interestingly, objects and actions in the peripersonal space appear to be mainly encoded in an operational (action possibility), rather than metric (absolute distance), frame of reference in premotor cortices (Caggiano et al., 2009; Xxxxxx et al., 2014; Xxxx et al., 2019).
1.2 Fronto-Parietal networks
Premotor cortices show bidirectional connections with different areas of the posterior parietal cortex (Xxxxxx & Goldman‐Rakic, 1989). These circuits have been proposed
as the anatomical substrate of sensorimotor transformations, the processes by which the characteristics of the external environment obtained through sensory channels are converted in forms that are more appropriate to the execution of goal-directed motor actions in response to sensory stimuli (e.g., transformation from allocentric to egocentric coordinates). The posterior parietal cortex shares with the premotor cortices similar somatotopic representations and most of its neurons have also motor properties, but the parietal sensory representations are richer than premotor ones, whereas the descendant pathways to the spinal cord are usually weaker in parietal than in premotor cortices (Fogassi and Luppino, 2005).
The most studied fronto-parietal networks are the VIP-F4 and the AIP-F5 circuits. These circuits connect the two ventral premotor areas described in the previous section with two areas located inside the intraparietal sulcus, particularly in its ventral (area VIP) and anterior (area AIP) portions (Figure 1). Neurons in area VIP can show visual responses - even with directional preference - thanks to its connections with area MT, a high-level visual area that analyses the motion components of visual stimuli (Xxxxx et al., 1993), as well as somatosensory responses. It is possible to evoke body movements through electrical stimulation of VIP (with parameters similar to those used in premotor cortex to evoke complex movements) and these are similar to those evoked by F4 stimulation - in particular face, arm and neck movements. In area VIP there are also some neurons showing bimodal responses to tactile and visual stimuli, likewise in F4. An important difference between these two areas concerns the coordinate system
used to detect visual stimuli: while most neurons in F4 show a somatocentric coding,
most of VIP neurons use a retinocentric coordinate system, and only a 15% of them is somatocentric (Xxxxxxx et al., 1997). This circuit has been proposed to be involved in the reaching and avoidance actions related to stimuli presented in the monkey’s peripersonal space.
The AIP-F5: circuit is considered to be mainly involved in the visuomotor transformations of the object sensory features allowing the subject to plan and execute adequate prehension movements. In AIP, three types of neurons have been recorded: visuo-motor neurons, firing for the executed grasping action as well as during the presentation of a potential target object (as the so-called canonical neurons found in area F5p); motor-dominant neurons, and visual-dominant neurons. The last type of neurons fire when the object is observed but not when the grasping movement is executed in the dark, and they have not been described so far in F5p but they can be found in F5a (Theys et al., 2012; Theys et al., 2013); furthermore, the response of AIP neurons differs when distinct objects that could be grasped in the same way are presented suggesting a more visual-based coding compared to F5 neurons one (Xxxxxx et al., 2000; Xxxxxxxxxxxxx and Xxxxxxxxxxx, 2016).
A model that has been proposed which can explain these data suggests that visual-dominant neurons in AIP code all the possible object’s affordances and send this information to canonical F5 neurons which, via their direct and indirect afferences from the prefrontal cortex (Xxxxxxxx’x area 46), allow to select the more adequate motor plan to be turned into action in the current context and for the ongoing purposes,
projecting back to AIP to inhibit affordances not chosen (Xxxx and Arbib, 1998). Borra
et al. (2008) demonstrated that area AIP has bidirectional connections with the inferotemporal cortex, which is involved in the pictorial description of an object, in particular its sematic categorization. The inactivation of area AIP has been demonstrated to cause the erroneous opening of the hand in order to grasp an object (Xxxxxxx et al., 1994), similarly to the consequences of F5 inactivation (Fogassi et al., 2001), bringing support to its involvement in the coding of object affordances for grasping.
Although direct tracing studies in the human brain cannot be performed, indirect functional evidence for the existence and similarity of these circuits in the human brain have been achieved by several fMRI (Xxxxxxxxx et al., 1999; Culham et al., 2003; Xxxxxxx et al., 2001) and TMS (Xxxxx et al., 2005; Xxxxxx et al., 2006; Xxxx et al., 2006) experiments. Likewise, the study of the kinematics of movements performed by patients with lesions in the posterior parietal cortex, including or not AIP, showed that this area plays a specific role in grasping actions in the human brain as well (Xxxxxxxxx et al., 1998)
1.3 A concrete application: Brain Machine Interfaces (BMI)
The understanding of the principles governing the motor networks functioning is crucial for the development of techniques and devices that can help people affected by motor deficits to regain their autonomy in everyday life. Indeed, traumatic lesions of the central nervous system - especially, stroke and spinal cord injury - as well as neurodegenerative disorders - such as multiple sclerosis - can lead patients to deal with
partial or almost total body paralysis (Xxxxxx et al., 2016), determining a huge negative impact on their quality of life and on their self-dependence. In these situations, there are two possible alternative approaches: rehabilitation or replacement.
To restore motor functions after spinal cord injuries, a classical approach aims to reconstruct the connectivity and functionality of damaged nerve fibres (Al-Xxxxx et al., 2000; Xxxxx et al., 2001; Xxxxxx, 2002) but this is only possible when some limb mobility is preserved. An alternative pathway, which has become more and more promising with the recent computer technology and hi-tech engineering advancements, consists in the development of brain-machine interfaces (BMIs). This approach was firstly proposed by Xxxxxxx in 1980 and it assumes that direct interfaces between spared cortical or subcortical motor centres and artificial actuators could be employed to ‘‘bypass’’ spinal cord injuries so that paralyzed patients could enact their voluntary motor intentions (Xxxxxxx, 1980). A BMI is composed by three major components:
- A device that records neural activity;
- An effector which is controlled by the neural signal;
- An algorithm that analyses and interprets the neural signal as motor ommands.
The type of effector can be disparate, ranging from a visual signal (such as a cursor on the screen) to a complicated prosthetic limb. Even the neural inputs utilized by the BMIs can be very different and obtained through dissimilar levels of invasiveness, ranging from single (SUA) and multi-unit (MUA) activity (Xxxxxxx et al., 2003;
Xxxxxxxx et al., 2004; Xxxxxxxx et al., 2006; Xxxxxxxx et al., 2008) to EEG signal (Xxxxxx and XxXxxxxxx, 2004; Xxxxxx et al., 2004; Xxxxxx et al., 2016).
On the one hand, invasive recordings have a high signal-to-noise ratio but the invasiveness of the implant and the likely long-term rejection related to glial scarring of the brain tissue have yielded limited success outside the laboratory. Moreover, a high number of recording channels is required for the correct and long-term functioning of complex BMIs; indeed, Xxxxxxx and Xxxxxxxxx (2011) stated that the performance of BMI decoders increases linearly with the logarithm of the cortical neuronal sample recorded simultaneously. On the other hand, non-invasive techniques provide a less informative signal; nevertheless, it could be sufficient to decode simple motor intentions. A brain-controlled wheelchair based on EEG signal has been developed and it prove to be functioning and safe in an office environment (Xxxxxxxx et al., 2010); moreover, the relatively low cost and non-invasiveness of EEG makes it perfect for commercial applications such as games and other recreational products controlled by thought commands (Xxxxxxxx, 2008).
Finally, a general-purpose BMI requires two types of control mechanisms: a continuous control - fundamental for motor behaviours such as writing, drawing, and reaching that require precise trajectory and path control - and a discrete control - essential for movement initiation and termination, typing, and discrete grasp and postural configurations. Xxxxxxxxxxx et al. (2004) demonstrated a double dissociation in which ensemble activity in M1 more accurately reconstructs continuous movement,
whereas dPMC ensemble activity can more effectively predict upcoming movements
to discrete targets. A similar dissociation was found by Xxxxxxxx et al. (2004) between vPMC and M1. These findings seem extremely in line with the distinct functional properties of neurons in these areas discussed in the previous sections.
1.4 Wireless Recordings as a mean to improve ethological and ecological validity
The developing of every-day BMIs requires a deep understanding of how neural mechanisms underlying specific behaviours operate in ecological settings; indeed, paralyzed patients need to use these technologies in a wide range of natural contexts. Therefore, a fundamental distinction needs to be introduced: ecological and ethological validity are often used as synonyms, even in the literature, but they have slightly different meaning. They are both examples of external validity, which is the validity of applying the conclusions of a scientific study outside its particular context; this is often opposed to the concept of internal validity, which is the extent to which a piece of evidence supports an experimental hypothesis within the context of a particular research. I propose to define ecological validity as a construct which much strongly depends on the environmental setting, measuring how much the experimental setup resembles the natural environment of that animal model whereas ethological validity relates on the species-specific behaviours tested in an experiment: the more similar they are to natural responses that animal would give to an equivalent natural stimulus, the higher ethological validity will be.
Classical experiments with non-human primates are typically conducted in physically restrained conditions, such as the animal being in a primate chair. This well-
structured setting allows a high internal validity because it is possible to control spatial parameters like head position, gaze direction, and body and arm posture; indeed, these can represent confounding variables that can influence the phenomena undern investigation. This traditional approach led to many valuable insights into the neural correlates of visually guided movements but, because of these physical restraints, the results were mostly limited to hand or arm movements in the immediately reachable space. Tethered neural recordings in freely-moving animals is only possible with small animals - such as squirrel monkeys (Xxxxxx et al., 2004) or marmosets (Xxxxxxxxx et al., 2019; Xxxxxxx et al., 2017) - or in the absence of obstacles and with low channel count - as reported by Hazama and Xxxxxx (2019) in Japanese macaques. These considerations excluded for a long time the study of numerous more complex and ethologically relevant behaviours in the neuroscientific field, at least in the case of larger species such as macaques. The recent development of wireless neural recording technologies in combination with chronically implanted microelectrodes arrays made it possible to overcome these boundaries and complex behaviours like locomotion and foraging became objects of neuroscientific studies. There are two ways to record neural activity wirelessly: online and offline. In the first case, neural data are transmitted in real-time and instantaneously synchronized with the behavioural data whereas in the second case neural activity is stored on the headstages and it is then paired with other synchronous data sources offline, at the end of the recording session.
In this highly novel field, Xxxxxx at al. (2020) developed a setup to investigate
the planning and execution of spatially and temporally structured goal-directed
movements that required locomotion in a relatively large environment. In this experiment monkeys had to perform controlled memory-guided reaching movements with instructed delays to targets within and beyond the immediately reachable space. Wirelessly recording single unit activity in three cerebral areas, the authors demonstrated that neurons in the parietal reach region (PRR) and in the dorsal promotor cortex (dPMC) but not in M1 already codes target location information of far-located walk-and-reach targets during the planning period before and during the walk-and- reach movement.
Another ethologically relevant behaviour for macaques is foraging: an important amount of monkeys’ time is spent in the search of food because the resources in their natural environment are usually sparse. Shahidi et al. (2019) aimed to understand better the neural basis of planning and decision-making strategies when monkey performed a foraging task and the neuronal activity in dorsolateral prefrontal cortex (dlPFC) was wirelessly recorded. They discovered that monkeys perform foraging decisions based on reward probabilities inferred indirectly from the hidden rules of the task: this information is encoded in neurons of dlPFC and can predict animal’s future actions. The authors claimed that previous studies have underestimated the cognitive capacity of monkeys during foraging, primarily because of their restrictive experimental paradigms, whereas their free-roaming setting enabled them to implement the switching cost between two reward options as simply allowing the monkey to walk between them, which is the most ecologically valid translation of the construct.
Even the investigation of spatial learning and memory, that has been deeply explored in its hippocampal correlates in the rodent model (X'Xxxxx and Xxxxxxxxxx, 1971), demonstrated to be pretty difficult in monkeys. Tethered recordings impose the use of two-dimensional computerized visual mazes which lack of ecological validity because the monkey needs to be sat in a primate chair and therefore cannot actively explore the environment as it could do in more natural three-dimensional mazes; moreover, they require a long training time. On the other hand, T-maze, V-maze and other non-regular mazes used with rats are way too simple for non-human primates. These are the reasons why the development of large-scale, three-dimensional maze models for non-human primates, combined with wireless neural recording techniques, could represent an important step towards the future of neuroscientific research (Xxxxx et al., 2008).
A further important new field regards the neural correlates of arousal states, which have mainly been investigated in rodent model due to technical limitations of recording from larger freely-moving animals for several hours. Xxxxxx et al. (2020) used a wireless recording system to examine the dynamics of population activity in dlPFC of unrestrained monkey, particularly concentrating of LFP and single unit activity during active wakefulness, quiet wakefulness and rest. Classical studies with rats showed that, during sleep and rest, cortical populations are intrinsically synchronized in the low-frequency range, whereas during wakefulness they are actively desynchronized by cholinergic inputs received from subcortical areas. The
authors’ results confirm this hypothesis indicating that the main features of cortical
state have remained evolutionarily unchanged across species, which supports the idea that they must convey substantial functional advantages to the organism.
Wireless recording technologies have also been used with other experimental animals such as rats and bats. Xxxx at al. (2018), thanks to a spatial observational learning task, demonstrated the existence of a particular subclass of place cells that code the 3D position of a conspecific in allocentric coordinates in the hippocampal dorsal-CA1 of bats. Grieves et al. (2020) focused on rats’ place cells and showed how, despite rodents are animals that primarily move on 2D surfaces, they still create a volumetric map of the space with oval receptive fields that are more elongated (so that the coding in less precise) in those directions which are less easily viable for the animal
- which normally is the vertical dimension, due to gravity force, but this map can plastically change in relation to the environmental affordances.
A few, field-specific preliminary studies have been carried out to assess whether restrained conditions could alter some of the behavioural variables under investigation and the answer seems positive. Xxxxxx et al. (2004) showed that cats’ accuracy in the localization of visual and auditory stimuli improved substantially comparing the classical head-restrained condition to a more ethological setup in which they were free to move their head; therefore, the use of more ethologically relevant head-unrestrained gaze shifts would be superior to head-restrained eye saccades in order to investigate these mechanisms. Xxxxxxxx (2008) critically analysed the scientific literature regarding the saccadic movement for visual orientation. When the head is not
restrained, changes in the direction of the line of sight can involve not only eye
movement but a simultaneous head movement and therefore the rules that helped define head-restrained saccadic eye movements may have to be changed. In order to account for these new findings, modifications to the hypotheses developed in head- restrained subjects are fundamental. Assuming that the possibility of moving one’s own head has an important impact on the movements of one’s own eyes seems pretty obvious, nevertheless even the restrained body conditions can alter other types of movement such as reaching or grasping as well as their underling neural mechanisms.
Obviously, allowing animals to freely move in a complex environment drastically increase the number of variables that can influence the neuronal firing. Therefore, it’s extremely important to combine these wireless recording technologies with sophisticated video-recording systems that allow a detailed description of the ongoing animal’s body postures. Only in this way these variables can be taken into account and contribute to a more detailed comprehension of the relationships between neural activity and behaviour. Another important reason why wireless recording systems represent the future of neuroscientific research is related to animal’s welfare: being restricted in a primate chair is a stressful procedure for a monkey and it usually requires several months of training; the chance of studying animal’s free behaviours will lead to a considerable time saving for researchers and far better physical and psychological conditions for the animals.
2. Aims of the Study
Although in recent years wireless technology has become more sophisticated and affordable, most of the neuroscientific literature is still based on experiments carried out in constrained conditions where the animals have to perform well-structured task. This is especially true when the animal model is a primate, because of intrinsic spatial (e.g., the size of the animal) and ethological (e.g., the complexity of its natural behaviours) difficulties.
This study aims to investigate whether and to what extent the neural correlates of motor behaviours, studied with the classical head-fixed approach in the primate chair (CHR), are generalizable to a freely-moving condition, the NeuroEthoRoom (NER).
3. Materials and Methods
3.1 Ethical Statement
All experimental protocols complied with the European (Directive 2010/63/EU) and national (D.lgs 26/2014) laws on the protection of animals used for scientific purposes, they were approved by the Veterinarian Animal Care and Use Committee of the University of Parma (Prot. 52/OPBA/2018) and authorized by the Italian Ministry of Health (Aut. Min. 802/2018- PR).
3.2 Subjects
The experiments were carried out on two adult male Macaca mulatta - Mk1 and Mk2 (8 and 10 years old). Before starting with the recording sessions, they were trained with
positive reinforcement to spontaneously sit in a primate chair and to be brought from their home cage to the laboratory. When the monkeys were confident with these phases, they were habituated to perform the experimental tasks in two different conditions - the primate chair (CHR) and the NeuroEthoRoom (NER), as explained below. At this point, they underwent surgical procedures for the implantation of the head fixation system and, subsequently, of the microelectrode arrays.
3.3 Surgical Procedures
For each monkey, a first surgical operation in deep anaesthesia, followed by post- surgical pain medication, was carried out for the implantation of the headpost. The headpost is a titanium cylinder with 4 feet shaped according to the 3D reconstruction of the cranial bone curvature based on a previously taken MRI scan of the monkey’s head. Monkeys were prepared for the anesthesia with atropine administration (0.03 mg/kg) 15 minutes prior to the induction of anesthesia. Next, anesthesia was induced with ketamine (Lobotor, 4.5 mg/kg) and medetomidine hydrochloride (Domitor, 0.05 mg/kg), and maintained via inhaled isofluorane (IsoFlo, 100% p/p). Then, monkey’s head was shaved and skin and muscles were cut. Once the headpost was positioned and the screws fixed along the feet of the headpost, muscles and skin were separately sutured so that only the cylinder required to fix the monkey’s head during CHR sessions protrudes. Finally, the monkey was awaken by administering atipamezole hydrochloride (Antisedan, 0.05 mg/kg), a synthetic α2-adrenoreceptor antagonist.
A second operation was performed to implant 4 floating 32-channels microelectrode arrays (FMAs) in Mk1 and 6 in Mk2 (Figure 5). Surgical and anaesthetic procedures were the same as the described above, but in this case a craniotomy was performed and a part of the brain (chosen according to magnetic resonance image) was exposed and the microelectrode arrays were positioned and slowly lowered in the cortical tissue. Dura mater was sutured and the bone flap repositioned and fixed with dental cement and micro bone screws to the skull. The chamber was fixed to the skull with bone screws and dental cement and the Omnetics connectors positioned in their recess on top of it, before sealing it with the protective cap. The muscles and skin were sutured and the monkey was awaken after the appropriate pharmacological treatment. The monkey was allowed to fully recover for three weeks before starting the neural recording sessions.
Figure 5. Floating microelectrode arrays (FMAs) implanted in macaques. Schematic representation of a FMA with 36-channels (A); Image of microelectrode arrays placement in Mk1 (B) and in Mk2 (C). Anatomical landmark descriptions: CS - central sulcus; AS - arcuate sulcus and PS - principal sulcus.
3.4 Behavioural Paradigms and Setup
Each experimental session implied two steps: the primate chair (CHR) condition and the NeuroEthoRoom (NER) condition. In the CHR condition the monkey was sit head- fixed in the primate chair and had to perform various active and passive tasks: tactile stimulation, visual stimulation, hand and mouth motor tasks and an action observation task. Tactile stimulation was performed using a plastic stick with a small reflective marker on its end. Visual stimulation was performed using a long stick with a large ball or a square at its extremity, and the stimulation was carried out both in the contralateral hemifield and in the ipsilateral hemifield by moving the stimuli in different tangential direction and toward/away from monkey’s body. Motor tasks (at least 10 trials per condition) included grasping of food morsels presented in front of it: a piece of fruit was hidden by a sheet of paper that, once removed, revealed the target of the forthcoming reach-to grasp movement, either in the peripersonal space of the animal - near condition - or in its extrapersonal space - far condition (in the latter, motor preparation was longer). Additional motor tasks included sucking fruit juice through a syringe given by the experimenter, biting and chewing pieces of food received directly into the mouth and grasping metallic rings and a rope with a finger prehension (respectively, with the wrist rotated by 0° and 90°) to get liquid reward. In the action observation task, the monkey had to refrain from moving while watching the experiment grasping a piece of fruit and eating it, in order to get a liquid reward directly into its mouth from a syringe. The whole testing in CHR condition lasted
around one hour.
After this phase, the monkey was released into the NER. The NER is a custom- made transparent Plexiglass box (W x H x D: 208 x 205 x 181cm) that could be equipped with several enrichment items prior to the beginning of the freely-moving session (Figure 6): a wooden structure where to climb and sit on, a hanging rope and four footholds by which the monkey could reach the upper level of the cage, several holes where fruit pieces where placed and two hanging hook attached to a nylon thread that allows the experimenter to regulate their height and replace the fruit pieces on them after the monkey ate it. This part of the session lasted around 30 minutes and allows the study of a wide range of monkeys’ natural behaviours - like walking, searching for food, jumping and climbing - in an ethologically relevant context. Moreover, through the holes in the cage, the experimenter could give the monkey liquid and solid food directly into the mouth, making these behaviours maximally comparable with those investigated in the CHR condition.
Figure 6. A 3D reconstruction (A) and an image (B) of the NER in which are highlighted the two main doors (blue), the two monkey doors (red) and some of the holes (red arrows) through which the experimenter could give the monkey solid and liquid reward.
3.5 Video Acquisition
Monkey behaviour was recorded through a system of eight high-resolution cameras placed around the cage (four in the upper corners and four at a middle height). We used Dual Gigabit Ethernet Machine vision cameras (mvBlueCOUGAR-XD, Matrix Vision) with a resolution of 1936×1214 at up to 164 frames per second, set to 50Hz. The cameras were equipped with a global shutter with sensor size 1/2” format (5.86μm pixel), a manual C-Mount Lenses with 5 mm focal length (CCTV Lens, KowaOptical Products Co., Ldt) and LEDs ring lights. Each camera had two RJ-45 Gigabit Ethernet connectors with screw-locking and two Industry standard 12-pin locking connectors to provide transmission of images and signals to the computer, and to synchronize all cameras through a synchronization box connected to both cameras and computer. A dedicated, commercially available software for 3D motion data acquisition and analysis - SIMI Motion - was used to load, visualize and preliminary extract the 3D position of the retroreflective markers. In the CHR condition, markers were placed at the end of the sticks used for tactile and visual stimulation whereas in the NER condition they were attached to a custom-made structure secured to the monkey’s headpost (Figure 7).
Figure 7. Image of the custom-made structure with the four retroreflective markers which is attached to the monkey’s head-post in the NER condition in order to extract the 3D position of the animal.
3.6 Behavioural Scoring
Video acquisitions were adjusted in terms of brightness, rotation and zooming and then used to perform offline behavioural scoring by means of the Behavioural Observation Research Interactive Software (BORIS), a free and open-source event-logging software (Xxxxxx and Gamba, 2016). We used a purposefully-defined ethogram which includes behaviours of our interests that could be observed in the CHR condition and/or in the NER condition (Table 2). Monkey’s behaviours were accurately operationally descripted and divided into state events or point events depending on their duration or temporal unfolding. Example of behaviours we considered as state events are tactile and visual stimulation in the CHR condition or rest and walk in the NER condition whereas behaviours we considered as point events were actions temporally more immediate like the contact between the hand or the mouth and a piece of food or an object in order to grasp it, eat it or drink it. We also distinguished actions performed with the controlateral and ispilateral hand in order to differentiate neurons with unilateral and bilateral receptive fields. Once the ethogram was defined, a Boris project was created and the videos were analysed, using frame by frame mode to achieve a precision of 20ms, by at least two independent observers. Inter-rater reliability was then calculated using Xxxxx’x kappa statistic with a time unit of 1s and was considered acceptable when above 0.75 (Xxxxx, 1960). Finally, we generated an output for each observation that contained all the behaviours with their start and stop - for state events
- or with the exact time in which they happened - for point events - that was then used
for further Matlab data analysis.
3.7 Neural Recordings
Neural recordings were performed using 32-channel Floating Microelectrode Arrays (FMA), with alternated electrodes of 4 and 2.5 mm, implanted in the premotor cortex (PMC) of the left hemisphere, between the arcuate sulcus and the central sulcus (Figure 5). Each FMA was connected through an OMNETICS connector to the recording system, a wireless neural data logger (xxxxx://xxxxxxxxxxxx.xxx/) with 128 channels (Figure 8).
Figure 8. Deuteron wireless neural data logger connected to its battery compared to the size of a 2€ coin
The logger communicated via a radio signal to the transceiver, updating the internal clock and allowing the synchronization with the videos using a unique 50Hz signal generated by a LabView based software and transmitted via a BNC cable. The signal was band-pass filtered in the range 2 to 7000 Hz and recorded at a conversion rate of 00000 Xx on each channel, thereby enabling to collect both Local Field Potentials (LFP) and single and multi-unit signals. Neural signals were amplified, digitized and stored in a MicroSD memory card (64 GB), thus preventing any possible transmission error. The device was powered by a small external battery connected with a short cable. Once the logger device was linked to the electrode arrays into the chamber, all the components were sealed within a cover screwed on top of the chamber. In addition, the
logger had a magnetic on/off switch, so that it could be switched on and off also when the device was sealed into the protective chamber, with no need to physically touch the animal or remove any component. All formal signal analysis were performed off-line. Single neuron activity was extracted by means of fully automated software (MountainSort, Xxxxx et al., 2017), using a -3.0 standard deviations of the signal-to- noise ratio of each channel as threshold for detecting units. Units were distinguished into single and multi-units using the noise overlap, a parameter that can vary between 0 and 1, with units with value below 0.1 considered as a single. Single unit isolation was further verified using standard criteria (ISI distribution, refractory period > 1 ms and absence of cross-correlated firing with time-lag of ≈ 0 relative to other isolated units in the same channel, to avoid oversampling). Possible artefacts were removed and all the remaining waveforms that could not be classified as single units formed the multi-unit activity.
3.8 Data Analysis
Once that single unit and multi-unit activity was found and behavioural scoring was done, we utilized these data to create four matrices on MatLab for each experimental session. First, we compared various single unit’s features between the two conditions: we analysed three standard measures - the burst index, the ISI (InterSpike Interval)’s coefficient of variation and the position of XXX’x maximum (Xxxxxxxxxxxxxx and Goldman-Rakic, 2002). In addition, we also compared the mean firing rate, the variability of the firing rate and the peak of the firing rate. These three measures were
computed in a similar way as the successive analysis: for every unit, the whole session
was binned in 200ms windows with a step of 20ms (Sliding Windows procedure) and the mean firing rate for each window was calculated. The mean firing rate of every neuron was the mean of the mean firing rate in each window; the variability of the firing rate was operationalized as the standard deviation of the mean firing rates for every window and the peak of the firing rate was the maximus of mean firing rate in a 200ms window.
Secondly, for the assessment of the functional properties of the recorded units, only behaviours that occurred at least seven times were taken into account for the analysis. In order to determinate if a single or multi-unit was positively modulated by a behaviour, a 2s epoch around that behaviour was binned in 200ms windows with a step of 20ms (Sliding Windows procedure) and we performed a one-sample right-tailed t-test between the mean firing rate within each window and the mean firing rate of that unit during the whole task. Since distinct behaviours had a different number of trials - ranging from 7 to 106 - we randomly extracted 7 trials within that behaviour’s pool and repeated this extraction 100 times (Bootstrap method). We took the median of the 100 p-values multiplied by 2 (Xxxxxxxxxxxx and Xxxxxxxx, 2002; Xxxx and Xxxx, 2020) as an estimate of the corrected p-value for each window. If at least 5 consecutive windows had a corrected p-value lower than 0.05 (significance window), that single or multi-unit was considered modulated by that behaviour. In order to avoid erroneous attribution of a significance window, we removed the attribution to a behaviour if in at least half of the trials another behaviour was closer to the centre of the significance
window than the behaviour currently analysed.
Table 1. Temporal structure of the tasks in the CHR condition.
Stimulation | Tactile | Face | |
Upper Body | |||
Visual | Ball | Controlateral | |
Ipsilateral | |||
Square | Controlateral | ||
Ipsilateral | |||
Motor | Grasp Food | Far | Controlateral |
Ipsilateral | |||
Near | Controlateral | ||
Ipsilateral | |||
Liquid Reward | |||
Solid Reward | |||
Finger Prehension 0° | Controlateral | ||
Ipsilateral | |||
Finger Prehension 90° | Controlateral | ||
Ipsilateral | |||
Action Observation |
Table 2. The ethogram used to perform the behavioural scoring. Behaviours only observable during CHR condition are represented in red, behaviours only observable during NER condition in blue and behaviours observable in both conditions in yellow.
Behaviour | Type of Event | Operational Description |
Grasp Near R | Point Event | Monkey grasps a food piece with right hand when the food is presented in front of him - Start when hand touches food |
Grasp Near L | Point Event | Monkey grasps a food piece with left hand when the food is presented in front of him - Start when hand touches food |
Grasp Far R | Point Event | Monkey grasps a food piece with right hand when the food is presented far from him and then brought near - Start when hand touches food |
Grasp Far L | Point Event | Monkey grasps a food piece with left hand when the food is presented far from him and then brought near - Start when hand touches food |
Grasp Food R | Point Event | Monkey grasps a food piece with right hand - Start when hand touches food |
Grasp Food L | Point Event | Monkey grasps a food piece with left hand - Start when hand touches food |
Grasp Solid Reward R | Point Event | Monkey receives passively solid (fruit pieces, raisins or others) reward from the experimenter and grasp it with right hand - Start when hand touches the food |
Grasp Solid Reward L | Point Event | Monkey receives passively solid (fruit pieces, raisins or others) reward from the experimenter and grasp it with left hand - Start when hand touches the food |
Failed Grasp | Point Event | Monkey tries to grasp a food piece with left or right hand but fails - Start when hand touches food |
Finger Prehension O° R | Point Event | Monkey grasps carabiner with right hand - Start when hand closes around the carabiner |
Finger Prehension O° L | Point Event | Monkey grasps carabiner with left hand - Start when hand closes around the carabiner |
Finger Prehension O° R | Point Event | Monkey grasps rope with right hand - Start when hand closes around the rope |
Finger Prehension O° L | Point Event | Monkey grasps rope with left hand - Start when hand closes around the rope |
Grasp Nylon Thread R | Point Event | Monkey grasps nylon thread with right hand - Start when hand touches the nylon thread |
Grasp Nylon Thread L | Point Event | Monkey grasps nylon thread with left hand - Start when hand touches the nylon thread |
Grasp Rope R | Point Event | Monkey grasps rope with right hand - Start when hand touches the rope |
Grasp Rope L | Point Event | Monkey grasps rope with left hand - Start when hand touches the rope |
Grasp for Climbing R | Point Event | Monkey grasps foothold or structure (but not rope) with right hand for climbing - Start when hand touches the object |
Grasp for Climbing L | Point Event | Monkey grasps foothold or structure (but not rope) with left hand for climbing - Start when hand touches the object |
Grasp for Grooming R | Point Event | Monkey grasp something in his fur with right hand - Start when the precision grip closes |
Grasp for Grooming L | Point Event | Monkey grasp something in his fur with left hand - Start when the precision grip closes |
Active Food to the Mouth R | Point Event | Monkey actively places a food piece into the mouth with right hand - Start when right hand reaches the mouth |
Active Food to the Mouth L | Point Event | Monkey actively places a food piece into the mouth with left hand - Start when left hand reaches the mouth |
Solid Reward | Point Event | Monkey receives passively solid (fruit pieces, raisins or others) reward directly in the mouth - Start when the food touches the mouth - Only if the experimenter gives it |
Liquid Reward | Point Event | Monkey receives passively liquid reward directly in the mouth with a syringe - Start when the mouth touches the syringe |
Grasp Food with Mouth | Point Event | Monkey eats food directly with its mouth (he doesn't pick it up with hands) - Start when mouth and food get in contact |
Grasp Experimenter | Point Event | Experimenter grasps a food piece with left or right hand - Start when fingers stop closing |
Active Food to the Mouth Experimenter | Point Event | Experimenter actively places a food piece into the mouth with left or right hand - Start when the distance between the mouth and the food is at its minimum (if not really eaten) or when hand reaches the mouth |
Food Presentation | Point Event | Monkey sees the food piece to grasp, either near or far - Start when panel is removed to show food |
Tactile Stimulation | State Event | Monkey is being haptically stimulated through the stick - Start when the stick touches the monkey, stop when the stick stops touching the monkey |
Visual Stimulation | State Event | Monkey is being visually stimulated through the long stick - Start when the marker on the stick is aligned with the cage door entering it, stop when the marker on the stick is aligned with the cage door exiting it |
Step Hand R | Point Event | Monkey takes a step with right hand - Start when hand touches the floor/structure |
Step Hand L | Point Event | Monkey takes a step with left hand - Start when hand touches the floor/structure |
Power Step R | Point Event | Monkey grasps the wooden structure for walking - Start when right hand touches the structure |
Power Step L | Point Event | Monkey grasps the wooden structure for walking - Start when left hand touches the structure |
Scratching | Point Event | Monkey is scratching - Start when the hand touches the body the first time |
Yawn | Point Event | Monkey is yawning - Start when the mouth starts opening |
Threat | Point Event | Monkey is doing a facial expression to threat experimenter - Start when monkey starts moving the mouth |
Autogrooming | State Event | Monkey does autogrooming - Start when hand touches the body, stop when hand stops touching the body |
Walk | State Event | Monkey moves from one location to another (not climbing) - Context independent - Start when first limb touches the floor, stop when last limb touches the floor – Minimum: two steps with each hand |
Rest | State Event | Monkey stands still: in this moment monkey isn't walking - Start when the rear-end touches the ground (if he sits) or when last limb touches the ground (if he stands, the frame after the end of walk), stop when first limb is moved for walking – Minimum: 2 s |
4. Results
We isolated 98 single units (n=60 in Mk1; n=38 in Mk2) with highly restrictive criteria (see Methods) during two recording sessions, one for each monkey. We recorded from 128 electrodes of the chronic arrays implanted in Mk1 (Figure 9A) and from 128 electrodes of arrays C, D, E and F implanted in Mk2 (Figure 9B). Neuronal activity was recorded in a series of naturalistic conditions performed with the monkey’s head fixed in the primate-chair (CHR condition), and next while the monkey freely moved in the NER (NER condition).
Figure 9. Schematic representation of the microelectrode arrays’ insertion sites in the left premotor
cortices of Mk1 (A) and Mk2 (B)
We scored the behavioural events as described in the previous section (see Methods). Figure 10 shows an example of the distribution of the behavioural occurrences along the session’s timelines: in the NER condition each category of behavioural events is evenly distributed across the session whereas in the CHR condition the regular blocks of the task group specific events in definite period of the testing session.
Figure 10. Example of the timeline of the occurrences of all the behavioural events during the CHR condition (A) and during the NER condition (B) for Mk2.
4.1 Firing properties in the two conditions
It is important to note that the neural signal recorded during the CHR and NER conditions has been sorted after merging the two datasets, in order to ensure that the same parameters and processes of spike detection and classification were applied. However, to be able to compare possible differences in terms of neuronal functional properties it is of critical importance to be sure that the same individual neuron was steadily recorded across the two conditions.
To this purpose, we compared various firing properties of well-isolated single neurons between the two conditions. Specifically, for each of the 98 recorded neurons, we assessed 1) the mean firing rate, 2) the variability of the firing rate and 3) the peak of the firing rate (see Methods). The mean firing rate in the CHR condition was 6.45 ±
8.87 Hz whereas the mean firing rate in the NER condition was 6.35 ± 9.12 Hz. A
paired-sample two-tail t-test revealed that the difference between the mean firing rate in the two conditions was not significant (t=0.40; p=0.69) whereas the correlation between the two distributions was high and significant - r=0.96, p<0.001 (Figure 11A). The standard deviation in the CHR condition was 6.38 ± 4.94 Hz whereas in the NER condition was 6.60 ± 5.22 Hz. Even in this case, a paired-sample two-tail t-test revealed that the difference was not significant (t=-1.43; p=0.16) and the correlation between the two distributions was high and highly significant - r=0.95, p<0.001 (Figure 11B). Finally, the peak of the firing rate in the CHR condition was 69.69 ± 43.78 Hz whereas in the NER condition it was significantly higher: 77.19 ± 47.31 Hz (t=-3.43; p<0.001), although the correlation between the two distributions was highly significant - r=0.89, p<0.001 (Figure 11C). These findings indicate that individual neurons’ firing rate remain overall extremely constant between NER and CHR conditions; the slightly greater peak firing rate in the NER may be accounted for by the greater variety of behaviours, which can increase the probability to identify the optimal condition for a neuron’s discharge.
In addition, we also analysed three neural firing features: the burst index, the XXX’x coefficient of variation and the position of XXX’x maximum (Xxxxxxxxxxxxxx and Xxxxxxx-Xxxxx, 2002). The mean burst index was 0.10 ± 0.21 in the CHR condition and 0.1 ± 0.16 in the NER condition. A paired-sample two-tail t-test revealed that the difference in the burst index was not significant (t=-0.53; p=0.60) and the correlation between the two distributions was highly significant - r=0.86, p<0.001 (Figure 11D).
The mean XXX’x coefficient of variation was smaller in the CHR condition (1.09 ± 0.30)
than in the NER condition (1.15 ± 0.29) - paired-sample two-tail t-test: t=-3-70; p<0.001, but the correlation between the values in the NER and CHR was high - r=0.87, p<0.001 (Figure 11E). The mean position of XXX’x maximum was 46.58 ± 55.99 in the CHR condition and was 43.24 ± 54.58 in the NER condition. A paired-samples two- tail t-test revealed that the difference in the means of the positions of XXX’x maximum was not significant (t=0.76; p=0.45) and values of the two conditions were significantly correlated - r=0.69, p<0.001 (Figure 11F).
Overall, the remarkable stability of the firing rate properties of the single units across the two conditions (NER and CHR) support the assumption that we could actually monitor steadily the same single neuron’s activity and hence can compare the functional properties of each individual neuron between the two conditions.
Figure 11. Relationship between the firing properties of the single units in the CHR and in the NER condition: the mean firing rate (A), the variability of the firing rate (B), the peak of firing rate (C) the burst index (D), the XXX’x coefficient of variation (E), and the position of XXX’x maximum (F). In red, the regression line is shown.
4.2 Single Units Analysis
In the CHR condition we found 57 out of 98 (58%) neurons (31/60 - 52% - in Mk1 and 26/38 - 68% - in Mk2) with a significant modulation of their discharge during at least one behaviour, whereas the remaining fraction did not appear to be responsive for any of the scored behaviours (Figure 12A). In the NER condition the number of neurons modulated for at least one behaviour dropped down to 34 out of 98 (35%), in particular 17/60 (28%) in Mk1 and 17/38 (45%) in Mk2 (Figure 12B).
Figure 12. Percentages of neurons modulated by at least one behaviour in the CHR (A) and in the NER (B) condition.
Figure 13 shows an example of a “mouth-related” neuron recorded in Mk1: in the CHR condition, this neuron is modulated when the monkey brings food morsels to the mouth, regardless of the hand used, and when it performs mouth behaviours, such as receiving passively pieces of fruits and fruit juice directly into the mouth and subsequent motor behaviour such as chewing and swallowing; in the NER condition,
this neuron tends to generally maintain the same functional properties, showing modulation both for active and passive mouth movements in spite of a more noisy baseline and less clearly tuned responses.
Figure 14 shows an example of another “mouth” neuron recorded in Mk2 that, instead, discharges specifically when the monkey actively brings a piece of fruit to the mouth with the controlateral hand in the CHR condition but no modulation can be found in the NER condition.
Figure 15 shows an example of another neuron from Mk2 which increases its firing rate when the monkey performs a finger prehension toward neutral objects with the controlateral hand in the CHR condition but doesn’t keep its functional properties in the NER condition, where no significant modulation emerges.
Lastly, Figure 16 shows an example of a neuron from Mk2 which is modulated by all the grasping action performed with the controlateral hand (toward food or no food items) in the CHR condition; it also shows a modulation for grasping actions in the NER condition but, in this case, it is modulated by grasping actions performed to climb up or down the wooden structure regardless of the hand used and it shows no modulation for grasping actions towards fruit pieces.
It is clear, based on the example neurons, that the correspondence in terms of response properties between the neural responses in CHR and NER can be very broad or absent at all.
Figure 13. Modulations of Mk1’s single unit 96a in the CHR (A) and in the NER (B) conditions. This is an example of a broad “mouth” neuron which keeps its functional properties across conditions. Behaviours which significantly modulate the unit are highlighted in yellow.
Figure 14. Modulations of Mk2’s single unit 8a in the CHR (A) and in the NER (B) conditions. This is an example of a “mouth” neuron which shows a strict selectiveness in the CHR condition but doesn’t keep its functional property in the NER condition. Behaviours which significantly modulate the unit are highlighted in yellow.
Figure 15. Modulations of Mk2’s single unit 35a in the CHR (A) and in the NER (B) conditions. This is an example of a “hand” neuron which shows a relatively strict selectiveness in the CHR condition but doesn’t keep its functional property in the NER condition. Behaviours which significantly modulate the unit are highlighted in yellow.
Figure 16. Modulations of Mk2’s single unit 86a in the CHR (A) and in the NER (B) conditions. This is an example of a “hand” neuron which increases its firing rate in the CHR condition for all the grasping actions performed with the controlateral hand. It shows a modulation for grasping action also in the NER condition, but the correspondence is only broadly congruent. Behaviours which significantly modulate the unit are highlighted in yellow.
In Figure 17 we graphically reported the number of neurons showing an increase in their firing rate for each behaviour tested in the two conditions. For every grasping behaviour considered in the analysis, the number of neurons modulated in the CHR condition was markedly higher than the number of neurons modulated in the NER condition. Notably in the CHR condition the number of neurons modulated by grasping actions performed with the controlateral hand was higher than those modulated by grasping actions with the ipsilateral hand (this is particularly true for Mk2) but this difference disappears in the NER condition.
A high number of neurons was also modulated by mouth behaviours (either passive or active) in the CHR conditions whereas in the NER condition this number dropped down for active mouth behaviours but slightly increased for passive behaviours, in particular when monkey was receiving fruit juice through a syringe. Passive delivery of “Solid reward” was not tested for Mk1 in the NER condition, making impossible to compare the coding of this behaviour between the two contexts. Interestingly, most of the neurons modulated by mouth behaviour have been found in Mk1, likely because of the slightly more ventral positioning of the microelectrode arrays.
Lastly, a really small number of neurons showed a significant response in the CHR condition when the monkey observed passively fruit morsels before grasping them and when it observed the experimenter grasping and eating fruit morsels in front of it, as well as to walking related behaviours in the NER condition. We also found 5
neurons which appeared to be specifically modulated when the monkey yawned
30
25
20
15
10
5
0
CHR Mk1 CHR Mk2 NER Mk1 NER Mk2
(Figure 18-19); unfortunately, this behaviour was not observed in Mk2 with a sufficiently high frequency to be statistically analysed.
Grasp Food R
Grasp Food R
Grasp Food L
Grasp Food L
Finger Prehension 0° R
Finger Prehension 90° R Grasp for Climbing R
Grasp Nylon R
Finger Prehension 0° L
Finger Prehension 90° L Grasp for Climbing L
Grasp Nylon L Grasp for Grooming L
Active Food R
Active Food R
Active Food L
Active Food L
Solid Reward
Solid Reward
Liquid Reward
Liquid Reward
Food Presentation
Active Food Exp
Grasp Exp
Step Hand R
Power Step R Step Hand L Power Step L Scratching
Yawn
Figure 17. Number of neurons, for each monkey and altogether, modulated by each behaviour in the CHR and in the NER condition. Grasp food in the CHR condition was obtained grouping together grasp far and grasp near behaviours. Some behaviours have only been analysed for one monkey because of a lack of trials for the other: finger prehension 90° L and R in the CHR condition and solid reward and scratching in the NER condition were only analysed for Mk2 whereas grasp nylon L and R, grasp for grooming L and yawn in the NER condition were only analysed for Mk1.
Figure 18 Modulations of Mk1’s single unit 5a for mouth behaviours in the NER condition. This is an example of a neuron specifically modulated when the monkey yawns; it shows no further significant modulation for other behaviours in the NER as well as in the CHR conditions. Behaviours which significantly modulate the unit are highlighted in yellow.
Figure 19. Modulations of Mk1’s single unit 16b for mouth behaviours in the NER condition. This is an example of a neuron specifically modulated when the monkey yawns but it also increases its firing rate when monkey receive fruit juice through a syringe; it shows no further significant modulation for other behaviours in the NER as well as in the CHR conditions. Behaviours which significantly modulate the unit are highlighted in yellow.
Next, we assessed the possible match between the neuronal properties in the two contexts at the single neuron level, with the null-hypothesis that – if the neuronal response and the firing rate/behaviour relationship is captured in a reliable and ecologically-relevant manner by testing neurons in head restrained conditions - then the relationship between single neuron’s tuning and behaviour should be the same also when tested in the NER, which is the context closer to the one in which we need to clarify the brain-behaviour relationship. In total, we found 65 single units modulated for at least one scored behavioural event in the two sessions. The same neuron could be active exclusively in one of the two contexts, either the CHR (n=31) or the NER (n=8), whereas a set of neurons became active in both contexts (n=26; Figure 20).
Figure 20. Venn diagram showing the condition/s associated with the modulation of the single units. In particular, in the first subject (Mk1), we found a total of 36 responsive neurons to at least one scored behaviour, where 12 are significantly firing in both conditions (33%), 19 are responsive only in the CHR condition (53%) and 5 only in the NER condition (14%). In the second subject (Mk2), we found a total of 29 responsive neurons, where 14 are significantly activated in both conditions (48%), 12 are responsive only in the CHR condition (41%) and 3 only in the NER condition (10%).
Regarding the behaviours that could be compared between CHR and NER condition, a fundamental question is whether the recorded neurons maintained their tuning across conditions. In order to assess this point, we graphically reported the number of neurons that kept their selectivity across conditions compared to the number of neurons that showed a particular behavioural preference only in CHR or NER condition. We analysed these properties at two different levels: grouping together all mouth behaviours or keeping them distinct (Figure 21). Taken together, mouth behaviours modulated a high number of single units in the CHR condition; a good portion of these were modulated by similar behaviours also in the NER condition whereas only few neurons responded only in the latter. Analysing the mouth behaviours singularly, we can observe a good overlapping for Active Food to the
Mouth R and Liquid Reward but much more differences when analysing Active Food to the Mouth L and Solid Reward (the last one, tested only in Mk2).
Figure 21. Venn diagram representing the relationship between the neurons which respond to at least one mouth behaviour in both CHR and NER condition and the neurons which respond to mouth behaviours only in CHR or in NER condition (A). Venn diagrams for active food R (B), for active food L (C), for liquid reward (D) and for solid reward (E). In the last case, only Mk2’s neurons have been considered. Colour code as in Figure 20.
The same analysis was also conducted on grasping behaviours (Figure 22). In the CHR condition a high number of neurons increased their firing rate when monkey was performing a grasping behaviour with controlateral or ipsilateral hand, but less than a half of these showed this modulation also in NER condition. Analysing the hand behaviours singularly, we can observe a discrete overlapping when monkey grasped a piece of fruit with controlateral or ipsilateral hand or a neutral object with controlateral hand but a higher difference when monkey grasped a neutral object with the ipsilateral hand.
Figure 22. Venn diagram representing the relationship between the neurons which respond to at least one grasping behaviour in both CHR and NER condition and the neurons which respond to grasping behaviours only in CHR or in NER conditions (A). Venn diagrams for grasp food R - which groups together grasp far and grasp near R in the CHR condition (B), for grasp food L - which groups together grasp far and grasp near L in the CHR condition (C), for grasp no food R - which groups together finger prehension 0° and 90° R in the CHR condition and grasp for climbing and grasp nylon thread R in the NER condition (D) and for grasp no food L - which groups together finger prehension 0° and 90° L in the CHR condition and grasp for climbing, grasp for grooming and grasp nylon thread L in the NER condition (E). Colour code as in Figure 20.
Finally, we aimed to explore the relationship between the neural tuning for behaviours within and across the two conditions, without any preliminary assumption of behavioural resemblance. For each pair of behaviours, a coefficient of similarity was calculated as the ratio between the number of neurons modulated by both behaviours and the number of neurons modulated at least by one of them; a comprehensive similarity matrix was then generated (Figure 23). Within the CHR condition a high similarity can be observed for grasping behaviours performed with the same hand as well as for mouth behaviours. Within the NER condition the similarity matrix becomes
slightly more confused, due to the relatively small number of neurons modulated in
this condition; a high index can be found for walking related behaviour as well as for grasping food and active bringing food to the mouth behaviours. Across condition, liquid reward is the behaviour exhibiting the highest similarity, but this is shared with all the other mouth behaviours in the CHR condition.
Figure 23. Similarity matrix showing the relationship between the neural tuning for behaviours within and across conditions. For each pair of behaviours, a coefficient of similarity is calculated as the ratio between the number of neurons modulated by both behaviours and the number of neurons modulated by at least one of them. CHR-CHR paired behaviours are shown in the left square up, CHR-NER paired behaviours are shown in the right square up and in the left square down whereas NER-NER paired behaviours are shown in the right square down.
4.3 Multi Units Analysis
In the CHR condition we found 162 out of 256 (63%) multi-units (89/128 - 70% - in Mk1 and 73/128 - 57% - in Mk2) with a significant modulation of their discharge during at least one behaviour, whereas the remaining fraction did not appear to be responsive for any of the scored behaviours (Figure 24A). In the NER condition the number of neurons modulated for at least one behaviour was 166 out of 256 (65%) - 87/128 (68%) in Mk1 and 79/128 (62%) in Mk2 (Figure 24B).
Figure 24. Percentages of multi-units modulated by at least one behaviour in the CHR (A) and in the NER (B) condition.
Figure 25 shows an example of a unit recorded in Mk1 that broadly maintained its selectivity across conditions. In the CHR, it is clearly modulated when the monkey actively brings food morsels to the mouth, regardless of the hand used, and when it performs passive mouth behaviours, such as receiving passively pieces of fruits and fruit juice directly into the mouth; in the NER condition it is still modulated by mouth behaviours but, because of a more noisy baseline and less clearly tuned responses, the only activation that reaches the significance threshold is receiving fruit juice directly
into the mouth.
Figure 25. Modulations of Mk1’s multi-unit 32a in the CHR (A) and in the NER (B) conditions. This is an example of a broad “mouth” neuron in the CHR condition whereas in the NER condition it is still modulated when monkey receives passively fruit juice through a syringe but no significant modulations can be found when the monkey actively brings food to the mouth. Behaviours which significantly modulate the unit are highlighted in yellow.
In Figure 26 we graphically reported the number of multi-units showing an increase in their firing rate for each behaviour tested in the two conditions. Differently from single unit results, not all the grasping behaviours modulated more units in the CHR condition than in the NER condition: a distinction based on the target of the grasping action seems to be necessary in order to explain these data. In particular, grasping pieces of fruits modulated a much higher number of neurons in the CHR condition rather than in the NER condition, independently from the hand used. This is not true for grasping actions directed toward neutral objects, which showed a comparable number of tuned multi-units in the two conditions when performed with the controlateral hand and a much higher number of units modulated in the NER than in the CHR condition when performed with the ipsilateral hand.
A high number of units, especially for Mk1, was also modulated by mouth behaviours - either passive or active - in the CHR conditions whereas in the NER this number generally dropped down, in particular for bringing food to the mouth behaviours performed with the ipsilateral hand. Passively receiving fruit juice through a syringe is an exception to this rule because this behaviour modulates more units in the NER condition than in the CHR condition.
Similarly to single units analysis, a really small number of units showed a significant response in the CHR condition when monkey observed passively fruit morsels before grasping them and when it observed the experimenter grasping and eating fruit morsels in front of it, as well as to walking related behaviours in the NER
condition.
90
80
70
60
50
40
30
20
10
0
CHR Mk1 CHR Mk2 NER Mk1 NER Mk2
Grasp Food R
Grasp Food R
Grasp Food L
Grasp Food L
Finger Prehension 0° R
Finger Prehension 90° R Grasp for Climbing R
Grasp Nylon R
Finger Prehension 0° L
Finger Prehension 90° L Grasp for Climbing L
Grasp Nylon L Grasp for Grooming L
Active Food R
Active Food R
Active Food L
Active Food L
Solid Reward
Solid Reward
Liquid Reward
Liquid Reward
Food Presentation
Active Food Exp
Grasp Exp
Step Hand R
Power Step R Step Hand L Power Step L Scratching
Yawn
Figure 26. Number of multi-units, for each monkey and altogether, modulated by each behaviour in the CHR and in the NER conditions. Grasp food in the CHR condition was obtained grouping together grasp far and grasp near behaviours. Some behaviours have only been analysed for one monkey because of a lack of trials for the other: finger prehension 90° L and R in the CHR condition and solid reward and scratching in the NER condition were only analysed for Mk2 whereas grasp nylon L and R, grasp for grooming L and yawn in the NER condition were only analysed for Mk1.
We also found 202 multi-units modulated for at least one scored behavioural event in the two sessions. The same unit could be active exclusively in one of the two contexts, either the CHR (n=36) or the NER (n=40), but most of these units were somehow active in both contexts (n=126; Figure 27).
Figure 27. Venn diagram showing the condition/s associated with the modulation of the multi-units. In particular, in the first subject (Mk1), we found a total of 107 responsive units to at least one scored behaviour, where 69 are significantly firing in both conditions (64%), 20 are responsive only in the CHR condition (19%) and 18 only in the NER condition (17%). In the second subject (Mk2), we found a total of 95 responsive units, where 57 are significantly activated in both conditions (60%), 16 are responsive only in the CHR condition (17%) and 22 only in the NER condition (23%).
Regarding the behaviours that were available both in CHR and NER conditions, we compared the activations for mouth and hand behaviours in the two contexts, individually or grouping them together based on the effector used.
When the monkey was performing a mouth behaviour, almost half of the modulated units increased their firing rate in both conditions, but a consistent portion of them were only modulated in the CHR or in the NER condition (Figure 28A). Analysing the mouth behaviours singularly, we can observe a good overlapping only for liquid reward whereas active mouth behaviours and solid reward (the last one, tested only in Mk2) show more differences between the two conditions (Figure 28B, C, D, E).
Figure 28. Venn diagrams representing the relationship between the multi-units which respond to at least one mouth behaviour in both CHR and NER conditions and the multi-units which respond to mouth behaviours only in CHR or in NER condition (A). Venn diagrams for active food R (B), for active food L (C), for liquid reward (D) and for solid reward (E). In the last case, only Mk2’s units have been considered. Colour code as in Figure 27.
Regarding hand behaviours, most of the modulated units increased their firing rate only in the CHR or in the NER condition; however, a consistent portion maintained its modulations across conditions (Figure 29A). Analysing the hand behaviours individually, for all the groups created the overlapping between the activations in the CHR and in the NER conditions was small; in particular, when the monkey used the ipsilateral hand for grasping a piece of fruit only a few units were modulated in the NER condition whereas we can observe an opposite pattern for neutral objects, where much more units became active in the NER condition then those which were active in the CHR context (Figure 29B, C, D, E).
Figure 29. Venn diagrams representing the relationship between the multi-units which respond to at least one grasping behaviour in both CHR and NER conditions and the multi-units which respond to grasping behaviours only in CHR or in NER condition (A). Venn’s diagrams for grasp food R - which groups together grasp far and grasp near R in the CHR condition (B), for grasp food L - which groups together grasp far and grasp near L in the CHR condition (C), for grasp no food R - which groups together finger prehension 0° and 90° R in the CHR condition and grasp for climbing and grasp nylon thread R in the NER condition (D) and for grasp no food L - which groups together finger prehension 0° and 90° L in the CHR condition and grasp for climbing, grasp for grooming and grasp nylon thread L in the NER condition (E). Colour code as in Figure 27.
In Figure 30 the similarity matrix of multi-units’ tuning is shown. Within the CHR condition, a high similarity index can be observed for mouth behaviours as well as for grasping behaviours performed with the same hand. Within the NER condition, mouth behaviours seem pretty separate from each other, whereas some degree of similarity can be found between some grasping behaviours and for walking related behaviours. Across condition, liquid reward is still the one most steadily encoded behaviour across contexts but this similarity is shared with all the mouth behaviours and, even if with a lower index, also with grasping food behaviours in the CHR
condition (likely because they prelude to a preparation for mouth opening). A weak similarity emerges also among grasping behaviours, in particular those performed towards neutral objects in the CHR condition and grasp for climbing and failed grasp in the NER condition.
Figure 30. Similarity matrix showing the relationship between the neural tuning for behaviours within and across conditions. For each pair of behaviours, a coefficient of similarity is calculated as the ratio between the number of multi-units modulated by both behaviours and the number of multi- units which were modulated by at least one of them. CHR-CHR paired behaviours are shown in the left square up, CHR-NER paired behaviours are shown in the right square up and in the left square down whereas NER-NER paired behaviours are shown in the right square down.
5. Discussion
In this study, we wirelessly recorded single and multi-unit activity from the premotor cortex while monkeys were performing naturalistic behaviours in two different contexts: a classical head-restrained setup in the primate chair (CHR) and a new freely moving setup in the NeuroEthoRoom (NER). These preliminary results shed light on the relationship between the functional properties of neurons in conditions with different levels of ecological validity. In particular, we could verify the possibility to steadily record the same neurons’ activity across the two contexts, as supported by the application of the same spike sorting algorithm to the whole session and the considerable stability of individual neurons’ firing features. However, the functional properties of both single and multi-units characterized in the classical head-restrained condition do not appear to be strong predictors of those observed when the animal is tested in a freely moving context, both because of an overall reduced number of modulated units in the NER relative to the CHR and because often there is a large variation in individual units’ tuning across conditions. Among the various studied behaviours, actions performed with the mouth generally show a greater across-context similarity, whereas distal manual actions exhibit the more distinct neural substrates in the two contexts.
As summarized above, the main and most important finding that emerges from our analyses is that the context in which the behaviours are performed shows a strong influence on the neural responses of the investigated sector of the PMC. Indeed, in the
CHR condition 58% of the single units we recorded were modulated by at least one
scored behaviour, whereas in the NER condition only 35% of the recorded neurons were positively modulated (even if the number of behaviours considered was higher in this condition). Moreover, considering the neurons modulated for at least one behavioural event in the whole session, only 40% of them were active in both conditions whereas 48% of them were only modulated in the CHR and 12% only in the NER condition.
Neuroscientific literature abounds of data supporting the extraordinary encoding plasticity of the premotor cortex: its neurons can represent the direction of an action (Xxxxx et al., 2001), the relative position of the target, hand, and eye in the planning of a reaching action (Xxxxxxx et al., 2006), the goal and/or the hand configuration of a grasping action (Xxxxxxxxxx et al., 1988; Xxxxxxxxxx and Luppino, 2001; Xxxxxx et al., 2008), defensive movements (Xxxxxxxx et al., 2002; Xxxxx and Xxxxxxxx, 2004), other’s actions (Xxxxxxx et al., 1996; Xxxxxx, 2017) and decision making strategies (Xxxxx et al., 2011; Xxxxxx-Xxxxxxx and Xxxx, 2019). Therefore, it is possible that the premotor cortex involves mostly goal-directed or partially learned behaviours which usually show a strong voluntary component (Xxxxxxxxxx et al., 1988; Xxxxxx et al., 2008), whereas natural behaviours occurring in a more implicit manner and with less attentional control could more heavily rely on automated, subcortical mechanisms (Xxxxxx, 1993; Xxxxxx and Xxxxxx, 2001).
Nevertheless, an important difference regarding the effector utilized in the
behaviours which were compared in the two conditions emerged from our study: 52% of the responsive neurons showed a modulation for mouth actions in both conditions
whereas the percentage of responsive neurons showing a modulation for grasping actions in both conditions was only 37%. This difference, even if less marked, emerges also in the multi-unit analysis: 47% of units showed an activation for mouth actions in both contexts whereas the units activated for hand actions in both contexts represented the 43%. In particular, the behaviour that shows a higher consistency between the two conditions is passively receiving fruit juice through a syringe, which activates 53% of the modulated single units and 61% of the modulated multi-units in both conditions. A plausible explanation for this difference relies on the fact that behavioural events involving the mouth typically consist of defined pattern of repetitive movements and only to a lesser extent dependent on other variables (e.g., variable axial and/or distal components or visual input changing with the monkey’s gaze position). In contrast, behaviours involving distal effectors usually involve in complex body dynamics where different postural variables can contribute to the final movement, making them strongly dependent on context-related postural and synergistic controls that are markedly reduced or even eliminated in constrained conditions.
This hypothesis is supported by several studies that have shown an involvement of the premotor cortex in the axio-proximal muscle control. First, axio-proximal movements can be elicited by the electrical stimulation of the dorsal sector of the premotor cortex, as well as the dorsal sector of the ventral primary motor cortex (Xxxxxxxx et al., 2002; Xxxxxxxx et al., 2012). Second, the premotor cortex - as well as the primary and supplementary motor cortex - shows direct anatomical connections to
the spinal cord and to the nuclei from which reticulospinal tract fibres originate, in
particular with more extensive projections from the areas controlling movements of proximal than of distal parts of the limbs (Xxxxxx and Xxxxxxx 1984; Xxxxxx and Xxxxxxx 1989; Xxxxxxxxx and Xxxxxx, 2006). Third, bilateral damage of human’s PMC results in deficient postural control of the body, whereas unilateral lesions are associated with a weakness of contralateral hip and shoulder muscles and incoordination of movements requiring the interaction of both arms or both legs (Xxxxxx, 1984). Lastly, a recent study in freely-moving rats showed that the posterior parietal cortex and the frontal motor cortex are modulated by posture of the head, neck and back, and with synchronous recordings from both regions it is possible to decode ongoing whole-body behaviours (Xxxxxx et al., 2018). Therefore, it is possible that a large portion of the functional properties of premotor neurons investigated in the head- restrained condition could, in fact, be due to an involvement not of a distal but of an axio-proximal component (e.g., neck and back muscles), which remarkably differ when the monkey is in a freely-moving context where the control of the body posture is carried out by automatic schemes and reflexes.
Another possibility to explain the observed discrepancies in the activation of hand-related neurons between the two conditions invokes a difference in the visuo- motor coding, which might be influenced by the variability in gaze direction and, therefore, in eye-hand coordination (Xxxxx et al., 1996; Xxxxxxxx and Xxxxxxxxxxx, 2018). Further studies, integrating body postural tracking as well as eye-tracking in the freely-moving monkey, are needed to test these possibilities. Elucidating these
variables and their impact on the cortical control of behaviour in freely-moving
conditions should drastically improve our understanding of neural activity in ecological conditions.
Altogether, the results of the present study suggest the necessity of caution when generalising the neural properties investigated in the classical head-restrained experiments to other ecologically relevant contexts for the animal. This conclusion is supported by few other studies comparing behavioural variables as well as neuronal activations in constrained and unconstrained settings. Xxxxxx et al. (2004) and Xxxxxxxx (2008) found important differences in the saccadic movements for visual orientation depending on the animal’s possibility of moving the head. More recently, Xxxxxxxxx et al. (2022) showed that neurons in non-human primate’s prefrontal cortex were differently modulated by social signal processing when the animals were in contexts with different degrees of freedom (restrained, freely-moving or active communication).
A possible limitation of this study relies on the low number of experimental sessions analysed so far; although we recorded a larger number of sessions, the time required to complete a cross-validated ethological scoring of the whole session, the neural data analysis, and the match between the two sources of data with the subsequent processing will have to be continued in the next months, especially because the individually isolated neurons tend to slightly change across weeks, thus offering the possibility to expand the data set and the number of isolated neurons by adding new sessions. The difference between the functional properties of the two monkeys’ neurons (Mk1 had more single units modulated by mouth behaviours whereas Mk2 had
more neurons modulated by hand behaviours) could easily be explained given that the
ventral premotor cortex shows a somatotopic map where hand actions are represented dorsally and mouth actions are represented ventrally (Xxxxxxxxxx and Xxxxxxxxxx, 1988; Xxxxxxxxx et al., 1995), which is completely consistent with the known difference in the implantation sites.
In conclusion, we developed a two-steps approach that aims to compare the traditional, head-restrained neurophysiological setup to investigate the functional properties of the cortical motor system with an unconstrained, neuroethological approach in rhesus macaques. This study contributes to a recently emerged effort carried out by an increasingly large part of the neuroscientific community (Xxxxxx at al., 2020; Xxxxxx et al., 2020; Xxxxxxxxxx et al., 2020) which aims to develop new paradigms allowing a more ecologically and ethologically valid understanding of the brain-behaviour relationship, especially in non-human primates, which constitutes the more translationally-relevant model of human brain function. Indeed, new approaches to generalize the brain mechanisms identified in the laboratory to ecologically and ethologically relevant conditions are required not only to provide a better understanding of how the brain works in natural contexts - which is the ultimate aim of neuroscientific studies - but also for developing neuroprosthetic devices and clinic intervention in the case of brain damage or disease, which must ultimately be applicable and functional in a wide range of everyday situations.
Bibliography
Xx-Xxxxx, A. A., Xxxxxxx, C. M., Xxxxxxxx, T. M., & Xxxxxx, T. (2000). Brief electrical stimulation promotes the speed and accuracy of motor axonal regeneration. Journal of Neuroscience, 20(7), 2602- 2608.
Xxxxxx, B. S., Xxxxxxxx-Xxxx, E. A., Xxx, M. H., Xxxxxxx, X., & Xxxxxx, A. (2016). Prevalence and causes of paralysis—United States, 2013. American journal of public health, 106(10), 1855-1857.
Xxxxxx, X., Xxxx, N. S., & Xxxx, A. (2020). Wireless recording from unrestrained monkeys reveals motor goal encoding beyond immediate reach in frontoparietal cortex. Elife, 9, e51322.
Xxxxxx, N. A., Xxxxxxxxxxxxxxx, X., Xxxxxxxxxx, B., Xxxx, E. J., Xxxxxxxxxx, N., Xxxxx, A. A., ... & Xxxxxxxxx-Xxxxx, X. X. (2016). Design and optimization of an EEG-based brain machine interface (BMI) to an upper-limb exoskeleton for stroke survivors. Frontiers in neuroscience, 10, 122.
Xxxxxxxxxxxx, X., & Xxxxxxxx, D. (2002). Median of the p value under the alternative hypothesis. The American Statistician, 56(3), 202-206.
Xxxxxxxxx, X., Xxxxx, X., Xxxxx, S., Xxxxxxx, K. M., Xxxxxx, X., Xxxxx, R. J., & Xxxxxx, H. J. (1998). Human anterior intraparietal area subserves prehension: a combined lesion and functional MRI activation study. Neurology, 50(5), 1253-1259.
Xxxxxxxxx, X., Buccino, X., Xxxxx, X., Xxxxx, R. J., Xxxxxxxxxx, X., & Xxxxxx, H. J. (1999). A fronto‐ parietal circuit for object manipulation in man: evidence from an fMRI‐study. European Journal of Neuroscience, 11(9), 3276-3286.
Xxxxx, H. M., Xxxxxxx, K. R., Xxxxxxxx, B. J., Xxxxxx, P., & Xxxxx, J. P. (2001). Spinal axon regeneration evoked by replacing two growth cone proteins in adult neurons. Nature neuroscience, 4(1), 38-43.
Xxxxxx, X., Xxxxxxxx, M., Xxxx, X., Xxxxxxx, X., & Xxxxxxxxxx, G. (2014). Space-dependent representation of objects and other's action in monkey ventral premotor grasping neurons. Journal of Neuroscience, 34(11), 4108-4119.
Xxxxxx, X. (2017). The extended mirror neuron network: Anatomy, origin, and functions. The Neuroscientist, 23(1), 56-67.
Xxxxx, X., Xxxxxxxx, X., Xxxxxxxxx, X., Gerbella, X., Xxxxxx, X., Xxxxx, X., & Xxxxxxx, G. (2008). Cortical connections of the macaque anterior intraparietal (AIP) area. Cerebral Cortex, 18(5), 1094- 1111.
Xxxxxxx, X., Xxxxxxx, X., Xxxx, N. J., Xxxxxxx, O., Xxxxxxxxx, M., Xxxxxxxx, K. P., ... & Xxxx, G. R. (2001). Polymodal motion processing in posterior parietal and premotor cortex: a human fMRI study strongly implies equivalencies between humans and monkeys. Neuron, 29(1), 287-296.
Caggxxxx, X., Xxxxxxx, X., Xxxxxxxxxx, X., Xxxxx, X., & Xxxxxx, X. (2009). Mirror neurons differentially encode the peripersonal and extrapersonal space of monkeys. Science, 324(5925), 403-406.
Xxxxxxx, J. M., Xxxxxxx, M. A., Xxxxx, R. E., X'Xxxxxxx, J. E., Xxxxxxxx, D. M., Xxxxxxxx, D. F., ... & Xxxxx, X. (2003). Learning to control a brain–machine interface for reaching and grasping by primates. PLoS biology, 1(2), e42.
Xxxxxx, X., & Xxxxxxx‐Rakic, P. S. (1989). Posterior parietal cortex in rhesus monkey: II. Evidence for segregated corticocortical networks linking sensory and limbic areas with the frontal lobe. Journal of Comparative Neurology, 287(4), 422-445.
Xxxxx, X. X., Magland, J. F., Xxxxxxx, A. H., Tolosa, V. M., Xxxxxx, A. C., Xxx, K. Y., ... & Xxxxxxxxx,
L. F. (2017). A fully automated approach to spike sorting. Neuron, 95(6), 1381-1394.
Xxxxx, X. (1960). A Coefficient of Agreement for Nominal Scales. Educational and Psychological Measurement, 20(1), 37–46
Xxxxx, C. L., Xxxxxxx, X. X., & Xxxxxxxx, M. E. (1993). Ventral intraparietal area of the macaque: anatomic location and visual response properties. Journal of neurophysiology, 69(3), 902-914.
Xxxxxxxxxxxxxx, X., & Xxxxxxx-Xxxxx, P. S. (2002). Correlated discharges among putative pyramidal neurons and interneurons in the primate prefrontal cortex. Journal of neurophysiology, 88(6), 3487- 3497.
Xxxxx, X. X., & Xxxxxxxx, M. S. (2004). Sensorimotor integration in the precentral gyrus: polysensory neurons and defensive movements. Journal of neurophysiology, 91(4), 1648-1660.
Xxxxxxxxx, H. S., Xxxxxxx, S. U., Xxxxx, M., Xxxxx, G. W., Xxxxxxx, X., Xxxxxxxxxxxx, X., & Xxxxxx,
C. T. (2019). Spatial encoding in primate hippocampus during free navigation. PLoS biology, 17(12), e3000546.
Culham, J. C., Xxxxxxxx, S. L., Xx Xxxxx, X. X., Xxxx, J. S., Xxxxx, R. S., & Xxxxxxx, M. A. (2003). Visually guided grasping produces fMRI activation in dorsal but not ventral stream brain areas. Experimental brain research, 153(2), 180-189.
Xxxxxx, X., Xxxxxx, M., Xxxxxxx, X., Xxxxxxxx, X. X., & Xxxxxxx, X. (2006). Dissociating the role of ventral and dorsal premotor cortex in precision grasping. Journal of Neuroscience, 26(8), 2260-2268.
Xxxxxxx, J. R., Xxxxxxx, F., Xxxxx, S. B., & Xxxx, X. (1997). Spatial invariance of visual receptive fields in parietal cortex neurons. Nature, 389(6653), 845-848.
Xxxxxx, E. V. (1968). Relation of pyramidal tract activity to force exerted during voluntary movement.
Journal of neurophysiology, 31(1), 14-27.
Xxxx, X. X., & Xxxxx, M. A. (1998). Modeling parietal–premotor interactions in primate control of grasping. Neurxx Xxxxxxxx, 00(0-0), 0000-0000.
Xxxxxxx, X., Xxxxxxx, V., Xxxxxx, X., Xxxxxxx, X., Xxxxxxx, X., & Xxxxxxxxxx, G. (1996). Coding of peripersonal space in inferior premotor cortex (area F4). Journal of neurophysiology, 76(1), 141-157.
Xxxxxxx, X., Xxxxxxx, V., Xxxxxxx, G., Xxxxxxxxx, X., Xxxxxx, X., & Xxxxxxxxxx, G. (2001). Cortical mechanism for the visual guidance of hand grasping movements in the monkey: A reversible inactivation study. Brain, 124(3), 571-586.
Xxxxxxx, X., & Xxxxxxx, G. (2005). Motor functions of the parietal lobe. Current opinion in neurobiology, 15(6), 626-631.
Fornia, X., Xxxxxxx, G., Xxxxxxxx, X., Xxxxx, X., Xxxxx, X., Xxxxx, X., & Xxxxxxxxx, F. (2020). Direct electrical stimulation of the premotor cortex shuts down awareness of voluntary actions. Nature communications, 11(1), 1-11.
Xxxxxxxx, X.X. (2008). Reality Bites. Inc. Magazine Available from: xxxx://xxx.xxx.xxx/xxxxxxxx/00000000/xxxxxxx-xxxxx.xxxx.
Xxxxxx, X. X. (1984). Premotor areas in man. Trends in neurosciences, 7(12), 481-483.
Xxxxxxx, X., & Xxxxxx, X. (1870). Ueber die elektrishe Erregarkeit des Grosshirns. Trans. by X. xxx Xxxxx. In The Cerebral Cortex, ed. WW Xxxxxxxx, 73-96.
Xxxxxxx, X., Xxxxxx, X., Xxxxxx, M., Xxxx, N., & Xxxxxx, X. (1994). Deficit of hand preshaping after muscimol injection in monkey parietal cortex. Neuroreport: An International Journal for the Rapid Communication of Research in Neuroscience.
Xxxxxxx, X., Xxxxxx, X., Xxxxxxx, X., & Xxxxxxxxxx, G. (1996). Action recognition in the premotor cortex. Brain, 119(2), 593-609.
Xxxxxxxxxxxx, A. P., Xxxxxxx, J. F., Xxxxxxxx, X., & Xxxxxx, J. T. (1982). On the relations between the direction of two-dimensional arm movements and cell discharge in primate motor cortex. Journal of Neuroscience, 2(11), 1527-1537.
Xxxxxxxxx, M., Xxxx, A. R., xxx Xxxx, B., & xxx xxx Xxxxx, H. (1995). Somatotopy of monkey premotor cortex examined with microstimulation. Neuroscience research, 23(3), 269-279.
Xxxxxxxx, M. S., Xxxxx, L. A., & Xxxxx, C. G. (1999). A neuronal representation of the location of nearby sounds. Nature, 397(6718), 428-430.
Xxxxxxxx, M. S., Xxxxxx, C. S., & Xxxxx, T. (2002). Complex movements evoked by microstimulation of precentral cortex. Neuron, 34(5), 841-851.
Xxxxxxxx, M. S., & Xxxxxx, T. N. (2007). Mapping behavioral repertoire onto the cortex. Neuron, 56(2), 239-251.
Xxxxxxx, R. M., Xxxxxx-Xxxxx, S., Xxxxxxxxxxxx, X., Xxx, X., Xxxxxxxxxxx, X., & Xxxxxxx, K. J. (2020). The place-cell representation of volumetric space in rats. Nature communications, 11(1), 1- 13.
Xxxxxxxxxxx, X., Xxxxx, X., & X'Xxxxx, J. G. (2004). Decoding continuous and discrete motor behaviors using motor and premotor cortical ensembles. Journal of neurophysiology, 92(2), 1165- 1174.
Hazama, X., & Xxxxxx, R. (2019). Effects of self-locomotion on the activity of place cells in the hippocampus of a freely behaving monkey. Neuroscience letters, 701, 32-37.
Xxxxxxxx, L. R., Xxxxxxx, M. D., Xxxxxx, G. M., Xxxxxx, J. A., Xxxxx, M., Xxxxxx, A. H., ... & Xxxxxxxx, J. P. (2006). Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature, 442(7099), 164-171.
Xxxxx, X., & Xxxxx, X. (2006). Differential involvement of neurons in the dorsal and ventral premotor cortex during processing of visual signals for action planning. Journal of neurophysiology, 95(6), 3596-3616.
Xxxxxxxxx, X., & Xxxxxx, S. A. (2006). How can corticospinal tract neurons contribute to ipsilateral movements? A question with implications for recovery of motor functions. The Neuroscientist : a review journal bringing neurobiology, neurology and psychiatry, 12(1), 67–79.
Xxxxxxxxx, X., Xxxxxxxx, A. R., de la Mothe, X., Xxx, K. F., & Xxxxxx, C. T. (2022). Behavioral context affects social signal representations within single primate prefrontal cortex neurons. Neuron, S0896- 6273(22)00059-9. Advance online publication.
Xxxxx, X., Xxxxxxx, D. S., & Xxxxxx, P. L. (2001). Direction of action is represented in the ventral premotor cortex. Nature neuroscience, 4(10), 1020-1025.
Xxxxxx, X., & Xxxxxxx, H. G. J. M. (1984). Distribution of corticospinal neurons with collaterals to lower brain stem reticular formation in cat. Experimental brain research, 54(1), 107-120.
Xxxxxx, X., & Xxxxxxx, H. G. J. M. (1989). Distribution of corticospinal neurons with collaterals to the lower brain stem reticular formation in monkey (Macaca fascicularis). Experimental Brain Research, 74(2), 311-318.
Xxxxx, X., Xxxxxxxxxxx, S., Xxxxxxxxxxx, X., & Xxxx, X. (2011). Choosing goals, not rules: deciding among rule-based action plans. Neuron, 70(3), 536-548.
Xxxxxxx, M. A., & Xxxxxxxxx, M. A. (2011). Toward a whole-body neuroprosthetic. Progress in brain research, 194, 47-60.
Xxxx, X., Xxxxxxxxxx, M., Xxxxxxxx, M., Xxxxxxx, X., Xxxxxxxxxx, X., & Xxxxxx, L. (2019). Agent-based representations of objects and actions in the monkey pre-supplementary motor area. Proceedings of the National Academy of Sciences, 116(7), 2691-2700.
Xxxxxx, X., Xxxx, H. M., Xxxxx, B. C., & Xxxxxx, J. M. (2004). Detecting location-specific neuronal firing rate increases in the hippocampus of freely-moving monkeys. Brain research, 1014(1-2), 97- 109.
Xxxxxxxx, X., Xxxx, X., Xxxxxx, X., Xxxxx, X., Ferrari, P. F., Xxxxxxx, X., & Xxxxx, G. (2012). Anatomo‐functional organization of the ventral primary motor and premotor cortex in the macaque monkey. European Journal of Neuroscience, 36(10), 3376-3387.
Xxxxxxxx, X., Xxxx, X., & Xxxxxx, L. (2017). Spatial and viewpoint selectivity for others’ observed
actions in monkey ventral premotor mirror neurons. Scientific reports, 7(1), 1-7.
Xxxxxx, X., & Xxxxxx, D. (2001). Central pattern generators and the control of rhythmic movements. Current biology, 11(23), R986-R996.
Xxxxxxx, X., Xxxxxxx, X., Xxxxxxxxxx, X., & Xxxxxxxxxx, G. (1986). Afferent and efferent projections of the inferior area 6 in the macaque monkey. Journal of Comparative Neurology, 251(3), 281-298.
Xxxxxxx, X., Xxxxxxxx, X., Xxxxxx, X., Xxxx, T., Xxxxxxx, X., Xxxxxxxxx, M., & Xxxxx, C. D. (2014). Continuous theta-burst stimulation demonstrates a causal role of premotor homunculus in action understanding. Psychological Science, 25(4), 963-972.
Xxxxxxxx, X. X., & Xxxxxxxxxxx, X. (2018). Population coding of grasp and laterality-related information in the macaque fronto-parietal network. Scientific reports, 8(1), 1-15.
Xxxxxx, X. X., Xxxxxxx, X., Xxxxxxx, X., & Gerstner, X. (2004). Noninvasive brain-actuated control of a mobile robot by human EEG. IEEE Transactions on biomedical Engineering, 51(6), 1026-1033.
Xxxxxx, X., Xxxxxxx, N., & Xxxxxx, V. (2020). Dynamic states of population activity in prefrontal cortical networks of freely-moving macaque. Nature communications, 11(1), 1-10.
Xxxxxx, X., Xxxx, B. A., Xxxxxx, X., Xxxxx, V. S., & Xxxxxxxx, J. R. (2018). Efficient cortical coding of 3D posture in freely behaving rats. Science, 362(6414), 584-589.
Xxxxxx, X., Xxxxxx, X., Xxxxxxx, X., Xxxxxxx, V., Raos, V., & Xxxxxxxxxx, G. (1997). Object representation in the ventral premotor cortex (area F5) of the monkey. Journal of neurophysiology, 78(4), 2226-2230.
Xxxxxx, X., Xxxxxxx, V., Xxxxxxx, G., Xxxxxx, M., & Xxxxxx, H. (2000). Selectivity for the shape, size, and orientation of objects for grasping in neurons of monkey parietal area AIP. Journal of neurophysiology, 83(5), 2580-2601.
Xxxxxxxx, X., Xxxxxxx, B. D., Xxxxxx, X., Xxxxxxxxxxx, X., & Xxxxxxxx, R. A. (2004). Cognitive control signals for neural prosthetics. Science, 305(5681), 258-262.
Xxxxxxxxxx, X., Xxxxxxxxxx, X., Xx, C. L. A., Xxxxxx, S., Ormen, Y., Xxxxxxx-Xxxxx, C., ... & Xxxxx,
D. (2020). EthoLoop: automated closed-loop neuroethology in naturalistic environments. Nature Methods, 17(10), 1052-1059.
Xxxxxxx, S. U., Xxxxxxxxx, V., xx xx Xxxxx, X., & Xxxxxx, C. T. (2017). Social context-dependent activity in marmoset frontal cortex populations during natural conversations. Journal of Neuroscience, 37(29), 7036-7047.
X'Xxxxx, X., & Xxxxxxxxxx, X. (1971). The hippocampus as a spatial map: preliminary evidence from unit activity in the freely-moving rat. Brain research.
Xxxx, D. B., Xxxxxx, S. R., Xxx, X., & Xxxxxxxxx, N. (2018). Social place-cells in the bat hippocampus. Science, 359(6372), 218-224.
Xxxxxx, M. G. (1993). The role of the cerebellum in motor control and perception. Brain, behavior and evolution, 41(1), 39-50.
Xxxxxxxxx, M., Xxxxxx, N., Corato, X., & Xxxxxxx, S. M. (2008). Neural underpinnings of gesture discrimination in patients with limb apraxia. Journal of Neuroscience, 28(12), 3030-3041.
Xxxxxxxx W, Xxxxxxx E (1937): Somatic motor and sensory representation in the cerebral cortex of man as studied by electrical stimulation. Brain 37: 389–443.
Xxxxxxx, X., Xxxxxx, M. J., & Xxxxxxxx, R. A. (2006). Dorsal premotor neurons encode the relative position of the hand, eye, and goal during reach planning. Neuron, 51(1), 125-134.
Raos, X., Xxxxxx, M. A., Xxxxxx, X., Xxxxxxx, X., & Xxxxxxx, V. (2006). Functional properties of grasping-related neurons in the ventral premotor area F5 of the macaque monkey. Journal of neurophysiology, 95(2), 709-729.
Xxxxxxxx, B., Xxxx, X., Xxxxx, X., Xxxx, X., Xxx, C., Xxx, M. H., & Xxxxxx, X. (2010). A brain controlled wheelchair to navigate in familiar environments. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 18(6), 590-598.
Xxxx, N. J., Xxxxx, X., & Xxxxxxx, S. T. (2006). The anterior intraparietal sulcus mediates grasp execution, independent of requirement to update: new insights from transcranial magnetic stimulation. Journal of Neuroscience, 26(31), 8176-8182.
Xxxxxxxxxx, X., Xxxxxxxxxx, C., Xxxxxxx, M., & Xxxxxxxxxx, M. (1981a). Afferent properties of periarcuate neurons in macaque monkeys. I. Somatosensory responses. Behavioural brain research, 2(2), 125-146.
Xxxxxxxxxx, X., Xxxxxxxxxx, C., Xxxxxxx, X., & Xxxxxxxxxx, M. (1981b). Afferent properties of periarcuate neurons in macaque monkeys. II. Visual responses. Behavioural brain research, 2(2), 147-163.
Xxxxxxxxxx, X., Xxxxxxx, X., Xxxxxxx, L., Xxxxxxxxxx, M., Xxxxxxx, G. & Xxxxxxx, M. (1988) Functional organization of inferior area 6 in the macaque monkey. II. Area F5 and the control of distal movements. Experimental Brain Res., 71, 491–507.
Xxxxxxxxxx, X., & Xxxxxxxxxx, M. (1988). Motor and visual-motor functions of the premotor cortex.
Neurobiology of neocortex, 42, 269-84.
Xxxxxxxxxx, X., Xxxxxx, L., Xxxxxxx, X., & Xxxxxxx, X. (1996). Premotor cortex and the recognition of motor actions. Cognitive brain research, 3(2), 131-141.
Xxxxxxxxxx, X., & Xxxxxxx, G. (2001). The cortical motor system. Neuron, 31(6), 889-901.
Xxxxxxxxxx G, Xxxxxxx J (2012) Voluntary movement: the parietal and premotor cortex. In: Principles of neural science, Xx 0 (Xxxxxx E, Xxxxxxxx J, Xxxxxxx T, Xxxxxxxxxx S, Xxxxxxxx XX, eds). New York: McGraw-Xxxx.
Xxxxxxxxxxxxx, X., & Xxxxxxxxxxx, X. (2016). Object vision to hand action in macaque parietal, premotor, and motor cortices. elife, 5, e15278.
Xxxxxxx, EM (1980). Single neuron recording from motor cortex as a possible source of signals for control of external devices. Ann Biom Eng 8: 339–349.
Xxxxxx, M. E. (2002). Repairing the injured spinal cord. Science, 295(5557), 1029-1031.
Xxxxxxxx, A. B., Xxxxx, D. W., & Xxxxx, G. A. (2004). Differential representation of perception and action in the frontal cortex. Science, 303(5656), 380-383.
Xxxxxxx, D. A., Xxxxxxx, M. A., Xxxxxx, T. L., Xxxxxxxx, D. F., Xxxxx, G., Xxxxx, J., ... & Nicolelis,
M. A. (2014). Chronic, wireless recordings of large-scale brain activity in freely moving rhesus monkeys. Nature methods, 11(6), 670-676.
Shahidi, X., Xxxxxxxx, P., Xxxxxx, X., Xxxxxx, X., & Xxxxxx, V. (2019). Population coding of strategic variables during foraging in freely-moving macaques. BioRxiv, 811992.
Xxxxx, X., Xxxxxxxx, X., Xxxxx, X., & Xxxxx, X. (1996). Role for cells in the presupplementary motor area in updating motor plans. Proceedings of the National Academy of Sciences, 93(16), 8694-8698.
Xxxxxx-Xxxxxxx, X., & Xxxx, A. (2019). Complementary encoding of priors in monkey frontoparietal network supports a dual process of decision-making. ELife, 8, e47581.
Theys, X., Xxxx, P., xxx Xxxx, X., Xxxxxx, X., & Xxxxxxx, P. (2012). Selectivity for three-dimensional shape and grasping-related activity in the macaque ventral premotor cortex. Journal of Neuroscience, 32(35), 12038-12050.
Theys, X., Xxxx, P., xxx Xxxx, X., Xxxxxx, X., & Xxxxxxx, P. (2013). Three-dimensional shape coding in grasping circuits: a comparison between the anterior intraparietal area and ventral premotor area F5a. Journal of cognitive neuroscience, 25(3), 352-364.
Xxxxx, X., Xxxx, S. H., & Xxxxxxx, S. T. (2005). Virtual lesions of the anterior intraparietal area disrupt goal-dependent on-line adjustments of grasp. Nature neuroscience, 8(4), 505-511.
Xxxxxx, M. A., Xxxxxxxxxxx, I., Grammont, F., Xxxxxx, M., Xxxxxxx, F., Xxxxxxx, A., ... & Xxxxxxxxxx, G. (2008). When pliers become fingers in the monkey motor system. Proceedings of the National Academy of Sciences, 105(6), 2209-2213.
Xxxxxx, C., Xxxxxxx, X., & Avenanti, A. (2014). Neuroanatomical substrates of action perception and understanding: an anatomic likelihood estimation meta-analysis of lesion-symptom mapping studies in brain injured patients. Frontiers in human neuroscience, 8, 344.
Xxxxxxxx, X., Xxxxx, X., Xxxxxxxx, M. C., Xxxxxxxx, A. S., & Xxxxxxxx, A. B. (2008). Cortical control of a prosthetic arm for self-feeding. Nature, 453(7198), 1098-1101.
Xxxx, X., & Xxxx, R. (2020). Combining p-values via averaging. Biometrika, 107(4), 791-808.
Xxxxxxx, X., Xxxxxxx, X., Xxxxx, C., Xxxx, X., Xxxxxxx, X., & Xxxxxxx, C. (2009). A review on directional information in neural signals for brain-machine interfaces. Journal of Physiology-Paris, 103(3-5), 244-254.
Xxxxxx, X. X., & XxXxxxxxx, D. J. (2004). Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans. Proceedings of the national academy of sciences, 101(51), 17849-17854.
Xxxxxxx CN, Xxxxxxxx PH, Meyer DR, Xxxxxx W, Xxxxx TP, Xxxxxx AM (1952): Patterns of localization in precentral and ‘supplementary’ motor areas and their relation to the concept of a premotor area. Res Pub Assoc Res Nerv Ment Dis 30: 238–264.
Xxxxx, X., Xxx, X., Xxx, N. L., Xxxx, X. X., Xxxx, Z. Q., Xx, C. Y., ... & Ma, Y. Y. (2008). Maze
model to study spatial learning and memory in freely moving monkeys. Journal of neuroscience methods, 170(1), 111-116.