Brain-Machine-Interfaces (BMIs) aim to improve the autonomy of human patients. Nevertheless, beyond the restoration of movement, the fine control of prosthetics requires the recovery of tactile sensory feedback. Although BMIs with artificial somatosensory inputs have recently been implemented in patients, few studies have focused on the temporal constraints of feedback integration.
In this study, we aim to explore the impact of the temporal latency between a motor command and the feedback update after movement. Therefore, we have developed an ultra-fast bidirectional BMI based on chronic electrophysiological recordings in M1 and 2D patterned optogenetic stimulation of the primary somatosensory cortex (S1) in mice. Thanks to our control algorithm based on the incremental displacement of the prosthesis triggered by single spikes, we achieved a 4.4-ms minimal latency for the complete loop — the fastest closed-loop BMI to our knowledge.
In our protocol, single M1 neurons were conditioned to control the rotation of a virtual bar. The photostimulation pattern on S1 provided feedback of the prosthesis angular position to the animal. On a subset of animals, we showed that such optostimulations could generate perceptions similar to the ones evoked by equivalent tactile stimuli.
Results showed that our incremental algorithm was efficient to achieve fine control. We obtained well-guided trajectories by using a 50-ms latency for tactile feedback. Decreasing/increasing the latency to 5/300-ms impaired the ability of the animals to move and stabilize the prosthesis in the target area, suggesting the existence of a specific range of time windows in the S1-M1 dialogue.