Augmentation technology is a rapidly expanding field, and with it there is growing interest in how such devices interface with the body. When learning to control augmentation devices, one important sensory input is the tactile feedback received from where the device is worn on the body, described as intrinsic touch. We asked whether the brain gathers information from intrinsic tactile inputs to construct an internal representation of the device.
To investigate such changes in somatosensory processing, we are using a supernumerary robotic finger (the Third Thumb, Dani Clode Design). In our ongoing study, we are assessing changes to inter-finger sensory representations before and after a week of altered finger-synchronisation motor training: either due to extended Third Thumb training, or training to play the keyboard. We are using fMRI to study the representational similarity patterns across the biological fingers and Third Thumb (via intrinsic touch) before and after training using a soft pneumatic actuator stimulation system. We are also using a psychophysics paradigm to explore changes in sensory integration, examining tactile temporal order judgements involving the biological fingers and the Third Thumb. Preliminary results show improved localisation ability, and increased neural representational similarity, in the Third Thumb training group between the Third Thumb and the biological fingers it collaborates with most frequently. This is because the brain has gained familiarity integrating these somatosensory inputs. This work will allow us to demonstrate the brain's ability to integrate an artificial limb into the biological body's sensory model.