Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In bioRxiv : the preprint server for biology

Human verbal communication requires a rapid interplay between speech planning, production, and comprehension. These processes are subserved by local and long-range neural dynamics across widely distributed brain areas. How linguistic information is precisely represented during natural conversation or what shared neural processes are involved, however, remain largely unknown. Here we used intracranial neural recordings in participants engaged in free dialogue and employed deep learning natural language processing models to find a striking similarity not only between neural-to-artificial network activities but also between how linguistic information is encoded in brain during production and comprehension. Collectively, neural activity patterns that encoded linguistic information were closely aligned to those reflecting speaker-listener transitions and were reduced after word utterance or when no conversation was held. They were also observed across distinct mesoscopic areas and frequency bands during production and comprehension, suggesting that these signals reflected the hierarchically structured information being conveyed during dialogue. Together, these findings suggest that linguistic information is encoded in the brain through similar neural representations during both speaking and listening, and start to reveal the distributed neural dynamics subserving human communication.

Cai Jing, Hadjinicolaou Alex E, Paulk Angelique C, Williams Ziv M, Cash Sydney S

2023-Mar-11