Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In European urology ; h5-index 128.0

Several barriers prevent the integration and adoption of augmented reality (AR) in robotic renal surgery despite the increased availability of virtual three-dimensional (3D) models. Apart from correct model alignment and deformation, not all instruments are clearly visible in AR. Superimposition of a 3D model on top of the surgical stream, including the instruments, can result in a potentially hazardous surgical situation. We demonstrate real-time instrument detection during AR-guided robot-assisted partial nephrectomy and show the generalization of our algorithm to AR-guided robot-assisted kidney transplantation. We developed an algorithm using deep learning networks to detect all nonorganic items. This algorithm learned to extract this information for 65 927 manually labeled instruments on 15 100 frames. Our setup, which runs on a standalone laptop, was deployed in three different hospitals and used by four different surgeons. Instrument detection is a simple and feasible way to enhance the safety of AR-guided surgery. Future investigations should strive to optimize efficient video processing to minimize the 0.5-s delay currently experienced. General AR applications also need further optimization, including detection and tracking of organ deformation, for full clinical implementation.

De Backer Pieter, Van Praet Charles, Simoens Jente, Peraire Lores Maria, Creemers Heleen, Mestdagh Kenzo, Allaeys Charlotte, Vermijs Saar, Piazza Pietro, Mottaran Angelo, Bravi Carlo A, Paciotti Marco, Sarchi Luca, Farinha Rui, Puliatti Stefano, Cisternino Francesco, Ferraguti Federica, Debbaut Charlotte, De Naeyer Geert, Decaestecker Karel, Mottrie Alexandre

2023-Mar-18

Augmented reality, Deep learning, Instrument segmentation, Kidney transplantation, Partial nephrectomy, Real time, Renal cell carcinoma, Robotic surgery, Three-dimensional models