Logo des Repositoriums
Zur Startseite
  • English
  • Deutsch
Anmelden
  1. Startseite
  2. SuUB
  3. Dissertationen
  4. Shadow-free illumination and projection in surgical environments
 
Zitierlink DOI
10.26092/elib/5844

Shadow-free illumination and projection in surgical environments

Veröffentlichungsdatum
2026-03-16
Autoren
Mühlenbrock, Andre
Gutachter
Zachmann, Gabriel  
Magnor, Marcus
Zusammenfassung
Modern operating rooms are highly complex, safety-critical environments in which surgeons must maintain sustained concentration and an uninterrupted workflow to ensure optimal patient outcomes. Despite significant technological advances, several fundamental challenges remain insufficiently addressed. One such challenge is achieving optimal and consistent illumination of the surgical field. In current operating rooms, surgical lights must often be manually repositioned to compensate for shadows cast by medical personnel, leading to frequent workflow interruptions and increased cognitive load. Another persistent challenge is providing surgeons with immediate and non-distracting access to critical information. As the amount of information required during surgery continues to grow, presenting this information on remote monitors can disrupt workflow and divert attention away from the surgical field. Both interruptions can compromise safety and efficiency.

In this dissertation, I address both challenges by developing algorithms based on multi-view depth camera technology that enable shadow-free illumination of the surgical field and shadow-free projection of information adjacent to it. For shadow-free illumination, I propose algorithms for an autonomous surgical lighting system that detect occlusions caused by medical personnel and dynamically control ceiling-mounted light modules to ensure optimal illumination. For shadow-free projection, I develop algorithms for a multi-projector system that displays information directly next to the surgical site. While multiple projectors are employed to facilitate shadow-free projection, my algorithms specifically enable precise geometric distortion correction on dynamic, uneven surfaces. In addition, I present a blur-mitigation strategy that significantly reduces ghosting artifacts caused by overlapping projections and unavoidable minor errors in dynamic geometry reconstruction.

Beyond the medical application itself, I developed novel, general-purpose multi-view point cloud algorithms that enable these systems: (a) a fast, accurate, and robust multi-depth-sensor registration method that allows rapid registration between surgeries while complying with data privacy constraints; (b) BlendPCR, a real-time multi-camera point cloud rendering technique that preserves the native sensor resolution while enabling seamless visualization and precise geometric correction for projection onto dynamic, deformable surfaces; and (c) TemPCC, an experimental approach for temporal point cloud completion.

I demonstrate that the autonomous surgical lighting system significantly improves illumination quality and shadow reduction in a simulated surgical environment, compared to conventional surgical lights, based on validation with practicing surgeons (n = 11). Furthermore, the multi-projector system enables shadow-free projection onto dynamic, deformable surfaces with a visual accuracy of 1–2 mm using point cloud–based rendering, while the proposed blur mitigation pipeline significantly reduces blurring and ghosting (n = 23).

Beyond the surgical domain, my proposed multi-depth-sensor registration method attains an accuracy of 1.6 mm at a distance of 2 m without requiring RGB or infrared data, and could therefore be adapted to LiDAR-based setups. My multi-point cloud rendering technique, BlendPCR, enables low-latency, real-time surface reconstruction and rendering. It achieves frame rates exceeding 200 fps at ultra-high-definition resolution while processing data from seven RGB-D sensors simultaneously, making it suitable for live virtual reality and telepresence scenarios.

In summary, my dissertation establishes a technical and algorithmic foundation for reducing distractions in the operating room while also contributing general-purpose methods for multi-view point cloud registration, rendering, and completion, providing a basis for future research and applications in computer graphics and computer vision.
Schlagwörter
point cloud

; 

surgical lighting

; 

projection mapping

; 

depth camera

; 

registration

; 

rendering
Institution
Universität Bremen  
Fachbereich
Fachbereich 03: Mathematik/Informatik (FB 03)  
Institute
AG Computer Graphik und Virtuelle Realität (CGVR)  
Dokumenttyp
Dissertation
Lizenz
Alle Rechte vorbehalten
Sprache
Englisch
Dateien
Lade...
Vorschaubild
Name

Shadow-free illumination and projection in surgical environments.pdf

Size

106.27 MB

Format

Adobe PDF

Checksum

(MD5):49c5c755ecd1d48a4ca26907f5ebb959

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Datenschutzbestimmungen
  • Endnutzervereinbarung
  • Feedback schicken