Unified Calibration Technique for Augmented-Reality Ultrasound-Guided Interventions

Published in MetroXRAINE, 2022

Recommended citation: Chen ECS., Allen, DR., Cambranis-Romero, J., and Peters, TM. (2022). "Unified Calibration Technique for Augmented-Reality Ultrasound-Guided Interventions"; in IEEE MetroXRAINE, pp. 495-500 https://doi.org/10.1109/MetroXRAINE54828.2022.9967585

Accurate spatial calibration for mobile imaging modality is the essential and enabling technology for augmented-reality based surgical navigation systems. Despite years of research, spatial calibration for surgical camera and freehand ultrasound remains areas of active research. In this paper, we present a unified spatial calibration framework for ultrasound probe calibration and camera hand-eye calibration using the same mathematical principle. By treating spatial calibration as a registration problem between paired points and lines, our framework provides i) efficient solutions with guaranteed convergence properties, and ii) based on error propagation model, a set of heuristic rules for fiducial placements that leads to accurate calibration consistently. Monte Carlo simulation demonstrated that accuracy camera hand-eye calibration (≈ 5 pixel) is possible with the Microsoft HoloLens 2 using as few as 6 fiducial measurements, and accurate ultrasound probe calibration can be consistently obtained using as few as 12 fiducial measurements.

Download paper here BibTeX