test
Search publications, data, projects and authors

Thesis

French

ID: <

10670/1.cpahl1

>

Where these data come from
Ultrasound guided needle simulator for training

Abstract

The work presented in this manuscript answers the need to modernize and improve the learning of the medical gesture of needle insertion under ultrasound. This simulator includes a rendering part and a haptic part. After the goals of this work, a state of the art of ultrasound image rendering in computer graphics is performed, followed by a state of the art of needle insertion models and simulators. We then introduce a GPU-based real-time ultrasound rendering method of a 3D scene. This method allows us to cut 3D objects to create 2D surfaces on the fly that depend on the position of the ultrasound probe, and then to work on these surfaces to produce the different effects that appear on an ultrasound image: shadows and reflections, absorption, granularity and fiber orientations of biological tissues. To our knowledge, his last effect has never been presented in a real-time ultrasound image generation simulator. Then, we show the different interactions between the medical tools (probe and needle, manipulated thanks to two haptic interfaces) and the generated image, and we will see how to deform this image to match the deformation of the 3D tissues thanks to deformation models. Finally, we display the results and how to validate such a simulator through the example of an identical simulator with visual and haptic feedback, the laparoscopy simulator

Your Feedback

Please give us your feedback and help us make GoTriple better.
Fill in our satisfaction questionnaire and tell us what you like about GoTriple!