VRSkills Lab, Project Univ. Evry CIR 2017

VRSkills Lab, Project Univ. Evry CIR 2017 2024-06-18T22:59:18+02:00

Project Description

Context

This project is part of the IBISC laboratory’s research focus on personalized medicine. It supports
support for the VR-Skills Lab research project, co-funded by Genopole as part of the 2015 call for

semi Heavy 2015, and in support of Miss Aylen Ricca’s PhD thesis, funded by a doctoral contract from Paris Saclay University (beginning Oct. 2016). The partners in this multidisciplinary project are the IBISC laboratory (Univ. Evry), the Centre Hospitalier Sud Francilien (CHSF), the IRCCyN laboratory (Ecole des Mines de Nantes), and the EREL laboratory (Wright State University, USA).

Scientific objectives

Surgical simulators based on virtual reality technologies can significantly improve surgeon training [4]. These tools enable training in the technical and psychomotor skills required for their tasks (e.g. instrument handling, knot tying, tissue dissection). However, there is currently a lack of guidelines for defining appropriate levels of interaction fidelity for these systems. In this context, and as part of the HIFI-VR project, we plan to study the fidelity of haptic interactions for designing a virtual surgical simulation laboratory, enabling practitioners to train in an immersive way to master technical surgical skills.

To achieve this objective, the first step was to design a tool for learning technical gestures.
To do this, we first chose a target surgical task: needle insertion to perform a biopsy in interventional radiology. We then modeled the virtual learning environment, including surgical tools
, surgical tools, target objects, and organs (Figure 1) and its physical interface.
We then designed interaction techniques for navigating the virtual environment [5], enabling the learner to perform the task correctly. Finally, we carried out an experimental study to evaluate our system. The results showed that a natural navigation technique based on tracking the position of the user’s head is best suited to this task [5]. However, the evaluation also identified several system limitations, notably linked to the lack of fidelity of the haptic interface.

Next, we designed interaction techniques for navigation in the virtual environment [5] that enable the learner to perform the task correctly. Finally, we carried out an experimental study to evaluate our system. The results showed that a natural navigation technique based on tracking the position of the user’s head is best suited to this task [5]. However, the evaluation also identified some system limitations, linked in particular to the lack of fidelity of the haptic interface. Indeed, the haptic interface only provides force feedback over 3 degrees of freedom (DDL), impacting the user experience when manipulating the needle and interacting with the virtual environment. To overcome this problem,
we have proposed a model for evaluating haptic fidelity for a surgical simulator [4]. The HIFIVR project aims to validate this model through experimental studies. Our request within the framework of this project
is the acquisition of a new 6DDL haptic interface to improve the fidelity of user interaction with the virtual environment.

  • Project leader: Amine CHELLALI (MCF Univ. Evry)
  • Project start: January 2017
  • Project duration: 1 year
  • Approximate amount: 25 Keuros
  • Platform concerned: EVR@
  • Targeted equipment: 6 ddl force feedback arm
WP to LinkedIn Auto Publish Powered By : XYZScripts.com