Research Article, J Surg Clin Pract Vol: 1 Issue: 1
Evaluation of Eye-Tracking vs Color-code Tracking for Robotic Camera Assistance in Minimally Invasive Surgery
Ahmed Elsherbiny1, Sebastian Koller1, Nils Kohn1, Daniel Ostler1, Armin Schneider1, Thomas Vogel1,2, Dirk Wilhelm1,2, Helmut Friess2, Hubertus Feussner1,2 and Michael Kranzfelder1,2*
1Workgroup MITI (Minimally invasive Interdisciplinary Therapeutical Intervention), Klinikum rechts der Isar, Technische Universität München, 81675 München, Germany
2Department of Surgery, Technische Universität München, D- 81675 München, Germany
*Corresponding Author : Michael Kranzfelder
Department of Surgery, Technische Universität München, D- 81675 München, Germany
Tel: +49-89-4140-5088
E-mail: Michael.kranzfelder@tum.de
Received: November 02, 2016 Accepted: November 16, 2017 Published: November 21, 2017
Citation: Elsherbiny A, Koller S, Kohn N, Ostler D, Schneider A, et al. (2017) Evaluation of Eye-Tracking vs Color-code Tracking for Robotic Camera Assistance in Minimally Invasive Surgery. J Surg Clin Pract 1:1
Abstract
Purpose: Robotic camera assistance enhances manual dexterity, precision, and ergonomic control in minimally invasive surgery. However, the optimal control interface for robotic camera assistance is still a matter of research and development. Tracking systems seem to offer a potential solution for an autonomous maneuvering of a robotic camera holder.
Methods: We evaluated two potential tracking solutions in a preliminary ex-vivo study (n=20 participants) using either eyetracking or color-code tracking for control of the robotic camera holder SOLOASSIST (AKTORMed, Barbing, Germany). Performance time (maneuvering robotic camera holder to five distinct markers) and system usability scale (SUS) were evaluated. Joystick-control of the SOLOASSIST (standard control interface) was used as reference. Each participant carried out three repetitions with each navigation modality.
Results: Camera control by joystick (81 ± 32.1 sec.) was quicker compared to eye- or color- code tracking (124.3 ± 68.1 sec., p= 0.36 and 114.2 ± 59.1 sec., p= 0.17). No statistically significant difference between eye- and color-code tracking was noted (p= 0.36). The system usability scale (SUS) scored highest for the joystick control (87.5 ± 13.4 pts.). Color-code tracking revealed a SUS of 73 ± 20 pts. and eye-tracking of 57.8 ± 19.6 pts. Despite these findings, participants would prefer eye- and color-code tracking for real surgery.
Conclusion: Our study showed that manoeuvring the robotic camera holder SOLOASSIST with eye- and color-code tracking is possible and feasible. Clinicians prefer control interfaces other than the joystick (current standard). Further efforts should be taken to develop intuitive control interfaces, e.g. by use of remote eye tracking, to facilitate a broader use of robotic camera assistance systems in surgery.
Keywords: Color-code tracking; Eye-tracking; Robotic camera assistance; Autonomy
Introduction
Since the mid-1990s, minimally invasive surgery (MIS) has become a matured surgical discipline that reduces surgical trauma, scarring, blood loss, and patient recovery time [1]. However, despite advances in instrument development and visualization techniques, camera assistance remains uncomfortable [2]. To overcome this issue, surgical robots were introduced in MIS, that further enhance manual dexterity, precision, and ergonomic control [3]. Mechatronic support systems, such as robotic camera holders, allow the surgeon independent camera control and are already used successfully in clinical routine for distinct indications [4].
However, the optimal control interface for robotic camera assistance is still a matter of research [5]. Most currently deployed systems use joystick, foot pedal or speech-recognition for maneuvering of the robotic arm. Although single-surgery thereby becomes possible, the surgeon still needs to actively ignite (manually or by speech) control mechanisms for movement of the arm leading to an increased workload during the operation. In order to further enhance clinical application of robotic camera holders and decrease the present disturbing additional workload, development of “intuitive and autonomous” control interfaces would be mostly valuable [5]. Tracking systems seem to offer a potential solution for an autonomous maneuvering of a robotic camera holder [6]. Generally, it has to be distinguished between mechanical, optical, acoustical and electromagnetic tracking. In principal, two different tracking methods can be used for robotic camera assistance.
Passive (indirect) tracking
For passive tracking, different markers (e.g. color or pattern codes), that are attached to a laparoscopic instrument, are either detected optically from a distance by e.g. wall mounted camera systems or directly within the laparoscopic video image by computerized image processing algorithms. The latter is especially suitable for application in minimally invasive surgery, as the marked (dominant) instrument, which is used to obtain the tracking data, is always in the center of the laparoscopic video [7].
Active (direct) tracking
For active tracking, the sensor that detects motion is attached directly onto the instrument (acoustic and electromagnetic tracking). Therefore, direct instrument tracking as well as pure mechanical tracking systems are not suitable for application in minimally invasive surgical procedures.
However, eye gaze tracking techniques, which also belong to the group of active (direct) tracking, may offer the potential for a direct control of a robotic camera holder without application of joystick, foot pedal or speech recognition. Up so far, eye tracking is mainly used in non-medical fields, e.g. in consumer behavior studies or visual marketing [8].
To evaluate the potential of direct and indirect tracking techniques for control of a robotic camera assistance, we set up a pilot study investigating the performance of an eye – and color-coded tracking system.
Materials and Methods
In our study, the robotic camera holder SOLOASSIST (AKTORMed, Barbing, Germany) was used in an ex-vivo setting with a standard laparoscopy unit attached to it. For maneuvering of the arm, either the appropriate joystick (reference standard) or an eyetracking or color-code tracking system were used. For eye-tracking, we deployed a head-mounted system (Ergoneers GmbH, Geretsried, Germany) that was connected to the SOLOASSIST control interface allowing the following steering commands according to the eye movement: “up – down – left – right”. For color-code tracking, a laparoscopic standard instrument was equipped with a colormarker that was detected within the laparoscopic video by a specially developed tracking software. The SOLOASSIST moved according to the instrument movement in the video (Figure 1).
Twenty participants (thirteen clinicians and seven engineers) were included in the study and asked to navigate the SOLOASSIST with the attached laparoscope to five consecutively numbered markers inside the abdominal cavity of the OR phantom ELITE (CLA, Coburg, Germany) with either eye- (direct) or color-coded (indirect) tracking. Additionally, all participants also had to perform the tasks using the joystick interface (reference measurement). Each participant carried out three repetitions with each navigation modality, the task order was randomly assigned. The ELITE phantom has been proven to be an effective training unit for MIS in previous studies [9]. Elapsed time per run was documented, as well as a system evaluation performed using the system usability scale (SUS; rating of effectiveness, efficiency and satisfaction of each system). A SUS score between 60 and 80 pts. is considered “average to good” for usability of a system [10]. The SUS of our study consists of a 10 item questionnaire with five response options for respondents, ranging from strongly agree to strongly disagree (Table 1).
Eye-tracking | Color-code tracking | Joystick control | ||
---|---|---|---|---|
I think/found … | strongly disagree – strongly agree (1 point) (5 points) | |||
Q1 | … that I would like to use this system frequently | 3.35 | 3.6 | 4.5 |
Q2 | … the system unnecessarily complex | 2.5 | 1.8 | 1.3 |
Q3 | … the system was easy to use | 2.9 | 3.9 | 4.7 |
Q4 | I would need the support of a technical person to be able to use this system | 3.3 | 2.2 | 1.8 |
Q5 | The various functions in this system were well integrated | 3.3 | 3.2 | 4.5 |
Q6 | … there was too much inconsistency in the system | 2.9 | 2.3 | 1.5 |
Q7 | … that most people would learn to use this system very quickly | 3.4 | 4.3 | 4.9 |
Q8 | … the system very cumbersome to use | 2.6 | 2.3 | 2.0 |
Q9 | I felt very confident using the system quickly | 3.2 | 3.4 | 4.4 |
Q10 | I needed to learn a lot of things before I could get going with this system | 2.5 | 1.7 | 1.5 |
Table 1: System usability scale (SUS). Questionnaire consisting of ten distinct questions for usability evaluation of eye-tracking, color-code tracking and joystick control (modified according to Brooke et al. (10)). Likert-scale ranging from strongly disagree (1 point) to strongly agree (5 points).
Results
Elapsed time for task completion
Control of the SOLOASSIST by joystick revealed a total task completion time (mean ± SD) of 81 ± 32.1 sec., by eye-tracking of 124.3 ± 68.1 sec. and by color-code tracking of 114.2 ± 59.1 sec. Minimum of elapsed time was 26.6, 48.1 and 32.7 sec., maximum was 142.7, 330.1 and 299.7 sec. (Table 2). All participants completed task execution. No hardware failure occurred. Camera control by joystick was quicker than with eye- (p= 0.36) or color- code tracking (p= 0.17). No statistically significant difference between the two tracking modalities was noted (p= 0.36).
Method | N | Mean | Std. Deviation | Minimum | Maximum |
---|---|---|---|---|---|
Eye tracking | 20 | 124,2575 | 68,06260 | 48.10 | 300.10 |
Color tracking | 20 | 114,2345 | 59,08345 | 32.70 | 299.70 |
Joystick | 20 | 80,9670 | 32,12951 | 26.60 | 142.70 |
Total | 60 | 106,4863 | 57,41605 | 26.60 | 300.10 |
Table 2: Elapsed time (min.) until task completion differentiated according to the control interface of the robotic camera assistance (Joystick, eye (direct)- and colorcode (indirect) tracking). Mean (sec.) ± Standard Deviation (SD), Minimum and Maximum). Each participant carried out three repetitions with each navigation modality.
Joystick as control interface achieved a SUS of 87 ± 11.8 pts., color-code tracking of 70.1 ± 20 pts. and eye-tracking of 55.9 ± 23.1 pts (Table 3). Subgroup analysis revealed that 80% of participants (n= 16) ranked joystick as best control interface and 20% (n= 4) preferred color-code tracking. A SUS <45 pts. was noted twice for color-code and eight times for eye- tracking. A SUS score 60 - 80 pts. is considered between average and good for system usability.
Method | N | Mean | Std. Deviation | Minimum | Maximum |
---|---|---|---|---|---|
Eye tracking | 20 | 55,875 | 23,0884 | 7,5 | 92.5 |
Color tracking | 20 | 70,125 | 20,0406 | 27.5 | 100.0 |
Joystick | 20 | 87.000 | 11,8265 | 57.5 | 100.0 |
Table 3: System usability scale (SUS) differentiated according to the control interface of the robotic camera assistance (Joystick, eye (direct)- and color-code (indirect) tracking). Mean (sec.) ± Standard Deviation (SD), Minimum and Maximum. A SUS score between 60 and 80 is considered “average to good” for usability of a system.
Joystick control would be applied by 90% and eye- or color-code tracking by 50% of participants frequently for control of the robotic camera holder. Eye-tracking was considered complex to use by almost 50% of participants, color-code tracking by 20% and joystick control by only 5% of participants. Technical support was requested by 50% of participants while using the eye-tracking system, by 45% while using color-code tracking and only by 20% during joystick control. Details of the participants’ answers to the individual questions are presented in (Table 1).
Comparison of eye-tracking vs. joystick revealed a distinct SUS answer deviation (Likert- scale: range 1 point (strongly disagree) to 5 points (strongly agree)) in question 3 “system easy to use” (2.9 vs. 4.7 points), question 4 “support is needed” (3.3 vs. 1.8 points) and question 7 “system easy to learn” (3.4 vs. 4.9 points). Comparison of eye-tracking vs. color- code tracking showed an answer deviation in favour of the latter for these three questions.
Although both tracking modalities were appraised inferior compared to joystick control ex vivo, debriefing of the participants revealed, that during real surgery eye- and color-code tracking would be the preferred control interface of the SOLOASSIST.
Discussion
Manoeuvring the SOLOASSIST with eye- and color-code tracking seems to be possible and feasible. Although the system usability scale (SUS) scored highest for joystick control, participants would prefer eye- or color-code tracking for application during real surgery. Both direct (eye) and indirect (color-code) tracking revealed good results in terms of elapsed time per task and system usability scale, although joystick control scored best. Further efforts, e.g. the use of remote eye tracking, should be taken to overcome the latter, as the eye-tracking technique itself was considered satisfactory. However, the head-mounted system that was used in this study was considered disturbing by most of the participants.
Analysis of the SUS questionnaires revealed, that – compared to joystick control – the head- mounted eye-tracking system was considered more difficult to use by the participants, needed more technical support and was appraised to have a longer learning curve. These results can be explained by the participants’ complaints of mounting the eye-tracking system to the head Figure 2.
In order to further enhance clinical application of robotic camera holders and decrease the present disturbing additional workload, development of “intuitive and autonomous” control interfaces would be mostly valuable [5]. Tracking systems seem to offer potential solutions [6]. Development of innovative control interfaces for robotic camera assistance is therefore currently in the focus of different research groups. Tadano [11] reported upon a system using the surgeons head movements for camera navigation (direct tracking).
Although most of the currently available systems for controlling robotic camera holders are still in the laboratory research phase, feedback of users is promising and vindicate intensified research efforts [12].
Acknowledgments
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
References
- Feussner H, Siewert J (2001) Reduktion des Zugangstraumas: gesicherte Vorteile. Der Chirurg 72: 236-244.
- Holländer S, Klingen H, Fritz M, Djalali P, Birk D (2014) Robotic Camera Assistance and Its Benefit in 1033 Traditional Laparoscopic Procedures: Prospective Clinical Trial Using a Joystick-guided Camera Holder. Surg Technol Int 25:19-23.
- Kwok KW, Sun LW, Mylonas GP, James DR, Orihuela-Espina F, et al. (2012) Collaborative gaze channelling for improved cooperation during robotic assisted surgery. Ann Biomed Eng 40: 2156-2167.
- Gillen S, Pletzer B, Heiligensetzer A (2014) Solo-surgical laparoscopic cholecystectomy with a joystick-guided camera device: a case–control study. Surg Endosc 28: 164-170.
- Kranzfelder M, Schneider A, Fiolka A (2015) What Do We Really Need? Visions of an Ideal Human–Machine Interface for NOTES Mechatronic Support Systems From the View of Surgeons, Gastroenterologists, and Medical Engineers. Surg Innov 22: 432-440.
- Chmarra M, Grimbergen C, Dankelman J (2007) Systems for tracking minimally invasive surgical instruments. Minimally Invasive Therapy & Allied Technologies 16: 328-340.
- Oropesa I, Sánchez-González P, Chmarra MK (2013) EVA: Laparoscopic instrument tracking based on endoscopic video analysis for psychomotor skills assessment. Surg Endosc 27: 1029-1039.
- Wedel M, Pieters R (2008) Eye tracking for visual marketing. Now Publishers Inc,Boston.
- Fiolka A, Gillen S, Meining A, Feussner H (2010) ELITE-The ex vivo training unit for NOTES: Development and Validation. Minimally Invasive Therapy & Allied Technologies 19: 281-286
- Brooke J (1996) SUS-A quick and dirty usability scale. Usability evaluation in industry 189: 4-7.
- Tadano K, Kawashima K (2015) A pneumatic laparoscope holder controlled by head movement. Int J Med Robot 11: 331-340.
- Klausen A, Rohrig R, Lipprandt M (2016) Feasibility of Eyetracking in Critical Care Environments - A Systematic Review. Stud Health Technol Inform 228: 604-608.