Detailed Information

Cited 0 time in webofscience Cited 3 time in scopus
Metadata Downloads

A GPU-accelerated model-based tracker for untethered submillimeter grippers

Full metadata record
DC Field Value Language
dc.contributor.authorScheggi, Stefano-
dc.contributor.authorYoon, ChangKyu-
dc.contributor.authorGhosh, Arijit-
dc.contributor.authorGracias, David H.-
dc.contributor.authorMisra, Sarthak-
dc.date.available2021-02-22T08:46:34Z-
dc.date.issued2018-05-
dc.identifier.issn0921-8890-
dc.identifier.issn1872-793X-
dc.identifier.urihttps://scholarworks.sookmyung.ac.kr/handle/2020.sw.sookmyung/4508-
dc.description.abstractMiniaturized grippers that possess an untethered structure are suitable for a wide range of tasks, ranging from micromanipulation and microassembly to minimally invasive surgical interventions. In order to robustly perform such tasks, it is critical to properly estimate their overall configuration. Previous studies on tracking and control of miniaturized agents estimated mainly their 2D pixel position, mostly using cameras and optical images as a feedback modality. This paper presents a novel solution to the problem of estimating and tracking the 3D position, orientation and configuration of the tips of submillimeter grippers from marker-less visual observations. We consider this as an optimization problem, which is solved using a variant of the Particle Swarm Optimization algorithm. The proposed approach has been implemented in a Graphics Processing Unit (GPU) which allows a user to track the submillimeter agents online. The proposed approach has been evaluated on several image sequences obtained from a camera and on B-mode ultrasound images obtained from an ultrasound probe. The sequences show the grippers moving, rotating, opening/closing and grasping biological material. Qualitative results obtained using both hydrogel (soft) and metallic (hard) grippers with different shapes and sizes ranging from 750 microns to 4 mm (tip to tip), demonstrate the capability of the proposed method to track the agent in all the video sequences. Quantitative results obtained by processing synthetic data reveal a tracking position error of 25 and orientation error of 1.7 ± 1.3 degrees. We believe that the proposed technique can be applied to different stimuli responsive miniaturized agents, allowing the user to estimate the full configuration of complex agents from visual marker-less observations.-
dc.format.extent11-
dc.language영어-
dc.language.isoENG-
dc.publisherElsevier BV-
dc.titleA GPU-accelerated model-based tracker for untethered submillimeter grippers-
dc.typeArticle-
dc.publisher.location네델란드-
dc.identifier.doi10.1016/j.robot.2017.11.003-
dc.identifier.scopusid2-s2.0-85044867837-
dc.identifier.wosid000430764100009-
dc.identifier.bibliographicCitationRobotics and Autonomous Systems, v.103, pp 111 - 121-
dc.citation.titleRobotics and Autonomous Systems-
dc.citation.volume103-
dc.citation.startPage111-
dc.citation.endPage121-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.urlhttp://www.sciencedirect.com/science/article/pii/S0921889017304499-
Files in This Item
Go to Link
Appears in
Collections
공과대학 > 기계시스템학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Yoon, Chang Kyu photo

Yoon, Chang Kyu
공과대학 (기계시스템학부)
Read more

Altmetrics

Total Views & Downloads

BROWSE