Tehran University of Medical Sciences

Science Communicator Platform

Stay connected! Follow us on X network (Twitter):
Share this content! On (X network) By
Developing and Evaluating a Low-Cost Tracking Method Based on a Single Camera and a Large Marker Publisher



Rahchamani M1 ; Soboute MI2 ; Samadzadehaghdam N1 ; Abadi BM1
Authors
Show Affiliations
Authors Affiliations
  1. 1. Dept. Medical Physics and Engineering, Tehran University of Medical Science (TUMS), Tehran, Iran
  2. 2. Dept. Nuclear Physics, Payame Noor University, Damavand, Iran

Source: 2018 25th Iranian Conference on Biomedical Engineering and 2018 3rd International Iranian Conference on Biomedical Engineering# ICBME 2018 Published:2018


Abstract

Camera pose estimation is an important problem in many applications that need localization of cameras, devices, or instruments in robotics, surgical operations, and augmented-reality. It is important to provide a cost-effective, real-time, accurate, and easy to use system for pose estimation. There are two kinds of optical tracking methods employed by camera pose estimation algorithms, model-based versus feature based methods. Here, we developed a feature-based camera pose estimationmethodutilizing justonesingle camera and a large marker. The keypoint features from the scene image and the marker are detected by Speeded Up Robust Features (SURF) detector. Then, their descriptors are extracted by Scale Invariant Feature Transform (SIFT) and they are matched using Brute Force matching (BF). A perspective transform is supposed to map the coordinates of the image keypoints to the coordinates of the corresponding 3D points in the marker.This problem is solved by OpenCV functionsand the final camera pose matrix is obtained. To evaluate the proposed method, we developed a 3D printed calibrator with known placeholder positions. The proposed system can be realized usinga smartphone camera (in webcam mode) and a large marker on the wall. As results show, the proposed method achieves acceptable accuracy namely an average error of approximately 1.4 cm for position and 0.02 radianfor orientation. © 2018 IEEE.