Volume 7, Issue 3 (September 2011)                   IJEEE 2011, 7(3): 168-178 | Back to browse issues page

XML Print

Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Kasaei S, Shabani Nia E. Moving Vehicle Tracking Using Disjoint View Multicameras . IJEEE. 2011; 7 (3) :168-178
URL: http://ijeee.iust.ac.ir/article-1-346-en.html
Abstract:   (4324 Views)
Multicamera vehicle tracking is a necessary part of any video-based intelligent transportation system for extracting different traffic parameters such as link travel times and origin/destination counts. In many applications, it is needed to locate traffic cameras disjoint from each other to cover a wide area. This paper presents a method for tracking moving vehicles in such camera networks. The proposed method introduces a new method for handling inter-object occlusions as the most challenging part of the single camera tracking phase. This approach is based on coding the silhouette of moving objects before and after occlusion and separating occluded vehicles by computing the longest common substring of the related chain codes. In addition, to improve the accuracy of the tracking method in the multicamera phase, a new feature based on the relationships among surrounding vehicles is formulated. The proposed feature can efficiently improve the efficiency of the appearance (or space-time) features when they cannot discriminate between correspondent and non-correspondent vehicles due to noise or dynamic condition of traffic scenes. A graph-based approach is then used to track vehicles in the camera network. Experimental results show the efficiency of the proposed methods.
Full-Text [PDF 1121 kb]   (2075 Downloads)    
Type of Study: Research Paper | Subject: Multimedia Systems
Received: 2010/11/14 | Accepted: 2011/07/18 | Published: 2013/12/30

Creative Commons License
© 2019 by the authors. Licensee IUST, Tehran, Iran. This is an open access journal distributed under the terms and conditions of the Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license.