Volume 16, Issue 4 (December 2020)                   IJEEE 2020, 16(4): 461-473 | Back to browse issues page


XML Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Zabihi S M, Ghanei-Yakhdan H, Mehrshad N. An Improved Motion Vector Estimation Approach for Video Error Concealment Based on the Video Scene Analysis. IJEEE. 2020; 16 (4) :461-473
URL: http://ijeee.iust.ac.ir/article-1-1704-en.html
Abstract:   (68 Views)
In order to enhance the accuracy of the motion vector (MV) estimation and also reduce the error propagation issue during the estimation, in this paper, a new adaptive error concealment (EC) approach is proposed based on the information extracted from the video scene. In this regard, the motion information of the video scene around the degraded MB is first analyzed to estimate the motion type of the degraded MB. If the neighboring MBs possess uniform motion, the degraded MB imitates the behavior of neighboring MBs by choosing the MV of the collocated MB. Otherwise, the lost MV is estimated through the second proposed EC technique (i.e., IOBMA). In the IOBMA, unlike the conventional boundary matching criterion-based EC techniques, not only each boundary distortion is evaluated regarding both the luminance and the chrominance components of the boundary pixels, but also the total boundary distortion corresponding to each candidate MV is calculated as the weighted average of the available boundary distortions. Compared with the state-of-the-art EC techniques, the simulation results indicate the superiority of the proposed EC approach in terms of both the objective and subjective quality assessments.
Full-Text [PDF 1263 kb]   (26 Downloads)    
Type of Study: Research Paper | Subject: Image Processing
Received: 2019/11/06 | Revised: 2020/03/23 | Accepted: 2020/05/28

Creative Commons License
© 2020 by the authors. Licensee IUST, Tehran, Iran. This is an open access journal distributed under the terms and conditions of the Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license.