The MOtion-tuned Video Integrity Evaluation (MOVIE) index is a model and set of algorithms for predicting the perceived quality of digital television and cinematic pictures, as well as other kinds of digital images and videos.
It was developed by Kalpana Seshadrinathan and Alan Bovik in the Laboratory for Image and Video Engineering (LIVE) at The University of Texas at Austin. It was described in print in the 2010 technical paper "Motion Tuned Spatio-Temporal Quality Assessment of Natural Videos".[1] The original MOVIE paper was accorded an IEEE Signal Processing Society Best Journal Paper Award in 2013.
Model overview
The MOVIE index is a neuroscience-based model for predicting the perceptual quality of a (possibly compressed or otherwise distorted) motion picture or video against a pristine reference video. Thus, the MOVIE index is a full-reference metric. The MOVIE model is quite different from many other models since it uses neuroscience-based models of how the human brain processes visual signals at various stages along the visual pathway, including the lateral geniculate nucleus, primary visual cortex, and in the motion-sensitive extrastriate cortex visual area MT.
Spatial MOVIE operates by processing spatial and temporal motion picture information in an approximately separable manner. A prediction of the spatial (frame) quality of a video is found by calculating a space-time frequency decomposition of both reference and test (distorted) videos using a Gabor filter bank. Following a process of divisive normalization based on a model of cortical (area V1) processing in the brain, the processed reference and test videos are combined in a weighted difference to produce a prediction of spatial picture quality.
At the same time, a prediction of the temporal (time-varying or inter-frame) motion picture quality is calculated by using the responses of the same Gabor space-time frequency decomposition of reference and test videos, but in a different manner. Temporal MOVIE weights these responses using an excitatory-inhibitory weighting of the Gabor responses to motion-tune them in accordance with a local measurement of video motion. The motion measurements are also made using the space-time filter bank using a perceptually relevant measurement of phase-based optical flow. These measurements on the reference and test videos are then differentially combined and divisively normalized to produce a prediction of temporal picture quality.
The overall MOVIE index is then defined as the simple product of the Spatial and Temporal MOVIE indices, pooled over time (frames).
Performance
According to the original paper, MOVIE index delivers better perceptual motion picture quality predictions than do traditional methods such as the peak signal-to-noise ratio (PSNR) and mean squared error (MSE), which are inconsistent with human visual perception.[1] In the same paper, the authors also show that it performs better than other video quality models such as the ANSI/ISO standard VQM, and the popular Structural Similarity (SSIM) model in terms of motion picture quality prediction performance.
In another comparison, the MOVIE Index topped other models in terms of correlation with human judgments of motion picture quality on the LIVE Video Quality Database, which is a tool for assessing the accuracy of picture quality models.[2]
Usage
The MOVIE index is commercially marketed as part of the Video Clarity line of video quality measurements tools that are used throughout the Television and motion picture industries.
References
- 1 2 Seshadrinathan, K.; Bovik, A.C. (2010-02-01). "Motion Tuned Spatio-Temporal Quality Assessment of Natural Videos". IEEE Transactions on Image Processing. 19 (2): 335–350. CiteSeerX 10.1.1.153.9018. doi:10.1109/TIP.2009.2034992. ISSN 1057-7149. PMID 19846374.
- ↑ Seshadrinathan, K.; Soundararajan, R.; Bovik, A. C.; Cormack, L. K. (June 2010). "Study of Subjective and Objective Quality Assessment of Video". IEEE Transactions on Image Processing. 19 (6): 1427–1441. doi:10.1109/tip.2010.2042111. ISSN 1057-7149. PMID 20129861.