Automatic Video Quality Assessment (VQA) is an important part of modern video compressors and streaming applications. Motion has an important role in videos, and therefore will have substantial influence over video quality.
Yet, the prevalent VQAs today do not explicitly consider motion.
Our goal is to examine the influence of motion over the way humans perceive video quality, and to create a new full-reference VQA algorithm, that reaches high correlation with subjective scores given by humans.
During the project, new methods were proposed, and existing methods were examined.
The algorithms were implemented in MATLAB and were compared over the LIVE and Netflix datasets.
Because of the differences between the datasets, a different method was proposed for each dataset, both of which reached high correlation in the tests we made.