direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Page Content

Subjective assessment and instrumental prediction of mobile online gaming on the basis of perceptual dimensions

Motivation

The assessment of the perceived quality of the user (Quality of Experience) of pure audio and video material differs in many ways from the quality of computer games. The later possess a variety of factors due to their interactive nature. Not only factors of complex and innovative game systems have an impact on the QoE, but also the players themselves. A quality judgment, that results by comparing the expected and perceived composition of an entity, depends highly on the preferences, expectations and abilities of the player.

In this still young area of research standard methods for determining the QoE are not directly applicable. This is apparent since in a task oriented human-computer interaction the goal should be achieved with minimal effort. However in a game  the player exerts an effort in order to influence the outcome and thereby feels emotionally attached. Thus new concepts such as immersion and flow, a state of happiness while being in an equilibrium between competence and challenge, appear.

An analysis of the gaming market shows, that the proportion of mobile games has risen sharply in recent years. Mobile games are special in a sense that mobile devices such as smartphones or tablets were originally not designed for games and are therefore not optimally adapted to them. This also applies to the new concept of cloud gaming, where the entire game is executed on a server and only the video and audio material is transferred to the end user.

 

Aim of the project

The aim of this research project is to develop methods to assess the QoE of mobile games. In addition, based on a database containing subjective quality judgments, a model similar to the well known E-model should be constructed to predict the QoE. The following concrete steps are planned for this purpose:

  • Set up and modification of a testbed for conducting experiments including a cloud gaming system for mobile games
  • Development of a classification of games to choose representative games and identify system and user factors
  • Building a questionnaire covering a large space of relevant quality dimensions
  • Identification of quality-relevant perceptual dimensions and analysis of their impact on the overall quality
  • Analyzing the performance of current objective metrics which were proposed for different contents and services in mobile gaming
  • Building a QoE model based on game, system and network characteristics as well as user and context factors
Time Frame: 
01/2016 - 06/2019
T-labs Team Members:
Steven Schmidt
Funding by:
Deutsche Forschungsgemeinschaft (DFG)
Project Number:
MO 1038/21-1

List of Publications

An Evaluation of Video Quality Assessment Metrics for Passive Gaming Video Streaming
Citation key barman2018c
Author Barman, Nabajeet and Schmidt, Steven and Zadtootaghaj, Saman and Martini, Maria G. and Möller, Sebastian
Title of Book Proceedings of the 23rd Packet Video Workshop
Pages 1–6
Year 2018
ISBN 978-1-4503-5773-9
DOI 10.1145/3210424.3210434
Location Amsterdam, Netherlands
Address New York, NY, USA
Month jun
Note electronic
Publisher ACM
Series PV '18
How Published full
Abstract Video Quality assessment is imperative to estimate and hence manage the Quality of Experience (QoE) in video streaming applications to the end-user. Recent years have seen a tremendous advancement in the field of objective video quality assessment (VQA) metrics, with the development of models that can predict the quality of the videos streamed over the Internet. However, no work so far has attempted to study the performance of such quality assessment metrics on gaming videos, which are artificial and synthetic and have different streaming requirements than traditionally streamed videos. Towards this end, we present in this paper a study of the performance of objective quality assessment metrics for gaming videos considering passive streaming applications. Objective quality assessment considering eight widely used VQA metrics is performed on a dataset of 24 reference videos and 576 compressed sequences obtained by encoding them at 24 different resolution-bitrate pairs. We present an evaluation of the performance behavior of the VQA metrics. Our results indicate that VMAF predicts subjective video quality ratings the best, while NIQE turns out to be a promising alternative as a no-reference metric in some scenarios.
Link to publication Link to original publication Download Bibtex entry

Zusatzinformationen / Extras

Quick Access:

Schnellnavigation zur Seite über Nummerneingabe