direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Page Content

Subjective assessment and instrumental prediction of mobile online gaming on the basis of perceptual dimensions

Motivation

The assessment of the perceived quality of the user (Quality of Experience) of pure audio and video material differs in many ways from the quality of computer games. The later possess a variety of factors due to their interactive nature. Not only factors of complex and innovative game systems have an impact on the QoE, but also the players themselves. A quality judgment, that results by comparing the expected and perceived composition of an entity, depends highly on the preferences, expectations and abilities of the player.

In this still young area of research standard methods for determining the QoE are not directly applicable. This is apparent since in a task oriented human-computer interaction the goal should be achieved with minimal effort. However in a game  the player exerts an effort in order to influence the outcome and thereby feels emotionally attached. Thus new concepts such as immersion and flow, a state of happiness while being in an equilibrium between competence and challenge, appear.

An analysis of the gaming market shows, that the proportion of mobile games has risen sharply in recent years. Mobile games are special in a sense that mobile devices such as smartphones or tablets were originally not designed for games and are therefore not optimally adapted to them. This also applies to the new concept of cloud gaming, where the entire game is executed on a server and only the video and audio material is transferred to the end user.

 

Aim of the project

The aim of this research project is to develop methods to assess the QoE of mobile games. In addition, based on a database containing subjective quality judgments, a model similar to the well known E-model should be constructed to predict the QoE. The following concrete steps are planned for this purpose:

  • Set up and modification of a testbed for conducting experiments including a cloud gaming system for mobile games
  • Development of a classification of games to choose representative games and identify system and user factors
  • Building a questionnaire covering a large space of relevant quality dimensions
  • Identification of quality-relevant perceptual dimensions and analysis of their impact on the overall quality
  • Analyzing the performance of current objective metrics which were proposed for different contents and services in mobile gaming
  • Building a QoE model based on game, system and network characteristics as well as user and context factors
Time Frame: 
01/2016 - 06/2019
T-labs Team Members:
Steven Schmidt
Funding by:
Deutsche Forschungsgemeinschaft (DFG)
Project Number:
MO 1038/21-1

List of Publications

Assessing Interactive Gaming Quality of Experience using a Crowdsourcing Approach
Citation key schmidt2020a
Author Schmidt, Steven and Naderi, Babak and Sabet, Saeed Shafiee and Zadtootaghaj, Saman and Carsten and Möller, Sebastian
Title of Book 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX)
Pages 1–6
Year 2020
ISBN 978-1-7281-5965-2
DOI 10.1109/QoMEX48832.2020.9123122
Location Athlone, Ireland
Month may
Publisher IEEE
Series QoMEX ’20
How Published Fullpaper
Abstract Traditionally, the Quality of Experience (QoE) is assessed in a controlled laboratory environment where participants give their opinion about the perceived quality of a stimulus on a standardized rating scale. Recently, the usage of crowdsourcing micro-task platforms for assessing the media quality is increasing. The crowdsourcing platforms provide access to a pool of geographically distributed, and demographically diverse group of workers who participate in the experiment in their own working environment and using their own hardware. The main challenge in crowdsourcing QoE tests is to control the effect of interfering influencing factors such as a user's environment and device on the subjective ratings. While in the past, the crowdsourcing approach was frequently used for speech and video quality assessment, research on a quality assessment for gaming services is rare. In this paper, we present a method to measure gaming QoE under typically considered system influence factors including delay, packet loss, and framerates as well as different game designs. The factors are artificially manipulated due to controlled changes in the implementation of games. The results of a total of five studies using a developed evaluation method based on a combination of the ITU-T Rec. P.809 on subjective evaluation methods for gaming quality and the ITU-T Rec. P.808 on subjective evaluation of speech quality with a crowdsourcing approach will be discussed. To evaluate the reliability and validity of results collected using this method, we finally compare subjective ratings regarding the effect of network delay on gaming QoE gathered from interactive crowdsourcing tests with those from equivalent laboratory experiments.
Link to publication Link to original publication Download Bibtex entry

Zusatzinformationen / Extras

Quick Access:

Schnellnavigation zur Seite über Nummerneingabe