direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Page Content

Reviewed Conference Papers

go back to overview

Assessing Interactive Gaming Quality of Experience using a Crowdsourcing Approach
Citation key schmidt2020a
Author Schmidt, Steven and Naderi, Babak and Sabet, Saeed Shafiee and Zadtootaghaj, Saman and Carsten and Möller, Sebastian
Title of Book 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX)
Pages 1–6
Year 2020
ISBN 978-1-7281-5965-2
DOI 10.1109/QoMEX48832.2020.9123122
Location Athlone, Ireland
Month may
Publisher IEEE
Series QoMEX ’20
How Published Fullpaper
Abstract Traditionally, the Quality of Experience (QoE) is assessed in a controlled laboratory environment where participants give their opinion about the perceived quality of a stimulus on a standardized rating scale. Recently, the usage of crowdsourcing micro-task platforms for assessing the media quality is increasing. The crowdsourcing platforms provide access to a pool of geographically distributed, and demographically diverse group of workers who participate in the experiment in their own working environment and using their own hardware. The main challenge in crowdsourcing QoE tests is to control the effect of interfering influencing factors such as a user's environment and device on the subjective ratings. While in the past, the crowdsourcing approach was frequently used for speech and video quality assessment, research on a quality assessment for gaming services is rare. In this paper, we present a method to measure gaming QoE under typically considered system influence factors including delay, packet loss, and framerates as well as different game designs. The factors are artificially manipulated due to controlled changes in the implementation of games. The results of a total of five studies using a developed evaluation method based on a combination of the ITU-T Rec. P.809 on subjective evaluation methods for gaming quality and the ITU-T Rec. P.808 on subjective evaluation of speech quality with a crowdsourcing approach will be discussed. To evaluate the reliability and validity of results collected using this method, we finally compare subjective ratings regarding the effect of network delay on gaming QoE gathered from interactive crowdsourcing tests with those from equivalent laboratory experiments.
Link to publication Link to original publication Download Bibtex entry

go back to overview

Zusatzinformationen / Extras

Quick Access:

Schnellnavigation zur Seite über Nummerneingabe

Auxiliary Functions