Inhalt des Dokuments
zur Navigation
Steven Schmidt
Research Field
- Quality of Experience (QoE) for Cloud Gaming Services
- Engagement in Virtual Reality
Research Topics
- Identification and quantification of perceptual quality dimensions for gaming QoE
- Prediction of gaming QoE based on encoding and network parameters
- Classification of game content
- Crowdsourcing for gaming evaluation
Biography
Steven Schmidt received his M.Sc. degree in Electrical Engineering at the TU Berlin with a major in Communication Systems. Since 2016 he is employed as a research assistant at the Quality and Usability Lab where he is working towards a PhD in the field of Quality of Experience in Mobile Gaming.
Projects
ITU-T SG12 Activities:
- ITU-T Rec. G.1032 - Influence Factors on Gaming Quality of Experience (2017)
- ITU-T Rec. P.809 - Subjective Evaluation Methods for Gaming Quality (2018)
- ITU-T Rec. G.1072 - Opinion Model Predicting Gaming QoE for Cloud Gaming Services (2020)
Address
Quality and Usability Lab
Technische Universität Berlin
Ernst-Reuter-Platz 7
D-10587 Berlin, Germany
Tel: +49 151 12044969
Publications
Zitatschlüssel | schmidt2020a |
---|---|
Autor | Schmidt, Steven and Naderi, Babak and Sabet, Saeed Shafiee and Zadtootaghaj, Saman and Carsten and Möller, Sebastian |
Buchtitel | 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX) |
Seiten | 1–6 |
Jahr | 2020 |
ISBN | 978-1-7281-5965-2 |
DOI | 10.1109/QoMEX48832.2020.9123122 |
Ort | Athlone, Ireland |
Monat | may |
Verlag | IEEE |
Serie | QoMEX ’20 |
Wie herausgegeben | Fullpaper |
Zusammenfassung | Traditionally, the Quality of Experience (QoE) is assessed in a controlled laboratory environment where participants give their opinion about the perceived quality of a stimulus on a standardized rating scale. Recently, the usage of crowdsourcing micro-task platforms for assessing the media quality is increasing. The crowdsourcing platforms provide access to a pool of geographically distributed, and demographically diverse group of workers who participate in the experiment in their own working environment and using their own hardware. The main challenge in crowdsourcing QoE tests is to control the effect of interfering influencing factors such as a user's environment and device on the subjective ratings. While in the past, the crowdsourcing approach was frequently used for speech and video quality assessment, research on a quality assessment for gaming services is rare. In this paper, we present a method to measure gaming QoE under typically considered system influence factors including delay, packet loss, and framerates as well as different game designs. The factors are artificially manipulated due to controlled changes in the implementation of games. The results of a total of five studies using a developed evaluation method based on a combination of the ITU-T Rec. P.809 on subjective evaluation methods for gaming quality and the ITU-T Rec. P.808 on subjective evaluation of speech quality with a crowdsourcing approach will be discussed. To evaluate the reliability and validity of results collected using this method, we finally compare subjective ratings regarding the effect of network delay on gaming QoE gathered from interactive crowdsourcing tests with those from equivalent laboratory experiments. |