direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Inhalt des Dokuments

Saman Zadtootaghaj

Lupe

Research Field
- Assessing and predicting QoE of Gaming Applications

 Research Topics

- Video quality assessment of computer generated content 

- Cloud Gaming Quality of Experience

- Deep learning-based quality assessment of image/video content

- Image/video quality enhancement  

Current Project:

Adaptive Edge/Cloud Compute and Network Continuum over a Heterogeneous Sparse Edge Infrastructure to Support Nextgen Applications (ACCORDION)

Past Project:

Methods and Models for assessing and predicting the QoE linked to Mobile Gaming (QoE-NET/MSCA-ITN Network)


Biography

Saman Zadtootaghaj is a researcher at the Quality and Usability Lab at Technische Universitat Berlin working on modeling the gaming quality of experience under the supervision of Prof. Dr.-Ing. Sebastian Moller. His main interest is subjective and objective quality assessment of Computer-Generated content. He received his bachelor degree from IASBS and master degree in information technology from University of Tehran.

He worked as a researcher at Telekom Innovation Laboratories of Deutsche Telekom AG from 2016 to 2018 as part of European project called QoE-Net. He is currently the chair of Computer-Generated Imagery group at Video Quality Expert Group.  

Roles: 

Chair of Computer-Generated Imagery (CGI) at VQEG 

Local coordinator of HCID track of EIT master program 

Visiting Researcher:

MMSPG lab, EPFL (2017)

LST group, DFKI (2019)

Teaching experience:

Advance Projects at Quality and Usability Lab (Deep Learning for Video Quality Assessment and Enhancement) SS2020

Usability engineering exercise SS2017/SS2018/SS2019/SS2020

Quality and Usability Seminar (Applied statistic) WS 2019-2020

Quality and Usability Seminar (Gamification) SS2018

Teacher assistant: Multiagent (University of Tehran 2014), computer networks (IASBS 2011).

Talks: 

VQEG meetings at Nokia, Madrid, March 2018

VQEG meetings at Google (remote), USA, November 2018 

VQEG Meetings at Deutsche Telekom, Germany, March 2019

VQEG meetings at Tencent, China, October 2019

VQEG meeting, Online Meeting, March 2020 

Involvement in Standardization Activities: 

Active in the following work items:

ITU-T P.BBQCG: Parametric bitstream-based Quality Assessment of Cloud Gaming Services

ITU-T G.CMVTQS: Computational model used as a QoE/QoS monitor to assess videotelephony services

ITU-T G.OMMOG: Opinion Model for Mobile Online Gaming applications

Contributed to the following recommendations:

ITU-T G.1032: Influence factors on gaming quality of experience  

ITU-T P.809: Subjective evaluation methods for gaming quality  

ITU-T G.1072: Opinion model predicting gaming quality of experience for cloud gaming services  

Reviewed papers for TCSVT, Quality and User Experience journal, Journal of Electronic Imaging,  QoMEX 2017-2019, ICC 2019 and ICME 2020, PQS workshop 2016

 

Tools for Quality Prediction of Gaming Content:

NDNetGaming: Deep Learning based Quality metric for Gaming Content

GamingPara: Gaming Parametric based Video Quality Models

Implementation of ITU-T Recommendation G.1072

 

Datasets:

GamingVideoSet: https://kingston.box.com/v/GamingVideoSET

Cloud Gaming Video Dataset: https://github.com/stootaghaj/CGVDS 

Image Gaming Quality Dataset: https://github.com/stootaghaj/GISET 

 

Find me on ResearchGate, LinkedIn, Scholar, GitHub.


Address
Quality and Usability Lab
Deutsche Telekom Laboratories
TU Berlin
Ernst-Reuter-Platz 7
D-10587 Berlin, Germany

Email:  
Tel:  +49 30 8353 58394

 

 

Publications:

Assessing Interactive Gaming Quality of Experience using a Crowdsourcing Approach
Zitatschlüssel schmidt2020a
Autor Schmidt, Steven and Naderi, Babak and Sabet, Saeed Shafiee and Zadtootaghaj, Saman and Carsten and Möller, Sebastian
Buchtitel 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX)
Seiten 1–6
Jahr 2020
ISBN 978-1-7281-5965-2
DOI 10.1109/QoMEX48832.2020.9123122
Ort Athlone, Ireland
Monat may
Verlag IEEE
Serie QoMEX ’20
Wie herausgegeben Fullpaper
Zusammenfassung Traditionally, the Quality of Experience (QoE) is assessed in a controlled laboratory environment where participants give their opinion about the perceived quality of a stimulus on a standardized rating scale. Recently, the usage of crowdsourcing micro-task platforms for assessing the media quality is increasing. The crowdsourcing platforms provide access to a pool of geographically distributed, and demographically diverse group of workers who participate in the experiment in their own working environment and using their own hardware. The main challenge in crowdsourcing QoE tests is to control the effect of interfering influencing factors such as a user's environment and device on the subjective ratings. While in the past, the crowdsourcing approach was frequently used for speech and video quality assessment, research on a quality assessment for gaming services is rare. In this paper, we present a method to measure gaming QoE under typically considered system influence factors including delay, packet loss, and framerates as well as different game designs. The factors are artificially manipulated due to controlled changes in the implementation of games. The results of a total of five studies using a developed evaluation method based on a combination of the ITU-T Rec. P.809 on subjective evaluation methods for gaming quality and the ITU-T Rec. P.808 on subjective evaluation of speech quality with a crowdsourcing approach will be discussed. To evaluate the reliability and validity of results collected using this method, we finally compare subjective ratings regarding the effect of network delay on gaming QoE gathered from interactive crowdsourcing tests with those from equivalent laboratory experiments.
Link zur Publikation Link zur Originalpublikation Download Bibtex Eintrag

Zusatzinformationen / Extras

Direktzugang

Schnellnavigation zur Seite über Nummerneingabe