Inhalt des Dokuments
zur Navigation
Other Conference Papers
go back to overview
Zitatschlüssel | zequeirajimenez2018a |
---|---|
Autor | Zequeira Jiménez, Rafael and Fernández Gallardo, Laura and Möller, Sebastian |
Buchtitel | 44. Deutsche Jahrestagung für Akustik (DAGA) |
Seiten | 303–306 |
Jahr | 2018 |
ISBN | 978-3-939296-13-3 |
Monat | mar |
Verlag | Deutsche Gesellschaft für Akustik DEGA e.V. |
Wie herausgegeben | Fullpaper |
Zusammenfassung | The Crowdsourcing (CS) paradigm offers small tasks to anonymous users on the Internet. Human-centered speech quality assessment studies have been traditionally conducted under controlled laboratory conditions. Nowadays, CS provides an exceptional opportunity to transfer such experiments to the internet and reach a wider and diverse audience. However, data from CS can be corrupted due to users' neglect and hence quality control mechanisms are required to ensure reliable outcomes. While previous works have presented trapping questions or majority voting to ensure good results, this work introduces user-environmental noise recording to discard unreliable users located in noisy places. To this end, a speech quality assessment study was conducted in the clickworker CS platform. The speech stimuli are taken from the database 501 from the ITU-T Rec. P.863 and the results are to be contrasted to the existing lab ratings. This work analyzes whether environmental noise recording can be used to identify unreliable workers. Furthermore, the effects of discarding users deemed untrustworthy on the correlation between the CS and the Lab results is studied. Our outcomes highlight the importance of controlling for users' background noises to ensure reliable results in speech quality assessments conducted via CS. |
go back to overview