direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Inhalt des Dokuments

Neslihan Iskender

Lupe

Research Group

Crowdsourcing and Open Data

 

Teaching

  • Study Project Quality & Usability (Since SS 2018)
  • Interdiziplinäres Medienprojekt (Since SS 2018)
  • Usability Engineering (Exercise SS 2018)

 

Biography

Neslihan Iskender received her Bachelor and Master of Science degree in Industrial Engineering and Management at the Karlsruhe Institute of Technology. During her studies, she focused on managing new technologies and innovation management. Since May 2017, she is employed as a research assistant at the Quality and Usability Labs where she is working towards a PhD in the field of crowdsourcing. Her research Topics are:

  • Crowd assessments: Usability, UX, QoE, Quality
  • Real-time interaction, human computation as a service, (HuaaS)
  • Hybrid Worfklows for micro-task crowdsourcing
  • Internal Crowdsourcing

 

Current Projects

 

Past Projects

 

Contact

E-Mail: neslihan.iskender@tu-berlin.de

Phone: +49 (30) 8353-58347 

Fax: +49 (30) 8353-58409 

 

Address

Quality and Usability Lab

Deutsche Telekom Laboratories

Technische Universität Berlin

Ernst-Reuter-Platz 7

D-10587 Berlin, Germany 

 

 

Publications

Crowdsourcing versus the laboratory: towards crowd-based linguistic text quality assessment of query-based extractive summarization
Zitatschlüssel iskender2020a
Autor Iskender, Neslihan and Polzehl, Tim and Möller, Sebastian
Buchtitel Proceedings of the Conference on Digital Curation Technologies (Qurator 2020)
Seiten 1–16
Jahr 2020
Adresse Berlin, Germany
Monat jan
Notiz online
Verlag CEUR
Serie QURATOR
Wie herausgegeben Fullpaper
Zusammenfassung Curating text manually in order to improve the quality of automatic natural language processing tools can become very time consuming and expensive. Especially, in the case of query-based extractive online forum summarization, curating complex information spread along multiple posts from multiple forum members to create a short meta-summary that answers a given query is a very challenging task. To overcome this challenge, we explore the applicability of microtask crowdsourcing as a fast and cheap alternative for query-based extractive text summarization of online forum discussions. We measure the linguistic quality of crowd-based forum summarizations, which is usually conducted in a traditional laboratory environment with the help of experts, via comparative crowdsourcing and laboratory experiments. To our knowledge, no other study considered query-based extractive text summarization and summary quality evaluation as an application area of the microtask crowdsourcing. By conducting experiments both in crowdsourcing and laboratory environments, and comparing the results of linguistic quality judgments, we found out that microtask crowdsourcing shows high applicability for determining the factors overall quality, grammaticality, non-redundancy, referential clarity, focus, and structure & coherence. Further, our comparison of these findings with a preliminary and initial set of expert annotations suggest that the crowd assessments can reach comparable results to experts specifically when determining factors such as overall quality and structure & coherence mean values. Eventually, preliminary analyses reveal a high correlation between the crowd and expert ratings when assessing low-quality summaries.
Link zur Publikation Link zur Originalpublikation Download Bibtex Eintrag

Publications

2018

Barz, Michael and Büyükdemircioglu, Neslihan and Prasad Surya, Rikhu and Polzehl, Tim and Sonntag, Daniel (2018). Device-Type Influence in Crowd-based Natural Language Translation Tasks. Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement (SAD) in Crowdsourcing 2018, and the 1st Workshop CrowdBias'18: Disentangling the Relation Between Crowdsourcing and Bias Management, 93–97.

Link zur Publikation Link zur Originalpublikation

Zusatzinformationen / Extras

Direktzugang

Schnellnavigation zur Seite über Nummerneingabe