Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Investigating the Influence of Crowdworker Attitudes on Document Annotations

3 views

Published on

Talk at the Second symposium on Biases in Human Computation and Crowdsourcing. Cyprus (online).

Published in: Science
  • Be the first to comment

  • Be the first to like this

Investigating the Influence of Crowdworker Attitudes on Document Annotations

  1. 1. 1 WIS Web Information Systems Investigating the Influence of Crowdworker Attitudes on Document Annotations Tim Draws, Nava Tintarev, Ujwal Gadiraju TU Delft, The Netherlands t.a.draws@tudelft.nl timdraws.net
  2. 2. 2 WIS Web Information Systems Biases in web search My work: measuring and mitigating algorithmic and cognitive biases in the context of web search on debated topics Needed: data sets of search results with viewpoint annotations Yes! Yes! Yes! Yes! Yes! No! No! Problem: potential bias in the viewpoint annotations due to crowdworkers’ personal attitudes
  3. 3. 3 WIS Web Information Systems Biased annotations? Concern: tendency to annotate in line with personal stance – Confirmation bias – False consensus effect Opposing! Supporting!
  4. 4. 4 WIS Web Information Systems This work RQ: Do crowdworkers have a tendency to label in line with their personal attitude when annotating search results for viewpoints? Crowdsourcing study to collect viewpoint annotations Analyzed the relationship between crowdworker attitudes and their annotations
  5. 5. 5 WIS Web Information Systems Crowdsourcing viewpoints • Retrieved search results from Bing for two different debated topics – Should zoos exist? – Are social networking sites good for our society? • Top 50 results for 14 queries per topic • Set up task on Amazon Mechanical Turk
  6. 6. 6 WIS Web Information Systems Viewpoint labels Should we all be vegan? Extremely opposing Opposing Somewhat opposing Neutral Somewhat supporting Supporting Extremely supporting -3 -2 -1 0 +1 +2 +3
  7. 7. 7 WIS Web Information Systems Viewpoint annotation task • Step 1. Instructions; personal knowledge & attitude • Step 2. Annotate 14 search results on one topic and complete two attention checks
  8. 8. 8 WIS Web Information Systems Results Descriptive • 717 search result items • 140 annotators Spearman correlation analysis • IV: Crowdworker attitude [-3,3] • DV: Mean annotation [-3,3] • ρ = 0.26, p = 0.003 −2 0 2 −2 0 2 Crowdworker Attitude MeanAnnotation
  9. 9. 9 WIS Web Information Systems A difference between topics? −2 0 2 −2 0 2 Crowdworker Attitude MeanAnnotation −2 0 2 −2 0 2 Crowdworker Attitude MeanAnnotation Social Media Zoos ρ = 0.26, p = 0.025 ρ = 0.27, p = 0.041
  10. 10. 10 WIS Web Information Systems Mild vs. strong attitudes • Divided crowdworkers into mild and strong attitudes • Mild attitudes: ρ = -0.03, p = 0.829 • Strong attitudes: ρ = 0.26, p = 0.035 Extremely opposing Opposing Somewhat opposing Neutral Somewhat supporting Supporting Extremely supporting -3 -2 -1 0 +1 +2 +3 mildstrong strong
  11. 11. 11 WIS Web Information Systems Worker requirements too low? • Worker requirements – HIT approval rate > 95% – Location: United States • Second study; higher worker requirements – HIT approval rate > 98% – Location: US, AUS, NZ, UK, GER, FIN, SWE, NO, CH, AUT – Master workers • Comparing subsets of overlapping items (112) – Only difference was requirements – Subset of first study: ρ = 0.22, p = 0.022 (n = 114) – Subset of second study: ρ = 0.06, p = 0.77 (n = 25)
  12. 12. 12 WIS Web Information Systems Discussion • Crowdworker attitudes affected viewpoint annotations – Irrespective of topic – Workers with stronger opinions and lower qualifications seem to be more prone to bias • Other considerations – Influence of self-reported knowledge? – Asking workers for sincerity? – Type of document could also play a role (more ambiguous, more bias?) • Future work – What specifically causes the bias? – Mitigation strategies
  13. 13. 13 WIS Web Information Systems Take home and future work • Cognitive biases can affect (viewpoint) annotations of crowdworkers – Be aware (test!) – Design task to make crowdworkers aware of biases – If possible, remove ambiguous items Material related to this research available at https://osf.io/kbjgp/. t.a.draws@tudelft.nl timdraws.net

×