国際ジャーナルへの論文掲載情報
Authors: Yoshimasa Majima, Kaoru Nishiyama, Aki Nishihara, Ryosuke, Hata
Title: Conducting Online Behavioral Research Using Crowdsourcing Services in Japan
Journal(書誌情報): Frontiers in Psychology, 8:378, 2017.
doi: 10.3389/fpsyg.2017.00378
論文URL: http://journal.frontiersin.org/article/10.3389/fpsyg.2017.00378/full
Abstract:
Recent research on human behavior has often collected empirical data from the
online labor market, through a process known as crowdsourcing. As well as the
United States and the major European countries, there are several crowdsourcing
services in Japan. For research purpose, Amazon's Mechanical Turk (MTurk) is the
widely used platform among those services. Previous validation studies have shown
many commonalities between MTurk workers and participants from traditional samples
based on not only personality but also performance on reasoning tasks. The present
study aims to extend these findings to non-MTurk (i.e., Japanese) crowdsourcing
samples in which workers have different ethnic backgrounds from those of MTurk.
We conducted three surveys (N = 426, 453, 167, respectively) designed to compare
Japanese crowdsourcing workers and university students in terms of their
demographics, personality traits, reasoning skills, and attention to instructions.
The results generally align with previous studies and suggest that non-MTurk
participants are also eligible for behavioral research. Furthermore, small screen
devices are found to impair participants' attention to instructions. Several
recommendations concerning this sample are presented.
著者Contact先の email:
majima.y[at]hokusei.ac.jp
Title: Conducting Online Behavioral Research Using Crowdsourcing Services in Japan
Journal(書誌情報): Frontiers in Psychology, 8:378, 2017.
doi: 10.3389/fpsyg.2017.00378
論文URL: http://journal.frontiersin.org/article/10.3389/fpsyg.2017.00378/full
Abstract:
Recent research on human behavior has often collected empirical data from the
online labor market, through a process known as crowdsourcing. As well as the
United States and the major European countries, there are several crowdsourcing
services in Japan. For research purpose, Amazon's Mechanical Turk (MTurk) is the
widely used platform among those services. Previous validation studies have shown
many commonalities between MTurk workers and participants from traditional samples
based on not only personality but also performance on reasoning tasks. The present
study aims to extend these findings to non-MTurk (i.e., Japanese) crowdsourcing
samples in which workers have different ethnic backgrounds from those of MTurk.
We conducted three surveys (N = 426, 453, 167, respectively) designed to compare
Japanese crowdsourcing workers and university students in terms of their
demographics, personality traits, reasoning skills, and attention to instructions.
The results generally align with previous studies and suggest that non-MTurk
participants are also eligible for behavioral research. Furthermore, small screen
devices are found to impair participants' attention to instructions. Several
recommendations concerning this sample are presented.
著者Contact先の email:
majima.y[at]hokusei.ac.jp