Data Collection During the COVID-19 Pandemic
Collecting Spoken Responses Using a Self-Developed Website
Shi Chen, Ph.D.
Institutional Research, Assessment & Policy Studies
University of California, Santa Cruz
Collecting data during the pandemic has posed significant challenges for researchers whose research would normally involve meeting with study participants in person. When the pandemic began, I was planning to pivot from my original idea of creating a web-based pragmatic speaking test for English language learners using an online role-playing component for my dissertation research and to conduct in-person research. I had encountered several technical difficulties, but the limitations of the COVID-19 pandemic began to make those difficulties seem surmountable. Indeed, I found that working with a web developer I could create a website to host the online second language pragmatic speaking test. I achieved that original goal of web-based testing, and participants were able to speak into the microphone and take the test wherever they had internet access, except for in a few countries that have internet restrictions.
The dissertation instrument piloting process was exciting and, at the same time, nerve-wracking. Although I ran the speaking test many times on my own computer with multiple web browsers, I was still nervous about whether it would work well on participants’ computers. I piloted the test with about 40 undergraduate students who were enrolled in a large-session linguistics course, then began data collection. Participants received the instructions on e-mail and completed the consent form on Qualtrics. After they signed the IRB form, participants could head to the website, register an account, and take the speaking test online. Their audio recordings were automatically uploaded and stored in Amazon Web Services (AWS). Students who completed all three sections of the test were compensated with an Amazon gift card. It took each student approximately an hour to complete the speaking test. They were also instructed to take the test in one sitting. Much to my relief, the test-taking process was generally smooth, especially after revisions based on feedback from the pilot study, although technical difficulties prevented the recording of a few students’ tests. I was able to reach out to those participants and ask them to retake the speaking test. What I have learned from remediating that problem is to emphasize the importance of testing the microphone with students before proceeding with the test.
Photo courtesy of Shi Chen
I observed a few pros and cons of online data collection in the process. The greatest advantage of data collection online is that I could potentially reach a larger audience because of the lack of geographical constraints. Recruitment can leverage online communities such as groups on social media and email listservs. Instructors teaching English around the world could potentially administer the test I created. Another advantage is that the format is completely flexible with respect to time in that students can take it any time of day in any time zone. It is also far less time-consuming for the researcher to collect responses online, compared to meeting with every participant individually, and as my study demonstrated it is possible to record participants’ voices online and thus gain some of the advantages of spoken data. One disadvantage of data collection online is that not being able to talk to students face to face has consequences. Although online video-conferencing tools such as Zoom have made it easier to connect with people around the world, it is easier to convince participants in person to complete a study. My experience suggests that sending email reminders might not be as effective as recruiting participants in person. Had I been able to visit every classroom with potential participants myself and describe my study and answer questions about it, I suspect my participation and test completion rates would have been higher.
My experience also taught me some useful insights for collecting data online that I hope may be of use to other researchers. To begin with, it is crucial to mention the online data collection mode in the project narrative in your IRB application and to explain how students will sign the IRB consent form. I found the Qualtrics platform to be effective because test takers can sign the consent form using an electronic signature. Additionally, if you will be providing compensation, you need to be aware that students in other countries might not be able to use U.S.-based services, such as U.S. Amazon gift cards. As well, some instructors have preferences in terms of how to encourage students to participate in a research study. I had to revise my IRB narrative and consent form multiple times to reflect different compensation methods. For example, some instructors preferred to give extra credit for participants. Since participants in Canada might not be able to use U.S. Amazon gift cards, I had to look for another type of gift card (e.g., Starbucks gift card) to compensate for their time. Those points all need to be reflected in the IRB application and amendment.
In an online format, it is vital to ensure the test-taking instructions are concise and clear, as you will not have the same opportunity to take questions that an in-person format would allow, and some participants may not be fluent in the language in which you deliver the test. Even participants who are fluent may be intimidated by instructions that are not written in simple language. If possible, provide instructions in multiple languages. For example, one thing that can be done is to translate instructions into participants’ native languages. What I did was to ask an instructor to help answer any questions students might have about the test in their native language. Finally, I suggest testing out the data collection procedure yourself many times prior to asking others to use it. I tested my website many times and found room for improvement at almost every attempt, such as adjusting the font, fixing typos, and making sure the response time allowed was appropriate. Putting yourself in the participants’ shoes so that you understand issues they come across while participating in your study is a good way to make sure the process is smooth.
Shi Chen, Ph.D., is an Assessment Analyst at University of California Santa Cruz, in Fall 2021. She received her PhD in Applied Linguistics, specializing in language testing and assessment, from Northern Arizona University. During her PhD studies, she received multiple awards to support her dissertation project, including the ETS TOEFL Small Grant for Doctoral Research in Second or Foreign Language Assessment, the Duolingo English Test’s 2020 Doctoral Award, and the Northern Arizona University Support for Graduate Students Award.
Chen, S. (2021, September 28). Data Collection During the COVID-19 Pandemic: Collecting Spoken Responses Using a Self-Developed Website. AAAL Graduate Student Council Blog. https://www.aaal-gsc.org/post/data-collection-during-the-covid-19-pandemic