Are we getting what we are after? Our validation process

Multiple revisions and notes took place both digitally and on paper
Multiple revisions and notes took place both digitally and on paper

Preparation of the Teaching Literacies with Technology Survey was a lengthy process. From collaborating with peers to add the content, to validating with experts in the field, to taking the feedback and rebuilding the survey, the process took almost eight months before we were ready to launch the survey. I joined the Digitally Literate Research Team in October 2015, in the midst of the final revisions of the survey prior to validation and preparing applications for each institution’s research ethics boards.

The initial survey was put together by Michelle Schira Hagerman, Ian W. O’Byrne, Denise Johnson, and Greg McVerry. This initial survey was combed through by myself (Heather), Michelle, and Ian. We wanted to make sure we covered basic demographic information, facets of technology use within and outside of work, and educators’ own digital literacy knowledge/skill.

Discussions at this stage between the research team consisted of whether we were truly capturing educators digital literacy skills and teaching practices, the wording and tone of questions, and the repetitiveness of questions. Here we made some minor adjustments, and opted to put the demographics at the end to the survey. This decision was made to ensure that should participants only fill out a portion of the survey, we had usable data about their teaching practices. However, we reexamined this decision later on for validation purposes, which I will discuss later.

Once we were comfortable with the survey, we invited close to 40 practitioners and scholars to provide feedback on the survey. We received 31 responses. From this feedback, I was tasked with summarizing the comments, and formulating a plan of attack to improve the survey. Validators provided feedback at the end of each page of the survey. From this we combed through the survey, page by page, question by question.

Informed Consent

The feedback we received for the informed consent questioned the the length of this content. However, we were unable to change much of the content as it contains all the information that our respective Institutional Review Board (IRB) and Research Ethics Board (REB) required. However, as suggested, we did add clarifying language regarding the target audience for the survey “You are invited to participate in a research study because you are a PK-16 educator who may use instructional technologies as part of literacy instruction”. We also added headings to help direct the readers through the letter in a more systematic way. Finally, one of the most common points of feedback we received was that the survey would not take 60 minutes, as we had stated in the first informed consent letter. Validators reported that it would probably take about 30 minutes, and listing 60 minutes may turn educators away from wanting to participate in the study.

Demographics

Despite our thoughts about moving the demographics to the end of the survey, many of the validators questioned this, and felt it would be best to place it at the start. After discussing, we agreed, as this would allow us to systematically ensure that those participants who completed the survey were not characteristically different from those who did not fully complete the survey. Thus, this would add to the reliability of our anticipated analyses.

It was also suggested that we add questions about experience and grades being taught. As this is a global survey, we opted to ask what age group educators were teaching, as to avoid confusion between various countries grade systems. Based on the feedback we also added questions on where the educators were from, teaching, and the languages in which they taught and spoke.

Again we clarified some of the language, for example the type of institution in which participants were teaching was clarified from public and private to: Publicly funded by taxpayers; Private institution, not funded by government; and Other: [please specify]. We also added a description to help prompt respondents for the question “Please describe your current role”.

Main content

The main content of the survey underwent a fairly major overhaul after the validation feedback came in. One of the major adjustments was the addition of digital literacy specific questions. There was some feedback that we had not adequately captured educators’ digital literacy skills and how they implemented digital literacy education within the classroom, and we had to agree. There was a heavy focus on how educators’ used technology but not enough on digital literacy itself. So we added questions regarding theoretical approaches that educators may use to teach digital literacy and integrate technology. Additionally we added survey items regarding educators’ encourage their students to Use, Participate and Create in a digital world.

We initially had the question:

How, if at all, does the use of instructional technology positively impact students’ non-academic outcomes (e.g., social-emotional, behavioral, motivational)? (Select all that apply.)

  • The physical safety of students has improved.
  • Students are more actively engaged in class.
  • There are fewer classroom issues.
  • Improved classroom climate.
  • Increased time on task.
  • Students’ interactions with each other are more positive.
  • Other (please specify)

This question was altered so as to remove the positive lean of the question, to assess whether educators thought instructional technology impacted these factors in a positive or negative way. Each option was rated on a likert scale from negative to positive impact on student outcomes.

Additionally, we added examples throughout to aid in clarification. We also elaborated and generalized the applications that educators may be using within their daily personal and professional lives. We opted to move away from name brands where alternatives were available For example, we had originally included Microsoft word as an option, but generalized for word processing applications, so as to include Google Docs, Open Office, etc.

The biggest comment about the main content of the survey was the organization. Specifically, there was a lack of flow. The reviewers expressed that similar questions were scattered throughout, and it was difficult to assess which topics were being explored. Once we had reviewed and integrated all of the feedback the survey was printed off to be able to view the survey as a whole. This allowed me to visually group similar questions. Once this stage was done, Michelle and myself reviewed the survey, to ensure that the order of items seemed logical.

Overall, we aimed to assess how and which technologies that educators were using within the classroom, and at home, their digital literacy skills, and how they were fostering digital literacies in their students. We also wanted to include a short section on what potential barriers educators were facing in implementing technologies and digital literacy skills within the classroom. From the survey responses we can not only gain a better understanding of how educators from around the world teach digital literacy, but can then select participants to for the second stage of the research, the in depth interviews, where we will be able to further explore how educators are teaching digital literacies.