Revenir à PédagogIC

québec

DESCRIPTION OF THE RESEARCH PROJECT
PROJECT
Contribution to the Development, Testing, and Improvement of Practices
Nowadays, anyone with a smart phone and internet connection can publish whatever they wish rapidly
and widely. Therefore, it is no wonder that youth struggle with knowing which information to
trust. Knowing how to evaluate online information plays a key role in the academic success of
students, across multiple subject areas and disciplines, particularly as they advance throughout their
secondary and post-secondary education. Whether it be a high school science project debating the
safety of vaccinations or a doctoral history dissertation involving the analysis of digital archives to
corroborate accounts, evaluating online information is crucial to students’ academic performance.
Indeed, research has demonstrated that information evaluation and academic performance are
importantly intertwined. For example, evaluation processes are implicit in reading to learn,
discriminating reliable from unreliable evidence, and communicating the claim-plus-evidence structure
in argumentative writing (Wiley et al., 2009). Furthermore, these evaluation processes empower
students to make important decisions regarding their political, social, and economic lives. Without
evaluation skills, students are forced to rely on the judgements of others, who may have agendas at
odds with the students’ own values or best scientific evidence. Indeed, a lack of ability to evaluate
credibility allows disinformation to spread, jeopardizing democracy (Wineburg et al., 2016).
Despite its importance, the ability to evaluate the credibility of online information remains a challenge
for students (Corrigan, 2019; Kiili et al., 2018). For example, many students have misconceptions
regarding how to evaluate online information, relying instead on superficial strategies that do little to
differentiate between information and disinformation (Corrigan, 2019; Forzani, 2016). Furthermore, in
one study of 1,434 seventh graders in the United States, the ability to critically evaluate online
information was shown to be the most difficult online research skill among locating, synthesizing,
communicating, and evaluating (Forzani, 2016). Similarly, a 2018-2019 study of 3,446 American high
school students conducted by the Stanford History Education Group unveiled a number of troubling
findings regarding how youth evaluate online information. Students across the educational spectrum
from middle school to college struggled to perform “even the most basic evaluations of digital
material.” For example, two thirds of students could not tell the difference between news stories and
ads (set off by the words “Sponsored Content”) and “[n]inety-six percent of students did not consider
why ties between a climate change website and the fossil fuel industry might lessen that website’s
credibility” (Stanford History Education Group, 2019, p. 3). While there is limited research on the
Quebec context, one province-wide study of first year undergraduate students entering Quebec
universities found that students were “ill-equipped” to evaluate online information, finding that only
23% of students were able to identify credibility indicators for online information such as currency,
authority, and site sponsorship (Mittermeyer, 2005).
In light of the need to raise students’ ability to evaluate critically online information, the proposed
project will contribute to practice through the development of a series of instructional interventions
designed to help secondary school students better develop the 21st century competency of critical
thinking. More specifically, the study aims to improve students’ critical thinking with regards to
evaluating the credibility of online information. These interventions will be developed through a
collaboration among researchers and teachers at our participating schools. To determine whether the
interventions improve practice, our study will begin by exploring current instructional approaches
through classroom observation and teacher interviews. Following this, researchers and teachers will
collaborate on the design and implementation of the interventions, considering both current practice
and recent empirical and theoretical research. To test whether these interventions indeed improve
15
DESCRIPTION OF THE RESEARCH PROJECT
practice, we will compare students’ performance over time. We have strategically chosen to focus the
scope of this project to critical thinking in the context of evaluating online information to achieve
robust and actionable results regarding this complex cognitive task.
Originality and Contribution to the Advancement of Knowledge
There is a small, but growing body of empirical literature examining the effects of teaching students
how to evaluate online information credibility. A number of intervention studies have shown that
online evaluation skills are amenable to instruction (Braasch et al., 2012; McGrew, 2020; Pérez et al.,
2018; Zhang & Duke, 2011). While tools to evaluate online information currently exist such as the
CRAAP (Credibility, Relevance, Authority, Accuracy, Purpose) test (California State University,
2013), checklists and similar tools to these “may encourage superficial reading by prompting readers to
gather lists of disconnected information rather than evaluating comprehensively and in relation to a
topic” (Forzani, 2019, p. 4). Therefore, rather than offering students such a checklist or test, our goal is
to create a series of interventions that enable students to comprehensively and critically evaluate online
information in a way that is responsive and contextualized to the student’s online inquiry task. By
being responsive to context, we mean that any given source of online information needs to be evaluated
considering the context of the inquiry. For example, while a Facebook mom’s group would not be a
credible source of information for discerning the safety of vaccinations, it would be a credible source
for finding out tips on how to help a scared child who is getting a vaccination. While most online
evaluation interventions and checklists are prescriptive in nature (claiming that government websites
are more credible than personal blogs, for example; Pérez et al., 2018), ours will be descriptive. This
means that rather than prescribing to students a set of rules for evaluating online information, we
describe how and why we might evaluate information in relation to the context of the inquiry. This
approach will empower students to recognize the purpose of their online inquiry and determine which
sources are most appropriate and credible, in relation to their context. Thus, our intervention will
emphasize that credibility is context dependent, which is an atypical position in online inquiry (cf.
Pérez et al., 2018) and thus makes our study a unique contribution to the advancement of knowledge.
Secondly, we are not aware of any intervention studies to date that ask students to consider their
confirmation bias (Lord et al., 1979) at the outset of the online inquiry process. As we will further
explain in the theoretical framework, a student’s pre-existing beliefs are likely to bias a student during
the credibility evaluation process. Thus, it is our intention to incorporate in the intervention teaching
around flexible thinking (Barzilai & Zohar, 2012) so that students are better able to acknowledge their
biases and follow the path of best evidence.
Thirdly, many of the pre-existing intervention studies provide scaffolding to students during the posttest
phase. For example, Walraven et al. (2013, p. 144) designed an intervention study using a post-test
wherein students are prompted (e.g., “Is this information up to date?”) to consider the currency of an
online source, alongside a variety of other prompts regarding credibility and relevancy indicators (e.g.,
type of URL, author expertise, rank in search results). In another example, Pérez et al (2018) asked
students to rate the credibility of a number of online sources across three pre-determined criteria (i.e.,
author position, author motivation, and media quality) and then to justify their ratings. By doing so, the
post-tests used in the Pérez and Walraven studies signal to students the criteria to use when evaluating
online information instead of having the students determine these criteria on their own. While we will
design scaffolding for students during the intervention phase, our goal is to have students
independently evaluate the credibility of online information by the end of the intervention. We aim to
prepare students to evaluate information beyond this study in real world contexts where such
scaffolding will not exist. To accomplish this goal, we plan to use a gradual release of responsibility
16
DESCRIPTION OF THE RESEARCH PROJECT
model (Fisher & Frey, 2014). In this model, the responsibility for learning moves increasingly away
from the teacher and towards the student, such as by moving from focussed instruction, to guided
instruction, then to collaborative learning, and finally, to independent learning.
Fourthly, we are not aware of any intervention studies done in the Quebec context. While websites
such as digitalcitizenshipquebec.ca (developed by teachers, education consultants, and librarians from
the English education community of Quebec) feature a number or resources to aid in the teaching of
online evaluation skills, these have not been empirically validated in the Quebec context. Our study
will investigate the effectiveness of an intervention designed by Quebec teachers for Quebec students
across demographic variables including gender (male, female, non-binary/two-spirit), socio-economic
status (SES), first-language background (Allophone, Anglophone, Francophone), and language
proficiency (grades in first and second language classes at the end of Secondary II). Equipped with this
knowledge regarding how the interventions differentially impact segments of the population, we will be
poised to adapt the interventions to accommodate learners’ individual differences and promote learning
for all. Therefore, our study is original and will make important contributions to the advancement of
knowledge by creating a series of interventions that are comprehensive and contextualized (i.e.,
descriptive not prescriptive); consider confirmation bias during the inquiry process (i.e., by teaching
students to think flexibly); prepare students to evaluate online information independently (i.e., without
scaffolding during the post-test phase); and, are situated in the Quebec context (i.e., responsive to
Quebec demographics including language, SES, and gender).
Clarity of the Problem, Appropriateness of the Theoretical Approach, and Precision of the
Objectives
To summarize the problem, a number of studies have revealed that students lack the ability to evaluate
online information, even at a basic level (Forzani, 2016; Mittermeyer, 2005; Stanford History Education
Group, 2019). Unfortunately, even Quebec students are not immune to this problem, with Mittermeyer
(2005) reporting that a majority of undergraduate students are not aware of which criteria are pertinent to
evaluating online information. Concerning the research regarding intervention studies focused on online
evaluation, these studies have a number of short-comings including interventions that are systematic, but
lacking in comprehensiveness; a lack of consideration for confirmation bias during inquiry phase; a reliance
on scaffolding students’ evaluation processes instead of promoting independent critical thinking; and, a lack
of consideration for the Québecois demographic context.
Our theoretical approach will be guided by perspectives on credibility evaluation. We define information
credibility as information accuracy (Kiili et al., 2008) or the “believability” of information (Hovland et al.,
1953). We situate evaluation within the new literacies of online research and comprehension (Leu et al.,
2013). This theory views online research as a reading comprehension skill that involves both traditional,
offline reading skills as well as additional, online reading skills (see, for example, Cho, 2013; Coiro &
Dobler, 2007). It further views online reading as an inquiry process of defining questions (Leu et al., 2004)
and locating (Bilal, 2000), synthesizing (Goldman et al., 2005), evaluating (Sanchez et al., 2006), and
communicating information (Greenhow et al., 2009).
In particular, we reference the three-tiered framework for evaluating relevancy and credibility during online
inquiry (Corrigan & Forzani, 2019; Forzani, 2019). Briefly, this framework considers the multifaceted and
situated nature of online evaluation, including evaluating content (e.g., accuracy of ideas, strength of
argumentation, triangulation of evidence); source (e.g., trustworthiness of the author and/or publisher); and,
context (e.g., URL, currency [publication date], genre [e.g., blog], endorsements [e.g., advertisers and
sponsors]). However, these skills alone are insufficient for evaluating the credibility of online information.
They must be supported by metacognitive practices that enable students to critically evaluate information,
17
DESCRIPTION OF THE RESEARCH PROJECT
despite the well-documented phenomenon of my-side bias (Perkins et al., 1991) or confirmation bias (Lord,
et al., 1979). In other words, people naturally gravitate towards information that supports their entrenched
beliefs and values. Thus, no amount of instruction will be productive if students do not concomitantly
develop critical habits of mind (Forzani, 2018), including flexible thinking (i.e., the ability to change one’s
mind, when confronted with better evidence; Barzilai & Zohar, 2012) and assuming a proactive, critical
stance towards information (Corrigan, 2019).
The objective of the proposed project is to respond to needs expressed in the call of the Action-
Research Program on Digital Technology in Education and Higher Education in the following three
ways. First, this project responds to Need 1, which concerns the identification of teaching approaches
or practices that contribute to the development of 21st century competencies. As the development of
21st century competencies is quite broad, we have limited the scope of our research, in consideration of
our timeline and budget, to investigate the effect of interventions aimed at teaching students critical
thinking in the context of online evaluation. Second, this project aims to build partnerships among
stakeholders in the educational system, thereby increasing the relevance of research in the schools and
facilitating the uptake of evidence-based research. Finally, this project responds to the call for genderbased
analysis by examining how the intervention differentially impacts segments of the population,
notably across the variable of gender. In order to accomplish these objectives, this project adopts a
collaborative action research approach between researchers and practitioners to develop, and
subsequently validate, classroom interventions that will enable students to evaluate the credibility of
online information more effectively. By building partnerships among stakeholders in the educational
system, the project will be more responsive to the needs of teachers and students, while facilitating the
uptake of evidence-based research in secondary schools.
Appropriateness, Rigour and Justification of the Methodological Approach, and Realistic
Timetable
Reflecting the objectives set forth in the call, the proposed study adopts a collaborative action research
approach. The mixed methods study will be carried out in three phases, each with distinct research
questions, objectives, and research activities.
Participants: Our study will target Secondary 3 students and their teachers, based on the feedback of
our practitioner co-researchers, who pointed out that teachers and students at this grade level face fewer
demands in preparing for Ministry testing, the results of which can have a significant impact on
students’ admission to Cégep and other post-secondary programs. Furthermore, research in
developmental psychology has demonstrated that during early adolescence, youth develop the ability to
think in more critical and abstract ways (American Psychological Association, 2002), which undergirds
their ability to evaluate the credibility of information. Therefore, all three phases of the research project
will be carried out with Secondary 3 students and their teachers.
The study will involve three schools recruited via our practitioner co-applicants. The first school will
be College Saint-Paul, a French language, private school where our practitioner co-applicant Andrea
Barrios serves as the Department Chair of English, Spanish, and physical education at Collège Saint-
Paul. At this school, we will be administering the intervention with English as a second language
students. The second and third schools will be recruited from two English language, public schools in
the English Montreal School Board where our practitioner co-applicant Caroline Dupuis serves as a
Pedagogical Consultant with Réseau Education Collaboration Innovation Technologie (RÉCIT). We
aim to recruit two teachers from each of these three schools for a total of six teachers.
18
DESCRIPTION OF THE RESEARCH PROJECT
Research Design: This study will use a multi-phase, participatory action research design. The
following table summarizes the research questions and methods for each phase.
Table 1. Research Design
Phase and
Focus
Research Questions Participants and Methods
1:
Exploratory
1. What approaches and practices are
Quebec secondary teachers
currently using to teach students
critical thinking in the context of
online evaluation? What are
students’ perceptions of these
practices and processes?
• Interviews with teachers
o In-depth, semi-structured interviews
with teachers (n = 6)
• Classroom observation
o Three classroom observations
sessions of approximately one hour
with each participating teacher (n =
18 hours of observation)
• Focus Groups with students
o A focus group with each
participating class for a total of 6
focus groups, with approximately 7
students per focus group, for a total
of (n = 42 students)
• Diagnostic assessment of students’
online evaluation skills
o Students in participating teachers’
classes (n = 150)
2:
Intervention
2. How do a series of interventions—
informed by research regarding the
Quebec context (research questions
1 and 2) and by the three-tiered
framework for evaluating relevancy
and credibility during online
inquiry—affect students’
proficiency in evaluating online
source credibility?
3. How do these interventions
differentially affect students’
proficiency across the variables of
gender (male, female, nonbinary/
two-spirit); socio-economic
status (SES); first-language
background (Allophone,
Anglophone, Francophone); and
language proficiency (grades in
first and second language classes at
the end of Secondary II)?
• Assessment of each student at the
beginning and end of the intervention
o Three classes at each of the three
participating schools (n = 150
students)
• Demographic questionnaire for
participating students
• Interviews with participating teachers
during the intervention
• Post-class reports from teachers after
each intervention
• Focus groups with students about the
intervention (one focus group from
each participating class)
• Artefacts from students created during
the intervention (screen caputure, audio
recordings, written products)
3: Knowledge
Dissemination
and Transfer
4. How can the instructional materials
be adapted for use by Secondary III
teachers?
• Professional development sessions with
teachers
19
DESCRIPTION OF THE RESEARCH PROJECT
o Revise and adapt materials
based on findings of Phase 2
• Disseminate materials to professional
organizations, at pedagogical days, and
through a website
Phase 1 is the Exploratory Phase during which time our researcher and practitioner co-applicants will
work closely with teachers at our partner schools to explore what approaches and practices Quebec
secondary teachers are currently using to teach students critical thinking in the context of online
evaluation. Also, we will explore students’ perceptions about these approaches and practices, in
addition to their ability to evaluate online evaluation. While teachers may be familiar with the Digital
Competency Framework and the 21st century competencies defined within this document, there will
likely be variation in how teachers interpret these competencies and address them through their
pedagogical practices —particularly because it is not mandatory that these competencies be evaluated
(i.e, teachers are only responsible for evaluating the competencies in the Quebec Educational
Program).
We will begin this phase by inviting the teachers to participate in semi-structured interviews. This
purpose of these interviews is to generate “thick descriptions” (Geertz, 1973) about how teachers
approach the teaching of critical thinking in the context of online source evaluation. Moreover, we
hope to uncover more detail regarding what barriers and facilitators teachers experience with regards to
their pedagogical practices, approaches, and/or tools they use to teach students how to evaluate online
information. Next, students in the classes of participating teachers will be invited to participate in
focus groups. During the focus groups, we will probe students’ knowledge of critical thinking in the
context of online evaluation. We will also explore students’ perceptions about how this competency is
taught. Last, the teachers will administer a diagnostic assessment tool of the students’ ability to
evaluate online information, which will establish a baseline against which the performance of students
in subsequent phases will be compared.
The diagnostic tool will be an adapted version of the Online Research Comprehension Assessment
(ORCA), which has been validated for use with secondary students (Leu et al., 2012). During the
ORCA, students are presented with an online inquiry challenge (e.g., Does playing video games harm
eye health?) in a Facebook-like platform via avatars named Brianna and Jordan who are introduced as
students from a neighbouring school. ORCA tasks are centred around the inquiry challenge and require
students to locate, evaluate, synthesize, and communicate online information. We operationalize online
evaluation ability as students’ scores on the Evaluate dimension of the ORCA. An overview of the
ORCA including video capture of some its main features can be found here:

Phase 1 will conclude with the triangulation of data from teacher interviews, student focus groups,
classroom observations, and the diagnostic tool. With this data, we will be able to draw inferences
regarding the efficacy of the current approaches being used to teach online evaluation, students’
perceptions of these approaches, and students’ actual online evaluation performance. This data will
also enable us to design and tailor instructional targets in the subsequent phase to meet the specific
needs of participating teachers and their students.
After identifying instructional targets, in Phase 2 we begin working on the series of interventions. This
will begin with the development, pilot-testing, and revision of instructional materials to help students
improve their critical thinking in the context of online evaluation. Working collaboratively, researchers
20
DESCRIPTION OF THE RESEARCH PROJECT
and practitioners will design instructional activities that target the components in the three-tiered
evaluation model proposed by Corrigan and Forzani (2019): content, source, and context. The
participating teachers will be invited to the Centre for the Study of Learning and Performance at
Concordia University to co-design the materials with the research team. Due to the differing
demographic characteristics at the participating schools, each school may require tailor-made
instructional materials. For example, ESL students in one school may require interventions to promote
content evaluation, whereas students at another school may require additional instruction in context
evaluation. To address the learning styles and needs of diverse students, the instructional materials will
include a variety of activity types, including individual and collaborative tasks, oral and written
performance, and computer and pen/paper activities.
Once the materials are ready, they will be implemented by the participating teachers in their classes. To
measure whether or not the interventions were successful in increasing students’ online evaluation
ability, the ORCA will be administered at the beginning and end of the intervention period. Additional
artifacts from the classroom will be collected to gain a nuanced view of students’ performance on the
instructional activities, including audio-recordings of collaborative tasks, written documents, and
screen capture of online inquiry tasks. Members of the research team will attend classes when the
instructional materials are implemented to take field notes. Instructors will also record their
impressions about the materials and student learning through post-class reports. Interviews and focus
groups will be carried out with both teachers and students to gain further insight into their perceptions
about the instructional activities designed for the intervention.
In Phase 3 (Dissemination and Transfer), we will complete data analysis and evaluate the effectiveness
of our instructional activities. Through a comparison with the student performance data (i.e., the
ORCA) administered during the Exploratory phase, we will explore whether the instructional materials
were more effective than the teachers’ prior methods of teaching online evaluation. Qualitive data
analysis of the student performance artifacts, teacher interviews, and student focus groups will
complement the quantitative findings.
Our first professional development session will be a workshop for participating teachers to help revise
the materials to improve upon any weaknesses identified in Phase 2 and adapt/expand the materials for
use in a wider range of Secondary III classes. After finalizing the materials, we will provide workshops
for additional teachers during pedagogical days at the participating schools, and in the school board, so
that teachers can implement the activities in their own classes. We will also disseminate the materials
for teachers to use through professional organizations (e.g., conferences, newsletters). Additionally, we
will create a website that will provide links to both practical (e.g., lesson plans, assessments) and
theoretical (e.g., peer-reviewed articles) resources.
During Phase 3, we will also explore whether the instructional activities had differential impacts across
variables including gender (male, female, non-binary/two-spirit), first-language background
(Allophone, Anglophone, Francophone), socio-economic status, and language proficiency. We will
explore these demographic factors to determine if students with different characteristics benefit equally
from the intervention. In the final professional development workshop, materials will be revised in
cooperation with the practitioners to help meet the needs of all demographic groups. We will also
encourage the practitioners to develop new materials and assessment tools that target the same objects
while accommodating diverse student populations. Finally, Phase 3 will also consist of dissemination
activities through conference presentations and publications.
21
DESCRIPTION OF THE RESEARCH PROJECT
Appropriateness, Rigour, and Justification of the Methodological Approach: This methodological
approach is appropriate because it will enable partnership building between researchers and
practitioners, while supporting the research-to-practice pipeline. Scholars have noted the importance of
involving stakeholders in the generation of new knowledge to make the research more socially relevant
(Chevalier & Buckles, 2009).
We have taken a number of steps to ensure the rigour of our mixed methods, action research study.
Considering that quantitative and qualitative approaches have distinct indicators of rigour (O’Leary,
2017), we will discuss these each in turn. To emphasize the indicators of rigour, we have italicized
them below. In terms of the quantitative dimension of our study, we have ensured that we are using a
validated instrument (i.e., the ORCA) to collect data regarding students’ performance in evaluating
online information. Next, we have ensured that we have used an appropriate sampling plan in order to
obtain sufficient power for statistical analyses. Also, we will ensure the reliability of the test data by
administering the test under the same conditions in each of our six participating classes. By repeating
the series of interventions and the tests with six classes, we will ensure that our results are
reproducible.
For the qualitative dimensions of our study (e.g., teacher interviews, teacher post-class reports, student
focus groups, student artefacts, classroom observation), we have designed our research study to have
greater ecological validity by collecting data in the classroom (vs. in a laboratory setting).
Furthermore, we will use classroom teachers to conduct the intervention instead of a researcher.
Although using one researcher would increase the reliability of the study (e.g., by eliminating the
variable of teacher effectiveness), it would diminish the ecological validity. We wish to know if the
interventions we design are practical and accessible for classroom teachers, and thus it is important for
us to have classroom teachers administer the intervention. By soliciting feedback from both teachers
and students across six classes, we will be increasing the authenticity of our findings by recognizing
multiple voices with multiple perspectives. Considering our sampling plan using six classes, we are
confident that we will achieve saturation during our data analysis. To ensure greater dependability, we
are designing protocols for the interviews, focus groups, and classroom observation. This will ensure
that our qualitative data collection is systematic and well documented. Throughout the research
process, we will keep detailed notes and logs to increase the auditability of the research. Considering
the appropriateness and rigour of our methodology, we feel that our methodological approach is
justified.
Appropriateness of Timetable: The proposed study’s timeline is divided into three major phases, as
illustrated in Table 2. Phase 1 (Summer 2020- Winter 2021) is the exploratory phase during which
ethic protocols will be prepared and submitted to both universities and school boards, classrooms will
be observed, and practitioners will be consulted about their students’ needs, instructional objectives,
and pedagogical materials. Student performance will be assessed through the ORCA tool and their selfefficacy
about critical thinking in the context of online evaluation will be elicited. Toward the end of
Phase 1, the instructional plans for each participating class will be finalized.
In Phase 2 (Summer 2021 – Winter 2022) is the intervention phase. All the experimental materials will
be finalized, and professional development sessions will be held with practitioners after teacher guides
for the materials are created. The main activity will be data collection starting in the fall and ending in
the Winter.
Phase 3 (Summer 2019-Winter 2020) will concentrate on dissemination and transfer activities after
completing data analysis activities. Diffusion and transfer activities will culminate with professional
22
DESCRIPTION OF THE RESEARCH PROJECT
development sessions with the practitioners, wider distribution of the instructional materials,
manuscripts for journals, and submission of the final research report. We feel that this timetable is
realistic and appropriate considering our past research experience the expertise of our research team.
Table 2: Timetable
Phase 1: Exploratory
Summer 2020
• Submit ethics protocols and consent forms to Concordia
• Submit ethics protocols and consent to school boards
• Work with practitioner partners to establish tentative schedule
for school visits
Fall 2020
• Wait for ethics approval from school boards
• Request approval from principals to do research in specific
schools
• Design/administer teacher interview protocol
• Observe classes and take field notes
• Hold professional development (PD) session with practitioners
(potential instructional targets)
• Design questionnaire to collect demographic data referred to in
research question 5 (i.e., gender, socio-economic status, firstlanguage
background, and language proficiency
Winter 2021
• Continue to observe classes and take field notes
• Assess students’ critical thinking in the context of online
evaluation via ORCA diagnostic assessment; at the same time,
collect demographic data via questionnaire
• Elicit teachers’ perceptions about student performance on
ORCA test
• Create/administer student focus group protocol
• Hold PD sessions with participating teachers (revisit
instructional targets based on student data)
Phase 2: Intervention
Summer 2021 • Finalize all instructional materials
• Prepare teacher guides for instructional materials
• Plan tentative data collection schedule
Fall 2021
• Confirm data collection schedule
• Hold PD session with practitioners (implementation of
instructional materials)
• Begin data collection
Winter 2022
• Finish data collection
• Hold PD session with practitioners (assess instructional
activities)
• Begin data analysis
Phase 3: Dissemination and Transfer
Summer 2022
• Continue data analysis
• Begin diffusion, transfer
Fall 2022
• Finish data analysis
• Continue diffusion, transfer
23
DESCRIPTION OF THE RESEARCH PROJECT
• Begin writing the research report
• Hold PD session with practitioners (share initial results)
Winter 2023
• Finalize the research report
• Dissemination through publications
• Hold PD session with practitioners (share final results &
extension activities)
Consideration of the Relevance Committee’s Comments
The committee suggested to make the link between our research proposal and the 21st century
competencies more explicit instead of focusing explicitly on information literacy. The development of
critical thinking in the context of online credibility evaluation was a key part of our proposal, and we
made that connection more explicit in the revised proposal. The committee further had two comments
on the anticipated outcomes of our project. First, they recommended a back-and-forth between the
development and the testing of the materials in the classroom. This was part of our research design, and
we have made the aspect of the project clearer in the current version of the proposal. Second, the
committee wondered whether all links between Quebec educational policy documents and our project
had been made explicit. The revised proposal aligns specifically to Axis 1 (le développement de l’offre
de formation) and Objective 1.1 (définir les compétences numérique et les intégrer efficacement dans
l’offre de formation) of the Digital Action Plan (Plan d’action numérique), Dimension 11 (développer
sa pensée critique envers le numérique) of the Digital Competence Framework (Cadre de référence de
la compétence numérique), and media literacy among the broad areas of learning of the Quebec
Education Program for secondary schools. Finally, in response to the committee’s comments, we have
specified the training opportunities for teachers and we have articulated the team member’s roles more
clearly in the revised proposal.
COMPETENCE
Quality of the Team’s Experience and Achievements
Our team consists of researchers and practitioners from across secondary and post-secondary, private
and public, and Anglophone and Francophone educational institutions. Team members from Concordia
University include Julie Corrigan (Assistant Professor of Digital Literacies), Kim McDonough
(Professor and Canada Research Chair in Applied Linguistics), and Heike Neumann (Senior Lecturer in
English as a Second Language). Although early in her career, Corrigan’s research in the area of digital
literacies has already been recognized with $232,000 in scholarship funding, including a SSHRC
Bombardier Canada Graduate Scholarship. Her dissertation was also nominated for the University of
Ottawa’s Social Sciences and Humanities Outstanding Dissertation Award, as was her master’s thesis.
Thus far, she has published 10 articles in peer-reviewed journals and was first author on five of these
articles. Her work has appeared in some of the top publications in the field (mostly Q1), including
Research in the Teaching of English, Computers & Education, The Qualitative Report, the Nordic
Journal of Digital Literacy, and Assessing Writing. She also co-authored “Writing Research from a
New Literacies Lens,” an invited chapter for the Handbook of Writing Research (second edition),
widely considered the authoritative text for writing researchers. Aside from scholarly publications,
Corrigan has disseminated her research to pre-service teachers during invited talks with faculties of
education, as well as at professional development activities with in-service teachers.

Print Friendly, PDF & Email
%d blogueurs aiment cette page :