Doherty, I. Evaluating the impact of professional development on teaching practice: research findings and future research directions. US-China Education Review, A, 703-714. Retrieved June 12, 2014, from http://eric.ed.gov/?id=ED527691
Professional Development (PD) that is centered on meaningful learning activities is professional development that is generally considered to be highly effective. In his article, Iain Doherty (2011) sought to address whether or not there is a correlation between satisfactory participant experience following a professional development session and educators actually implementing changes and utilizing skills and strategies learned in their teaching practice (Doherty, 2011). Previous research suggested that meaningful professional development focuses on several principles: 1) contextual realism that intimately connects with teaching practice, meaning PD modules are linked to challenges and practical teaching situations; 2) content that allows learners to connect new information with preexisting schematic frameworks; 3) the utilization of authentic activities that mimic how new information can and will be used in future activities; 4) offering multiple and diverse perspectives, and; 5) collaborative reflection that promotes articulation of new knowledge (Doherty, 2011).
Doherty (2011) utilized PD modules that were built with the aforementioned considerations firmly in mind. They gave University-level educators in New Zealand information regarding the implementation of various technological tools that could enhance the learning of their students. These tools included things like blogs, social networking sites, Wikis, among others (Doherty, 2011). Further, the PD sessions were not a sage-on-stage/sit-and-get style session; each educator in attendance had the opportunity create accounts and begin to use them during the session. To measure his results, Doherty (2011) gave the educators a “pretest,” that asked them to assess their own familiarity with the various web-based tools that they were about to learn about. Following the session, participants were, again, asked to self-assess their own knowledge, awareness, familiarity, and ability to implement the tool into instruction. Doherty (2011) found that participants were significantly more knowledgeable about the resources following the training, than they were before it.
To truly assess their own effectiveness, they came back to the educators three months after the PD modules had been completed and gave a survey designed to assess whether the participants had, in any fashion, begun to implement or use the knowledge gained in the PD sessions in their instruction. Doherty (2011) found that the vast majority (91-96% depending on the technology) of participants had not utilized any of the technology showcased, despite very strong reviews immediately following the sessions. Doherty then sought to supplement his quantitative results with qualitative information, ascertained through interviews with willing participants. Doherty’s (2011) sample size had diminished from an initial 27 to only seven who agreed to the interview; only one of the seven had made use of one of the multiple technology tools in their instruction, and the others were unable to articulate the reasons as to why they had not begun to implement learned strategies.
Upon reflecting on Doherty’s (2011) methods and results, there are both connections and areas of strength and weakness, each of which I want to take a moment to address in turn. There are a number of connections of this research to my own community of practice. One of the things that we emphasize in my role is the follow up to ensure that educators 1) feel supported as they begin to utilize the methods discussed during the actual PD session and 2) actually implement the strategies and tools into their professional work. I think that had Doherty offered on-going implementation support to the educators, he may have seen significantly higher rates of tool utilization. I know when I have been a participant in professional development sessions, I’ve left feeling very motivated by all that I am able to do with the new tools and strategies, but that if I don’t begin to utilize them almost immediately, that I begin to lose understanding of the capabilities and how to integrate them into instruction.
One of the strengths that Doherty’s methods offered was the manner through which he assessed his participants’ knowledge before the session, after the session, and gauged their implementation by following up with attendees three months after the modules had been completed. This gives a good understanding of, 1) at what comfort and familiarity levels those in attendance entered the session, 2) the effectiveness of the facilitator(s) in communicating the desired knowledge to the participants, and 3) how valuable the content was to the educators by assessing the rates at which they actually utilized the information conveyed. This approach was a strong one, as it assessed the participants are various, predetermined intervals, providing information that a short-term data collection period wouldn’t even come close to measuring.
Another particular strength offered by Doherty’s procedure is the application of interviews and qualitative methods to supplement the quantitative information. Doherty (2011) chose to interview participants to pinpoint why and how participants chose to, or in his case, chose not to utilize the information conveyed through the professional development sessions. Though they couldn’t self-identify the root causes for their inaction, the process of interviewing participants, in addition to a simple post assessment, offers invaluable insight that might not otherwise be communicated to the researcher. This research model definitely provides a framework that I can utilize as I begin to plan for my own innovation in the area of professional development, combining both short- and long-term quantitative data, as well as qualitative data to provide further information.
The last area I want to address regarding Doherty’s (2011) methods was a lens he lacked, through which he ought to have collected and analyzed data to understand an even more meaningful perspective on the role of professional development in education. In the introductory paragraph, he writes, “[professional development] is important to improve and enhance student learning” (Doherty, 2011, p. 703). If educators are tasked with improving outcomes for students, and professional development is meant to play a role in that charge, then the improvement in student performance, either academically, socially, behaviorally, or otherwise, should be an essential consideration when measuring or assessing the effectiveness of any session, content, or implementation. Given that Doherty (2011) mentions that purpose of professional development, I thought the perspective that could be offered by looking at the change in student outcomes would have been a valuable lens through which he could have collected and analyzed data.