The Whole Teacher Enchilada

Twining, P., Raffaghelli, J., Albion, P., & Knezek, D. (2013). Moving education into the digital age: the contribution of teachers’ professional development. Journal of Computer Assisted Learning, 29(5), 426-437.

I chose that title today because I had no idea what I was getting myself into when it came to today’s article. Today’s title is a play on the more common phrase “the whole enchilada” meaning it’s the whole kit n kaboodle, everything, the kitchen sink, and the dish ran away with the spoon…you get the picture, right?
Well, the article I read pretty much summed up a lot of the major research surrounding teacher professional development (TPD) in general and as it pertained to the topic of Information and communication technology (ICT).

This journal article was particularly unique because it some respects it reminded me of a meta-analysis that combined large elements of information and distilled it down to some basic, general “truths”. And yet it also reminded me of a conference paper that appears a bit more loosey-goosey in its methodologies and formalities.

Let me explain. This article stemmed from a series of discussions at the 2011 EDUsummIT and from one particular group in general, the Technical Working Group on Teacher Professional Development (TWG3). This group consisted of twenty-one individuals from fourteen different nationalities and their discussion really centered on how could TPD ensure that teachers were prepared to use ICT to promote current skills and learning styles presently. As they began their conversation, they focused on what the current literature already stated about TPD in general and anything pertinent in regards to ICT.

Then the article delves deeper into the discussions as to what were the goals of the integration of ICT into TPD and what obstacles and layers of infrastructure must be considered when planning or working within this arena. Generally speaking, much the the goals centered around the idea that the effective and seamless use of ICT in the classroom would be a transformative element to the status quo.

The final portion of the article explores the topics that the group came to a consensus on, what discussion is to still be had and what conclusions this highly diverse group were able to agree upon. Some of the big conclusions were that most international, educational settings do not employ TPD best practices, the distance between practitioner and researcher must shrink and information must flow in both directions in that relationship, and that ICT must be modeled and expected of educators for it to truly have a transformative effect on education.

As I had stated previously, this article was similar to a conference paper in that it seemed less formal and more conversational in nature. As such, I think it was a very easy to read and follow textually. I also do believe that this one article, as I had mentioned before, is similar to a meta-analysis in that it covers a large, breadth of information and is able to distill it down to some very basic, compelling components. In doing so, this article has contributed much to the field of education and specifically to the branch of education dealing with teacher preparation and continued development. Any new district or site administrator could easily come to this article to see some highlighted best practices that are gleaned from strong, international research. And in having the references, you could specifically look further into what a particular author or article is saying about TPD and how they know they should say that.

I do think this article straddles the fence in a few places where parts of it are real strengths and contributions and the other is sure something could be improved upon. For example, the Literature Review wasn’t exactly a review. It did cover a breadth of established research but it didn’t really tease out all of the information for each of those articles. I also believe that the omission of some texts can be as intentional as the submission or use of other texts. To me personally, it didn’t seem like enough of a well-rounded picture was painted for this Literature Review for me to feel comfortable in saying it was comprehensive.

Springboarding off of that last thought, I arrive to a similar one in regards to Theoretical Frameworks. This article explored, very briefly, three frameworks and a bit as to how it generally applied to the present discussion of TWG3. But none of these frameworks were used extensively or explored thoroughly enough for me to really consider that a strength.

Finally, I would have to say that the “data collection” was completely absent. I understand that this article was not an empirical study but there was some ambiguity around some of the practices employed in creating it. For example, a few times the authors stated, “consensus in the discussion…” or even “ the group’s consensus….”. This term was not unpacked as to how a consensus was created and even who was or was not a part of the consensus and what the counter argument/point was to that dominant group. I believe some more clarification could have brought some more transparency to the article in general. There were also some value statements made about particular country’s educational systems or even proposed costs for TPD, that did not have any explanations or rationale tied to them. The findings were were interesting in that they were very broad and basic as to have to be applied internationally but I wonder in doing so, they lose their potency at the national or meso level.

As stated previously, I would recommend this article to others, like myself, who are researching into best practices in TPD because it is a great jumping off point. However, I do believe that in examining this issue at such a macro-international level some elements were overlooked. 1) the assumed bias that technology should be integrated into classrooms, 2) that it is imperative that all schools consider increasing the use of technology regardless of student/teacher access to initial and continued funding for such projects, and 3) the inherent complications and struggles that teacher preparation programs and districts face in regards to demands on their time, finances and attention. I think an article that examined these issues concretely and also clarified some more of the value or statements or even means of coming to a consensus by building more transparency into it, would prove to be an even more effective and powerful article that would truly offer some strong contributions to the field.

Who gets to tell the story and why.

I was particularly moved by one of our readings this week. The excerpts from Rosaldo’s text, Culture and Truth: The remaking of social analysis struck me with a few insights (1994). I was really surprised by a comment that one of Rosaldo’s colleagues had made in regards to one of Frantz Fanon’s anecdotes. Rosaldo’s colleague discounted the account and its perspective because of the participatory narrative of the incident, “it just says that it takes one to know
one” (pg 188). This entire text has helped me understand the history and range of types of stances that one must consider and take when capturing an ethnography, but I must say that I was so surprised to see devalued or undervalued the ethnographies of individuals from within an subaltern group.

I personally connected it to how historical museums are set up or even historical documentaries. That the “unemotional” fact telling narrator is one important way to convey information but the individual, connected storylines or first person accounts are what help us relate our humanity to the humans who experienced the events.

I think that one thing I’ve really begun to pull from our readings and discussions is the awareness that we can never truly “divorce” ourselves from our research or “storytelling” and therefore if we’re forcing ourselves to be “unemotional” and detached in our writing, why is that? Why am I involved in this research in the first place? Can I acknowledge my position, historical context, and perspective and incorporate that transparently and wisely into my work?

One other glaring moment from this text and author comes after he shares and analogy to the interplay of power and position and the oppressed in telling their own story. In contradiction to his colleague’s dismissal of the concept of the oppressed analyzing and writing about their own state, Rosaldo states that in fact our field and peers miss out on something essential if the oppressed don’t write about their own situation (pg 189). He uses the analogy of a master to slave relationship to cue us into who may have the more difficult time truly encompassing all of the perspectives in a situation. The slave or oppressed gambles their daily survival on analyzing and interpreting the mind of the master. It however, seems to be that the one in power never has to engage in the analysis and interpretation of mind of the oppressed. They are engaged in merely ensuring that their own needs are met and how to keep those needs being met. Therefore, as Rosaldo states, it takes a more “imaginative leap to discover slave consciousness” (pg 189). This caveat helps establish the importance of gathering multiple accounts and perspectives in situational ethnographies and having an awareness of our historical, societal positionality.

I will say, that in general, this week’s readings on indigenous epistemologies and struggles with linguicide has highlighted one of my own personal passions around dual language immersion schooling. Some of my passions arise out of my own life experiences with racism and societal shame that is built up in the USA around speaking languages other than English. I think if I had a choice of what area of inquiry I could follow and it not be tied to my current work, it would be dual language immersion or bilingual education. I personally feel heartened by these last two weeks of readings because I sometimes feel battered down by local or national politics and propaganda that really belittle people, heritages, cultures and language and for either purposes of hidden agendas, ignorance or fear. I was about to say misplaced fear, and though I do think there isn’t a fear in honoring people, their history and language, I do think that people who oppress should be fearful of the oppressed. Time and time again, it has been these marginalized communities who have demonstrated centuries of hearty resistance and persistence and they have yet to be crushed.

Rosaldo, R. (1994). Culture & Truth: The remaking of social analysis. Boston, MA: Beacon Press.

Putting the cart ahead of the horse.

Zembylas, M. (2008). Adult learners’ emotions in online learning. Distance Education, 29(1), 71-87.

Putting the cart ahead of the horse.

As I continue down the path of gaining better insight into what research is out there around professional development, I’m also pursuing what research is out there in regards to adult online learning. This particular branch of my area of inquiry is quite interesting because it takes me all over the world. Adult online education has been a huge thing in European countries for decades now and much research has been done abroad. It is these particular articles that I find interesting because the focus and topic may be similar but often it is the lens that is different and, at times, quite revealing.

The journal that I most recently read centers around a qualitative study out of the Open University of Cyprus in Nicosia, Cyprus. This particular study followed the emotional ups and downs of twenty-two graduate students enrolled in a year-long, distance education course on the topic of multicultural education and social justice pedagogies. The majority of the participants were women and averaged an age of 36 years old. The researcher, Michalinos Zembylas, was the professor for the course and he gathers data around the emotions of the online students through a monthly journal, two in-person interviews at the start and end of the course, 867 email messages between him and the students, his reflective journal, field notes from the face-to-face meetings, phone conversations, student work documents, and his own planning.

His findings include: new online students experience both positive and negative emotions in relation to their new roles and the new setting in which they’re learning in, those emotions change over the course of the year, and men and women experience the outside pressures of being a student differently.

In general, I found there to many strengths to this journal article. The organization and overall coherence of this article to be good. It was very clear and understandable and particularly easy to read. I do think that his participation in the study might have made him write in a certain point of view that was then easier to read. I believe that elements of this work appear to be original to an extend and do offer something new to the field. Since the students were self-reporting their own emotions and causes for those emotions throughout the year-long course does seem to make this unique. I do think the socio-historical aspect of the setting in which the study took place also provides a uniqueness to the text that I’ll speak more about later. I found the literature review to be thorough and it really, naturally built up to the need and space for this particular study. The theoretical framework that was chosen, critical and postructuralist thinking, seems to make sense for the setting. This framework outlines placing and identifying emotion in a space between the public and the individual. Hence, the students self-reporting would occupy that same location. One last strength of the article, was the many direct quotes that were captured from student documents that really highlight and humanize the findings. The unique struggle that Zembylas’ findings found was the difference in emotional reporting and reasons between men and women. I will come back to this particular point later as I personally connected with it.

As interesting and easy as it was to read Zembylas’ work, I did find a variety of elements disconcerting in the research and make me want to offer them up as ways for improvement. Since this research centers around utilizing only qualitative data, it would seem wise to have incorporated some elements of quantitative research and even utilized another reviewer to ensure more reliability across the coding and results. I found the fact that he was their professor to be highly problematic to this whole design model. In the article, Zembylas mentions talking about the research the first night and inquiring how students felt about the study and his position of power. Having not hearing any negative responses, he offers them 10% of their grade if they’d participate. I know the rules about if you can’t and can “pay” someone to participate but this method seems to be completely against the established code of conduct. If the journal assignments were somehow aligned to the topic of the course, then maybe. One other similar concern was that the author never listed or illuminated what the specific interview questions were that he utilized in this study. In the author’s findings, he mentions that the student emotions changed over the year. First of all, I’m sure I, if I thought about it for a little while, could have written these findings just off of the top of my head. These “findings” didn’t appear to be so revealing or revolutionary. In fact, the “findings” were a let down to the development of the literature review and the scope of the study. And upon even deeper reflection, I don’t know how he made the “leap” from his findings to his very strong conclusions and implications. One conclusion he establishes is that the the emotions of the students changed over the course of the year. Again, that is so vague and obvious, that I could’ve written that sentence years ago without any data. What would’ve been revealing is an even deeper dive into the data and a more linear projection that truly demonstrated how their students’ emotions did change over time. There seems to be a chasm of disconnect between the findings and conclusions. I was also surprised to see citations from other articles in the Conclusions section. It would seem like these concepts and citations should be up in the literature review helping to build the case of why this study is important.

This article struck a personal chord because of the slightly tangential point it made about working women as professionals and students expressed more negatives around the many roles and responsibilities they have to upkeep along with their studies. I also over the past year and a half have had to face this same battle but I do think it speaks to something Zembylas mentioned. These women have gone back to college at a graduate level and are being forced to take on the evolution of equal opportunity and/or gender equality while still maintaining old family traditional values. I daily face this balance and have wanted to engage further in research around the concepts and tension between the expansion of gender equality or more access for women to ladder climbing experiences with the lack of equal fluidity in gender responsibilities shared in the family unit.

Lastly, I do think this study has opened up a new consideration for in my field of influence. The incorporation or explicit focus on user emotion in professional development that occurs online. Applying interventions that provides the students with innovative strategy of how to engage the importance of user emotion in a learning experience as well as its correspondence to particular events in the online platform.

In general, this article seemed to make quite a few leaps from their findings to their conclusions. So much so, that I almost thought that they had written a bunch of other articles where this data was further unpacked and graphics were utilized to support the statements made and helped the user track the same trends that then logically lead to an appropriate conclusion.

I will use the positives and improvements to make sure I don’t put the cart of my conclusions before the horse of my findings.

That came out of left field!

Jordan, M., McDaniel, Jr., R.R. (2014). Managing uncertainty during collaborative problem solving in elementary school teams: the role of peer influence in robotics engineering activity. The Journal of the Learning Sciences. 0(0), 1-47.

Yosso*, T. J. (2005). Whose culture has capital? A critical race theory discussion of community cultural wealth. Race Ethnicity and Education, 8(1), 69-91.

That came out of left field!

The articles that resonated with me this week are surprising to say the least. The first article that grabbed my attention was Jordan & McDaniel’s Managing Uncertainty During Collaborative Problem Solving in Elementary School teams: The Role of Peer Influence in Robotics Engineering Activity. This article focuses on the reasons why uncertainty can arise for an individual in various settings and also how those individuals cope with feelings of uncertainty. One other novel element about this text, is that it follows the causes of uncertainty and the coping mechanisms of 5th grade students in relation to their collaborative peers. The causes of uncertainty can come from the content or the relationships within the collaborative group. This particular article really struck me as I started to consider the content of the article through the lens of leadership. As an educator and leader within our communities of practice, it would be important and beneficial to be familiar with this information and its implications. Jordan & McDaniel (2014) state that some types of uncertainty can be good for a group because it can increase creativity and innovation. Other sources of uncertainty could be damaging to a group and its productivity because it pulls attention away from creativity and innovation. As the leader in a community of practice, I would want to be engaging the learning community in the examination of this research and the direct utilization of it to define better harnessing this “sustained” productive uncertainty.
Yosso’s article entitled, Whose culture has capital? A critical race theory discussion of community cultural wealth, was a great article because of what it spoke about. For all the years that I’ve had in the classroom, I have heard countless educators critique my students’ parents or their students’ parents on all the things they are doing wrong in regards to parenting or what they lack. It seems that I’m more apt to hear why a parent is bad or can’t help their student at all rather than colleagues expounding on all of the essential and unique information that parents and students carry with them. It painfully reminds me of when I hear colleagues speak about their ELL students as if they have no knowledge or information at all and are a complete “tabula rasas” (or blank slates).
One of the other points of agreement that I had within this article, came from two quotes from outside this article. “We need to de-academize theory and to connect the community to the academy (Anzaldua, 1990). Now that I’ve concluded my third year working within higher education, I’m constantly plagued by the concept of community benefited research. Who could really use this new knowledge and put it to the most, good use? If we are educational researchers and our findings from our work never reach or positively benefit students, what good is that research? I’ll conclude with one last quote from this particular article and a comment. This quote is simple yet powerful and to me speaks to the importance of not only publishing our work and knowledge, but ensuring that it leaves the minimally intended impact on the target audience. “Change requires more than words on a page–it takes perseverance, creative ingenuity and acts of love “ (Azaldua, 2002). I truly believe that quote speaks to the short and long-term tribulations, responsibilities, and joys of being an educational researcher. Well, being new to this role, I hope it does.

Anzaldua, G. (1990). Haciendo caras/making face, making soul: creative and critical perspectives by women of color. (San Francisco, CA, Aunt Lute Press).

Anzaldua, G. (2002). Now let us shift…the path of conocomiento…inner work, public acts, in: G. Anzaldua & A. Keating (eds). This bridge we call come: radical visions for transformation (New York, Routledge), 540-578.

Inconclusive research = lame duck

Eynon, R., & Helsper, E. (2011). Adults learning online: digital choice and/or digital exclusion?. New media & society, 13(4), 534-551.

As I continue down the breadcrumb path of research in the general area of my area of inquiry, I’ve decided to open up my research to encompass articles that touch on a range of topics:
adult online learning,
effective practices for online learning,
assessing online learning,
unfacilitated online learning,
facilitated online learning,
adult communities engaged/disengaged in online learning,
various theories and frameworks that relate to online learning,
and, what I’m calling, the catch-all connection to online learning.

If I set a wide enough net, I’m sure to catch onto a guiding research question that makes the most sense for my work and in the mean-time can build up a great arsenal of knowledge in general about what research has been done in online learning and how it was orchestrated.

This particular article would fall under the umbrella of “adult communities engaged/disengaged in online learning”. This study took place in England, Scotland and Wales and aimed to investigate further which adults were engaging in online learning activities. These activities were organized into twelve general areas for fact checking, travel, shopping, entertainment, social networking, e-government civic participation, informal learning and formal learning/training. The researchers wanted to delve further into some of the characteristics that these adults had in common and also what factors influenced them to engage or disengage with different types of online learning. They focuses in on a central divide in the population and really wanted to decipher between voluntary or choice reasons versus involuntary or “digital exclusion” reasons (536).

The researchers methodologies included using the 2007 Oxford Internet Surveys (OxIS) with a sample size of 2350 people. These individuals were selected first by randomly selecting 175 regions in Britain and then from there they randomly selected ten addresses. The participants ranged from 14 years of age and above and they were all administered the OxIS face-to-face.

As the researchers consolidated the results and captured their findings, they divided the disengaged group of respondents into two groups: ex-users and non-users. There were four key reasons as to why these individuals disengaged from not using the internet and they were: costs, interest in using the internet, skills and access. It appeared that ex-users had made the choice to disengage because they were no longer interested in the internet but the non-users highlighted topics of digital exclusion like access and costs.

When it came to investigating why users were disengaging with the internet for learning opportunities, the characteristics of the users did tend to trend but were complex as it often depended on what type of online learning activity was happening. Users that are highly educated, have children over the age of 10 and have high levels of internet self-efficacy were found to be more likely to engage in formal learning activities via the internet. An underlying element that was important for informal learning, was having access to the internet at home. Also, upon analysis of the data set and its trends, began to see pockets of individuals who were “unexpectedly included” or “unexpectedly excluded” (542).

They conclude the research with a statement that this investigation into the engagement and disengagement of users into the internet and online learning is important because it demonstrates that the more information organizations or educational institutions have about a user, the more likely it can provide tailored, differentiated user support to increase the amount of learning activity that takes place.

If I were to turn my critical eye on this article, I unfortunately find more ways in which to improve it rather than strengths. I think one of the greatest strengths of the article is that the content is organized clearly and it really did a thorough job of contextualizing the importance of finding answers for the research questions.

An immediate area of improvement came from the literature review and theoretical framework supports to the research. It truly appears that more time was spent explaining the need for the research than the content and theory that acts as the foundation to the work.

The data collection process in general appeared to be well thought out and good in its randomization of quite a large population sample size. Unfortunately there are a few elements in the methodologies that are missing. Why was the particular OxIS survey tool used and how was it the perfect tool for this research? What questions are on the OxIS? Also, there was no explanation as to the process of how the surveys were conducted beyond “face-to-face” (537). Were they recorded by hand by the participant or the researcher? How many researchers were involved and was their any training needed to maintain consistency amongst the team? And what protocol was used during the survey?

The analysis of the data and the findings were good and very detailed but it almost seemed like the research really didn’t find much. And what was found, goes to support what would seem very sensible. As for the discussion and conclusion, it just seems like the conclusion was that this survey could be done better in the future and next time around they’ll also gather qualitative data, etc. If seems like if that’s your conclusion, then did we find out something important in this study at all? And if it is that there should be more studies in the future, then that’s not much of a conclusion. And since it lacked in conclusiveness, it makes sense that they weren’t able to offer suggestions of what they could do to provide tailored or individualized supports for educational organizations. It also seemed like there was no clear delineation or ability to distinguish clearly between factors of choice and those of digital exclusion.

Personally, I think the researchers have a lot of room for development when it comes to building on this research. I agree with them that a strong next step would be to combine the survey with an interview to capture some qualitative data. I think partnering with one of the informal or formal institutions they identified and surveying/interviewing learners before and after their online learning experience as well as capturing the actions the organization took to: onboard, orient the user to the technology and learning scope, and support their ongoing learning. This data in concert with the other data collected could help paint a more clear picture of who these online learners are, what needs do they have, and is the organization fulfilling those needs.

I think this second round of research could have great potential in providing more access to equitable educational opportunities. If these researchers could really hone in on the factors that exclude learners from online opportunities or even what actions unexpectedly help to include individuals, then that information could be directly utilized by organizations to help support these learners access learning opportunities that otherwise would not “exist” for them.

This article continues to help paint the picture of how much support, thought and detail should go into your writing and research. Learned something important tonight—if you don’t have anything conclusive in your conclusions, then something went wrong!

Inspired to take action in action research…

Last week, after reading an article, (Shulman, L., et al, 2006), that thoroughly described the differences between the PhD and the EdD, I was really affirmed in my choice of ASU’s EdD program. The underlying concept of participatory action research of being a line of inquiry that rises out of the need of the local community is something that personally speaks to my personal and professional desires. I joined and continue to work in the field of education because I want to be a force of positive impact that helps those who live in the community I serve.

As we were grappling with elements of scholar and community identities last week, I’ve really begun to consider various aspects of research in general. Who is my community? Am I an insider? Outsider? Or some odd hybrid? Who ultimately is the beneficiary of the research? How do I ensure that they do benefit from the research? If I’m not a “part” of the community, can I even accurately identify what problems exist in the community?

I feel like these questions were not necessarily answered but enhanced from some of this weeks readings. A few texts in particular grabbed my attention and caused me to critically reconsider the above questions.

The first article that really made me reflect on the previous questions was Participatory Action Research and City Youth: Methodological Insights from the Council of Youth Research (Bautista, Bertrand, Morrell, Scorza, & Matthews, 2013). This article followed a project of the Council of Youth Research in Los Angeles as they taught high school youth how to do research and then supported them as they altered some traditional tools and practices to fit their needs. The students conducted various branches of research around schooling in their local community and many of the action research participants had personal connections to findings, experiences and systems it illuminated. I think I particularly connected to this article because it seemed that these students immediately benefited from the process and findings of the research. The students walked away from the experience being more informed advocates for equitable educational opportunities in LA.

Another set of text that confounded the questions that I’ve been grappling with, comes from the book Handbook of critical and indigenous methodologies (Denzin, Lincoln & Smith, 2008). A major component of the text is to analyze the methodologies and practices of traditional research and eradicate practices that reinforce colonization practices. A means of doing that, is to truly allow indigenous cultures and communities create their own research agendas, identify their own problems and conduct the research in ways that uphold their values and practices. This speaks to at least two of my main concerns. Research that is conducted in this manner truly benefits the community because it rises out of a need they’ve established. It also addresses the concept of whether or not an outsider of the community can accurately identify a problem. I think this text has caused me to believe that yes, an outsider might be able to identify elements of a problem that plagues the community but they may not ever necessarily identify or establish the importance, ramifications or depth of that problem themselves. I think the text establishes ways in which “outsiders” can assist communities in research but it really is described as completely altruistic and at the mercy of the community.

One last text that I was particularly drawn to was an article entitled, ‘Keeping Up the Good Fight’: the said and unsaid in Flores v. Arizona’, (Thomas, Risri Aletheiani, Carlson, & Ewbank, 2014). This particular article was crafted and organized very well and took a very interesting view and research stance on the Flores v. Arizona case and its implications for English Language Learning students in Arizona. I think one thing that immediately caught my attention and was present throughout the article, was the very objective and distant feel of the text. I think the authors did a profound job of connecting novel concepts to the plight of ELL students and Arizonans as well as crafting very poignant images that help illustrate that plight even more. However, what I didn’t get from this article that I felt from some of the others is a sense of personal connection. I understand that writing articles in a small group may drown out a strong, individual voice and even the article that this text was written for may demand very removed, distanced writing but I couldn’t help feel that this article was written in a fashion of an outsider looking in.

That may very well not be the case, but upon reflecting on the idea of participatory action research and the role that we as community members have in serving the needs of that community, I can’t help be believe that my research should be something that I’m not only passionate about but personally connected to. I hope to see that my writing reflects that element of community member fervor and that it ultimately benefits my community.

Bautista, M. A., Bertrand, M., Morrell, E., Scorza, D., & Matthews, C. (2013). Participatory Action Research and City Youth: Methodological Insights from the Council of Youth Research. Teachers College Record, 115(10).

Denzin, N. K., Lincoln, Y. S., & Smith, L. T. (2008, May 7). Handbook of critical and indigenous methodologies (N. K. Denzin, Y. S. Lincoln, & L. T. Smith). Sage.

Shulman, L. S., Golde, C. M., Bueschel, A. C., & Garabedian, K. J. (2006). Reclaiming education’s doctorates: A critique and a proposal. Educational Researcher, 35(3), 25-32.

Thomas, M. H., Risri Aletheiani, D., Carlson, D. L., & Ewbank, A. D. (2014). ‘Keeping Up the Good Fight’: the said and unsaid in Flores v. Arizona. Policy Futures in Education, 12(2), 242-261.

More questions than answers…

Heller, J. I., Daehler, K. R., Wong, N., Shinohara, M., & Miratrix, L. W. (2012). Differential effects of three professional development models on teacher knowledge and student achievement in elementary science. Journal of Research in Science Teaching, 49(3), 333-362.

I’ll admit that over the past few years, after receiving my Masters in Educational Administration and Supervision, I’ve been plagued by the question, “What are the elements of great professional development that make it effective and how do we know that?”. Over the years I’ve gleaned bits and pieces of knowledge and experience from here and there like sustained and supported professional development, tailored learning, collaborative learning communities, and reflection cycles. As I move forward in the inquiry process of this doctoral program, I look forward to delving deeper into this quagmire of a question and finding out more about the “how do we know that?” portion.

So that brings us to my highlighted journal article of the week. This week I read the Differential effects of three professional development models on teacher knowledge and student achievement in elementary science (Heller, et al., 2012). What I immediately appreciated about this research study, is that it aimed to take three unique approaches to professional development for elementary science teachers and compare the results from teacher and student assessments as to the effectiveness of the different programs. This study took place in a range of states and utilized a train-the-trainer model. The study encompassed 253 teachers after all data was collected and exclusions were made. These teachers were divided up into four groups: 1) a control group that got the same science content knowledge as the other teachers, 2) a group who studied the content and analyzed student responses and teacher instructional choices via a set of case studies, 3) a group that received content instruction and analyzed their own student work throughout the unit, and 4) a group that utilized metacognition and analysis to develop their professional practices. The content that all of the groups were learning focused on electrical circuits which often is covered in fourth grade.

After the initial, study year some teacher participants and their new students were asked if they would take the electrical test again to see how much knowledge they retained over time and whether or not they were impacting their students positively still. The results from the survey was that all three different professional development courses produced large gains in the initial study year and also continued to have positive effects on teacher knowledge and achievement in the second year. The three different courses also had significant increases in content knowledge in the initial year and the follow up year. One of the more surprising results was the effect the professional development had on ELL student content scores. One element that was also being tracked was how well teachers and student could explain or justify the answers that they were providing in a selected-response assessment. In this area, one of the professional development courses really stood out: the Teaching Cases program.

I think in general this study had some really important strengths. Some of the first that I recognized would be the fact that the study utilized selected-response assessments that had extensive reliability and validity testing. The manner in which the teachers were taught the electrical circuit content was also done in a very intentional means of exploration and collaboration. There was much care taken and procedures in place to make sure that assessment scoring was reliable and that individual trainer skill was not affecting the teacher, and therefore, student scores. I think another strength of the study was how thoughtful the designers were to not just create a professional development that had to be done in a very perfect, almost impossible to achieve environment. The study included hundreds of teachers, thousands of students and employed a very difficult training-the-trainer set up that covered a large geographical range. There was also diversity in student and teacher population.

I think that some of the critiques I have for the article come from its literature review and its organization. There were design elements decisions made for the study but some of those decisions weren’t explained until later until the results section of the article. The literature and framework for why the three course elements were created they were seemed to be lacking. This is especially true when I consider that it was unclear why there were some smaller components repeated amongst the three course types.

Another critique I had was in the execution of one of the methods that they had chosen. The control and two of the other courses were done before the actual student unit was taught and testing was done as a pretest and posttest. However, due to the nature of the Looking at Student Work course, this professional development happened concurrently to the unit. I think this actually is essential when you take into account that course was the only course that significantly improved the students’ abilities to justify their answers.

One other critique I had for the article, and this was the inspiration for the title for today’s blog, is the fact that though this article appeared to really get some great results, its design and execution caused for there to many more questions at the end of it than at the beginning. One element is that there was no real intentionality around impacting ELL students through these professional development courses, but nonetheless there was a significant impact. Unfortunately, it is unclear to what or whom we owe that credit to.

One last critique that I had for one of the methods in this study was that teacher participants themselves administered the student assessment. It seems that in areas of research like this you would utilize a proctor or have a partner proctor present to help validate the execution of the protocol.

I found the reading of this article to be particularly interesting as we just finished reading one of Dr. Beardsley’s recent articles around Value Added Models in teacher evaluation (Paufler & Amrein-Beardsley, 2013). This particular article highlights the importance of random student placement to avoid bias and influencing data due to “stacked” classes, etc. Since that concept was not even addressed in this article, I assume that was not factored into the analysis or design of the project.

I think one great way that this study could be built upon is to take a deeper look at each one of the professional development courses and really unpack the literature and theoretical frameworks behind them and really make an intentional case for why these courses will impact ELL students and then measure that element.

Overall, this article was illuminating but I really found myself asking more questions about the research design rather than “How can I take some of these elements and put them into practice?”

Paufler, N. A., & Amrein-Beardsley, A. (2013). The Random Assignment of Students Into Elementary Classrooms Implications for Value-Added Analyses and Interpretations. American Educational Research Journal, 0002831213508299.

Intersectionality and Impact

Impact

One of the readings from this week, Intersectionality as a Framework for Transformative Research in Special Education, was truly life changing and perspective altering (Garcia & Ortiz, 2013). This article was particularly eye-opening because it highlighted and delved into an area of research that I have rarely considered nor at the depth that the authors covered it. As a classroom educator, I’ve often considered my identity, positionality, the funds of knowledge that all stakeholders bring to the classroom, and even the power that I personally held by covering or not covering topics, the texts that were selected and even the people that I recognized historically. In the article, Garcia and Ortiz gather a arsenol of research to really support and propose a framework of intersectionality in research in special education. Through their article however I really have connected with the importance of delving deep into my many identities, the identities of my prospective research community, my insider/outsider status in relation to that community, my biases and stereotypes, the nature of my research, the appropriateness of the knowledge that is to be gleaned, and even who will benefit from that knowledge.

Garcia and Ortiz highlight the importance of this researcher reflexivity because of its nature to impact what we deem as important research, the methods we employ and even the communities we involve in that research. It is possible that my past experiences, skills, and knowledge base comprise, in essence, who I am and therefore who might or might not be integrated into this research that I orchestrate. The authors do an excellent job of highlighting that the inclusion or exclusion of certain subgroups extends our knowledge of them and builds on the holistic body of research that exists. On the flipside of the coin, if our research does not include certain peoples, our knowledge of them does not increase and nor does that information, perspective, unique knowledge become a part of our holistic knowledge from research.

The authors even highlighted an important element of value within the research community that stems from the What Works Clearinghouse which excludes interventions for ELL students that are performed in languages other than English (pg 39). This inherent valuing of interventions done in English over others that are performed in other languages hurts the overall body of research on supporting ELL students as it automatically excludes a whole other body of work that appears to not align with the organizations socio-political beliefs on language instruction. If The Clearinghouse is supposed to be a gathering of what works so that this information can guide political, district and school leaders in a decision-making process, then all interventions surrounding this population should be considered and analyzed.
Throughout my entire reading, highlighting and notetaking of this article I found myself continuously nodding my head in agreement, saying, “huh, hmmm, huh”. I really connected with the topic and found myself convicted in analyzing my own scholar/researcher identify closely, the community of learners who will and will not be a part of my research, the knowledge that I hope to glean, who will benefit from that knowledge and the methods in which it will be gathered. One question or nagging thought that has persisted throughout the article and continues to surface at the conclusion of reading it would be, when is it healthy or right to participate in research and when isn’t it and who helps to make that call? What if, in my researcher reflexivity, I illuminate areas of bias and stereotypes within my own lens, how do I go about remediating these deficiencies? How do I even notice that I have these? Is this something that can only be explored and identified in groupings of “different’ people? If I have biases, does that mean I should not participate in research at all, to some degree, or only in community of others? How do I move forward after my initial reflection and declaration of my position? Do I engage in this process at every stage of research seeing as that it is often the acquiring of knowledge and interaction with others that does alter one’s identity?

I guess the “bottom line” or greatest connection I feel as though I can take away from this article, is that I have a lot of identity searching and clarifying to do and that it appears that the only way that I can most “safely” traverse the difficult task ahead is to be transparent and in communion with many different people to engage in the reflective and growing process. Earlier, I stated that this article was perspective altering, which it has been, but even more honestly it appears to have been spring-boarding in its effect. It deeply causes me to ask, “What IMPACT will the act of my researching have? Really, what impact and why?”.