Teaching Pre-service Teachers Content Area–easy….Technology–notsomuch

Hubbard, J. D. & Price, G. (2013). Cross-culture and technology integration: Journal of the Research Center for Educational Technology (RECT) 9(1).

 

 

Cross-Culture and Technology Integration: Examining the Impact of a PTACK-focused Collaborative Project on Pre-Service Teachers and Teacher Education Faculty (Hubbard & Price, 2013) is an article that was written about research done with pre-service teachers.  The intent of the research was two-fold.  First, it was to have pre-service teachers create an instructional lesson using the TPACK model as a basis.  Second, it was to determine how those pre-service teachers might incorporate technology into their lessons when they become in-service teachers in the future.

The study was based on research that supported several sub-components of TPACK (Technology, Pedagogy, And Content Knowledge).  The investigators provided research that backs the need for research in five different intersecting categories. After detailing each, it then chose to focus on just two of those for this study; the last two of the five.   Laying out the literature section in this way proved to be one of the weakness of the article.  The most important items should have been listed first and given more weight and research to back up the rationale.  No justification explaining why these two of the five were being targeted was ever explained.

The researchers did go on to describe that they wanted the pre-service teachers create a Learning Activity Type (LAT) project that required the application of inquiry based social studies skills.   They also mandated the use Microsoft Photostory 3.0 for the technology component.  The pedagogical basis was the concept of culturally responsive instruction which corresponded to the social studies content knowledge of multicultural and global perspectives.

The requirement for the pre-service teachers was to interview a foreign-born person then use internet research skills to gain additional information about the country from which that person came.  Then they were tasked with organizing it and using Microsoft Photostory 3.0 to create their final project which would tell the story about their interviewee’s life, culture, and heritage.

There were 83 students who participated in this project all of whom were all juniors at a university and enrolled in a K-6 elementary school program.  These students were assigned to one of four classes consisting of about twenty students each.  Each class had one of four instructors.  The students also were assigned to meet, as a class, periodically in the computer lab.  There was a separate instructor available there whose job it was solely to help with the technology aspect of this assignment.  That person kept a journal regarding his observations of the classes for the research project but was not considered an instructor for purposes of this study.

The quantitative data came from two sources.  Surveys were given to the students and their instructors.  The student surveys had nine questions that described the learning experience.  Two versions of Likert scales were used.  One set on the first five questions and another on the second four.  82 of the 83 surveys returned were usable.  The other set of data came from responses gathered from a survey given to the instructors.  Of the four instructors, one of them was also one of the researchers and that person chose not to complete a survey to minimize the bias of the results.  This would be another example of a weakness of the research as there were only four instructors to begin with and one chose to, rightly, to withhold filling out the survey.  That reduced the results by 25% of what they could have been and with such a small sample size to begin with, that may have had a large impact on the results.  A strength, however, is that the other three surveys were sent out to be evaluated by a different set of technology experts as opposed to the researchers working on this project in order to minimize any conflict.  The instrument had a reliability coefficient using Cronbach’s alpha of .832.  The standard error of measurement was found to be 2.033 (Hubbard & Price, 2013).

The results from the first five questions on the student survey showed that the pre-service teachers reported being pleased with the class.  The responses for the first five questions ranged from 86.6%-95.1% answering fairly or very useful for questions such as the types of hand-outs used in class, the usefulness of the class, etc.  The remaining four questions garnered a more varied response with only 37.9% reporting that they were fairly or very much likely to use Microsoft Photostory 3.0 as a teaching or learning tool.  The results of instructor surveyed demonstrated that although they felt “very satisfied” with the course, they felt an overall lacking of their own comfort with the technology tool being used which kept them in a situation where they were unable to help their students to the extent they would have liked.

In addition to the two surveys, artifacts were collected throughout the research project.  The researchers recorded classes on video, held one-on-one meetings, took notes, and held interviews with pre-service teachers.  The results from this study indicated that the project did not overwhelmingly help pre-service teachers view technology as a necessary component when teaching.  The survey showed that it did help them gain an awareness of technology and content knowledge (i.e. the cultural responsive component).  This survey size is too small to be generalizable.

Although this research was focused on pre-service teachers and I want to create a TPACK action research for my fifth grade classroom, I still found many ways I can apply some of these concepts to own project.  One of the thoughts that was generated from the results of the survey was that the pre-service teachers did not understand why they were required to use Microsoft Photostory 3.0.  They saw it only as a requirement to contend with rather than concept to master.  I can definitely apply those results to my research.  My students may respond better to the technology I use in my study if they understand that it is something to learn in and of itself and not just a meaningless requirement.

I also made a connection between this piece and the article Rural Elementary School Teachers’ Technology Integration (Howley, Wood, & Hough 2011).  That article described how the attitude of the teacher was so vital towards the successful implementation of technology.  Although this differs because the instructors did want this to be a successful experience, their survey results showed that they expressed a discomfort with the tool.  They also conveyed that not knowing how to use the technology caused them to be unable to help their students during their projects.  I wonder how much more successful this entire study might have been if the four instructors had been well-versed in the tools they were requiring their students to use.

The final connection for me is the concept of completion.  Although not a requirement in one sense, the students were not allotted the time to be able to share their projects.  While that may not have been a requirement needed for successful implementation in the authors’ minds, I wonder if it might have made the project more appealing to the students and therefore raised the scores of some of the pre-service teachers to some degree, and ultimately their desire to implement technology in their own classrooms down the road.  That, for me, is another lesson I will take when I create my own action research project.

 

Howley, A., Wood, L., & Hough, B. (2011). Rural elementary school teachers’ technology integration. Journal of Research in Rural Education, 26, 1-13http://www.mendeley.com/catalog/rural-elementary-school-teachers-technology-integration-3/