Lu, J., & Deng, L. (2012). Reading actively online: An exploratory investigation of online annotation tools for learning inquiry learning. Canadian Journal of Learning and Technology. 38(3), 1-16.
Critical thinking is a difficult concept and students need to learn the skills necessary to accomplish that. My research hopes to incorporate technology, critical thinking, and sound pedagogy in order to help students achieve the maximum benefits when learning. This research study looks at how a specific piece of technology can be used to help students engage in critical thinking skills during reading. The research was conducted in Hong Kong with students who were the equivalent of tenth graders in the United States. Research was evaluated in two categories: a review of pedagogical annotations and a review of currently available annotating web programs. The literature shows that the more annotations readers take, the more they increase their comprehension. This is true for both for the frequency and quality of those notes. The annotation process is helpful when done either individually or collaboratively and this information was factored into the study. There were five online annotation tools available. The literature detailed the differences between them and then explained its rationale for choosing Diigo (Digest of Internet Information, Groups and Other stuff). Diigo provided a few features that allowed students to interact with the text in ways that some of the others did not. The authors believed it was best suited to support the critical thinking process.
The study consisted of two classes of students. One class was an advanced level class. That class began with 44 students but the study only assessed 42 students after accounting for factors such as excessive absences. The other group of students was a class of regular education students. That class began with 37 students but only accounted for 27 once also weighing for absences. The differences between the two types of classes was purposeful. One question the researchers examined was the difference between how the two groups’ behaviors and observations related to Diigo. Other goals of the study were to find out how all of the students used the technology, perceived it, and how their actual use of it compared to their reported responses. The research was conducted in four sessions. First, the teacher gave the students material to read. The next session required the use of Diigo. On the third session, students worked independently and could take notes or interact with the text however they chose. The fourth session was for students to share their notes with each other and synthesize information.
The researchers individually observed and assessed each note card entered it into the Diigo system. Calculations were based on those results. The notes were categorized into four sections: define, tag, record, and discuss. A Likert scale assessment was also used to measure the students’ opinions. MANOVA tests were performed. Scores were adjusted to account for the differences in the number of students in the two classes. The results showed that Class A used, and reported liking, the sticky notes more. They also used the define category the most. Both groups reported enjoying the notes according to the survey results, however, Class B had so few notes that some categories could not be fully assessed.
Strengths and Critiques
This research compares two classes one of which had high achieving students. One of the big problems is that the tool being assessed involves students’ ability to read critically. The reason that Class A may have had more notes in Diigo may have had nothing to do with the product or its effectiveness but everything to do with the students’ skill in Class B to complete the assignment. The research did not detail the reading level of the material presented to the classes nor did it specify if it was the same passage for both groups. There were a few other problems as well. Class A and Class B were very small making it unable to be generalizable even if they were both the same type of learners. Another problem is that one class lost two students while another class lost ten. There might be some dynamic or secondary issue going on (behavior, illness, etc.) that had an effect on the remainder of the students which could, in turn, effect the results. The authors didn’t address that issue. Finally, the design called for students to be grouped by their teachers. Again, since both classes started out with larger numbers and ended smaller, the research did not explain how groupings were changed during the project as absences occurred. Since Group B lost ten students and Group A only lost 2 that might have been another factor.
The researchers evaluated each note card themselves. That left the opportunity for interpretation of the cards. There was no independent party also evaluating the messages on the sticky notes so the breakdown of data could have been construed differently had someone without a bias been the arbiter.
There literature review was broken into two sections. The first section evaluated the pedagogy value behind annotations. The second section was not a review of literature at all. Rather, it was a review of the products that are currently on the internet and available. It explained to the readers the rationale behind the choice of using Diigo as the source for this research.
The layout of the paper was fine, however, the one typographical error was very noticeable and did create difficulty when reading. At the end of the Research Question section, it stated that there were three questions they would be focusing on but then proceeded to list four questions. I reread that several times as I was initially unsure if the mistake was that the three should have been a four (which is what I concluded) or if one of the questions on the list was the error. That mistake was very confusing and distracting.
I think that the researchers had a very valuable idea by choosing to research how a specific piece of technology can assist in building students’ reading skills. In order to extend beyond this study, the research needs to be repeated with several changes. First, more students need to be involved. Second, consistent academic levels need to be considered. Once this study has been repeated accurately, other studies also can be done to compare the other technology options that were presented at the beginning of the literature review section. One more direction that this study could be taken is to compare the use of the online annotations to traditional annotations with paper and pencil or sticky notes. This study compared higher level students with average students but on a very small scale. The next study could be done on a large level with groups of students at both high and average academic levels which would make the results generalizable. The reading passages each group gets could match their abilities. The results could be compared to each other after the fact thereby eliminating that variable as a factor.
Relate to Another Reading
The literature review and discussions in Effects of Technology on Critical Thinking and Essay Writing Among Gifted Adolescents (Dixon, F., Cassady, J., Cross, T., & Williams, D., 2005) argued that little research has been explored specifically regarding how gifted students learn. Both studies look at how technology is used by high achieving students. Neither study used large enough groups to make the results generalizable so neither were able to contribute much to the overall development of literature of the way gifted students acquire knowledge.
Brainstorms for My Area of Interest
This study had two of the three components that I am looking to use in my study; the technology and the critical reading development. The pedagogy base was discussed in the literature review but not analyzed in the research so I do not consider it as fully part of the research. I was completely unaware that this type of annotating technology existed until I read this research. This seems like a simple, free, easily accessible piece of software. The literature section described several of the options available and provided the sites to access them. What it made me realize is just how much may be obtainable for my research that I have not even thought about. I recognize now that I may have been limiting my options. I am going to begin to trolling through many places to explore what else may be possible before I narrow my research decisions.
Dixon, F., Cassady, J., Cross, T., & Williams, D. (2005). Effects of technology on critical thinking and essay writing among gifted adolescents. The Journal of secondary gifted education, 16(4). 180-189.